Examples
Batch Processing
const texts = ["text1", "text2", "text3"];
const results = await client.maskBatch(texts);
Usage Tracking
const result = await client.mask(text);
console.log(`Tokens used: ${result.usage.total_tokens}`);
Chat with History
const response = await client.safeChat(
"What did I tell you about my contact info?",
"openai:gpt-4o",
{
masking: true,
chat_history: [
{ role: "user", content: "My email is john@example.com" },
{ role: "assistant", content: "I understand. How can I help?" },
],
},
);
Best Practices
- Always handle errors - Use try-catch blocks
- Adjust confidence thresholds - Lower (0.3) = more sensitive, Higher (0.8) = more strict
- Use batch operations - Use
maskBatch()for multiple texts - Monitor token usage - Track
usage.total_tokensfor cost management - Leverage TypeScript - Use type definitions for better DX
TypeScript Support
Full TypeScript definitions are included:
import type {
MaskResponse,
MaskPreviewResponse,
UnmaskResponse,
ChatCompletionResponse,
PiiEntity,
PiiMapItem,
} from "@anotiai/pii-masker";