Glaider allows you to anonymize personally identifiable information (PII) in text data before processing it with AI models or storing it. The service automatically detects and anonymizes sensitive information such as personal names, locations, organizations, email addresses, IP addresses, access tokens, API keys, credit card numbers, and more.
Header | Value |
---|---|
Authorization | Bearer YOUR_API_KEY |
Content-Type | application/json |
Replace YOUR_API_KEY
with your actual API key.
Parameter | Type | Required | Description |
---|---|---|---|
prompt | string | Yes | The input text to anonymize |
200 OK
application/json
anonymized_text
(string
): The input text with all detected sensitive information replaced with anonymized tokens.entities
(object
): A mapping of anonymized tokens to their original values.The API automatically detects and anonymizes the following types of sensitive information:
[PER_n]
)[LOC_n]
)[ORG_n]
)[Email Address_n]
)[IPv4 Address_n]
)[Access Token_n]
)[API Key_n]
)[Credit Card Number_n]
)Rate Limiting: The API enforces rate limits to ensure fair usage. If you exceed the rate limit, you will receive a 429 Too Many Requests
response. Please implement appropriate retry logic with exponential backoff in your applications.
Error Handling: Always check the response status code and handle errors appropriately in your application.
Security: Keep your API key secure. Do not expose it in client-side code, public repositories, or logs.
Support: For assistance or inquiries, contact our support team at info@glaider.it.
Example Scenario:
Suppose you are processing user-submitted text and need to ensure that any personally identifiable information (PII) is anonymized before storing it or processing it further. You can use the anonymize-pii
endpoint to detect and replace sensitive information with anonymized tokens.
By sending the text to the API, you receive the anonymized version along with a mapping of the anonymized tokens to the original values. This allows you to safely work with the anonymized text while still having the ability to refer back to the original data if necessary, in a secure and controlled manner.
Glaider allows you to anonymize personally identifiable information (PII) in text data before processing it with AI models or storing it. The service automatically detects and anonymizes sensitive information such as personal names, locations, organizations, email addresses, IP addresses, access tokens, API keys, credit card numbers, and more.
Header | Value |
---|---|
Authorization | Bearer YOUR_API_KEY |
Content-Type | application/json |
Replace YOUR_API_KEY
with your actual API key.
Parameter | Type | Required | Description |
---|---|---|---|
prompt | string | Yes | The input text to anonymize |
200 OK
application/json
anonymized_text
(string
): The input text with all detected sensitive information replaced with anonymized tokens.entities
(object
): A mapping of anonymized tokens to their original values.The API automatically detects and anonymizes the following types of sensitive information:
[PER_n]
)[LOC_n]
)[ORG_n]
)[Email Address_n]
)[IPv4 Address_n]
)[Access Token_n]
)[API Key_n]
)[Credit Card Number_n]
)Rate Limiting: The API enforces rate limits to ensure fair usage. If you exceed the rate limit, you will receive a 429 Too Many Requests
response. Please implement appropriate retry logic with exponential backoff in your applications.
Error Handling: Always check the response status code and handle errors appropriately in your application.
Security: Keep your API key secure. Do not expose it in client-side code, public repositories, or logs.
Support: For assistance or inquiries, contact our support team at info@glaider.it.
Example Scenario:
Suppose you are processing user-submitted text and need to ensure that any personally identifiable information (PII) is anonymized before storing it or processing it further. You can use the anonymize-pii
endpoint to detect and replace sensitive information with anonymized tokens.
By sending the text to the API, you receive the anonymized version along with a mapping of the anonymized tokens to the original values. This allows you to safely work with the anonymized text while still having the ability to refer back to the original data if necessary, in a secure and controlled manner.