Export or delete user data in Content Moderator

Important

Azure Content Moderator is deprecated as of February 2024 and will be retired by February 2027. It is replaced by Azure AI Content Safety, which offers advanced AI features and enhanced performance.

Azure AI Content Safety is a comprehensive solution designed to detect harmful user-generated and AI-generated content in applications and services. Azure AI Content Safety is suitable for many scenarios such as online marketplaces, gaming companies, social messaging platforms, enterprise media companies, and K-12 education solution providers. Here's an overview of its features and capabilities:

  • Text and Image Detection APIs: Scan text and images for sexual content, violence, hate, and self-harm with multiple severity levels.
  • Content Safety Studio: An online tool designed to handle potentially offensive, risky, or undesirable content using our latest content moderation ML models. It provides templates and customized workflows that enable users to build their own content moderation systems.
  • Language support: Azure AI Content Safety supports more than 100 languages and is specifically trained on English, German, Japanese, Spanish, French, Italian, Portuguese, and Chinese.

Azure AI Content Safety provides a robust and flexible solution for your content moderation needs. By switching from Content Moderator to Azure AI Content Safety, you can take advantage of the latest tools and technologies to ensure that your content is always moderated to your exact specifications.

Learn more about Azure AI Content Safety and explore how it can elevate your content moderation strategy.

Content Moderator collects user data to operate the service, but customers have full control to view, export, and delete their data using the Moderation APIs.

Note

This article provides steps about how to delete personal data from the device or service and can be used to support your obligations under the GDPR. For general information about GDPR, see the GDPR section of the Microsoft Trust Center and the GDPR section of the Service Trust portal.

For more information on how to export and delete user data in Content Moderator, see the following table.

Data Export Operation Delete Operation
Account Info (Subscription Keys) N/A Delete using the Azure portal (Azure Subscriptions).
Images for custom matching Call the Get image IDs API. Images are stored in a one-way proprietary hash format, and there is no way to extract the actual images. Call the Delete all Images API. Or delete the Content Moderator resource using the Azure portal.
Terms for custom matching Cal the Get all terms API Call the Delete all terms API. Or delete the Content Moderator resource using the Azure portal.