What is Azure Content Moderator?

The Azure Content Moderator API is a cognitive service that checks text, image, and video content for material that is potentially offensive, risky, or otherwise undesirable. When such material is found, the service applies appropriate labels (flags) to the content. Your app can then handle flagged content in order to comply with regulations or maintain the intended environment for users. See the Content Moderator APIs section to learn more about what the different content flags indicate.

Where it is used

The following are a few scenarios in which a software developer or team would use Content Moderator:

  • Online marketplaces that moderate product catalogs and other user-generated content
  • Gaming companies that moderate user-generated game artifacts and chat rooms
  • Social messaging platforms that moderate images, text, and videos added by their users
  • Enterprise media companies that implement centralized moderation for their content
  • K-12 education solution providers filtering out content that is inappropriate for students and educators

What it includes

The Content Moderator service consists of several web service APIs available through both REST calls and a .NET SDK. It also includes the human review tool, which allows human reviewers to aid the service and improve or fine-tune its moderation function.

block diagram for Content Moderator showing the Moderation APIs, Review APIs, and human review tool

Content Moderator APIs

The Content Moderator service includes APIs for the following scenarios.

Action Description
Text moderation Scans text for offensive content, sexually explicit or suggestive content, profanity, and personally identifiable information (PII).
Custom term lists Scans text against a custom list of terms in addition to the built-in terms. Use custom lists to block or allow content according to your own content policies.
Image moderation Scans images for adult or racy content, detects text in images with the Optical Character Recognition (OCR) capability, and detects faces.
Custom image lists Scans images against a custom list of images. Use custom image lists to filter out instances of commonly recurring content that you don't want to classify again.
Video moderation Scans videos for adult or racy content and returns time markers for said content.
Review Use the Jobs, Reviews, and Workflow operations to create and automate human-in-the-loop workflows with the human review tool. The Workflow API is not yet available through the .NET SDK.

Human review tool

The Content Moderator service also includes the web-based human review tool.

Content Moderator human review tool homepage

You can use the Review APIs to set up team reviews of text, image, and video content, according to filters that you specify. Then, human moderators can make the final moderation decisions. The human input does not train the service, but the combined work of the service and human review teams allows developers to strike the right balance between efficiency and accuracy.

Next steps

Follow the Quickstart to get started using Content Moderator.