Content Moderator Documentation

The Azure Content Moderator API is a cognitive service that checks text, image, and video content for material that is potentially offensive, risky, or otherwise undesirable. When such material is found, the service applies appropriate labels (flags) to the content. Your app can then handle flagged content in order to comply with regulations or maintain the intended environment for users.

5-Minute Quickstarts

Learn how to get started with Content Moderator by selecting an arrow link .

column 2column 3column 4
Analyze text content
Analyze image content

Step-by-Step Tutorials

Learn how to develop applications using Content Moderator:

Samples

Find samples for Content Moderator API:

Reference