Quickstart: Try Content Moderator on the web


The Content Moderator Review tool is now deprecated and will be retired on 12/31/2021.

In this quickstart, you'll use the online Content Moderator Review tool to test out the basic functionality of Content Moderator without having to write any code. If you wish to integrate this service into your content moderation app more quickly, see the other quickstarts in the Next steps section.


  • A web browser

Set up the review tool

The Content Moderator Review tool is a web-based tool that allows human reviewers to aid the cognitive service in making decisions. In this guide, you'll go through the short process of setting up the review tool so that you can see how the Content Moderator service works. Go to the Content Moderator Review tool site and sign up.

Content Moderator Home Page

Create a review team

Next, create a review team. In a working scenario, this team will be the group of people who manually review the service's moderation decisions. To create a team, you'll need to select a Region, and provide a Team Name and a Team ID. If you wish to invite colleagues to the team, you can do so by entering their email addresses here.


Team Name is a friendly name for your review team. This is the name displayed in the Azure portal. The Team ID is what's used to identify your review team programatically.

Invite team member

If you choose to encrypt data using a customer-managed key (CMK), you'll be prompted for the Resource ID for your Content Moderator resource in the E0 pricing tier. The resource you provide must be unique to this team.

Invite team member with CMK

Upload sample content

Now you're ready to upload sample content. Select Try > Image, Try > Text, or Try > Video.

Try Image or Text Moderation

Submit your content for moderation. You can use the following sample text content:

Is this a grabage email abcdef@abcd.com, phone: 4255550111, IP:, 1234 Main Boulevard, Panapolis WA 96555.
Crap is the profanity here. Is this information PII? phone 4255550111

Internally, the review tool will call the moderation APIs to scan your content. Once the scanning is complete, you'll see a message informing you that there are results waiting for your review.

Moderate files

Review moderation tags

Review the applied moderation tags. You can see which tags were applied to your content and what the score was in each category. See the Image, Text, and Video moderation articles to learn more about what the different content tags indicate.

In a project, you or your review team can change these tags or add more tags as needed. You'll submit these changes with the Next button. As your business application calls the Moderator APIs, the tagged content will queue up here, ready to be reviewed by the human review teams. You can quickly review large volumes of content using this approach.

At this point, you've used the Content Moderator Review tool to see examples of what the Content Moderator service can do. Next, you can either learn more about the review tool and how to integrate it into a software project using the Review APIs, or you can skip to the Next steps section to learn how to use the Moderation APIs themselves in your app.

Learn more about the review tool

To learn more about how to use the Content Moderator Review tool, take a look at the Review tool guide, and see the Review tool APIs to learn how to fine-tune the human review experience:

  • The Job API scans your content by using the moderation APIs and generates reviews in the review tool.
  • The Review API directly creates image, text, or video reviews for human moderators without first scanning the content.
  • The Workflow API creates, updates, and gets details about the custom workflows that your team creates.

Or, continue with the next steps to get started using the Moderation APIs in your code.

Next steps

Learn how to use the Moderation APIs themselves in your app.

  • Implement image moderation. Use the API console or follow a quickstart to scan images and detect potential adult and racy content by using tags, confidence scores, and other extracted information.
  • Implement text moderation. Use the API console or follow a quickstart to scan text content for potential profanity, personal data, and other unwanted text.
  • Implement video moderation. Follow the Video moderation how-to guide for C# to scan videos and detect potential adult and racy content.