Detect adult content

Computer Vision can detect adult material in images so that developers can restrict the display of these images in their software. Content flags are applied with a score between zero and one so that developers can interpret the results according to their own preferences.

Note

Much of this functionality is offered by the Azure Content Moderator service. See this alternative for solutions to more rigorous content moderation scenarios, such as text moderation and human review workflows.

Content flag definitions

Within the "adult" classification are several different categories:

  • Adult images are defined as those which are explicitly sexual in nature and often depict nudity and sexual acts.
  • Racy images are defined as images that are sexually suggestive in nature and often contain less sexually explicit content than images tagged as Adult.
  • Gory images are defined as those which depict gore.

Use the API

You can detect adult content with the Analyze Image API. When you add the value of Adult to the visualFeatures query parameter, the API returns three boolean properties—isAdultContent, isRacyContent, and isGoryContent—in its JSON response. The method also returns corresponding properties—adultScore, racyScore, and goreScore—which represent confidence scores between zero and one for each respective category.