Quickstart: Detect faces in an image using the Face REST API and cURL

In this quickstart, you will use the Azure Face REST API with cURL to detect human faces in an image.

If you don't have an Azure subscription, create a free account before you begin.


Write the command

You will use a command like the following to call the Face API and get face attribute data from an image. First, copy the code into a text editor—you'll need to make changes to certain parts of the command before you can run it.

curl -H "Ocp-Apim-Subscription-Key: <Subscription Key>" "https://westcentralus.api.cognitive.microsoft.com/face/v1.0/detect?returnFaceId=true&returnFaceLandmarks=false&returnFaceAttributes=age,gender,headPose,smile,facialHair,glasses,emotion,hair,makeup,occlusion,accessories,blur,exposure,noise" -H "Content-Type: application/json" --data-ascii "{\"url\":\"https://upload.wikimedia.org/wikipedia/commons/c/c3/RH_Louise_Lillian_Gish.jpg\"}"

Subscription key

Replace <Subscription Key> with your valid Face subscription key.

Face endpoint URL

The URL https://westcentralus.api.cognitive.microsoft.com/face/v1.0/detect indicates the Azure Face endpoint to query. You may need to change the first part of this URL to match the region that corresponds to your subscription key (see the Face API docs for a list of all region endpoints).

URL query string

The query string of the Face endpoint URL specifies which face attributes to retrieve. You may wish to change this string depending on your intended use.


Image source URL

The source URL indicates the image to use as input. You can change this to point to any image you wish to analyze.


Run the command

Once you've made your changes, open a command prompt and enter the new command. You should see the face information displayed as JSON data in the console window. For example:

    "faceId": "49d55c17-e018-4a42-ba7b-8cbbdfae7c6f",
    "faceRectangle": {
      "top": 131,
      "left": 177,
      "width": 162,
      "height": 162
    "faceAttributes": {
      "smile": 0,
      "headPose": {
        "pitch": 0,
        "roll": 0.1,
        "yaw": -32.9
      "gender": "female",
      "age": 22.9,
      "facialHair": {
        "moustache": 0,
        "beard": 0,
        "sideburns": 0
      "glasses": "NoGlasses",
      "emotion": {
        "anger": 0,
        "contempt": 0,
        "disgust": 0,
        "fear": 0,
        "happiness": 0,
        "neutral": 0.986,
        "sadness": 0.009,
        "surprise": 0.005
      "blur": {
        "blurLevel": "low",
        "value": 0.06
      "exposure": {
        "exposureLevel": "goodExposure",
        "value": 0.67
      "noise": {
        "noiseLevel": "low",
        "value": 0
      "makeup": {
        "eyeMakeup": true,
        "lipMakeup": true
      "accessories": [],
      "occlusion": {
        "foreheadOccluded": false,
        "eyeOccluded": false,
        "mouthOccluded": false
      "hair": {
        "bald": 0,
        "invisible": false,
        "hairColor": [
            "color": "brown",
            "confidence": 1
            "color": "black",
            "confidence": 0.87
            "color": "other",
            "confidence": 0.51
            "color": "blond",
            "confidence": 0.08
            "color": "red",
            "confidence": 0.08
            "color": "gray",
            "confidence": 0.02

Next steps

In this quickstart, you wrote a cURL command that calls the Azure Face API to detect faces in an image and return their attributes. Next, explore the Face API reference documentation to learn more.