Install and run Face containers

Face provides a standardized Linux container for Docker, named Face, which detects human faces in images, and identifies attributes, including face landmarks (such as noses and eyes), gender, age, and other machine-predicted facial features. In addition to detection, Face can check if two faces in the same image or different images are the same by using a confidence score, or compare faces against a database to see if a similar-looking or identical face already exists. It can also organize similar faces into groups, using shared visual traits.

If you don't have an Azure subscription, create a free account before you begin.

Prerequisites

You must meet the following prerequisites before using Face API containers:

Required Purpose
Docker Engine You need the Docker Engine installed on a host computer. Docker provides packages that configure the Docker environment on macOS, Windows, and Linux. For a primer on Docker and container basics, see the Docker overview.

Docker must be configured to allow the containers to connect with and send billing data to Azure.

On Windows, Docker must also be configured to support Linux containers.

Familiarity with Docker You should have a basic understanding of Docker concepts, like registries, repositories, containers, and container images, as well as knowledge of basic docker commands.
Face API resource In order to use the container, you must have:

A Face API Azure resource to get the associated billing key and billing endpoint URI. Both values are available on the Azure portal's Face API Overview and Keys pages and are required to start the container.

{BILLING_KEY}: resource key

{BILLING_ENDPOINT_URI}: endpoint URI example is: https://westus.api.cognitive.microsoft.com/text/analytics/v2.0

Request access to the private container registry

You must first complete and submit the Cognitive Services Vision Containers Request form to request access to the container. The form requests information about you, your company, and the user scenario for which you'll use the container. Once submitted, the Azure Cognitive Services team reviews the form to ensure that you meet the criteria for access to the private container registry.

Important

You must use an email address associated with either a Microsoft Account (MSA) or Azure Active Directory (Azure AD) account in the form.

If your request is approved, you then receive an email with instructions describing how to obtain your credentials and access the private container registry.

Log in to the private container registry

There are several ways to authenticate with the private container registry for Cognitive Services Containers, but the recommended method from the command line is by using the Docker CLI.

Use the docker login command, as shown in the following example, to log into containerpreview.azurecr.io, the private container registry for Cognitive Services Containers. Replace <username> with the user name and <password> with the password provided in the credentials you received from the Azure Cognitive Services team.

docker login containerpreview.azurecr.io -u <username> -p <password>

If you have secured your credentials in a text file, you can concatenate the contents of that text file, using the cat command, to the docker login command as shown in the following example. Replace <passwordFile> with the path and name of the text file containing the password and <username> with the user name provided in your credentials.

cat <passwordFile> | docker login containerpreview.azurecr.io -u <username> --password-stdin

The host computer

The host is the computer that runs the docker container. It can be a computer on your premises or a docker hosting service in Azure including:

Container requirements and recommendations

The following table describes the minimum and recommended CPU cores and memory to allocate for each Face API container.

Container Minimum Recommended TPS
(Minimum, Maximum)
Face 1 core, 2 GB memory 1 core, 4 GB memory 10, 20
  • Each core must be at least 2.6 gigahertz (GHz) or faster.
  • TPS - transactions per second

Core and memory correspond to the --cpus and --memory settings, which are used as part of the docker run command.

Get the container image with docker pull

Container images for Face API are available.

Container Repository
Face containerpreview.azurecr.io/microsoft/cognitive-services-face:latest

Tip

You can use the docker images command to list your downloaded container images. For example, the following command lists the ID, repository, and tag of each downloaded container image, formatted as a table:

docker images --format "table {{.ID}}\t{{.Repository}}\t{{.Tag}}"

IMAGE ID            REPOSITORY              TAG
ebbee78a6baa       <container-name>         latest

Docker pull for the Face container

docker pull containerpreview.azurecr.io/microsoft/cognitive-services-face:latest

How to use the container

Once the container is on the host computer, use the following process to work with the container.

  1. Run the container, with the required billing settings. More examples of the docker run command are available.
  2. Query the container's prediction endpoint.

Run the container with docker run

Use the docker run command to run any of the three containers. The command uses the following parameters:

Placeholder Value
{BILLING_KEY} This key is used to start the container, and is available on the Azure portal's Face API Keys page.
{BILLING_ENDPOINT_URI} The billing endpoint URI value is available on the Azure portal's Face API Overview page.

Replace these parameters with your own values in the following example docker run command.

docker run --rm -it -p 5000:5000 --memory 4g --cpus 1 \
containerpreview.azurecr.io/microsoft/cognitive-services-face \
Eula=accept \
Billing={BILLING_ENDPOINT_URI} \
ApiKey={BILLING_KEY}

This command:

  • Runs a face container from the container image
  • Allocates one CPU core and 4 gigabytes (GB) of memory
  • Exposes TCP port 5000 and allocates a pseudo-TTY for the container
  • Automatically removes the container after it exits. The container image is still available on the host computer.

More examples of the docker run command are available.

Important

The Eula, Billing, and ApiKey options must be specified to run the container; otherwise, the container won't start. For more information, see Billing.

Running multiple containers on the same host

If you intend to run multiple containers with exposed ports, make sure to run each container with a different port. For example, run the first container on port 5000 and the second container on port 5001.

Replace the <container-registry> and <container-name> with the values of the containers you use. These do not have to be the same container. You can have the Face container and the LUIS container running on the HOST together or you can have multiple Face containers running.

Run the first container on port 5000.

docker run --rm -it -p 5000:5000 --memory 4g --cpus 1 \
<container-registry>/microsoft/<container-name> \
Eula=accept \
Billing={BILLING_ENDPOINT_URI} \
ApiKey={BILLING_KEY}

Run the second container on port 5001.

docker run --rm -it -p 5001:5000 --memory 4g --cpus 1 \
<container-registry>/microsoft/<container-name> \
Eula=accept \
Billing={BILLING_ENDPOINT_URI} \
ApiKey={BILLING_KEY}

Each subsequent container should be on a different port.

Query the container's prediction endpoint

The container provides REST-based query prediction endpoint APIs.

Use the host, https://localhost:5000, for container APIs.

Stop the container

To shut down the container, in the command-line environment where the container is running, press Ctrl+C.

Troubleshooting

If you run the container with an output mount and logging enabled, the container generates log files that are helpful to troubleshoot issues that happen while starting or running the container.

Container's API documentation

The container provides a full set of documentation for the endpoints as well as a Try it now feature. This feature allows you to enter your settings into a web-based HTML form and make the query without having to write any code. Once the query returns, an example CURL command is provided to demonstrate the HTTP headers and body format required.

Tip

Read the OpenAPI specification, describing the API operations supported by the container, from the /swagger relative URI. For example:

http://localhost:5000/swagger

Billing

The Face API containers send billing information to Azure, using a Face API resource on your Azure account.

Queries to the container are billed at the pricing tier of the Azure resource used for the <ApiKey>.

Cognitive Services containers are not licensed to run without being connected to the billing endpoint for metering. Customers need to enable the containers to communicate billing information with billing endpoint at all times. Cognitive Services containers do not send customer data (for example, the image or text that is being analyzed) to Microsoft.

Connecting to Azure

The container needs the billing argument values to run. These values allow the container to connect to billing endpoint. The container reports usage about every 10 to 15 minutes. If the container doesn't connect within the allowed time window to Azure, the container will continue to run but will not serve queries until the billing endpoint is restored. The connection is attempted 10 times at the same time interval of 10 to 15 minutes. If it can't connect to the billing endpoint within the 10 tries, the container will stop running.

Billing arguments

All three of the following options must be specified with valid values in order for the docker run command to start the container:

Option Description
ApiKey The API key of the Cognitive Service resource used to track billing information.
The value of this option must be set to an API key for the provisioned resource specified in Billing.
Billing The endpoint of the Cognitive Service resource used to track billing information.
The value of this option must be set to the endpoint URI of a provisioned LUIS Azure resource.
Eula Indicates that you've accepted the license for the container.
The value of this option must be set to accept.

For more information about these options, see Configure containers.

Summary

In this article, you learned concepts and workflow for downloading, installing, and running Face API containers. In summary:

  • Face API provides three Linux containers for Docker, encapsulating key phrase extraction, language detection, and sentiment analysis.
  • Container images are downloaded from the Microsoft Container Registry (MCR) in Azure.
  • Container images run in Docker.
  • You can use either the REST API or SDK to call operations in Face API containers by specifying the host URI of the container.
  • You must specify billing information when instantiating a container.

Important

Cognitive Services containers are not licensed to run without being connected to Azure for metering. Customers need to enable the containers to communicate billing information with the metering service at all times. Cognitive Services containers do not send customer data (e.g., the image or text that is being analyzed) to Microsoft.

Next steps