Install and run Text Analytics containers


  • The container for Sentiment Analysis and language detection are now Generally Available. The key phrase extraction container is available as an ungated public preview.
  • Entity linking and NER are not currently available as a container.
  • The container image locations may have recently changed. Read this article to see the updated location for this container.

Containers enable you to run the Text Analytic APIs in your own environment and are great for your specific security and data governance requirements. The Text Analytics containers provide advanced natural language processing over raw text, and include three main functions: sentiment analysis, key phrase extraction, and language detection.

If you don't have an Azure subscription, create a free account before you begin.


The free account is limited to 5,000 transactions per month and only the Free and Standard pricing tiers are valid for containers. For more information on transaction request rates, see Data Limits.


To run any of the Text Analytics containers, you must have the host computer and container environments.


You must meet the following prerequisites before using Text Analytics containers:

Required Purpose
Docker Engine You need the Docker Engine installed on a host computer. Docker provides packages that configure the Docker environment on macOS, Windows, and Linux. For a primer on Docker and container basics, see the Docker overview.

Docker must be configured to allow the containers to connect with and send billing data to Azure.

On Windows, Docker must also be configured to support Linux containers.

Familiarity with Docker You should have a basic understanding of Docker concepts, like registries, repositories, containers, and container images, as well as knowledge of basic docker commands.
Text Analytics resource In order to use the container, you must have:

An Azure Text Analytics resource with the free (F0) or standard (S) pricing tier. You will need to get the associated API key and endpoint URI by navigating to your resource's Key and endpoint page in the Azure portal.

{API_KEY}: One of the two available resource keys.

{ENDPOINT_URI}: The endpoint for your resource.

Gathering required parameters

There are three primary parameters for all Cognitive Services' containers that are required. The end-user license agreement (EULA) must be present with a value of accept. Additionally, both an Endpoint URL and API Key are needed.


The Endpoint URI value is available on the Azure portal Overview page of the corresponding Cognitive Service resource. Navigate to the Overview page, hover over the Endpoint, and a Copy to clipboard icon will appear. Copy and use where needed.

Gather the endpoint uri for later use

Keys {API_KEY}

This key is used to start the container, and is available on the Azure portal's Keys page of the corresponding Cognitive Service resource. Navigate to the Keys page, and click on the Copy to clipboard icon.

Get one of the two keys for later use


These subscription keys are used to access your Cognitive Service API. Do not share your keys. Store them securely, for example, using Azure Key Vault. We also recommend regenerating these keys regularly. Only one key is necessary to make an API call. When regenerating the first key, you can use the second key for continued access to the service.

If you're using the Text Analytics for health container, the responsible AI (RAI) acknowledgment must also be present with a value of accept.

The host computer

The host is a x64-based computer that runs the Docker container. It can be a computer on your premises or a Docker hosting service in Azure, such as:

Container requirements and recommendations

The following table describes the minimum and recommended specifications for the Text Analytics containers. At least 2 gigabytes (GB) of memory are required, and each CPU core must be at least 2.6 gigahertz (GHz) or faster. The allowable Transactions Per Section (TPS) are also listed.

Minimum host specs Recommended host specs Minimum TPS Maximum TPS
Language detection, key phrase extraction 1 core, 2GB memory 1 core, 4GB memory 15 30
Sentiment Analysis 1 core, 2GB memory 4 cores, 8GB memory 15 30
Text Analytics for health - 1 document/request 4 core, 10GB memory 6 core, 12GB memory 15 30
Text Analytics for health - 10 documents/request 6 core, 16GB memory 8 core, 20GB memory 15 30

CPU core and memory correspond to the --cpus and --memory settings, which are used as part of the docker run command.

Get the container image with docker pull


You can use the docker images command to list your downloaded container images. For example, the following command lists the ID, repository, and tag of each downloaded container image, formatted as a table:

docker images --format "table {{.ID}}\t{{.Repository}}\t{{.Tag}}"

IMAGE ID         REPOSITORY                TAG
<image-id>       <repository-path/name>    <tag-name>

Container images for Text Analytics are available on the Microsoft Container Registry.

Docker pull for the Sentiment Analysis v3 container

The sentiment analysis container v3 container is available in several languages. To download the container for the English container, use the command below.

docker pull

To download the container for another language, replace en with one of the language codes below.

Text Analytics Container Language code
Chinese-Simplified zh-hans
Chinese-Traditional zh-hant
Dutch nl
English en
French fr
German de
Hindi hi
Italian it
Japanese ja
Korean ko
Norwegian (Bokmål) no
Portuguese (Brazil) pt-BR
Portuguese (Portugal) pt-PT
Spanish es
Turkish tr

For a full description of available tags for the Text Analytics containers, see Docker Hub.

How to use the container

Once the container is on the host computer, use the following process to work with the container.

  1. Run the container, with the required billing settings.
  2. Query the container's prediction endpoint.

Run the container with docker run

Use the docker run command to run the containers. The container will continue to run until you stop it.


  • The docker commands in the following sections use the back slash, \, as a line continuation character. Replace or remove this based on your host operating system's requirements.
  • The Eula, Billing, and ApiKey options must be specified to run the container; otherwise, the container won't start. For more information, see Billing.
  • The sentiment analysis and language detection containers are generally available. The key phrase extraction container uses v2 of the API, and is in preview.

To run the Sentiment Analysis v3 container, execute the following docker run command. Replace the placeholders below with your own values:

Placeholder Value Format or example
{API_KEY} The key for your Text Analytics resource. You can find it on your resource's Key and endpoint page, on the Azure portal. xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
{ENDPOINT_URI} The endpoint for accessing the Text Analytics API. You can find it on your resource's Key and endpoint page, on the Azure portal. https://<your-custom-subdomain>
docker run --rm -it -p 5000:5000 --memory 8g --cpus 1 \ \
Eula=accept \
Billing={ENDPOINT_URI} \

This command:

  • Runs a Sentiment Analysis container from the container image
  • Allocates one CPU core and 8 gigabytes (GB) of memory
  • Exposes TCP port 5000 and allocates a pseudo-TTY for the container
  • Automatically removes the container after it exits. The container image is still available on the host computer.

Run multiple containers on the same host

If you intend to run multiple containers with exposed ports, make sure to run each container with a different exposed port. For example, run the first container on port 5000 and the second container on port 5001.

You can have this container and a different Azure Cognitive Services container running on the HOST together. You also can have multiple containers of the same Cognitive Services container running.

Query the container's prediction endpoint

The container provides REST-based query prediction endpoint APIs.

Use the host, http://localhost:5000, for container APIs.

Validate that a container is running

There are several ways to validate that the container is running. Locate the External IP address and exposed port of the container in question, and open your favorite web browser. Use the various request URLs below to validate the container is running. The example request URLs listed below are http://localhost:5000, but your specific container may vary. Keep in mind that you're to rely on your container's External IP address and exposed port.

Request URL Purpose
http://localhost:5000/ The container provides a home page.
http://localhost:5000/ready Requested with GET, this provides a verification that the container is ready to accept a query against the model. This request can be used for Kubernetes liveness and readiness probes.
http://localhost:5000/status Also requested with GET, this verifies if the api-key used to start the container is valid without causing an endpoint query. This request can be used for Kubernetes liveness and readiness probes.
http://localhost:5000/swagger The container provides a full set of documentation for the endpoints and a Try it out feature. With this feature, you can enter your settings into a web-based HTML form and make the query without having to write any code. After the query returns, an example CURL command is provided to demonstrate the HTTP headers and body format that's required.

Container's home page

Stop the container

To shut down the container, in the command-line environment where the container is running, select Ctrl+C.


If you run the container with an output mount and logging enabled, the container generates log files that are helpful to troubleshoot issues that happen while starting or running the container.


For more troubleshooting information and guidance, see Cognitive Services containers frequently asked questions (FAQ).


The Text Analytics containers send billing information to Azure, using a Text Analytics resource on your Azure account.

Queries to the container are billed at the pricing tier of the Azure resource that's used for the ApiKey.

Azure Cognitive Services containers aren't licensed to run without being connected to the metering / billing endpoint. You must enable the containers to communicate billing information with the billing endpoint at all times. Cognitive Services containers don't send customer data, such as the image or text that's being analyzed, to Microsoft.

Connect to Azure

The container needs the billing argument values to run. These values allow the container to connect to the billing endpoint. The container reports usage about every 10 to 15 minutes. If the container doesn't connect to Azure within the allowed time window, the container continues to run but doesn't serve queries until the billing endpoint is restored. The connection is attempted 10 times at the same time interval of 10 to 15 minutes. If it can't connect to the billing endpoint within the 10 tries, the container stops serving requests. See the Cognitive Services container FAQ for an example of the information sent to Microsoft for billing.

Billing arguments

The docker run command will start the container when all three of the following options are provided with valid values:

Option Description
ApiKey The API key of the Cognitive Services resource that's used to track billing information.
The value of this option must be set to an API key for the provisioned resource that's specified in Billing.
Billing The endpoint of the Cognitive Services resource that's used to track billing information.
The value of this option must be set to the endpoint URI of a provisioned Azure resource.
Eula Indicates that you accepted the license for the container.
The value of this option must be set to accept.

For more information about these options, see Configure containers.


In this article, you learned concepts and workflow for downloading, installing, and running Text Analytics containers. In summary:

  • Text Analytics provides three Linux containers for Docker, encapsulating various capabilities:
    • Sentiment Analysis
    • Key Phrase Extraction (preview)
    • Language Detection
    • Text Analytics for health (preview)
  • Container images are downloaded from the Microsoft Container Registry (MCR) or preview container repository.
  • Container images run in Docker.
  • You can use either the REST API or SDK to call operations in Text Analytics containers by specifying the host URI of the container.
  • You must specify billing information when instantiating a container.


Cognitive Services containers are not licensed to run without being connected to Azure for metering. Customers need to enable the containers to communicate billing information with the metering service at all times. Cognitive Services containers do not send customer data (e.g. text that is being analyzed) to Microsoft.

Next steps