Access Azure resources from an online endpoint with a managed identity

APPLIES TO: Azure CLI ml extension v2 (current) Python SDK azure-ai-ml v2 (current)

Learn how to access Azure resources from your scoring script with an online endpoint and either a system-assigned managed identity or a user-assigned managed identity.

Both managed endpoints and Kubernetes endpoints allow Azure Machine Learning to manage the burden of provisioning your compute resource and deploying your machine learning model. Typically your model needs to access Azure resources such as the Azure Container Registry or your blob storage for inferencing; with a managed identity, you can access these resources without needing to manage credentials in your code. Learn more about managed identities.

This guide assumes you don't have a managed identity, a storage account, or an online endpoint. If you already have these components, skip to the Give access permission to the managed identity section.

Prerequisites

  • To use Azure Machine Learning, you must have an Azure subscription. If you don't have an Azure subscription, create a free account before you begin. Try the free or paid version of Azure Machine Learning today.

  • Install and configure the Azure CLI and ML (v2) extension. For more information, see Install, set up, and use the 2.0 CLI.

  • An Azure resource group, in which you (or the service principal you use) need to have User Access Administrator and Contributor access. You have such a resource group if you configured your ML extension per the preceding article.

  • An Azure Machine Learning workspace. You already have a workspace if you configured your ML extension per the preceding article.

  • A trained machine learning model ready for scoring and deployment. If you're following along with the sample, a model is provided.

  • If you haven't already set the defaults for the Azure CLI, save your default settings. To avoid passing in the values for your subscription, workspace, and resource group multiple times, run this code:

    az account set --subscription <subscription ID>
    az configure --defaults gitworkspace=<Azure Machine Learning workspace name> group=<resource group>
    
  • To follow along with the sample, clone the samples repository and then change directory to cli.

    git clone https://github.com/Azure/azureml-examples --depth 1
    cd azureml-examples/cli
    

Limitations

  • The identity for an endpoint is immutable. During endpoint creation, you can associate it with a system-assigned identity (default) or a user-assigned identity. You can't change the identity after the endpoint is created.
  • If your ARC and blob storage are configured as private, that is, behind a virtual network, then access from the Kubernetes endpoint should be over the private link regardless of whether your workspace is public or private. More details about private link setting, refer to How to secure workspace vnet.

Configure variables for deployment

Configure the variable names for the workspace, workspace location, and the endpoint you want to create for use with your deployment.

The following code exports these values as environment variables in your endpoint:

export WORKSPACE="<WORKSPACE_NAME>"
export LOCATION="<WORKSPACE_LOCATION>"
export ENDPOINT_NAME="<ENDPOINT_NAME>"

Next, specify what you want to name your blob storage account, blob container, and file. These variable names are defined here, and are referred to in az storage account create and az storage container create commands in the next section.

The following code exports those values as environment variables:

export STORAGE_ACCOUNT_NAME="<BLOB_STORAGE_TO_ACCESS>"
export STORAGE_CONTAINER_NAME="<CONTAINER_TO_ACCESS>"
export FILE_NAME="<FILE_TO_ACCESS>"

After these variables are exported, create a text file locally. When the endpoint is deployed, the scoring script accesses this text file using the system-assigned managed identity that's generated upon endpoint creation.

Define the deployment configuration

To deploy an online endpoint with the CLI, you need to define the configuration in a YAML file. For more information on the YAML schema, see online endpoint YAML reference document.

The YAML files in the following examples are used to create online endpoints.

The following YAML example is located at endpoints/online/managed/managed-identities/1-sai-create-endpoint. The file,

  • Defines the name by which you want to refer to the endpoint, my-sai-endpoint.
  • Specifies the type of authorization to use to access the endpoint, auth-mode: key.
$schema: https://azuremlschemas.azureedge.net/latest/managedOnlineEndpoint.schema.json
name: my-sai-endpoint
auth_mode: key

This YAML example, 2-sai-deployment.yml,

  • Specifies that the type of endpoint you want to create is an online endpoint.
  • Indicates that the endpoint has an associated deployment called blue.
  • Configures the details of the deployment such as, which model to deploy and which environment and scoring script to use.
$schema: https://azuremlschemas.azureedge.net/latest/managedOnlineDeployment.schema.json
name: blue
model:
  path: ../../model-1/model/
code_configuration:
  code: ../../model-1/onlinescoring/
  scoring_script: score_managedidentity.py
environment:
  conda_file: ../../model-1/environment/conda-managedidentity.yaml
  image: mcr.microsoft.com/azureml/openmpi4.1.0-ubuntu20.04:latest
instance_type: Standard_DS3_v2
instance_count: 1
environment_variables:
  STORAGE_ACCOUNT_NAME: "storage_place_holder"
  STORAGE_CONTAINER_NAME: "container_place_holder"
  FILE_NAME: "file_place_holder"

Create the managed identity

To access Azure resources, create a system-assigned or user-assigned managed identity for your online endpoint.

When you create an online endpoint, a system-assigned managed identity is automatically generated for you, so no need to create a separate one.

Create storage account and container

For this example, create a blob storage account and blob container, and then upload the previously created text file to the blob container. You give the online endpoint and managed identity access to this storage account and blob container.

First, create a storage account.

az storage account create --name $STORAGE_ACCOUNT_NAME --location $LOCATION

Next, create the blob container in the storage account.

az storage container create --account-name $STORAGE_ACCOUNT_NAME --name $STORAGE_CONTAINER_NAME

Then, upload your text file to the blob container.

az storage blob upload --account-name $STORAGE_ACCOUNT_NAME --container-name $STORAGE_CONTAINER_NAME --name $FILE_NAME --file endpoints/online/managed/managed-identities/hello.txt

Create an online endpoint

The following code creates an online endpoint without specifying a deployment.

Warning

The identity for an endpoint is immutable. During endpoint creation, you can associate it with a system-assigned identity (default) or a user-assigned identity. You can't change the identity after the endpoint has been created.

When you create an online endpoint, a system-assigned managed identity is created for the endpoint by default.

az ml online-endpoint create --name $ENDPOINT_NAME -f endpoints/online/managed/managed-identities/1-sai-create-endpoint.yml

Check the status of the endpoint with the following.

az ml online-endpoint show --name $ENDPOINT_NAME

If you encounter any issues, see Troubleshooting online endpoints deployment and scoring.

Give access permission to the managed identity

Important

Online endpoints require Azure Container Registry pull permission, AcrPull permission, to the container registry and Storage Blob Data Reader permission to the default datastore of the workspace.

You can allow the online endpoint permission to access your storage via its system-assigned managed identity or give permission to the user-assigned managed identity to access the storage account created in the previous section.

Retrieve the system-assigned managed identity that was created for your endpoint.

system_identity=`az ml online-endpoint show --name $ENDPOINT_NAME --query "identity.principal_id" -o tsv`

From here, you can give the system-assigned managed identity permission to access your storage.

az role assignment create --assignee-object-id $system_identity --assignee-principal-type ServicePrincipal --role "Storage Blob Data Reader" --scope $storage_id

Scoring script to access Azure resource

Refer to the following script to understand how to use your identity token to access Azure resources, in this scenario, the storage account created in previous sections.

import os
import logging
import json
import numpy
import joblib
import requests
from azure.identity import ManagedIdentityCredential
from azure.storage.blob import BlobClient


def access_blob_storage_sdk():
    credential = ManagedIdentityCredential(client_id=os.getenv("UAI_CLIENT_ID"))
    storage_account = os.getenv("STORAGE_ACCOUNT_NAME")
    storage_container = os.getenv("STORAGE_CONTAINER_NAME")
    file_name = os.getenv("FILE_NAME")

    blob_client = BlobClient(
        account_url=f"https://{storage_account}.blob.core.windows.net/",
        container_name=storage_container,
        blob_name=file_name,
        credential=credential,
    )
    blob_contents = blob_client.download_blob().content_as_text()
    logging.info(f"Blob contains: {blob_contents}")


def get_token_rest():
    """
    Retrieve an access token via REST.
    """

    access_token = None
    msi_endpoint = os.environ.get("MSI_ENDPOINT", None)
    msi_secret = os.environ.get("MSI_SECRET", None)

    # If UAI_CLIENT_ID is provided then assume that endpoint was created with user assigned identity,
    # # otherwise system assigned identity deployment.
    client_id = os.environ.get("UAI_CLIENT_ID", None)
    if client_id is not None:
        token_url = (
            msi_endpoint + f"?clientid={client_id}&resource=https://storage.azure.com/"
        )
    else:
        token_url = msi_endpoint + f"?resource=https://storage.azure.com/"

    logging.info("Trying to get identity token...")
    headers = {"secret": msi_secret, "Metadata": "true"}
    resp = requests.get(token_url, headers=headers)
    resp.raise_for_status()
    access_token = resp.json()["access_token"]
    logging.info("Retrieved token successfully.")
    return access_token


def access_blob_storage_rest():
    """
    Access a blob via REST.
    """

    logging.info("Trying to access blob storage...")
    storage_account = os.environ.get("STORAGE_ACCOUNT_NAME")
    storage_container = os.environ.get("STORAGE_CONTAINER_NAME")
    file_name = os.environ.get("FILE_NAME")
    logging.info(
        f"storage_account: {storage_account}, container: {storage_container}, filename: {file_name}"
    )
    token = get_token_rest()

    blob_url = f"https://{storage_account}.blob.core.windows.net/{storage_container}/{file_name}?api-version=2019-04-01"
    auth_headers = {
        "Authorization": f"Bearer {token}",
        "x-ms-blob-type": "BlockBlob",
        "x-ms-version": "2019-02-02",
    }
    resp = requests.get(blob_url, headers=auth_headers)
    resp.raise_for_status()
    logging.info(f"Blob contains: {resp.text}")


def init():
    global model
    # AZUREML_MODEL_DIR is an environment variable created during deployment.
    # It is the path to the model folder (./azureml-models/$MODEL_NAME/$VERSION)
    # For multiple models, it points to the folder containing all deployed models (./azureml-models)
    # Please provide your model's folder name if there is one
    model_path = os.path.join(
        os.getenv("AZUREML_MODEL_DIR"), "model/sklearn_regression_model.pkl"
    )
    # deserialize the model file back into a sklearn model
    model = joblib.load(model_path)
    logging.info("Model loaded")

    # Access Azure resource (Blob storage) using system assigned identity token
    access_blob_storage_rest()
    access_blob_storage_sdk()

    logging.info("Init complete")


# note you can pass in multiple rows for scoring
def run(raw_data):
    logging.info("Request received")
    data = json.loads(raw_data)["data"]
    data = numpy.array(data)
    result = model.predict(data)
    logging.info("Request processed")
    return result.tolist()

Create a deployment with your configuration

Create a deployment that's associated with the online endpoint. Learn more about deploying to online endpoints.

Warning

This deployment can take approximately 8-14 minutes depending on whether the underlying environment/image is being built for the first time. Subsequent deployments using the same environment will go quicker.

az ml online-deployment create --endpoint-name $ENDPOINT_NAME --all-traffic --name blue --file endpoints/online/managed/managed-identities/2-sai-deployment.yml --set environment_variables.STORAGE_ACCOUNT_NAME=$STORAGE_ACCOUNT_NAME environment_variables.STORAGE_CONTAINER_NAME=$STORAGE_CONTAINER_NAME environment_variables.FILE_NAME=$FILE_NAME

Note

The value of the --name argument may override the name key inside the YAML file.

Check the status of the deployment.

az ml online-deployment show --endpoint-name $ENDPOINT_NAME --name blue

To refine the above query to only return specific data, see Query Azure CLI command output.

Note

The init method in the scoring script reads the file from your storage account using the system-assigned managed identity token.

To check the init method output, see the deployment log with the following code.

# Check deployment logs to confirm blob storage file contents read operation success.
az ml online-deployment get-logs --endpoint-name $ENDPOINT_NAME --name blue

When your deployment completes, the model, the environment, and the endpoint are registered to your Azure Machine Learning workspace.

Test the endpoint

Once your online endpoint is deployed, test and confirm its operation with a request. Details of inferencing vary from model to model. For this guide, the JSON query parameters look like:

{"data": [
    [1,2,3,4,5,6,7,8,9,10], 
    [10,9,8,7,6,5,4,3,2,1]
]}

To call your endpoint, run:

az ml online-endpoint invoke --name $ENDPOINT_NAME --request-file endpoints/online/model-1/sample-request.json

Delete the endpoint and storage account

If you don't plan to continue using the deployed online endpoint and storage, delete them to reduce costs. When you delete the endpoint, all of its associated deployments are deleted as well.

az ml online-endpoint delete --name $ENDPOINT_NAME --yes
az storage account delete --name $STORAGE_ACCOUNT_NAME --yes