Tutorial: Upload, encode, and stream videos with Media Services v3


Even though this tutorial uses .NET SDK examples, the general steps are the same for REST API, CLI, or other supported SDKs.

Azure Media Services lets you encode your media files into formats that play on a wide variety of browsers and devices. For example, you might want to stream your content in Apple's HLS or MPEG DASH formats. Before streaming, you should encode your high-quality digital media file. For help with encoding, see Encoding concept. This tutorial uploads a local video file and encodes the uploaded file. You can also encode content that you make accessible via an HTTPS URL. For more information, see Create a job input from an HTTP(s) URL.

Play a video with Azure Media Player

This tutorial shows you how to:

  • Download the sample app described in the topic.
  • Examine the code that uploads, encodes, and streams.
  • Run the app.
  • Test the streaming URL.
  • Clean up resources.

If you don't have an Azure subscription, create a free account before you begin.


Download and set up the sample

Clone a GitHub repository that has the streaming .NET sample to your machine using the following command:

git clone https://github.com/Azure-Samples/media-services-v3-dotnet-tutorials.git

The sample is located in the UploadEncodeAndStreamFiles folder.

Open appsettings.json in your downloaded project. Replace the values with credentials that you got from accessing APIs.

Examine the code that uploads, encodes, and streams

This section examines functions defined in the Program.cs file of the UploadEncodeAndStreamFiles project.

The sample performs the following actions:

  1. Creates a new Transform (first, checks if the specified Transform exists).
  2. Creates an output Asset that's used as the encoding Job's output.
  3. Create an input Asset and uploads the specified local video file into it. The asset is used as the job's input.
  4. Submits the encoding job using the input and output that was created.
  5. Checks the job's status.
  6. Creates a Streaming Locator.
  7. Builds streaming URLs.

Start using Media Services APIs with .NET SDK

To start using Media Services APIs with .NET, you need to create an AzureMediaServicesClient object. To create the object, you must supply credentials needed for the client to connect to Azure using Azure AD. In the code you cloned at the beginning of the article, the GetCredentialsAsync function creates the ServiceClientCredentials object based on the credentials supplied in local configuration file.

private static async Task<IAzureMediaServicesClient> CreateMediaServicesClientAsync(ConfigWrapper config)
    var credentials = await GetCredentialsAsync(config);

    return new AzureMediaServicesClient(config.ArmEndpoint, credentials)
        SubscriptionId = config.SubscriptionId,

Create an input asset and upload a local file into it

The CreateInputAsset function creates a new input Asset and uploads the specified local video file into it. This Asset is used as the input to your encoding job. In Media Services v3, the input to a Job can either be an Asset or content that you make available to your Media Services account via HTTPS URLs. To learn how to encode from an HTTPS URL, see this article.

In Media Services v3, you use Azure Storage APIs to upload files. The following .NET snippet shows how.

The following function performs these actions:

  • Creates an Asset.

  • Gets a writable SAS URL to the asset’s container in storage.

    If using asset’s ListContainerSas function to get SAS URLs, note that the function returns multiple SAS URLs as there are two storage account keys for each storage account. A storage account has two keys because it allows for seamless rotation of storage account keys (for example, change one while using the other then start using the new key and rotate the other key). The 1st SAS URL represents storage key1 and second one storage key2.

  • Uploads the file into the container in storage using the SAS URL.

private static async Task<Asset> CreateInputAssetAsync(
    IAzureMediaServicesClient client,
    string resourceGroupName,
    string accountName,
    string assetName,
    string fileToUpload)
    // In this example, we are assuming that the asset name is unique.
    // If you already have an asset with the desired name, use the Assets.Get method
    // to get the existing asset. In Media Services v3, the Get method on entities returns null 
    // if the entity doesn't exist (a case-insensitive check on the name).

    // Call Media Services API to create an Asset.
    // This method creates a container in storage for the Asset.
    // The files (blobs) associated with the asset will be stored in this container.
    Asset asset = await client.Assets.CreateOrUpdateAsync(resourceGroupName, accountName, assetName, new Asset());

    // Use Media Services API to get back a response that contains
    // SAS URL for the Asset container into which to upload blobs.
    // That is where you would specify read-write permissions 
    // and the exparation time for the SAS URL.
    var response = await client.Assets.ListContainerSasAsync(
        permissions: AssetContainerPermission.ReadWrite,
        expiryTime: DateTime.UtcNow.AddHours(4).ToUniversalTime());

    var sasUri = new Uri(response.AssetContainerSasUrls.First());

    // Use Storage API to get a reference to the Asset container
    // that was created by calling Asset's CreateOrUpdate method.  
    CloudBlobContainer container = new CloudBlobContainer(sasUri);
    var blob = container.GetBlockBlobReference(Path.GetFileName(fileToUpload));

    // Use Strorage API to upload the file into the container in storage.
    await blob.UploadFromFileAsync(fileToUpload);

    return asset;

Create an output asset to store the result of a job

The output Asset stores the result of your encoding job. The project defines the DownloadResults function that downloads the results from this output asset into the "output" folder, so you can see what you got.

private static async Task<Asset> CreateOutputAssetAsync(IAzureMediaServicesClient client, string resourceGroupName, string accountName, string assetName)
    // Check if an Asset already exists
    Asset outputAsset = await client.Assets.GetAsync(resourceGroupName, accountName, assetName);
    Asset asset = new Asset();
    string outputAssetName = assetName;

    if (outputAsset != null)
        // Name collision! In order to get the sample to work, let's just go ahead and create a unique asset name
        // Note that the returned Asset can have a different name than the one specified as an input parameter.
        // You may want to update this part to throw an Exception instead, and handle name collisions differently.
        string uniqueness = $"-{Guid.NewGuid().ToString("N")}";
        outputAssetName += uniqueness;
        Console.WriteLine("Warning – found an existing Asset with name = " + assetName);
        Console.WriteLine("Creating an Asset with this name instead: " + outputAssetName);                

    return await client.Assets.CreateOrUpdateAsync(resourceGroupName, accountName, outputAssetName, asset);

Create a Transform and a Job that encodes the uploaded file

When encoding or processing content in Media Services, it's a common pattern to set up the encoding settings as a recipe. You would then submit a Job to apply that recipe to a video. By submitting new jobs for each new video, you're applying that recipe to all the videos in your library. A recipe in Media Services is called a Transform. For more information, see Transforms and Jobs. The sample described in this tutorial defines a recipe that encodes the video in order to stream it to a variety of iOS and Android devices.


When creating a new Transform instance, you need to specify what you want it to produce as an output. The required parameter is a TransformOutput object, as shown in the code below. Each TransformOutput contains a Preset. Preset describes the step-by-step instructions of video and/or audio processing operations that are to be used to generate the desired TransformOutput. The sample described in this article uses a built-in Preset called AdaptiveStreaming. The Preset encodes the input video into an auto-generated bitrate ladder (bitrate-resolution pairs) based on the input resolution and bitrate, and produces ISO MP4 files with H.264 video and AAC audio corresponding to each bitrate-resolution pair. For information about this Preset, see auto-generating bitrate ladder.

You can use a built-in EncoderNamedPreset or use custom presets. For more information, see How to customize encoder presets.

When creating a Transform, you should first check if one already exists using the Get method, as shown in the code that follows. In Media Services v3, Get methods on entities return null if the entity doesn’t exist (a case-insensitive check on the name).

private static async Task<Transform> GetOrCreateTransformAsync(
    IAzureMediaServicesClient client,
    string resourceGroupName,
    string accountName,
    string transformName)
    // Does a Transform already exist with the desired name? Assume that an existing Transform with the desired name
    // also uses the same recipe or Preset for processing content.
    Transform transform = await client.Transforms.GetAsync(resourceGroupName, accountName, transformName);

    if (transform == null)
        // You need to specify what you want it to produce as an output
        TransformOutput[] output = new TransformOutput[]
            new TransformOutput
                // The preset for the Transform is set to one of Media Services built-in sample presets.
                // You can  customize the encoding settings by changing this to use "StandardEncoderPreset" class.
                Preset = new BuiltInStandardEncoderPreset()
                    // This sample uses the built-in encoding preset for Adaptive Bitrate Streaming.
                    PresetName = EncoderNamedPreset.AdaptiveStreaming

        // Create the Transform with the output defined above
        transform = await client.Transforms.CreateOrUpdateAsync(resourceGroupName, accountName, transformName, output);

    return transform;


As mentioned above, the Transform object is the recipe and a Job is the actual request to Media Services to apply that Transform to a given input video or audio content. The Job specifies information like the location of the input video, and the location for the output.

In this example, the input video has been uploaded from your local machine. If you want to learn how to encode from an HTTPS URL, see this article.

private static async Task<Job> SubmitJobAsync(IAzureMediaServicesClient client,
    string resourceGroupName,
    string accountName,
    string transformName,
    string jobName,
    string inputAssetName,
    string outputAssetName)
    // Use the name of the created input asset to create the job input.
    JobInput jobInput = new JobInputAsset(assetName: inputAssetName);

    JobOutput[] jobOutputs =
        new JobOutputAsset(outputAssetName),

    // In this example, we are assuming that the job name is unique.
    // If you already have a job with the desired name, use the Jobs.Get method
    // to get the existing job. In Media Services v3, the Get method on entities returns null 
    // if the entity doesn't exist (a case-insensitive check on the name).
    Job job = await client.Jobs.CreateAsync(
        new Job
            Input = jobInput,
            Outputs = jobOutputs,

    return job;

Wait for the Job to complete

The job takes some time to complete and when it does you want to be notified. The code sample below shows how to poll the service for the status of the Job. Polling isn't a recommended best practice for production apps because of potential latency. Polling can be throttled if overused on an account. Developers should instead use Event Grid.

Event Grid is designed for high availability, consistent performance, and dynamic scale. With Event Grid, your apps can listen for and react to events from virtually all Azure services, as well as custom sources. Simple, HTTP-based reactive event handling helps you build efficient solutions through intelligent filtering and routing of events. See Route events to a custom web endpoint.

The Job usually goes through the following states: Scheduled, Queued, Processing, Finished (the final state). If the job has encountered an error, you get the Error state. If the job is in the process of being canceled, you get Canceling and Canceled when it's done.

private static async Task<Job> WaitForJobToFinishAsync(IAzureMediaServicesClient client,
    string resourceGroupName,
    string accountName,
    string transformName,
    string jobName)
    const int SleepIntervalMs = 20 * 1000;

    Job job = null;

        job = await client.Jobs.GetAsync(resourceGroupName, accountName, transformName, jobName);

        Console.WriteLine($"Job is '{job.State}'.");
        for (int i = 0; i < job.Outputs.Count; i++)
            JobOutput output = job.Outputs[i];
            Console.Write($"\tJobOutput[{i}] is '{output.State}'.");
            if (output.State == JobState.Processing)
                Console.Write($"  Progress: '{output.Progress}'.");


        if (job.State != JobState.Finished && job.State != JobState.Error && job.State != JobState.Canceled)
            await Task.Delay(SleepIntervalMs);
    while (job.State != JobState.Finished && job.State != JobState.Error && job.State != JobState.Canceled);

    return job;

Job error codes

See Error codes.

Get a Streaming Locator

After the encoding is complete, the next step is to make the video in the output Asset available to clients for playback. You can make it available in two steps: first, create a Streaming Locator, and second, build the streaming URLs that clients can use.

The process of creating a Streaming Locator is called publishing. By default, the Streaming Locator is valid immediately after you make the API calls, and lasts until it's deleted, unless you configure the optional start and end times.

When creating a StreamingLocator, you'll need to specify the desired StreamingPolicyName. In this example, you'll be streaming in-the-clear (or non-encrypted content) so the predefined clear streaming policy (PredefinedStreamingPolicy.ClearStreamingOnly) is used.


When using a custom Streaming Policy, you should design a limited set of such policies for your Media Service account, and re-use them for your StreamingLocators whenever the same encryption options and protocols are needed. Your Media Service account has a quota for the number of Streaming Policy entries. You shouldn't be creating a new Streaming Policy for each Streaming Locator.

The following code assumes that you're calling the function with a unique locatorName.

private static async Task<StreamingLocator> CreateStreamingLocatorAsync(
    IAzureMediaServicesClient client,
    string resourceGroup,
    string accountName,
    string assetName,
    string locatorName)
    StreamingLocator locator = await client.StreamingLocators.CreateAsync(
        new StreamingLocator
            AssetName = assetName,
            StreamingPolicyName = PredefinedStreamingPolicy.ClearStreamingOnly

    return locator;

While the sample in this topic discusses streaming, you can use the same call to create a Streaming Locator for delivering video via progressive download.

Get streaming URLs

Now that the Streaming Locator has been created, you can get the streaming URLs, as shown in GetStreamingURLs. To build a URL, you need to concatenate the Streaming Endpoint host name and the Streaming Locator path. In this sample, the default Streaming Endpoint is used. When you first create a Media Service account, this default Streaming Endpoint will be in a stopped state, so you need to call Start.


In this method, you need the locatorName that was used when creating the Streaming Locator for the output Asset.

private static async Task<IList<string>> GetStreamingUrlsAsync(
    IAzureMediaServicesClient client,
    string resourceGroupName,
    string accountName,
    String locatorName)
    const string DefaultStreamingEndpointName = "default";

    IList<string> streamingUrls = new List<string>();

    StreamingEndpoint streamingEndpoint = await client.StreamingEndpoints.GetAsync(resourceGroupName, accountName, DefaultStreamingEndpointName);

    if (streamingEndpoint != null)
        if (streamingEndpoint.ResourceState != StreamingEndpointResourceState.Running)
            await client.StreamingEndpoints.StartAsync(resourceGroupName, accountName, DefaultStreamingEndpointName);

    ListPathsResponse paths = await client.StreamingLocators.ListPathsAsync(resourceGroupName, accountName, locatorName);

    foreach (StreamingPath path in paths.StreamingPaths)
        UriBuilder uriBuilder = new UriBuilder();
        uriBuilder.Scheme = "https";
        uriBuilder.Host = streamingEndpoint.HostName;

        uriBuilder.Path = path.Paths[0];

    return streamingUrls;

Clean up resources in your Media Services account

Generally, you should clean up everything except objects that you're planning to reuse (typically, you'll reuse Transforms, and you'll persist StreamingLocators, etc.). If you want your account to be clean after experimenting, delete the resources that you don't plan to reuse. For example, the following code deletes Jobs:

private static async Task CleanUpAsync(
    IAzureMediaServicesClient client,
    string resourceGroupName,
    string accountName,
    string transformName,
    string contentKeyPolicyName,
    List<string> assetNames,
    string jobName)
    await client.Jobs.DeleteAsync(resourceGroupName, accountName, transformName, jobName);

    foreach (var assetName in assetNames)
        await client.Assets.DeleteAsync(resourceGroupName, accountName, assetName);

    client.ContentKeyPolicies.Delete(resourceGroupName, accountName, contentKeyPolicyName);

Run the sample app

  1. Press Ctrl+F5 to run the EncodeAndStreamFiles app.
  2. Copy one of the streaming URLs from the console.

This example displays URLs that can be used to play back the video using different protocols:

Example output showing URLs for Media Services streaming video

Test the streaming URL

To test the stream, this article uses Azure Media Player.


If a player is hosted on an https site, make sure to update the URL to "https".

  1. Open a web browser and navigate to https://aka.ms/azuremediaplayer/.
  2. In the URL: box, paste one of the streaming URL values you got when you ran the app.
  3. Select Update Player.

Azure Media Player can be used for testing but shouldn't be used in a production environment.

Clean up resources

If you no longer need any of the resources in your resource group, including the Media Services and storage accounts you created for this tutorial, delete the resource group you created earlier.

Execute the following CLI command:

az group delete --name amsResourceGroup


The Azure Media Services v3 SDKs aren't thread-safe. When developing a multi-threaded app, you should generate and use a new AzureMediaServicesClient object per thread.

Ask questions, give feedback, get updates

Check out the Azure Media Services community article to see different ways you can ask questions, give feedback, and get updates about Media Services.

Next steps

Now that you know how to upload, encode, and stream your video, see the following article: