Signal descriptive audio tracks
Looking for Media Services v2 documentation?
Having trouble? See the Troubleshooting guide for solutions to issues with using Media Services.
Code samples can be found on the Samples page.
You can add a narration track to your video to help visually impaired clients to follow the video recording by listening to the narration. In Media Services v3, you signal descriptive audio tracks by annotating the audio track in the manifest file.
This article shows how to encode a video, upload an audio-only MP4 file (AAC codec) containing descriptive audio into the output asset, and edit the .ism file to include the descriptive audio.
Prerequisites
- Create a Media Services account.
- Follow the steps in Access Azure Media Services API with the Azure CLI and save the credentials. You will need to use them to access the API.
- Review Dynamic packaging.
- Review the Upload, encode, and stream videos tutorial.
Create an input asset and upload a local file into it
The CreateInputAsset function creates a new input asset and uploads the specified local video file into it. This asset is used as the input to your encoding job. In Media Services v3, the input to a job can either be an asset, or it can be content that you make available to your Media Services account via HTTPS URLs.
In Media Services v3, you use Azure Storage APIs to upload files. The following .NET snippet shows how.
The following function performs these actions:
- Creates an Asset
- Gets a writable SAS URL to the asset’s container in storage
- Uploads the file into the container in storage using the SAS URL
private static async Task<Asset> CreateInputAssetAsync(
IAzureMediaServicesClient client,
string resourceGroupName,
string accountName,
string assetName,
string fileToUpload)
{
// In this example, we are assuming that the asset name is unique.
//
// If you already have an asset with the desired name, use the Assets.Get method
// to get the existing asset. In Media Services v3, the Get method on entities returns null
// if the entity doesn't exist (a case-insensitive check on the name).
// Call Media Services API to create an Asset.
// This method creates a container in storage for the Asset.
// The files (blobs) associated with the asset will be stored in this container.
Asset asset = await client.Assets.CreateOrUpdateAsync(resourceGroupName, accountName, assetName, new Asset());
// Use Media Services API to get back a response that contains
// SAS URL for the Asset container into which to upload blobs.
// That is where you would specify read-write permissions
// and the exparation time for the SAS URL.
var response = await client.Assets.ListContainerSasAsync(
resourceGroupName,
accountName,
assetName,
permissions: AssetContainerPermission.ReadWrite,
expiryTime: DateTime.UtcNow.AddHours(4).ToUniversalTime());
var sasUri = new Uri(response.AssetContainerSasUrls.First());
// Use Storage API to get a reference to the Asset container
// that was created by calling Asset's CreateOrUpdate method.
BlobContainerClient container = new BlobContainerClient(sasUri);
BlobClient blob = container.GetBlobClient(Path.GetFileName(fileToUpload));
// Use Strorage API to upload the file into the container in storage.
await blob.UploadAsync(fileToUpload);
return asset;
}
If you need to pass the name of the created input asset to other methods, make sure to use the Name property on the asset object returned from CreateInputAssetAsync, for example, inputAsset.Name.
Create an output asset to store the result of the encoding job
The output Asset stores the result of your encoding job. The following function shows how to create an output asset.
private static async Task<Asset> CreateOutputAssetAsync(IAzureMediaServicesClient client, string resourceGroupName, string accountName, string assetName)
{
bool existingAsset = true;
Asset outputAsset;
try
{
// Check if an Asset already exists
outputAsset = await client.Assets.GetAsync(resourceGroupName, accountName, assetName);
}
catch (ErrorResponseException ex) when (ex.Response.StatusCode == System.Net.HttpStatusCode.NotFound)
{
existingAsset = false;
}
Asset asset = new Asset();
string outputAssetName = assetName;
if (existingAsset)
{
// Name collision! In order to get the sample to work, let's just go ahead and create a unique asset name
// Note that the returned Asset can have a different name than the one specified as an input parameter.
// You may want to update this part to throw an Exception instead, and handle name collisions differently.
string uniqueness = $"-{Guid.NewGuid():N}";
outputAssetName += uniqueness;
Console.WriteLine("Warning – found an existing Asset with name = " + assetName);
Console.WriteLine("Creating an Asset with this name instead: " + outputAssetName);
}
return await client.Assets.CreateOrUpdateAsync(resourceGroupName, accountName, outputAssetName, asset);
}
If you need to pass the name of the created output asset to other methods, make sure to use the Name property on the asset object returned from CreateIOutputAssetAsync, for example, outputAsset.Name.
In the case of this article, pass the outputAsset.Name value to the SubmitJobAsync and UploadAudioIntoOutputAsset functions.
Create a transform and a job that encodes the uploaded file
When encoding or processing content in Media Services, it is a common pattern to set up the encoding settings as a recipe. You would then submit a Job to apply that recipe to a video. By submitting new jobs for each new video, you are applying that recipe to all the videos in your library. A recipe in Media Services is called as a Transform. For more information, see Transforms and Jobs. The sample described in this tutorial defines a recipe that encodes the video in order to stream it to a variety of iOS and Android devices.
The following example creates a transform (if one does not exist).
private static async Task<Transform> GetOrCreateTransformAsync(
IAzureMediaServicesClient client,
string resourceGroupName,
string accountName,
string transformName)
{
bool createTransform = false;
Transform transform = null;
try
{
// Does a transform already exist with the desired name? Assume that an existing Transform with the desired name
// also uses the same recipe or Preset for processing content.
transform = client.Transforms.Get(resourceGroupName, accountName, transformName);
}
catch (ErrorResponseException ex) when (ex.Response.StatusCode == System.Net.HttpStatusCode.NotFound)
{
createTransform = true;
}
if (createTransform)
{
// You need to specify what you want it to produce as an output
TransformOutput[] output = new TransformOutput[]
{
new TransformOutput
{
// The preset for the Transform is set to one of Media Services built-in sample presets.
// You can customize the encoding settings by changing this to use "StandardEncoderPreset" class.
Preset = new BuiltInStandardEncoderPreset()
{
// This sample uses the built-in encoding preset for Adaptive Bitrate Streaming.
PresetName = EncoderNamedPreset.AdaptiveStreaming
}
}
};
// Create the Transform with the output defined above
transform = await client.Transforms.CreateOrUpdateAsync(resourceGroupName, accountName, transformName, output);
}
return transform;
}
The following function submits a job.
private static async Task<Job> SubmitJobAsync(IAzureMediaServicesClient client,
string resourceGroupName,
string accountName,
string transformName,
string jobName,
string inputAssetName,
string outputAssetName)
{
// Use the name of the created input asset to create the job input.
JobInput jobInput = new JobInputAsset(assetName: inputAssetName);
JobOutput[] jobOutputs =
{
new JobOutputAsset(outputAssetName),
};
// In this example, we are assuming that the job name is unique.
//
// If you already have a job with the desired name, use the Jobs.Get method
// to get the existing job. In Media Services v3, the Get method on entities returns null
// if the entity doesn't exist (a case-insensitive check on the name).
Job job = await client.Jobs.CreateAsync(
resourceGroupName,
accountName,
transformName,
jobName,
new Job
{
Input = jobInput,
Outputs = jobOutputs,
});
return job;
}
Wait for the job to complete
The job takes some time to complete and when it does you want to be notified. We recommend using Event Grid to wait for the job to complete.
The job usually goes through the following states: Scheduled, Queued, Processing, Finished (the final state). If the job has encountered an error, you get the Error state. If the job is in the process of being canceled, you get Canceling and Canceled when it is done.
For more information, see Handling Event Grid events.
Upload the audio-only MP4 file
Upload the additional audio-only MP4 file (AAC codec) containing descriptive audio into the output asset.
private static async Task UpoadAudioIntoOutputAsset(
IAzureMediaServicesClient client,
string resourceGroupName,
string accountName,
string outputAssetName,
string fileToUpload)
{
// Use the Assets.Get method to get the existing asset.
// In Media Services v3, the Get method on entities returns null
// if the entity doesn't exist (a case-insensitive check on the name).
// Call Media Services API to create an Asset.
// This method creates a container in storage for the Asset.
// The files (blobs) associated with the asset will be stored in this container.
Asset asset = await client.Assets.GetAsync(resourceGroupName, accountName, outputAssetName);
if (asset != null)
{
// Use Media Services API to get back a response that contains
// SAS URL for the Asset container into which to upload blobs.
// That is where you would specify read-write permissions
// and the exparation time for the SAS URL.
var response = await client.Assets.ListContainerSasAsync(
resourceGroupName,
accountName,
outputAssetName,
permissions: AssetContainerPermission.ReadWrite,
expiryTime: DateTime.UtcNow.AddHours(4).ToUniversalTime());
var sasUri = new Uri(response.AssetContainerSasUrls.First());
// Use Storage API to get a reference to the Asset container
// that was created by calling Asset's CreateOrUpdate method.
CloudBlobContainer container = new CloudBlobContainer(sasUri);
var blob = container.GetBlockBlobReference(Path.GetFileName(fileToUpload));
// Use Strorage API to upload the file into the container in storage.
await blob.UploadFromFileAsync(fileToUpload);
}
}
Here is an example of a call to the UpoadAudioIntoOutputAsset function:
await UpoadAudioIntoOutputAsset(client, config.ResourceGroup, config.AccountName, outputAsset.Name, "audio_description.m4a");
Edit the .ism file
When your encoding job is done, the output asset will contain the files generated by the encoding job.
In the Azure portal, navigate to the storage account associated with your Media Services account.
Find the container with the name of your output asset.
In the container, find the .ism file and click Edit blob (in the right window).
Edit the .ism file by adding the information about the uploaded audio-only MP4 file (AAC codec) containing descriptive audio and press Save when done.
To signal the descriptive audio tracks, you need to add “accessibility” and “role” parameters to the .ism file. It is your responsibility to set these parameters correctly to signal an audio track as audio description. For example, add
<param name="accessibility" value="description" />and<param name="role" value="alternate" />to the .ism file for a specific audio track, as shown in the following example.
<?xml version="1.0" encoding="utf-8"?>
<smil xmlns="http://www.w3.org/2001/SMIL20/Language">
<head>
<meta name="clientManifestRelativePath" content="ignite.ismc" />
<meta name="formats" content="mp4-v3" />
</head>
<body>
<switch>
<audio src="ignite_320x180_AACAudio_381.mp4" systemBitrate="128041" systemLanguage="eng">
<param name="systemBitrate" value="128041" valuetype="data" />
<param name="trackID" value="2" valuetype="data" />
<param name="trackName" value="aac_eng_2_128041_2_1" valuetype="data" />
<param name="systemLanguage" value="eng" valuetype="data" />
<param name="trackIndex" value="ignite_320x180_AACAudio_381_2.mpi" valuetype="data" />
</audio>
<audio src="audio_description.m4a" systemBitrate="194000" systemLanguage="eng">
<param name="trackName" value="aac_eng_audio_description" />
<param name="accessibility" value="description" />
<param name="role" value="alternate" />
</audio>
<video src="ignite_1280x720_AACAudio_3549.mp4" systemBitrate="3549855">
<param name="systemBitrate" value="3549855" valuetype="data" />
<param name="trackID" value="1" valuetype="data" />
<param name="trackName" value="video" valuetype="data" />
<param name="trackIndex" value="ignite_1280x720_AACAudio_3549_1.mpi" valuetype="data" />
</video>
<video src="ignite_960x540_AACAudio_2216.mp4" systemBitrate="2216764">
<param name="systemBitrate" value="2216764" valuetype="data" />
<param name="trackID" value="1" valuetype="data" />
<param name="trackName" value="video" valuetype="data" />
<param name="trackIndex" value="ignite_960x540_AACAudio_2216_1.mpi" valuetype="data" />
</video>
<video src="ignite_640x360_AACAudio_1154.mp4" systemBitrate="1154569">
<param name="systemBitrate" value="1154569" valuetype="data" />
<param name="trackID" value="1" valuetype="data" />
<param name="trackName" value="video" valuetype="data" />
<param name="trackIndex" value="ignite_640x360_AACAudio_1154_1.mpi" valuetype="data" />
</video>
<video src="ignite_480x270_AACAudio_721.mp4" systemBitrate="721893">
<param name="systemBitrate" value="721893" valuetype="data" />
<param name="trackID" value="1" valuetype="data" />
<param name="trackName" value="video" valuetype="data" />
<param name="trackIndex" value="ignite_480x270_AACAudio_721_1.mpi" valuetype="data" />
</video>
<video src="ignite_320x180_AACAudio_381.mp4" systemBitrate="381027">
<param name="systemBitrate" value="381027" valuetype="data" />
<param name="trackID" value="1" valuetype="data" />
<param name="trackName" value="video" valuetype="data" />
<param name="trackIndex" value="ignite_320x180_AACAudio_381_1.mpi" valuetype="data" />
</video>
</switch>
</body>
</smil>
Get a streaming locator
After the encoding is complete, the next step is to make the video in the output Asset available to clients for playback. You can accomplish this in two steps: first, create a Streaming Locator, and second, build the streaming URLs that clients can use.
The process of creating a Streaming Locator is called publishing. By default, the Streaming Locator is valid immediately after you make the API calls, and lasts until it is deleted, unless you configure the optional start and end times.
When creating a StreamingLocator, you will need to specify the desired StreamingPolicyName. In this example, you will be streaming in-the-clear (or non-encrypted content) so the predefined clear streaming policy (PredefinedStreamingPolicy.ClearStreamingOnly) is used.
Important
When using a custom Streaming Policy, you should design a limited set of such policies for your Media Service account, and re-use them for your StreamingLocators whenever the same encryption options and protocols are needed. Your Media Service account has a quota for the number of Streaming Policy entries. You should not be creating a new Streaming Policy for each Streaming Locator.
The following code assumes that you are calling the function with a unique locatorName.
private static async Task<StreamingLocator> CreateStreamingLocatorAsync(
IAzureMediaServicesClient client,
string resourceGroup,
string accountName,
string assetName,
string locatorName)
{
StreamingLocator locator = await client.StreamingLocators.CreateAsync(
resourceGroup,
accountName,
locatorName,
new StreamingLocator
{
AssetName = assetName,
StreamingPolicyName = PredefinedStreamingPolicy.ClearStreamingOnly
});
return locator;
}
While the sample in this topic discusses streaming, you can use the same call to create a Streaming Locator for delivering video via progressive download.
Get streaming URLs
Now that the Streaming Locator has been created, you can get the streaming URLs, as shown in GetStreamingURLs. To build a URL, you need to concatenate the Streaming Endpoint host name and the Streaming Locator path. In this sample, the default Streaming Endpoint is used. When you first create a Media Service account, this default Streaming Endpoint will be in a stopped state, so you need to call Start.
Note
In this method, you need the locatorName that was used when creating the Streaming Locator for the output Asset.
private static async Task<IList<string>> GetStreamingUrlsAsync(
IAzureMediaServicesClient client,
string resourceGroupName,
string accountName,
String locatorName)
{
const string DefaultStreamingEndpointName = "default";
IList<string> streamingUrls = new List<string>();
StreamingEndpoint streamingEndpoint = await client.StreamingEndpoints.GetAsync(resourceGroupName, accountName, DefaultStreamingEndpointName);
if (streamingEndpoint.ResourceState != StreamingEndpointResourceState.Running)
{
await client.StreamingEndpoints.StartAsync(resourceGroupName, accountName, DefaultStreamingEndpointName);
}
ListPathsResponse paths = await client.StreamingLocators.ListPathsAsync(resourceGroupName, accountName, locatorName);
foreach (StreamingPath path in paths.StreamingPaths)
{
UriBuilder uriBuilder = new UriBuilder
{
Scheme = "https",
Host = streamingEndpoint.HostName,
Path = path.Paths[0]
};
streamingUrls.Add(uriBuilder.ToString());
}
return streamingUrls;
}
Test with Azure Media Player
To test the stream, this article uses Azure Media Player.
Note
If a player is hosted on an https site, make sure to update the URL to "https".
Open a web browser and navigate to https://aka.ms/azuremediaplayer/.
In the URL: box, paste one of the streaming URL values you got from your application.
You can paste the URL in HLS, Dash, or Smooth format and Azure Media Player will switch to an appropriate streaming protocol for playback on your device automatically.
Press Update Player.
Azure Media Player can be used for testing but should not be used in a production environment.