Frequently asked questions

This article addresses frequently asked questions raised by the Azure Media Services (AMS) user community.

General AMS FAQs

Q: How do you stream to Apple iOS devices

A: add "(format=m3u8-aapl)" path to the "/Manifest" portion of the URL to tell the streaming origin server to return back HLS content for consumption on Apple iOS native devices (for details, see delivering content),

Q: How do you scale indexing?

A: The reserved units are the same for Encoding and Indexing tasks. Follow instructions on How to Scale Encoding Reserved Units. Note that Indexer performance is not affected by Reserved Unit Type.

Q: I uploaded, encoded, and published a video. What would be the reason the video does not play when I try to stream it?

A: One of the most common reasons is you do not have the streaming endpoint from which you are trying to playback in the Running state.

Q: Can I do compositing on a live stream?

A: Compositing on live streams is currently not offered in Azure Media Services, so you would need to pre-compose on your computer.

Q: Can I use Azure CDN with Live Streaming?

A: Media Services supports integration with Azure CDN (for more information, see How to Manage Streaming Endpoints in a Media Services Account). You can use Live streaming with CDN. Azure Media Services provides Smooth Streaming, HLS and MPEG-DASH outputs. All these formats use HTTP for transferring data and get benefits of HTTP caching. In live streaming actual video/audio data is divided to fragments and this individual fragments get cached in CDN. Only data needs to be refreshed is the manifest data. CDN periodically refreshes manifest data.

Q: Does Azure Media services support storing images?

A: If you are just looking to store JPEG or PNG images, you should keep those in Azure Blob Storage. There is no benefit to putting them in your Media Services account unless you want to keep them associated with your Video or Audio Assets. Or if you might have a need to use the images as overlays in the video encoder.Media Encoder Standard supports overlaying images on top of videos, and that is what it lists JPEG and PNG as supported input formats. For more information, see Creating Overlays.

Q: How can I copy assets from one Media Services account to another.

A: To copy assets from one Media Services account to another using .NET, use IAsset.Copy extension method available in the Azure Media Services .NET SDK Extensions repository. For more information, see this forum thread.

Q: What are the supported characters for naming files when working with AMS?

A: Media Services uses the value of the IAssetFile.Name property when building URLs for the streaming content (for example, http://{AMSAccount}{GUID}/{IAssetFile.Name}/streamingParameters.) For this reason, percent-encoding is not allowed. The value of the Name property cannot have any of the following percent-encoding-reserved characters: !*'();:@&=+$,/?%#[]". Also, there can only be one ‘.’ for the file name extension.

Q: How to connect using REST?

A: For information on how to connect to the AMS API, see Access the Azure Media Services API with Azure AD authentication.

Q: How can I rotate a video during the encoding process.

A: The Media Encoder Standard supports rotation by angles of 90/180/270. The default behavior is "Auto", where it tries to detect the rotation metadata in the incoming MP4/MOV file and compensate for it. Include the following Sources element to one of the json presets defined here:

"Version": 1.0,
"Sources": [
  "Streams": [],
  "Filters": {
    "Rotation": "90"
"Codecs": [


Media Services learning paths

Check out the latest version of Azure Media Services: Azure Media Services v3.

You can start with the following topics:

Provide feedback

Use the User Voice forum to provide feedback and make suggestions on how to improve Azure Media Services. You also can go directly to one of the following categories: