Migration guidance for moving from Media Services v2 to v3
Get notified about when to revisit this page for updates by copying and pasting this URL:
https://docs.microsoft.com/api/search/rss?search=%22Migrate+from+Azure+Media+Services+v2+to+v3%22&locale=en-usinto your RSS feed reader.
This article describes changes that were introduced in Azure Media Services v3, shows differences between two versions, and provides the migration guidance.
If you have a video service developed today on top of the legacy Media Services v2 APIs, you should review the following guidelines and considerations prior to migrating to the v3 APIs. There are many benefits and new features in the v3 API that improve the developer experience and capabilities of Media Services. However, as called out in the Known Issues section of this article, there are also some limitations due to changes between the API versions. This page will be maintained as the Media Services team makes continued improvements to the v3 APIs and addresses the gaps between the versions.
Benefits of Media Services v3
API is more approachable
- v3 is based on a unified API surface, which exposes both management and operations functionality built on Azure Resource Manager. Azure Resource Manager templates can be used to create and deploy Transforms, Streaming Endpoints, Live Events, and more.
- OpenAPI Specification (formerly called Swagger) document. Exposes the schema for all service components, including file-based encoding.
- SDKs available for .NET, .NET Core, Node.js, Python, Java, Go, and Ruby.
- Azure CLI integration for simple scripting support.
- For file-based Job processing, you can use a HTTP(S) URL as the input.
You do not need to have content already stored in Azure, nor do you need to create Assets.
- Introduces the concept of Transforms for file-based Job processing. A Transform can be used to build reusable configurations, to create Azure Resource Manager Templates, and isolate processing settings between multiple customers or tenants.
- An Asset can have multiple Streaming Locators each with different Dynamic Packaging and Dynamic Encryption settings.
- Content protection supports multi-key features.
- You can stream Live Events that are up to 24 hours long when using Media Services for transcoding a single bitrate contribution feed into an output stream that has multiple bitrates.
- New Low Latency live streaming support on Live Events. For more information, see latency.
- Live Event Preview supports Dynamic Packaging and Dynamic Encryption. This enables content protection on Preview as well as DASH and HLS packaging.
- Live Output is simpler to use than the Program entity in the v2 APIs.
- Improved RTMP support (increased stability and more source encoder support).
- RTMPS secure ingest.
When you create a Live Event, you get 4 ingest URLs. The 4 ingest URLs are almost identical, have the same streaming token (AppId), only the port number part is different. Two of the URLs are primary and backup for RTMPS.
- You have role-based access control (RBAC) over your entities.
Changes from v2
- For assets created with v3, Media Services supports only the Azure Storage server-side storage encryption.
- The Asset's properties in v3 differ to from v2, see how the properties map.
- The v3 SDKs are now decoupled from the Storage SDK, which gives you more control over the version of Storage SDK you want to use and avoids versioning issues.
- In the v3 APIs, all of the encoding bit rates are in bits per second. This is different than the v2 Media Encoder Standard presets. For example, the bitrate in v2 would be specified as 128 (kbps), but in v3 it would be 128000 (bits/second).
- Entities AssetFiles, AccessPolicies, and IngestManifests do not exist in v3.
- The IAsset.ParentAssets property does not exist in v3.
- ContentKeys is no longer an entity, it is now a property of the Streaming Locator.
- Event Grid support replaces NotificationEndpoints.
- The following entities were renamed
- Live Outputs start on creation and stop when deleted. Programs worked differently in the v2 APIs, they had to be started after creation.
- To get information about a job, you need to know the Transform name under which the job was created.
- In v2, XML input and output metadata files get generated as the result of an encoding job. In v3, the metadata format changed from XML to JSON.
Feature gaps with respect to v2 APIs
The v3 API has the following feature gaps with respect to the v2 API. Closing the gaps is work in progress.
The Premium Encoder and the legacy media analytics processors (Azure Media Services Indexer 2 Preview, Face Redactor, etc.) are not accessible via v3.
Customers who wish to migrate from the Media Indexer 1 or 2 preview can immediately use the AudioAnalyzer preset in the v3 API. This new preset contains more features than the older Media Indexer 1 or 2.
Many of the advanced features of the Media Encoder Standard in v2 APIs are currently not available in v3, such as:
- Stitching of Assets
- Thumbnail Sprites
- Inserting a silent audio track when input has no audio
- Inserting a video track when input has no video
Live Events with transcoding currently do not support Slate insertion mid-stream and ad marker insertion via API call.
Please bookmark this article and keep checking for updates.
The following table shows the code differences between v2 and v3 for common scenarios.
|Scenario||V2 API||V3 API|
|Create an asset and upload a file||v2 .NET example||v3 .NET example|
|Submit a job||v2 .NET example||v3 .NET example
Shows how to first create a Transform and then submit a Job.
|Publish an asset with AES encryption||1. Create ContentKeyAuthorizationPolicyOption
2. Create ContentKeyAuthorizationPolicy
3. Create AssetDeliveryPolicy
4. Create Asset and upload content OR Submit job and use output asset
5. Associate AssetDeliveryPolicy with Asset
6. Create ContentKey
7. Attach ContentKey to Asset
8. Create AccessPolicy
9. Create Locator
v2 .NET example
|1. Create Content Key Policy
2. Create Asset
3. Upload content or use Asset as JobOutput
4. Create Streaming Locator
v3 .NET example
|Get job details and manage jobs||Manage jobs with v2||Manage jobs with v3|
- Currently, you cannot use the Azure portal to manage v3 resources. Use the REST API, CLI, or one of the supported SDKs.
- You need to provision Media Reserved Units (MRUs) in your account in order to control the concurrency and performance of your Jobs, particularly ones involving Video or Audio Analysis. For more information, see Scaling Media Processing. You can manage the MRUs using CLI 2.0 for Media Services v3, using the Azure portal, or using the v2 APIs. You need to provision MRUs, whether you are using Media Services v2 or v3 APIs.
- Media Services entities created with the v3 API cannot be managed by the v2 API.
- Not all entities in the V2 API automatically show up in the V3 API. Following are examples of entities in the two versions that are incompatible:
- Jobs and Tasks created in v2 do not show up in v3 as they are not associated with a Transform. The recommendation is to switch to v3 Transforms and Jobs. There will be a relatively short time period of needing to monitor the inflight v2 Jobs during the switchover.
- Channels and Programs created with v2 (which are mapped to Live Events and Live Outputs in v3) cannot continue being managed with v3. The recommendation is to switch to v3 Live Events and Live Outputs at a convenient Channel stop.
Presently, you cannot migrate continuously running Channels.
This page will be maintained as the Media Services team makes continued improvements to the v3 APIs and addresses the gaps between the versions.
Ask questions, give feedback, get updates
Check out the Azure Media Services community article to see different ways you can ask questions, give feedback, and get updates about Media Services.
To see how easy it is to start encoding and streaming video files, check out Stream files.