Tutorial: Stream live with Media Services by using Node.js and TypeScript

Media Services logo v3


Warning

Azure Media Services will be retired June 30th, 2024. For more information, see the AMS Retirement Guide.

In Azure Media Services, live events are responsible for processing live streaming content. A live event provides an input endpoint (ingest URL) that you then provide to a live encoder. The live event receives input streams from the live encoder and makes them available for streaming through one or more streaming endpoints. Live events also provide a preview endpoint (preview URL) that you use to preview and validate your stream before further processing and delivery.

This tutorial shows how to use Node.js and TypeScript to create a pass-through type of a live event and broadcast a live stream to it by using OBS Studio.

In this tutorial, you will:

Note

Even though the tutorial uses Node.js examples, the general steps are the same for REST API, CLI, or other supported SDKs.

Prerequisites

You need the following items to complete the tutorial:

You need these additional items for live-streaming software:

  • A camera or a device (like a laptop) that's used to broadcast an event.

  • An on-premises software encoder that encodes your camera stream and sends it to the Media Services live-streaming service through the Real-Time Messaging Protocol (RTMP). For more information, see Recommended on-premises live encoders. The stream has to be in RTMP or Smooth Streaming format.

    This sample assumes that you'll use Open Broadcaster Software (OBS) Studio to broadcast RTMP to the ingest endpoint. Install OBS Studio.

    Use the following encoding settings in OBS Studio:

    • Encoder: NVIDIA NVENC (if available) or x264
    • Rate control: CBR
    • Bit rate: 2,500 Kbps (or something reasonable for your computer)
    • Keyframe interval: 2 s, or 1 s for low latency
    • Preset: Low-latency Quality or Performance (NVENC) or "veryfast" using x264
    • Profile: high
    • GPU: 0 (Auto)
    • Max B-frames: 2

Tip

Review Live streaming with Media Services v3 before proceeding.

Download and configure the sample

Clone the GitHub repository that contains the live-streaming Node.js sample to your machine by using the following command:

git clone https://github.com/Azure-Samples/media-services-v3-node-tutorials.git

The live-streaming sample is in the Live folder.

In the root folder folder, copy the file named sample.env to a new file called .env to store your environment variable settings that you gathered in the article Access the Azure Media Services API with the Azure CLI. Make sure that the file name includes the dot (.) in front of "env" so it can work with the code sample correctly.

The .env file contains your Azure Active Directory (Azure AD) application key and secret. It also contains the account name and subscription information required to authenticate SDK access to your Media Services account. The .gitignore file is already configured to prevent publishing this file into your forked repository. Don't allow these credentials to be leaked, because they're important secrets for your account.

Important

This sample uses a unique suffix for each resource. If you cancel the debugging or terminate the app without running it through, you'll end up with multiple live events in your account.

Be sure to stop the running live events. Otherwise, you'll be billed! Run the program all the way to completion to clean up resources automatically. If the program stops, or you inadvertently stop the debugger and break out of the program execution, you should double check the portal to confirm that you haven't left any live events in the running or standby state that would result in unwanted billing charges.

Examine the TypeScript code for live streaming

This section examines functions defined in the index.ts file of the Live/Standard_Passthrough_Live_Event project.

The sample creates a unique suffix for each resource so that you don't have name collisions if you run the sample multiple times without cleaning up.

Start using the Media Services SDK for Node.js with TypeScript

To start using Media Services APIs with Node.js, you need to first add the @azure/arm-mediaservices SDK module by using the npm package manager:

npm install @azure/arm-mediaservices

In the package.json file, this is already configured for you. You just need to run npm install to load the modules and dependencies:

  1. Install the packages used in the packages.json file:

    npm install
    
  2. Open Visual Studio Code from the root folder. (This is required to start from the folder where the .vscode folder and tsconfig.json files are located.)

    code .
    

Open the folder for Live/Standard_Passthrough_Live_Event, and open the index.ts file in the Visual Studio Code editor.

While you're in the index.ts file, select the F5 key to open the debugger.

Setting the longRunningOperationUpdateIntervalMs

To speed up the polling of long running operations from the default of 30s down to a couple of seconds, you need to set the longRunningOperationUpdateIntervalMs and pass this value to the updateIntervaleInMs property of the options parameter on createAndWait() operations when using liveEvents. This can be seen throughout the sample. This sample uses a value of 2000 ms (2 seconds). This change reduces the time it takes to poll for the status of a long-running operation on the Azure Resource Manager endpoint. It will shorten the time to complete major operations like creating live events, starting, and stopping, which are all asynchronous calls. We recommend a value of 2 seconds for most scenarios that are time sensitive.


// Long running operation polling interval in milliseconds
const longRunningOperationUpdateIntervalMs = 2000;

Create a live event

This section shows how to create a standard pass-through type of live event (LiveEventEncodingType set to PassthroughStandard). For information about the available types, see Live event types. In addition to basic or standard pass-through, you can use a live encoding event for 720p or 1080p adaptive bitrate cloud encoding. Examples of each of these types of events is available in the Live folder of the sample repository. In addition, a sample demonstrating how to listen to Event Grid events through Event Hubs is also included.

You might want to specify the following things when you're creating the live event:

  • The ingest protocol for the live event. Currently, the RTMP, RTMPS, and Smooth Streaming protocols are supported. You can't change the protocol option while the live event or its associated live outputs are running. If you need different protocols, create a separate live event for each streaming protocol.

  • IP restrictions on the ingest and preview. You can define the IP addresses that are allowed to ingest a video to this live event. Allowed IP addresses can be specified as one of these choices:

    • A single IP address (for example, 10.0.0.1)
    • An IP range that uses an IP address and a Classless Inter-Domain Routing (CIDR) subnet mask (for example, 10.0.0.1/22)
    • An IP range that uses an IP address and a dotted decimal subnet mask (for example, 10.0.0.1(255.255.252.0))

    If no IP addresses are specified and there's no rule definition, then no IP address will be allowed. To allow any IP address, create a rule and set 0.0.0.0/0. The IP addresses have to be in one of the following formats: IPv4 or IPv6 addresses with four numbers or a CIDR address range. For more information about using IPv4 or IPv6, see Restrict access to DRM license and AES key delivery using IP allowlists.

  • Autostart on an event as you create it. When autostart is set to true, the live event will start after creation. That means the billing starts as soon as the live event starts running. You must explicitly call Stop on the live event resource to halt further billing. For more information, see Live event states and billing.

    Standby modes are available to start the live event in a lower-cost "allocated" state that makes it faster to move to a running state. This is useful for situations like hot pools that need to hand out channels quickly to streamers.

  • A static host name and a unique GUID. For an ingest URL to be predictive and easier to maintain in a hardware-based live encoder, set the useStaticHostname property to true. For accessToken, use a custom, unique GUID. For detailed information, see Live event ingest URLs.


// Creating the LiveEvent - the primary object for live streaming in AMS. 
// See the overview - https://docs.microsoft.com/azure/media-services/latest/live-streaming-overview

// Create the LiveEvent

// Understand the concepts of what a live event and a live output is in AMS first!
// Read the following - https://docs.microsoft.com/azure/media-services/latest/live-events-outputs-concept
// 1) Understand the billing implications for the various states
// 2) Understand the different live event types, pass-through and encoding
// 3) Understand how to use long-running async operations 
// 4) Understand the available Standby mode and how it differs from the Running Mode. 
// 5) Understand the differences between a LiveOutput and the Asset that it records to.  They are two different concepts.
//    A live output can be considered as the "tape recorder" and the Asset is the tape that is inserted into it for recording.
// 6) Understand the advanced options such as low latency, and live transcription/captioning support. 
//    Live Transcription - https://docs.microsoft.com/en-us/azure/media-services/latest/live-transcription
//    Low Latency - https://docs.microsoft.com/en-us/azure/media-services/latest/live-event-latency

// When broadcasting to a live event, please use one of the verified on-premises live streaming encoders.
// While operating this tutorial, it is recommended to start out using OBS Studio before moving to another encoder. 

// Note: When creating a LiveEvent, you can specify allowed IP addresses in one of the following formats:                 
//      To allow all IPv4 addresses and block all IPv6 addresses, set the IP allow list to [ "0.0.0.0/0" ]
//      IpV4 address with 4 numbers
//      CIDR address range

let allowAllIPv4InputRange: IPRange = {
    name: "Allow all IPv4 addresses",
    address: "0.0.0.0",
    subnetPrefixLength: 0
};

// IpV6 addresses or ranges
//  For this example, the following request from the following addresses will be accepted:
//      •	IPv6 addresses between 2001:1234:1234:0000:0000:0000:0000:4567 and 2001:1234:FFFF:FFFF:FFFF:FFFF:FFFF:FFFF,
//      •	IPv6 address 2001:1235:0000:0000:0000:0000:0000:0000
//  Additional examples:
//      •	To allow requests from any IP address, set the “defaultAction” of the “accessControl” block to “Allow” (and do not specify an “ipAllowList)
//      •	To allow all IPv6 addresses and block all IPv6 addresses, set the IP allow list to [ "::/0" ]

let allowAllIPv6InputRange: IPRange = {
    name: "Allow all IPv6 addresses",
    address: "::",
    subnetPrefixLength: 0
};

// Create the LiveEvent input IP access control object
// this will control the IP that the encoder is running on and restrict access to only that encoder IP range.
let liveEventInputAccess: LiveEventInputAccessControl = {
    ip: {
        allow: [
            // re-use the same range here for the sample, but in production you can lock this
            // down to the ip range for your on-premises live encoder, laptop, or device that is sending
            // the live stream
            allowAllIPv4InputRange,
            allowAllIPv6InputRange
        ]
    }
};

// Create the LiveEvent Preview IP access control object. 
// This will restrict which clients can view the preview endpoint
let liveEventPreview: LiveEventPreview = {
    accessControl: {
        ip: {
            allow: [
                // re-use the same range here for the sample, but in production you can lock this to the IPs of your 
                // devices that would be monitoring the live preview. 
                allowAllIPv4InputRange,
                allowAllIPv6InputRange
            ]
        }
    }
}

// To get the same ingest URL for the same LiveEvent name every single time...
// 1. Set useStaticHostname  to true so you have ingest like: 
//        rtmps://liveevent-hevc12-eventgridmediaservice-usw22.channel.media.azure.net:2935/live/522f9b27dd2d4b26aeb9ef8ab96c5c77           
// 2. Set accessToken to a desired GUID string (with or without hyphen)

// See REST API documentation for details on each setting value
// https://docs.microsoft.com/rest/api/media/liveevents/create 

let liveEventCreate: LiveEvent = {
    location: mediaAccount.location,
    description: "Sample Live Event from Node.js SDK sample",
    // Set useStaticHostname to true to make the ingest and preview URL host name the same. 
    // This can slow things down a bit. 
    useStaticHostname: true,
    //hostnamePrefix: "somethingstatic", /// When using Static host name true, you can control the host prefix name here if desired 
    // 1) Set up the input settings for the Live event...
    input: {
        streamingProtocol: KnownLiveEventInputProtocol.Rtmp, // options are RTMP or Smooth Streaming ingest format.
        accessControl: liveEventInputAccess,  // controls the IP restriction for the source encoder. 
        // keyFrameIntervalDuration: "PT2S",  // Set this to match the ingest encoder's settings. This should not be used for encoding live events  
        accessToken: "9eb1f703b149417c8448771867f48501" // Use this value when you want to make sure the ingest URL is static and always the same. If omitted, the service will generate a random GUID value.
    },

    // 2) Set the live event to use pass-through or cloud encoding modes...
    encoding: {
        // Set this to Basic pass-through, Standard pass-through, Standard or Premium1080P to use the cloud live encoder.
        // See https://go.microsoft.com/fwlink/?linkid=2095101 for more information
        // Otherwise, leave as "None" to use pass-through mode
        encodingType: KnownLiveEventEncodingType.PassthroughStandard,
        // OPTIONS for encoding type you can use:
        // encodingType: KnownLiveEventEncodingType.PassthroughBasic, // Basic pass-through mode - the cheapest option!
        // encodingType: KnownLiveEventEncodingType.PassthroughStandard, // also known as standard pass-through mode (formerly "none")
        // encodingType: KnownLiveEventEncodingType.Premium1080p,// live transcoding up to 1080P 30fps with adaptive bitrate set
        // encodingType: KnownLiveEventEncodingType.Standard,// use live transcoding in the cloud for 720P 30fps with adaptive bitrate set
        //
        // OPTIONS using live cloud encoding type:
        // keyFrameInterval: "PT2S", //If this value is not set for an encoding live event, the fragment duration defaults to 2 seconds. The value cannot be set for pass-through live events.
        // presetName: null, // only used for custom defined presets. 
        //stretchMode: KnownStretchMode.None // can be used to determine stretch on encoder mode
    },
    // 3) Set up the Preview endpoint for monitoring based on the settings above we already set. 
    preview: liveEventPreview,

    // 4) Set up more advanced options on the live event. Low Latency is the most common one. 
    streamOptions: [
        "LowLatency"
    ],

    // 5) Optionally enable live transcriptions if desired. 
    // WARNING : This is extra cost ($$$), so please check pricing before enabling. Transcriptions are not supported on PassthroughBasic.
    //           switch this sample to use encodingType: "PassthroughStandard" first before un-commenting the transcriptions object below. 

    /* transcriptions : [
        {
            inputTrackSelection: [], // chose which track to transcribe on the source input.
            // The value should be in BCP-47 format (e.g: 'en-US'). See https://go.microsoft.com/fwlink/?linkid=2133742
            language: "en-us", 
            outputTranscriptionTrack: {
                trackName : "English" // set the name you want to appear in the output manifest
            }
        }
    ]
    */
}



console.log("Creating the LiveEvent, please be patient as this can take time to complete async.")
console.log("Live Event creation is an async operation in Azure and timing can depend on resources available.")
console.log();

let timeStart = process.hrtime();
// When autostart is set to true, the Live Event will be started after creation. 
// That means, the billing starts as soon as the Live Event starts running. 
// You must explicitly call Stop on the Live Event resource to halt further billing.
// The following operation can sometimes take awhile. Be patient.
// On optional workflow is to first call allocate() instead of create. 
// https://docs.microsoft.com/en-us/rest/api/media/liveevents/allocate 
// This allows you to allocate the resources and place the live event into a "Standby" mode until 
// you are ready to transition to "Running". This is useful when you want to pool resources in a warm "Standby" state at a reduced cost.
// The transition from Standby to "Running" is much faster than cold creation to "Running" using the autostart property.
// Returns a long running operation polling object that can be used to poll until completion.
await mediaServicesClient.liveEvents.beginCreateAndWait(
    resourceGroup,
    accountName,
    liveEventName,
    liveEventCreate,
    // When autostart is set to true, you should "await" this method operation to complete. 
    // The Live Event will be started after creation. 
    // You may choose not to do this, but create the object, and then start it using the standby state to 
    // keep the resources "warm" and billing at a lower cost until you are ready to go live. 
    // That increases the speed of startup when you are ready to go live. 
    {
        autoStart: false,
        updateIntervalInMs: longRunningOperationUpdateIntervalMs // This sets the polling interval for the long running ARM operation (LRO)
    }
).then((liveEvent) => {
    let timeEnd = process.hrtime(timeStart);
    console.info(`Live Event Created - long running operation complete! Name: ${liveEvent.name}`)
    console.info(`Execution time for create LiveEvent: %ds %dms`, timeEnd[0], timeEnd[1] / 1000000);
    console.log();
}).catch((reason) => {
    if (reason.error && reason.error.message) {
        console.info(`Live Event creation failed: ${reason.message}`);
    }
})

Create an asset to record and archive the live event

In the following block of code, you create an empty asset to use as the "tape" to record your live event archive to.

When you're learning these concepts, it's helpful to think of the asset object as the tape that you would insert into a video tape recorder in the old days. The live output is the tape recorder machine. The live event is just the video signal coming into the back of the machine.

Keep in mind that the asset, or "tape," can be created at any time. You'll hand the empty asset to the live output object, the "tape recorder" in this analogy.


// Create an Asset for the LiveOutput to use. Think of this as the "tape" that will be recorded to. 
// The asset entity points to a folder/container in your Azure Storage account. 
console.log(`Creating an asset named: ${assetName}`);
console.log();
let asset = await mediaServicesClient.assets.createOrUpdate(resourceGroup, accountName, assetName, {});

// Create the Live Output - think of this as the "tape recorder for the live event". 
// Live outputs are optional, but are required if you want to archive the event to storage,
// use the asset for on-demand playback later, or if you want to enable cloud DVR time-shifting.
// We will use the asset created above for the "tape" to record to. 
let manifestName: string = "output";
console.log(`Creating a live output named: ${liveOutputName}`);
console.log();

// See the REST API for details on each of the settings on Live Output
// https://docs.microsoft.com/rest/api/media/liveoutputs/create

Create the live output

In this section, you create a live output that uses the asset name as input to tell where to record the live event to. In addition, you set up the time-shifting (DVR) window to be used in the recording.

The sample code shows how to set up a 1-hour time-shifting window. This window will allow clients to play back anything in the last hour of the event. In addition, only the last 1 hour of the live event will remain in the archive. You can extend this window to be up to 25 hours if needed. Also note that you can control the output manifest naming that the HTTP Live Streaming (HLS) and Dynamic Adaptive Streaming over HTTP (DASH) manifests use in your URL paths when published.

The live output, or "tape recorder" in our analogy, can be created at any time as well. You can create a live output before starting the signal flow, or after. If you need to speed up things, it's often helpful to create the output before you start the signal flow.

Live outputs start when they're created and stop when they're deleted. When you delete the live output, you're not deleting the underlying asset or content in the asset. Think of it as ejecting the "tape." The asset with the recording will last as long as you like. When it's ejected (meaning, when the live output is deleted), it will be available for on-demand viewing immediately.

let liveOutputCreate: LiveOutput;
if (asset.name) {
    liveOutputCreate = {
        description: "Optional description when using more than one live output",
        assetName: asset.name,
        manifestName: manifestName, // The HLS and DASH manifest file name. This is recommended to set if you want a deterministic manifest path up front.
        archiveWindowLength: "PT30M", // sets the asset archive window to 30 minutes. Uses ISO 8601 format string.
        rewindWindowLength: "PT30M", // sets the time-shit(DVR) window to 30 minutes. Uses ISO 8601 format string.
        hls: {
            fragmentsPerTsSegment: 1 // Advanced setting when using HLS TS output only.
        },
    }

    // Create and await the live output
    await mediaServicesClient.liveOutputs.beginCreateAndWait(
        resourceGroup,
        accountName,
        liveEventName,
        liveOutputName,
        liveOutputCreate,
        {
            updateIntervalInMs: longRunningOperationUpdateIntervalMs // Setting this adjusts the polling interval of the long running operation. 
        })
        .then((liveOutput) => {
            console.log(`Live Output Created: ${liveOutput.name}`);
            let timeEnd = process.hrtime(timeStart);
            console.info(`Execution time for create Live Output: %ds %dms`, timeEnd[0], timeEnd[1] / 1000000);
            console.log();
        })
        .catch((reason) => {
            if (reason.error && reason.error.message) {
                console.info(`Live Output creation failed: ${reason.message}`);
            }
        });


}

Get ingest URLs

After the live event is created, you can get ingest URLs that you'll provide to the live encoder. The encoder uses these URLs to input a live stream by using the RTMP protocol.


// Get the RTMP ingest URL to configure in OBS Studio. 
// The endpoints is a collection of RTMP primary and secondary, and RTMPS primary and secondary URLs. 
// to get the primary secure RTMPS, it is usually going to be index 3, but you could add a  loop here to confirm...
if (liveEvent.input?.endpoints) {
    let ingestUrl = liveEvent.input.endpoints[0].url;
    console.log(`The RTMP ingest URL to enter into OBS Studio is:`);
    console.log(`RTMP ingest : ${ingestUrl}`);
    console.log(`Make sure to enter a Stream Key into the OBS studio settings. It can be any value or you can repeat the accessToken used in the ingest URL path.`);
    console.log();
}

Get the preview URL

Use previewEndpoint to preview and verify that the input from the encoder is being received.

Important

Make sure that the video is flowing to the preview URL before you continue.

if (liveEvent.preview?.endpoints) {
    // Use the previewEndpoint to preview and verify
    // that the input from the encoder is actually being received
    // The preview endpoint URL also support the addition of various format strings for HLS (format=m3u8-cmaf) and DASH (format=mpd-time-cmaf) for example.
    // The default manifest is Smooth. 
    let previewEndpoint = liveEvent.preview.endpoints[0].url;
    console.log("The preview url is:");
    console.log(previewEndpoint);
    console.log();
    console.log("Open the live preview in your browser and use any DASH or HLS player to monitor the preview playback:");
    console.log(`https://ampdemo.azureedge.net/?url=${previewEndpoint}(format=mpd-time-cmaf)&heuristicprofile=lowlatency`);
    console.log("You will need to refresh the player page SEVERAL times until enough data has arrived to allow for manifest creation.");
    console.log("In a production player, the player can inspect the manifest to see if it contains enough content for the player to load and auto reload.");
    console.log();
}

console.log("Start the live stream now, sending the input to the ingest url and verify that it is arriving with the preview url.");
console.log("IMPORTANT TIP!: Make CERTAIN that the video is flowing to the Preview URL before continuing!");

Create and manage live events and live outputs

After you have the stream flowing into the live event, you can begin the streaming event by publishing a streaming locator for your client players to use. This will make it available to viewers through the streaming endpoint.

You first create the signal by creating the live event. The signal is not flowing until you start that live event and connect your encoder to the input.

To stop the "tape recorder," you call delete on LiveOutput. This action doesn't delete the contents of your archive on the "tape" (asset). It only deletes the "tape recorder" and stops the archiving. The asset is always kept with the archived video content until you call delete explicitly on the asset itself. As soon as you delete LiveOutput, the recorded content of the asset is still available to play back through any published streaming locator URLs.

If you want to remove the ability of a client to play back the archived content, you first need to remove all locators from the asset. You also flush the content delivery network (CDN) cache on the URL path, if you're using a CDN for delivery. Otherwise, the content will live in the CDN's cache for the standard time-to-live setting on the CDN (which might be up to 72 hours).

Create a streaming locator to publish HLS and DASH manifests

Note

When your Media Services account is created, a default streaming endpoint is added to your account in the stopped state. To start streaming your content and take advantage of dynamic packaging and dynamic encryption, the streaming endpoint from which you want to stream content has to be in the running state.

When you publish the asset by using a streaming locator, the live event (up to the DVR window length) will continue to be viewable until the streaming locator's expiration or deletion, whichever comes first. This is how you make the virtual "tape" recording available for your viewing audience to see live and on demand. The same URL can be used to watch the live event, the DVR window, or the on-demand asset when the recording is complete (when the live output is deleted).

async function createStreamingLocator(assetName: string, locatorName: string) {
    let streamingLocator = {
        assetName: assetName,
        streamingPolicyName: "Predefined_ClearStreamingOnly"  // no DRM or AES128 encryption protection on this asset. Clear means un-encrypted.
    };

    let locator = await mediaServicesClient.streamingLocators.create(
        resourceGroup,
        accountName,
        locatorName,
        streamingLocator);

    return locator;
}

Build the paths to the HLS and DASH manifests

The method BuildManifestPaths in the sample shows how to deterministically create the streaming paths to use for HLS or DASH delivery to various clients and player frameworks.


// This method builds the manifest URL from the static values used during creation of the Live Output.
// This allows you to have a deterministic manifest path. <streaming endpoint hostname>/<streaming locator ID>/manifestName.ism/manifest(<format string>)
async function buildManifestPaths(scheme: string, hostname: string | undefined, streamingLocatorId: string | undefined, manifestName: string) {
    const hlsFormat: string = "format=m3u8-cmaf";
    const dashFormat: string = "format=mpd-time-cmaf";

    let manifestBase = `${scheme}://${hostname}/${streamingLocatorId}/${manifestName}.ism/manifest`
    let hlsManifest = `${manifestBase}(${hlsFormat})`;
    console.log(`The HLS (MP4) manifest URL is : ${hlsManifest}`);
    console.log("Open the following URL to playback the live stream in an HLS compliant player (HLS.js, Shaka, ExoPlayer) or directly in an iOS device");
    console.log(`${hlsManifest}`);
    console.log();

    let dashManifest = `${manifestBase}(${dashFormat})`;
    console.log(`The DASH manifest URL is : ${dashManifest}`);
    console.log("Open the following URL to playback the live stream from the LiveOutput in the Azure Media Player");
    console.log(`https://ampdemo.azureedge.net/?url=${dashManifest}&heuristicprofile=lowlatency`);
    console.log();
}

Watch the event

To watch the event, copy the streaming URL that you got when you ran the code to create a streaming locator. You can use a media player of your choice. Azure Media Player is available to test your stream at the Media Player demo site.

A live event automatically converts events to on-demand content when it's stopped. Even after you stop and delete the event, users can stream your archived content as a video on demand for as long as you don't delete the asset. An asset can't be deleted if an event is using it; the event must be deleted first.

Clean up resources in your Media Services account

If you run the application all the way through, it will automatically clean up all of the resources used in the cleanUpResources function. Make sure that the application or debugger runs all the way to completion, or you might leak resources and end up with running live events in your account. Double check in the Azure portal to confirm that all resources are cleaned up in your Media Services account.

In the sample code, refer to the cleanUpResources method for details.

Important

Leaving the live event running incurs billing costs. Be aware that if the project or program stops responding or is closed out for any reason, it might leave the live event running in a billing state.

Ask questions, give feedback, get updates

Check out the Azure Media Services community article to see different ways you can ask questions, give feedback, and get updates about Media Services.

More developer documentation for Node.js on Azure

Get help and support

You can contact Media Services with questions or follow our updates by one of the following methods: