How to generate thumbnails using Media Encoder Standard with .NET

This topic shows how to use Media Services .NET SDK to encode an asset and generate thumbnails using Media Encoder Standard. The topic defines the XML and JSON thumbnail presets that you can use to create a task that does encoding and generates thumbnails at the same time. This document contains descriptions of elements that are used by these presets.

Make sure to review the Considerations section.


The following code example uses Media Services .NET SDK to perform the following tasks:

  • Create an encoding job.
  • Get a reference to the Media Encoder Standard encoder.
  • Load the preset XML or JSON that contain the encoding preset as well as information needed to generate thumbnails. You can save this XML or JSON in a file and use the following code to load the file.

          // Load the XML (or JSON) from the local file.
          string configuration = File.ReadAllText(fileName);  
  • Add a single encoding task to the job.
  • Specify the input asset to be encoded.
  • Create an output asset that will contain the encoded asset.
  • Add an event handler to check the job progress.
  • Submit the job.

      using System;
      using System.Collections.Generic;
      using System.Configuration;
      using System.IO;
      using System.Linq;
      using System.Net;
      using System.Security.Cryptography;
      using System.Text;
      using System.Threading.Tasks;
      using Microsoft.WindowsAzure.MediaServices.Client;
      using Newtonsoft.Json.Linq;
      using System.Threading;
      using Microsoft.WindowsAzure.MediaServices.Client.ContentKeyAuthorization;
      using Microsoft.WindowsAzure.MediaServices.Client.DynamicEncryption;
      using System.Web;
      using System.Globalization;
      namespace EncodeAndGenerateThumbnails
          class Program
              // Read values from the App.config file.
              private static readonly string _mediaServicesAccountName =
              private static readonly string _mediaServicesAccountKey =
              // Field for service context.
              private static CloudMediaContext _context = null;
              private static MediaServicesCredentials _cachedCredentials = null;
              private static readonly string _mediaFiles =
              private static readonly string _singleMP4File =
                  Path.Combine(_mediaFiles, @"BigBuckBunny.mp4");
              static void Main(string[] args)
                  // Create and cache the Media Services credentials in a static class variable.
                  _cachedCredentials = new MediaServicesCredentials(
                  // Used the chached credentials to create CloudMediaContext.
                  _context = new CloudMediaContext(_cachedCredentials);
                  // Get an uploaded asset.
                  var asset = _context.Assets.FirstOrDefault();
                  // Encode and generate the thumbnails.
              static public IAsset EncodeToAdaptiveBitrateMP4Set(IAsset asset)
                  // Declare a new job.
                  IJob job = _context.Jobs.Create("Media Encoder Standard Job");
                  // Get a media processor reference, and pass to it the name of the 
                  // processor to use for the specific task.
                  IMediaProcessor processor = GetLatestMediaProcessorByName("Media Encoder Standard");
                  // Load the XML (or JSON) from the local file.
                  string configuration = File.ReadAllText("ThumbnailPreset_JSON.json");
                  // Create a task
                  ITask task = job.Tasks.AddNew("Media Encoder Standard encoding task",
                  // Specify the input asset to be encoded.
                  // Add an output asset to contain the results of the job. 
                  // This output is specified as AssetCreationOptions.None, which 
                  // means the output asset is not encrypted. 
                  task.OutputAssets.AddNew("Output asset",
                  job.StateChanged += new EventHandler<JobStateChangedEventArgs>(JobStateChanged);
                  return job.OutputMediaAssets[0];
              private static void JobStateChanged(object sender, JobStateChangedEventArgs e)
                  Console.WriteLine("Job state changed event:");
                  Console.WriteLine("  Previous state: " + e.PreviousState);
                  Console.WriteLine("  Current state: " + e.CurrentState);
                  switch (e.CurrentState)
                      case JobState.Finished:
                          Console.WriteLine("Job is finished. Please wait while local tasks or downloads complete...");
                      case JobState.Canceling:
                      case JobState.Queued:
                      case JobState.Scheduled:
                      case JobState.Processing:
                          Console.WriteLine("Please wait...\n");
                      case JobState.Canceled:
                      case JobState.Error:
                          // Cast sender as a job.
                          IJob job = (IJob)sender;
                          // Display or log error details as needed.
            private static IMediaProcessor GetLatestMediaProcessorByName(string mediaProcessorName)
                var processor = _context.MediaProcessors.Where(p => p.Name == mediaProcessorName).
                ToList().OrderBy(p => new Version(p.Version)).LastOrDefault();

                if (processor == null)
                    throw new ArgumentException(string.Format("Unknown media processor", mediaProcessorName));

                return processor;


Thumbnail JSON preset

For information about schema, see this topic.

  "Version": 1.0,
  "Codecs": [
      "KeyFrameInterval": "00:00:02",
      "SceneChangeDetection": "true",
      "H264Layers": [
          "Profile": "Auto",
          "Level": "auto",
          "Bitrate": 4500,
          "MaxBitrate": 4500,
          "BufferWindow": "00:00:05",
          "Width": 1280,
          "Height": 720,
          "ReferenceFrames": 3,
          "EntropyMode": "Cabac",
          "AdaptiveBFrame": true,
          "Type": "H264Layer",
          "FrameRate": "0/1"

      "Type": "H264Video"
      "JpgLayers": [
          "Quality": 90,
          "Type": "JpgLayer",
          "Width": 640,
          "Height": 360
      "Start": "{Best}",
      "Type": "JpgImage"
      "PngLayers": [
          "Type": "PngLayer",
          "Width": 640,
          "Height": 360,
      "Start": "00:00:01",
      "Step": "00:00:10",
      "Range": "00:00:58",
      "Type": "PngImage"
      "BmpLayers": [
          "Type": "BmpLayer",
          "Width": 640,
          "Height": 360
      "Start": "10%",
      "Step": "10%",
      "Range": "90%",
      "Type": "BmpImage"
      "Channels": 2,
      "SamplingRate": 48000,
      "Bitrate": 128,
      "Type": "AACAudio"
  "Outputs": [
      "FileName": "{Basename}_{Index}{Extension}",
      "Format": {
        "Type": "JpgFormat"
      "FileName": "{Basename}_{Index}{Extension}",
      "Format": {
        "Type": "PngFormat"
      "FileName": "{Basename}_{Index}{Extension}",
      "Format": {
        "Type": "BmpFormat"
      "FileName": "{Basename}_{Width}x{Height}_{VideoBitrate}.mp4",
      "Format": {
        "Type": "MP4Format"

Thumbnail XML preset

For information about schema, see this topic.

<?xml version="1.0" encoding="utf-16"?>
<Preset xmlns:xsd="" xmlns:xsi="" Version="1.0" xmlns="">
    <JpgImage Start="{Best}">
    <BmpImage Start="10%" Step="10%" Range="90%">
    <PngImage Start="00:00:01" Step="00:00:10" Range="00:00:58">
    <Output FileName="{Basename}_{Width}x{Height}_{VideoBitrate}.mp4">
      <MP4Format />
    <Output FileName="{Basename}_{Index}{Extension}">
      <JpgFormat />
    <Output FileName="{Basename}_{Index}{Extension}">
      <BmpFormat />
    <Output FileName="{Basename}_{Index}{Extension}">
      <PngFormat />


The following considerations apply:

  • The use of explicit timestamps for Start/Step/Range assumes that the input source is at least 1 minute long.
  • Jpg/Png/BmpImage elements have Start, Step and Range string attributes – these can be interpreted as:

    • Frame Number if they are non-negative integers, eg. "Start": "120",
    • Relative to source duration if expressed as %-suffixed, eg. "Start": "15%", OR
    • Timestamp if expressed as HH:MM:SS… format. Eg. "Start" : "00:01:00"

      You can mix and match notations as you please.

      Additionally, Start also supports a special Macro:{Best}, which attempts to determine the first “interesting” frame of the content NOTE: (Step and Range are ignored when Start is set to {Best})

    • Defaults: Start:{Best}
  • Output format needs to be explicitly provided for each Image format: Jpg/Png/BmpFormat. When present, MES will match JpgVideo to JpgFormat and so on. OutputFormat introduces a new image-codec specific Macro: {Index}, which needs to be present (once and only once) for image output formats.

Media Services learning paths

You can view Azure Media Services learning paths here:

Provide feedback

Use the User Voice forum to provide feedback and make suggestions on how to improve Azure Media Services. You can also go directly to one of the following categories:

See Also

Media Services Encoding Overview