Screen capture to video

This article describes how to encode frames captured from the screen with the Windows.Graphics.Capture APIs to a video file. For information on screen capturing still images, see Screen capture. For a simple end-to-end sample app that utilizes the concepts and techniques shown in this article, see SimpleRecorder.

Overview of the video capture process

This article provides a walkthrough of an example app that records the contents of a window to a video file. While it may seem like there is a lot of code required to implement this scenario, the high-level structure of a screen recorder app is fairly simple. The screen capture process uses three primary UWP features:

The example code shown in this article can be categorized into a few different tasks:

  • Initialization - This includes configuring the UWP classes described above, initializing the graphics device interfaces, picking a window to capture, and setting up the encoding parameters such as resolution and frame rate.
  • Event handlers and threading - The primary driver of the main capture loop is the MediaStreamSource which requests frames periodically through the SampleRequested event. This example uses events to coordinate the requests for new frames between the different components of the example. Synchronization is important to allow frames to be captured and encoded simultaneously.
  • Copying frames - Frames are copied from the capture frame buffer into a separate Direct3D surface that can be passed to the MediaStreamSource so that the resource isn't overwritten while being encoded. Direct3D APIs are used to perform this copy operation quickly.

About the Direct3D APIs

As stated above, the copying of each captured frame is probably the most complex part of the implementation shown in this article. At a low level, this operation is done using Direct3D. For this example, we are using the SharpDX library to perform the Direct3D operations from C#. This library is no longer officially supported, but it was chosen because it's performance at low-level copy operations is well-suited for this scenario. We have tried to keep the Direct3D operations as discrete as possible to make it easier for you to substitute your own code or other libraries for these tasks.

Setting up your project

The example code in this walkthrough was created using the Blank App (Universal Windows) C# project template in Visual Studio 2019. In order to use the Windows.Graphics.Capture APIs in your app, you must include the Graphics Capture capability in the Package.appxmanifest file for your project. This example saves generated video files to the Videos Library on the device. To access this folder you must include the Videos Library capability.

To install the SharpDX Nuget package, in Visual Studio select Manage Nuget Packages. In the Browse tab search for the "SharpDX.Direct3D11" package and click Install.

Note that in order to reduce the size of the code listings in this article, the code in the walkthrough below omits explicit namespace references and the declaration of MainPage class member variables which are named with a leading underscore, "_".

Setup for encoding

The SetupEncoding method described in this section initializes some of the main objects that will be used to capture and encode video frames and sets up the encoding parameters for captured video. This method could be called programmatically or in response to a user interaction like a button click. The code listing for SetupEncoding is shown below after the descriptions of the initialization steps.

  • Check for capture support. Before beginning the capture process, you need to call GraphicsCaptureSession.IsSupported to make sure that the screen capture feature is supported on the current device.

  • Initialize Direct3D interfaces. This sample uses Direct3D to copy the pixels captured from the screen into a texture that is encoded as a video frame. The helper methods used to initialize the Direct3D interfaces, CreateD3DDevice and CreateSharpDXDevice, are shown later in this article.

  • Initialize a GraphicsCaptureItem. A GraphicsCaptureItem represents an item on the screen that is going to be captured, either a window or the entire screen. Allow the user to pick an item to capture by creating a GraphicsCapturePicker and calling PickSingleItemAsync.

  • Create a composition texture. Create a texture resource and an associated render target view that will be used to copy each video frame. This texture can't be created until the GraphicsCaptureItem has been created and we know its dimensions. See the description of the WaitForNewFrame to see how this composition texture is used. The helper method for creating this texture is also shown later in this article.

  • Create a MediaEncodingProfile and VideoStreamDescriptor. An instance of the MediaStreamSource class will take images captured from the screen and encode them into a video stream. Then, the video stream will be transcoded into a video file by the MediaTranscoder class. A VideoStreamDecriptor provides encoding parameters, such as resolution and frame rate, for the MediaStreamSource. The video file encoding parameters for the MediaTranscoder are specified with a MediaEncodingProfile. Note that the size used for video encoding doesn't have to be the same as the size of the window being captured, but to keep this example simple, the encoding settings are hard-coded to use the capture item's actual dimensions.

  • Create the MediaStreamSource and MediaTranscoder objects. As mentioned above, the MediaStreamSource object encodes individual frames into a video stream. Call the constructor for this class, passing in the MediaEncodingProfile created in the previous step. Set the buffer time to zero and register handlers for the Starting and SampleRequested events, which will be shown later in this article. Next, construct a new instance of the MediaTranscoder class and enable hardware acceleration.

  • Create an output file The final step in this method is to create a file to which the video will be transcoded. For this example, we will just create a uniquely named file in the Videos Library folder on the device. Note that in order to access this folder, your app must specify the "Videos Library" capability in the app manifest. Once the file has been created, open it for read and write, and pass the resulting stream into the EncodeAsync method which will be shown next.

private async Task SetupEncoding()
{
    if (!GraphicsCaptureSession.IsSupported())
    {
        // Show message to user that screen capture is unsupported
        return;
    }

    // Create the D3D device and SharpDX device
    if (_device == null)
    {
        _device = Direct3D11Helpers.CreateD3DDevice();
    }
    if (_sharpDxD3dDevice == null)
    {
        _sharpDxD3dDevice = Direct3D11Helpers.CreateSharpDXDevice(_device);
    }
    


    try
    {
        // Let the user pick an item to capture
        var picker = new GraphicsCapturePicker();
        _captureItem = await picker.PickSingleItemAsync();
        if (_captureItem == null)
        {
            return;
        }

        // Initialize a blank texture and render target view for copying frames, using the same size as the capture item
        _composeTexture = Direct3D11Helpers.InitializeComposeTexture(_sharpDxD3dDevice, _captureItem.Size);
        _composeRenderTargetView = new SharpDX.Direct3D11.RenderTargetView(_sharpDxD3dDevice, _composeTexture);

        // This example encodes video using the item's actual size.
        var width = (uint)_captureItem.Size.Width; 
        var height = (uint)_captureItem.Size.Height;

        // Make sure the dimensions are are even. Required by some encoders.
        width = (width % 2 == 0) ? width : width + 1;
        height = (height % 2 == 0) ? height : height + 1;


        var temp = MediaEncodingProfile.CreateMp4(VideoEncodingQuality.HD1080p);
        var bitrate = temp.Video.Bitrate;
        uint framerate = 30;

        _encodingProfile = new MediaEncodingProfile();
        _encodingProfile.Container.Subtype = "MPEG4";
        _encodingProfile.Video.Subtype = "H264";
        _encodingProfile.Video.Width = width;
        _encodingProfile.Video.Height = height;
        _encodingProfile.Video.Bitrate = bitrate;
        _encodingProfile.Video.FrameRate.Numerator = framerate;
        _encodingProfile.Video.FrameRate.Denominator = 1;
        _encodingProfile.Video.PixelAspectRatio.Numerator = 1;
        _encodingProfile.Video.PixelAspectRatio.Denominator = 1;

        var videoProperties = VideoEncodingProperties.CreateUncompressed(MediaEncodingSubtypes.Bgra8, width, height);
        _videoDescriptor = new VideoStreamDescriptor(videoProperties);

        // Create our MediaStreamSource
        _mediaStreamSource = new MediaStreamSource(_videoDescriptor);
        _mediaStreamSource.BufferTime = TimeSpan.FromSeconds(0);
        _mediaStreamSource.Starting += OnMediaStreamSourceStarting;
        _mediaStreamSource.SampleRequested += OnMediaStreamSourceSampleRequested;

        // Create our transcoder
        _transcoder = new MediaTranscoder();
        _transcoder.HardwareAccelerationEnabled = true;


        // Create a destination file - Access to the VideosLibrary requires the "Videos Library" capability
        var folder = KnownFolders.VideosLibrary;
        var name = DateTime.Now.ToString("yyyyMMdd-HHmm-ss");
        var file = await folder.CreateFileAsync($"{name}.mp4");
        
        using (var stream = await file.OpenAsync(FileAccessMode.ReadWrite))

        await EncodeAsync(stream);
        
    }
    catch (Exception ex)
    {
        
        return;
    }
}

Start encoding

Now that the main objects have been initialized the EncodeAsync method is implemented to kick off the capture operation. This method first checks to make sure we aren't already recording, and if not, it calls the helper method StartCapture to begin capturing frames from the screen. This method is shown later in this article. Next, PrepareMediaStreamSourceTranscodeAsync is called to get the MediaTranscoder ready to transcode the video stream produced by the MediaStreamSource object to the output file stream, using the encoding profile we created in the previous section. Once the transcoder has been prepared, call TranscodeAsync to start transcoding. For more information on using the MediaTranscoder, see Transcode media files.


private async Task EncodeAsync(IRandomAccessStream stream)
{
    if (!_isRecording)
    {
        _isRecording = true;

        StartCapture();

        var transcode = await _transcoder.PrepareMediaStreamSourceTranscodeAsync(_mediaStreamSource, stream, _encodingProfile);

        await transcode.TranscodeAsync();
    }
}

Handle MediaStreamSource events

The MediaStreamSource object takes frames that we capture from the screen and transforms them into a video stream that can be saved to a file using the MediaTranscoder. We pass the frames to the MediaStreamSource via handlers for the object's events.

The SampleRequested event is raised when the MediaStreamSource is ready for a new video frame. After making sure we are currently recording, the helper method WaitForNewFrame is called to get a new frame captured from the screen. This method, shown later in this article, returns a ID3D11Surface object containing the captured frame. For this example, we wrap the IDirect3DSurface interface in a helper class that also stores the system time at which the frame was captured. Both the frame and the system time are passed into the MediaStreamSample.CreateFromDirect3D11Surface factory method and the resulting MediaStreamSample is set to the MediaStreamSourceSampleRequest.Sample property of the MediaStreamSourceSampleRequestedEventArgs. This is how the captured frame is provided to the MediaStreamSource.

private void OnMediaStreamSourceSampleRequested(MediaStreamSource sender, MediaStreamSourceSampleRequestedEventArgs args)
{
    if (_isRecording && !_closed)
    {
        try
        {
            using (var frame = WaitForNewFrame())
            {
                if (frame == null)
                {
                    args.Request.Sample = null;
                    Stop();
                    Cleanup();
                    return;
                }

                var timeStamp = frame.SystemRelativeTime;

                var sample = MediaStreamSample.CreateFromDirect3D11Surface(frame.Surface, timeStamp);
                args.Request.Sample = sample;
            }
        }
        catch (Exception e)
        {
            Debug.WriteLine(e.Message);
            Debug.WriteLine(e.StackTrace);
            Debug.WriteLine(e);
            args.Request.Sample = null;
            Stop();
            Cleanup();
        }
    }
    else
    {
        args.Request.Sample = null;
        Stop();
        Cleanup();
    }
}

In the handler for the Starting event, we call WaitForNewFrame, but only pass the system time the frame was captured to the MediaStreamSourceStartingRequest.SetActualStartPosition method, which the MediaStreamSource uses to properly encode the timing of the subsequent frames.

private void OnMediaStreamSourceStarting(MediaStreamSource sender, MediaStreamSourceStartingEventArgs args)
{
    using (var frame = WaitForNewFrame())
    {
        args.Request.SetActualStartPosition(frame.SystemRelativeTime);
    }
}

Start capturing

The StartCapture method shown in this step is called from the EncodeAsync helper method shown in a previous step. First, this method initializes up a set of event objects that are used to control the flow of the capture operation.

  • _multithread is a helper class wrapping the SharpDX library's Multithread object that will be used to make sure that no other threads access the SharpDX texture while it's being copied.
  • _frameEvent is used to signal that a new frame has been captured and can be passed to the MediaStreamSource
  • _closedEvent signals that recording has stopped and that we shouldn't wait for any new frames.

The frame event and closed event are added to an array so we can wait for either one of them in the capture loop.

The rest of the StartCapture method sets up the Windows.Graphics.Capture APIs that will do the actual screen capturing. First, an event is registered for the CaptureItem.Closed event. Next, a Direct3D11CaptureFramePool is created, which allows multiple captured frames to be buffered at a time. The CreateFreeThreaded method is used to create the frame pool so that the FrameArrived event is called on the pool's own worker thread rather than on the app's main thread. Next, a handler is registered for the FrameArrived event. Finally, a GraphicsCaptureSession is created for the selected CaptureItem and the capture of frames is initiated by calling StartCapture.

public void StartCapture()
{

    _multithread = _sharpDxD3dDevice.QueryInterface<SharpDX.Direct3D11.Multithread>();
    _multithread.SetMultithreadProtected(true);
    _frameEvent = new ManualResetEvent(false);
    _closedEvent = new ManualResetEvent(false);
    _events = new[] { _closedEvent, _frameEvent };

    _captureItem.Closed += OnClosed;
    _framePool = Direct3D11CaptureFramePool.CreateFreeThreaded(
        _device,
        DirectXPixelFormat.B8G8R8A8UIntNormalized,
        1,
        _captureItem.Size);
    _framePool.FrameArrived += OnFrameArrived;
    _session = _framePool.CreateCaptureSession(_captureItem);
    _session.StartCapture();
}

Handle graphics capture events

In the previous step we registered two handlers for graphics capture events and set up some events to help manage the flow of the capture loop.

The FrameArrived event is raised when the Direct3D11CaptureFramePool has a new captured frame available. In the handler for this event, call TryGetNextFrame on the sender to get the next captured frame. After the frame is retrieved, we set the _frameEvent so that our capture loop knows there is a new frame available.

private void OnFrameArrived(Direct3D11CaptureFramePool sender, object args)
{
    _currentFrame = sender.TryGetNextFrame();
    _frameEvent.Set();
}

In the Closed event handler, we signal the _closedEvent so that the capture loop will know when to stop.

private void OnClosed(GraphicsCaptureItem sender, object args)
{
    _closedEvent.Set();
}

Wait for new frames

The WaitForNewFrame helper method described in this section is where the heavy-lifting of the capture loop occurs. Remember, this method is called from the OnMediaStreamSourceSampleRequested event handler whenever the MediaStreamSource is ready for a new frame to be added to the video stream. At a high-level, this function simply copies each screen-captured video frame from one Direct3D surface to another so that it can be passed into the MediaStreamSource for encoding while a new frame is being captured. This example uses the SharpDX library to perform the actual copy operation.

Before waiting for a new frame, the method disposes of any previous frame stored in the class variable, _currentFrame, and resets the _frameEvent. Then the method waits for either the _frameEvent or the _closedEvent to be signaled. If the closed event is set, then the app calls a helper method to cleanup the capture resources. This method is shown later in this article.

If the frame event is set, then we know that the FrameArrived event handler defined in the previous step has been called, and we begin the process of copying the captured frame data into a Direct3D 11 surface that will be passed to the MediaStreamSource.

This example uses a helper class, SurfaceWithInfo, which simply allows us to pass the video frame and the system time of the frame - both required by the MediaStreamSource - as a single object. The first step of the frame copy process is to instantiate this class and set the system time.

The next steps are the part of this example that relies specifically on the SharpDX library. The helper functions used here are defined at the end of this article. First we use the MultiThreadLock to make sure no other threads access the video frame buffer while we are making the copy. Next, we call the helper method CreateSharpDXTexture2D to create a SharpDX Texture2D object from the video frame. This will be the source texture for the copy operation.

Next, we copy from the Texture2D object created in the previous step into the composition texture we created earlier in the process. This composition texture acts as a swap buffer so that the encoding process can operate on the pixels while the next frame is being captured. To perform the copy, we clear the render target view associated with the composition texture, then we define the region within the texture we want to copy - the entire texture in this case, and then we call CopySubresourceRegion to actually copy the pixels to the composition texture.

We create a copy of the texture description to use when we create our target texture, but the description is modified, setting the BindFlags to RenderTarget so that the new texture has write access. Setting the CpuAccessFlags to None allows the system to optimize the copy operation. The texture description is used to create a new texture resource and the composition texture resource is copied into this new resource with a call to CopyResource. Finally, CreateDirect3DSurfaceFromSharpDXTexture is called to create the IDirect3DSurface object that is returned from this method.

public SurfaceWithInfo WaitForNewFrame()
{
    // Let's get a fresh one.
    _currentFrame?.Dispose();
    _frameEvent.Reset();

    var signaledEvent = _events[WaitHandle.WaitAny(_events)];
    if (signaledEvent == _closedEvent)
    {
        Cleanup();
        return null;
    }

    var result = new SurfaceWithInfo();
    result.SystemRelativeTime = _currentFrame.SystemRelativeTime;
    using (var multithreadLock = new MultithreadLock(_multithread))
    using (var sourceTexture = Direct3D11Helpers.CreateSharpDXTexture2D(_currentFrame.Surface))
    {

        _sharpDxD3dDevice.ImmediateContext.ClearRenderTargetView(_composeRenderTargetView, new SharpDX.Mathematics.Interop.RawColor4(0, 0, 0, 1));

        var width = Math.Clamp(_currentFrame.ContentSize.Width, 0, _currentFrame.Surface.Description.Width);
        var height = Math.Clamp(_currentFrame.ContentSize.Height, 0, _currentFrame.Surface.Description.Height);
        var region = new SharpDX.Direct3D11.ResourceRegion(0, 0, 0, width, height, 1);
        _sharpDxD3dDevice.ImmediateContext.CopySubresourceRegion(sourceTexture, 0, region, _composeTexture, 0);

        var description = sourceTexture.Description;
        description.Usage = SharpDX.Direct3D11.ResourceUsage.Default;
        description.BindFlags = SharpDX.Direct3D11.BindFlags.ShaderResource | SharpDX.Direct3D11.BindFlags.RenderTarget;
        description.CpuAccessFlags = SharpDX.Direct3D11.CpuAccessFlags.None;
        description.OptionFlags = SharpDX.Direct3D11.ResourceOptionFlags.None;

        using (var copyTexture = new SharpDX.Direct3D11.Texture2D(_sharpDxD3dDevice, description))
        {
            _sharpDxD3dDevice.ImmediateContext.CopyResource(_composeTexture, copyTexture);
            result.Surface = Direct3D11Helpers.CreateDirect3DSurfaceFromSharpDXTexture(copyTexture);
        }
    }

    return result;
}

Stop capture and clean up resources

The Stop method provides a way to stop the capture operation. Your app may call this programmatically or in response to a user interaction, like a button click. This method simply sets the _closedEvent. The WaitForNewFrame method defined in the previous steps looks for this event and, if set, shuts down the capture operation.

private void Stop()
{
    _closedEvent.Set();
}

The Cleanup method is used to properly dispose of the resources that were created during the copy operation. This includes:

  • The Direct3D11CaptureFramePool object used by the capture session
  • The GraphicsCaptureSession and GraphicsCaptureItem
  • The Direct3D and SharpDX devices
  • The SharpDX texture and render target view used in the copy operation.
  • The Direct3D11CaptureFrame used for storing the current frame.
private void Cleanup()
{
    _framePool?.Dispose();
    _session?.Dispose();
    if (_captureItem != null)
    {
        _captureItem.Closed -= OnClosed;
    }
    _captureItem = null;
    _device = null;
    _sharpDxD3dDevice = null;
    _composeTexture?.Dispose();
    _composeTexture = null;
    _composeRenderTargetView?.Dispose();
    _composeRenderTargetView = null;
    _currentFrame?.Dispose();
}

Helper wrapper classes

The following helper classes were defined to help with the example code in this article.

The MultithreadLock helper class wraps the SharpDX Multithread class that makes sure that other threads don't access the texture resources while being copied.

class MultithreadLock : IDisposable
{
    public MultithreadLock(SharpDX.Direct3D11.Multithread multithread)
    {
        _multithread = multithread;
        _multithread?.Enter();
    }

    public void Dispose()
    {
        _multithread?.Leave();
        _multithread = null;
    }

    private SharpDX.Direct3D11.Multithread _multithread;
}

SurfaceWithInfo is used to associate an IDirect3DSurface with a SystemRelativeTime representing the a captured frame and the time it was captured, respectively.

public sealed class SurfaceWithInfo : IDisposable
{
    public IDirect3DSurface Surface { get; internal set; }
    public TimeSpan SystemRelativeTime { get; internal set; }

    public void Dispose()
    {
        Surface?.Dispose();
        Surface = null;
    }
}

Direct3D and SharpDX helper APIs

The following helper APIs are defined to abstract out the creation of Direct3D and SharpDX resources. A detailed explanation of these technologies is outside the scope of this article but the code is provided here to allow you to implement the example code shown in the walkthrough.

[ComImport]
[Guid("A9B3D012-3DF2-4EE3-B8D1-8695F457D3C1")]
[InterfaceType(ComInterfaceType.InterfaceIsIUnknown)]
[ComVisible(true)]
interface IDirect3DDxgiInterfaceAccess
{
    IntPtr GetInterface([In] ref Guid iid);
};

public static class Direct3D11Helpers
{
    internal static Guid IInspectable = new Guid("AF86E2E0-B12D-4c6a-9C5A-D7AA65101E90");
    internal static Guid ID3D11Resource = new Guid("dc8e63f3-d12b-4952-b47b-5e45026a862d");
    internal static Guid IDXGIAdapter3 = new Guid("645967A4-1392-4310-A798-8053CE3E93FD");
    internal static Guid ID3D11Device = new Guid("db6f6ddb-ac77-4e88-8253-819df9bbf140");
    internal static Guid ID3D11Texture2D = new Guid("6f15aaf2-d208-4e89-9ab4-489535d34f9c");

    [DllImport(
        "d3d11.dll",
        EntryPoint = "CreateDirect3D11DeviceFromDXGIDevice",
        SetLastError = true,
        CharSet = CharSet.Unicode,
        ExactSpelling = true,
        CallingConvention = CallingConvention.StdCall
        )]
    internal static extern UInt32 CreateDirect3D11DeviceFromDXGIDevice(IntPtr dxgiDevice, out IntPtr graphicsDevice);

    [DllImport(
        "d3d11.dll",
        EntryPoint = "CreateDirect3D11SurfaceFromDXGISurface",
        SetLastError = true,
        CharSet = CharSet.Unicode,
        ExactSpelling = true,
        CallingConvention = CallingConvention.StdCall
        )]
    internal static extern UInt32 CreateDirect3D11SurfaceFromDXGISurface(IntPtr dxgiSurface, out IntPtr graphicsSurface);

    public static IDirect3DDevice CreateD3DDevice()
    {
        return CreateD3DDevice(false);
    }

    public static IDirect3DDevice CreateD3DDevice(bool useWARP)
    {
        var d3dDevice = new SharpDX.Direct3D11.Device(
            useWARP ? SharpDX.Direct3D.DriverType.Software : SharpDX.Direct3D.DriverType.Hardware,
            SharpDX.Direct3D11.DeviceCreationFlags.BgraSupport);
        IDirect3DDevice device = null;

        // Acquire the DXGI interface for the Direct3D device.
        using (var dxgiDevice = d3dDevice.QueryInterface<SharpDX.DXGI.Device3>())
        {
            // Wrap the native device using a WinRT interop object.
            uint hr = CreateDirect3D11DeviceFromDXGIDevice(dxgiDevice.NativePointer, out IntPtr pUnknown);

            if (hr == 0)
            {
                device = Marshal.GetObjectForIUnknown(pUnknown) as IDirect3DDevice;
                Marshal.Release(pUnknown);
            }
        }

        return device;
    }


    internal static IDirect3DSurface CreateDirect3DSurfaceFromSharpDXTexture(SharpDX.Direct3D11.Texture2D texture)
    {
        IDirect3DSurface surface = null;

        // Acquire the DXGI interface for the Direct3D surface.
        using (var dxgiSurface = texture.QueryInterface<SharpDX.DXGI.Surface>())
        {
            // Wrap the native device using a WinRT interop object.
            uint hr = CreateDirect3D11SurfaceFromDXGISurface(dxgiSurface.NativePointer, out IntPtr pUnknown);

            if (hr == 0)
            {
                surface = Marshal.GetObjectForIUnknown(pUnknown) as IDirect3DSurface;
                Marshal.Release(pUnknown);
            }
        }

        return surface;
    }



    internal static SharpDX.Direct3D11.Device CreateSharpDXDevice(IDirect3DDevice device)
    {
        var access = (IDirect3DDxgiInterfaceAccess)device;
        var d3dPointer = access.GetInterface(ID3D11Device);
        var d3dDevice = new SharpDX.Direct3D11.Device(d3dPointer);
        return d3dDevice;
    }

    internal static SharpDX.Direct3D11.Texture2D CreateSharpDXTexture2D(IDirect3DSurface surface)
    {
        var access = (IDirect3DDxgiInterfaceAccess)surface;
        var d3dPointer = access.GetInterface(ID3D11Texture2D);
        var d3dSurface = new SharpDX.Direct3D11.Texture2D(d3dPointer);
        return d3dSurface;
    }


    public static SharpDX.Direct3D11.Texture2D InitializeComposeTexture(
        SharpDX.Direct3D11.Device sharpDxD3dDevice,
        SizeInt32 size)
    {
        var description = new SharpDX.Direct3D11.Texture2DDescription
        {
            Width = size.Width,
            Height = size.Height,
            MipLevels = 1,
            ArraySize = 1,
            Format = SharpDX.DXGI.Format.B8G8R8A8_UNorm,
            SampleDescription = new SharpDX.DXGI.SampleDescription()
            {
                Count = 1,
                Quality = 0
            },
            Usage = SharpDX.Direct3D11.ResourceUsage.Default,
            BindFlags = SharpDX.Direct3D11.BindFlags.ShaderResource | SharpDX.Direct3D11.BindFlags.RenderTarget,
            CpuAccessFlags = SharpDX.Direct3D11.CpuAccessFlags.None,
            OptionFlags = SharpDX.Direct3D11.ResourceOptionFlags.None
        };
        var composeTexture = new SharpDX.Direct3D11.Texture2D(sharpDxD3dDevice, description);
       

        using (var renderTargetView = new SharpDX.Direct3D11.RenderTargetView(sharpDxD3dDevice, composeTexture))
        {
            sharpDxD3dDevice.ImmediateContext.ClearRenderTargetView(renderTargetView, new SharpDX.Mathematics.Interop.RawColor4(0, 0, 0, 1));
        }

        return composeTexture;
    }
}

See also