屏幕捕获到视频Screen capture to video

本文介绍如何使用 Windows 将从屏幕捕获的帧编码为视频文件。This article describes how to encode frames captured from the screen with the Windows.Graphics.Capture APIs to a video file. 有关屏幕捕获静态图像的信息,请参阅 Screeen 捕获For information on screen capturing still images, see Screeen capture. 有关利用本文中所述的概念和技术的简单的端到端示例应用,请参阅 SimpleRecorderFor a simple end-to-end sample app that utilizes the concepts and techniques shown in this article, see SimpleRecorder.

视频捕获过程概述Overview of the video capture process

本文提供了一个示例应用的演练,该应用将窗口内容记录到视频文件中。This article provides a walkthrough of an example app that records the contents of a window to a video file. 虽然实现此方案可能需要大量代码,但屏幕录制器应用程序的高级结构非常简单。While it may seem like there is a lot of code required to implement this scenario, the high-level structure of a screen recorder app is fairly simple. 屏幕捕获进程使用三个主要 UWP 功能:The screen capture process uses three primary UWP features:

本文中所示的示例代码可以分为几个不同的任务:The example code shown in this article can be categorized into a few different tasks:

  • 初始化 -这包括配置上述 UWP 类,初始化图形设备接口,选取要捕获的窗口,并设置编码参数,如分辨率和帧速率。Initialization - This includes configuring the UWP classes described above, initializing the graphics device interfaces, picking a window to capture, and setting up the encoding parameters such as resolution and frame rate.
  • 事件处理程序和线程处理-主捕获循环的主要驱动程序是通过SampleRequested事件定期请求帧的MediaStreamSourceEvent handlers and threading - The primary driver of the main capture loop is the MediaStreamSource which requests frames periodically through the SampleRequested event. 此示例使用事件在示例的不同组件之间协调新帧的请求。This example uses events to coordinate the requests for new frames between the different components of the example. 同步对于允许同时捕获和编码帧很重要。Synchronization is important to allow frames to be captured and encoded simultaneously.
  • 复制帧 -从捕获帧缓冲区将帧复制到可传递到 MediaStreamSource 的单独 Direct3D 面,以便在编码时不覆盖资源。Copying frames - Frames are copied from the capture frame buffer into a separate Direct3D surface that can be passed to the MediaStreamSource so that the resource isn't overwritten while being encoded. Direct3D Api 用于快速执行此复制操作。Direct3D APIs are used to perform this copy operation quickly.

关于 Direct3D ApiAbout the Direct3D APIs

如上所述,每个捕获的帧的复制可能是本文中所示的实现中最复杂的部分。As stated above, the copying of each captured frame is probably the most complex part of the implementation shown in this article. 在较低的级别,此操作使用 Direct3D 完成。At a low level, this operation is done using Direct3D. 在此示例中,我们使用 SharpDX 库从 c # 中执行 Direct3D 操作。For this example, we are using the SharpDX library to perform the Direct3D operations from C#. 不再正式支持此库,但已选择此库,因为它是低级别复制操作的性能,非常适合于这种情况。This library is no longer officially supported, but it was chosen because it's performance at low-level copy operations is well-suited for this scenario. 我们已尝试使 Direct3D 操作尽可能离散,使你可以更轻松地替代你自己的代码或其他库来执行这些任务。We have tried to keep the Direct3D operations as discrete as possible to make it easier for you to substitute your own code or other libraries for these tasks.

设置项目Setting up your project

本演练中的示例代码是使用空白应用在 Visual Studio 2019 中 ** (通用 Windows) ** c # 项目模板创建的。The example code in this walkthrough was created using the Blank App (Universal Windows) C# project template in Visual Studio 2019. 若要在 您的应用 程序中使用 appxmanifest.xml api,必须在您的项目的包文件中包含 图形捕获 功能。In order to use the Windows.Graphics.Capture APIs in your app, you must include the Graphics Capture capability in the Package.appxmanifest file for your project. 此示例将生成的视频文件保存到设备上的视频库。This example saves generated video files to the Videos Library on the device. 若要访问此文件夹,必须包含 视频库 功能。To access this folder you must include the Videos Library capability.

若要安装 SharpDX Nuget 包,请在 Visual Studio 中选择 " 管理 Nuget 包"。To install the SharpDX Nuget package, in Visual Studio select Manage Nuget Packages. 在 "浏览" 选项卡中搜索 "SharpDX. Direct3D11" 包,然后单击 " 安装"。In the Browse tab search for the "SharpDX.Direct3D11" package and click Install.

请注意,为了减小本文中代码列表的大小,以下演练中的代码省略了显式命名空间引用,并使用前导下划线 "" 命名了 MainPage 类成员变量的声明。Note that in order to reduce the size of the code listings in this article, the code in the walkthrough below omits explicit namespace references and the declaration of MainPage class member variables which are named with a leading underscore, "".

编码设置Setup for encoding

本节中所述的 SetupEncoding 方法初始化一些主要对象,这些对象将用于捕获和编码视频帧,并为捕获的视频设置编码参数。The SetupEncoding method described in this section initializes some of the main objects that will be used to capture and encode video frames and sets up the encoding parameters for captured video. 此方法可通过编程方式或以响应用户交互(如按钮单击)的方式进行调用。This method could be called programmatically or in response to a user interaction like a button click. 下面显示了 SetupEncoding 的代码列表,如下所示。The code listing for SetupEncoding is shown below after the descriptions of the initialization steps.

  • 检查捕获支持。Check for capture support. 在开始捕获过程之前,需要调用 GraphicsCaptureSession IsSupported 以确保当前设备支持屏幕捕获功能。Before beginning the capture process, you need to call GraphicsCaptureSession.IsSupported to make sure that the screen capture feature is supported on the current device.

  • 初始化 Direct3D 接口。Initialize Direct3D interfaces. 此示例使用 Direct3D 将从屏幕捕获的像素复制为编码为视频帧的纹理。This sample uses Direct3D to copy the pixels captured from the screen into a texture that is encoded as a video frame. 本文的稍后部分将演示用于初始化 Direct3D 接口 CreateD3DDeviceCreateSharpDXDevice的帮助器方法。The helper methods used to initialize the Direct3D interfaces, CreateD3DDevice and CreateSharpDXDevice, are shown later in this article.

  • 初始化 GraphicsCaptureItem。Initialize a GraphicsCaptureItem. GraphicsCaptureItem表示屏幕上将捕获的一项(窗口或整个屏幕)。A GraphicsCaptureItem represents an item on the screen that is going to be captured, either a window or the entire screen. 允许用户通过创建 GraphicsCapturePicker 并调用 PickSingleItemAsync来选择要捕获的项。Allow the user to pick an item to capture by creating a GraphicsCapturePicker and calling PickSingleItemAsync.

  • 创建组合纹理。Create a composition texture. 创建将用于复制每个视频帧的纹理资源和关联的呈现目标视图。Create a texture resource and an associated render target view that will be used to copy each video frame. 只有在创建 GraphicsCaptureItem 并知道其维度后,才能创建此纹理。This texture can't be created until the GraphicsCaptureItem has been created and we know its dimensions. 请参阅 WaitForNewFrame 的说明,了解如何使用此撰写纹理。See the description of the WaitForNewFrame to see how this composition texture is used. 本文后面还显示了用于创建此纹理的帮助器方法。The helper method for creating this texture is also shown later in this article.

  • 创建 MediaEncodingProfile 和 VideoStreamDescriptor。Create a MediaEncodingProfile and VideoStreamDescriptor. MediaStreamSource类的实例将获取从屏幕捕获的图像,并将其编码到视频流中。An instance of the MediaStreamSource class will take images captured from the screen and encode them into a video stream. 然后, MediaTranscoder 类将视频流转码到视频文件中。Then, the video stream will be transcoded into a video file by the MediaTranscoder class. VideoStreamDecriptorMediaStreamSource提供编码参数,如分辨率和帧速率。A VideoStreamDecriptor provides encoding parameters, such as resolution and frame rate, for the MediaStreamSource. 使用MediaEncodingProfile指定MediaTranscoder的视频文件编码参数。The video file encoding parameters for the MediaTranscoder are specified with a MediaEncodingProfile. 请注意,用于视频编码的大小不必与正在捕获的窗口大小相同,但是为了使此示例简单,编码设置是硬编码的,以使用捕获项的实际维度。Note that the size used for video encoding doesn't have to be the same as the size of the window being captured, but to keep this example simple, the encoding settings are hard-coded to use the capture item's actual dimensions.

  • 创建 MediaStreamSource 和 MediaTranscoder 对象。Create the MediaStreamSource and MediaTranscoder objects. 如上所述, MediaStreamSource 对象将单个帧编码到视频流中。As mentioned above, the MediaStreamSource object encodes individual frames into a video stream. 调用此类的构造函数,并传入在上一步中创建的 MediaEncodingProfileCall the constructor for this class, passing in the MediaEncodingProfile created in the previous step. 将缓冲时间设置为零,并注册SampleRequested事件的处理程序,这将在本文的后面部分进行说明。Set the buffer time to zero and register handlers for the Starting and SampleRequested events, which will be shown later in this article. 接下来,构造 MediaTranscoder 类的新实例,并启用硬件加速。Next, construct a new instance of the MediaTranscoder class and enable hardware acceleration.

  • 创建输出文件 此方法中的最后一步是创建一个文件,该文件将转码视频。Create an output file The final step in this method is to create a file to which the video will be transcoded. 在此示例中,我们将只在设备上的视频库文件夹中创建一个唯一命名的文件。For this example, we will just create a uniquely named file in the Videos Library folder on the device. 请注意,若要访问此文件夹,你的应用程序必须在应用程序清单中指定 "视频库" 功能。Note that in order to access this folder, your app must specify the "Videos Library" capability in the app manifest. 创建文件后,将其打开以进行读取和写入,并将生成的流传递到 EncodeAsync 方法中,接下来将显示该方法。Once the file has been created, open it for read and write, and pass the resulting stream into the EncodeAsync method which will be shown next.

private async Task SetupEncoding()
{
    if (!GraphicsCaptureSession.IsSupported())
    {
        // Show message to user that screen capture is unsupported
        return;
    }

    // Create the D3D device and SharpDX device
    if (_device == null)
    {
        _device = Direct3D11Helpers.CreateD3DDevice();
    }
    if (_sharpDxD3dDevice == null)
    {
        _sharpDxD3dDevice = Direct3D11Helpers.CreateSharpDXDevice(_device);
    }
    


    try
    {
        // Let the user pick an item to capture
        var picker = new GraphicsCapturePicker();
        _captureItem = await picker.PickSingleItemAsync();
        if (_captureItem == null)
        {
            return;
        }

        // Initialize a blank texture and render target view for copying frames, using the same size as the capture item
        _composeTexture = Direct3D11Helpers.InitializeComposeTexture(_sharpDxD3dDevice, _captureItem.Size);
        _composeRenderTargetView = new SharpDX.Direct3D11.RenderTargetView(_sharpDxD3dDevice, _composeTexture);

        // This example encodes video using the item's actual size.
        var width = (uint)_captureItem.Size.Width; 
        var height = (uint)_captureItem.Size.Height;

        // Make sure the dimensions are are even. Required by some encoders.
        width = (width % 2 == 0) ? width : width + 1;
        height = (height % 2 == 0) ? height : height + 1;


        var temp = MediaEncodingProfile.CreateMp4(VideoEncodingQuality.HD1080p);
        var bitrate = temp.Video.Bitrate;
        uint framerate = 30;

        _encodingProfile = new MediaEncodingProfile();
        _encodingProfile.Container.Subtype = "MPEG4";
        _encodingProfile.Video.Subtype = "H264";
        _encodingProfile.Video.Width = width;
        _encodingProfile.Video.Height = height;
        _encodingProfile.Video.Bitrate = bitrate;
        _encodingProfile.Video.FrameRate.Numerator = framerate;
        _encodingProfile.Video.FrameRate.Denominator = 1;
        _encodingProfile.Video.PixelAspectRatio.Numerator = 1;
        _encodingProfile.Video.PixelAspectRatio.Denominator = 1;

        var videoProperties = VideoEncodingProperties.CreateUncompressed(MediaEncodingSubtypes.Bgra8, width, height);
        _videoDescriptor = new VideoStreamDescriptor(videoProperties);

        // Create our MediaStreamSource
        _mediaStreamSource = new MediaStreamSource(_videoDescriptor);
        _mediaStreamSource.BufferTime = TimeSpan.FromSeconds(0);
        _mediaStreamSource.Starting += OnMediaStreamSourceStarting;
        _mediaStreamSource.SampleRequested += OnMediaStreamSourceSampleRequested;

        // Create our transcoder
        _transcoder = new MediaTranscoder();
        _transcoder.HardwareAccelerationEnabled = true;


        // Create a destination file - Access to the VideosLibrary requires the "Videos Library" capability
        var folder = KnownFolders.VideosLibrary;
        var name = DateTime.Now.ToString("yyyyMMdd-HHmm-ss");
        var file = await folder.CreateFileAsync($"{name}.mp4");
        
        using (var stream = await file.OpenAsync(FileAccessMode.ReadWrite))

        await EncodeAsync(stream);
        
    }
    catch (Exception ex)
    {
        
        return;
    }
}

开始编码Start encoding

既然已经初始化了主对象,就会实现 EncodeAsync 方法以启动捕获操作。Now that the main objects have been initialized the EncodeAsync method is implemented to kick off the capture operation. 此方法首先检查以确保未记录,如果不是,则会调用 helper 方法 StartCapture 从屏幕开始捕获帧。This method first checks to make sure we aren't already recording, and if not, it calls the helper method StartCapture to begin capturing frames from the screen. 此方法将在本文的后面部分显示。This method is shown later in this article. 接下来,调用PrepareMediaStreamSourceTranscodeAsync ,以使用MediaTranscoder我们在上一节中创建的编码配置文件将MediaStreamSource对象生成的视频流转码到输出文件流。Next, PrepareMediaStreamSourceTranscodeAsync is called to get the MediaTranscoder ready to transcode the video stream produced by the MediaStreamSource object to the output file stream, using the encoding profile we created in the previous section. 准备好转码器后,调用 TranscodeAsync 开始转码。Once the transcoder has been prepared, call TranscodeAsync to start transcoding. 有关使用 MediaTranscoder的详细信息,请参阅 转码 media filesFor more information on using the MediaTranscoder, see Transcode media files.


private async Task EncodeAsync(IRandomAccessStream stream)
{
    if (!_isRecording)
    {
        _isRecording = true;

        StartCapture();

        var transcode = await _transcoder.PrepareMediaStreamSourceTranscodeAsync(_mediaStreamSource, stream, _encodingProfile);

        await transcode.TranscodeAsync();
    }
}

处理 MediaStreamSource 事件Handle MediaStreamSource events

MediaStreamSource对象采用从屏幕捕获的帧,并将其转换为可使用MediaTranscoder保存到文件的视频流。The MediaStreamSource object takes frames that we capture from the screen and transforms them into a video stream that can be saved to a file using the MediaTranscoder. 我们通过对象的事件的处理程序将帧传递到 MediaStreamSourceWe pass the frames to the MediaStreamSource via handlers for the object's events.

为新的视频帧准备好MediaStreamSource时,将引发SampleRequested事件。The SampleRequested event is raised when the MediaStreamSource is ready for a new video frame. 确保当前正在记录,将调用帮助程序方法 WaitForNewFrame ,以获取从屏幕捕获的新帧。After making sure we are currently recording, the helper method WaitForNewFrame is called to get a new frame captured from the screen. 本文后面所示,此方法将返回包含捕获帧的 ID3D11Surface 对象。This method, shown later in this article, returns a ID3D11Surface object containing the captured frame. 在此示例中,我们将 IDirect3DSurface 接口包装在帮助程序类中,此类还存储捕获帧的系统时间。For this example, we wrap the IDirect3DSurface interface in a helper class that also stores the system time at which the frame was captured. 帧和系统时间都传递到 CreateFromDirect3D11Surface 工厂方法中,生成的MediaStreamSample设置为MediaStreamSourceSampleRequestedEventArgsMediaStreamSample 属性。Both the frame and the system time are passed into the MediaStreamSample.CreateFromDirect3D11Surface factory method and the resulting MediaStreamSample is set to the MediaStreamSourceSampleRequest.Sample property of the MediaStreamSourceSampleRequestedEventArgs. 这就是将捕获的帧提供给 MediaStreamSource的方式。This is how the captured frame is provided to the MediaStreamSource.

private void OnMediaStreamSourceSampleRequested(MediaStreamSource sender, MediaStreamSourceSampleRequestedEventArgs args)
{
    if (_isRecording && !_closed)
    {
        try
        {
            using (var frame = WaitForNewFrame())
            {
                if (frame == null)
                {
                    args.Request.Sample = null;
                    Dispose();
                    return;
                }

                var timeStamp = frame.SystemRelativeTime;

                var sample = MediaStreamSample.CreateFromDirect3D11Surface(frame.Surface, timeStamp);
                args.Request.Sample = sample;
            }
        }
        catch (Exception e)
        {
            Debug.WriteLine(e.Message);
            Debug.WriteLine(e.StackTrace);
            Debug.WriteLine(e);
            args.Request.Sample = null;
            Stop();
            Cleanup();
        }
    }
    else
    {
        args.Request.Sample = null;
        Stop();
        Cleanup();
    }
}

开始 事件的处理程序中,我们调用 WaitForNewFrame,但只将帧捕获到的系统时间传递到 SetActualStartPosition 方法, MediaStreamSource 使用该方法对后续帧的计时进行正确编码。In the handler for the Starting event, we call WaitForNewFrame, but only pass the system time the frame was captured to the MediaStreamSourceStartingRequest.SetActualStartPosition method, which the MediaStreamSource uses to properly encode the timing of the subsequent frames.

private void OnMediaStreamSourceStarting(MediaStreamSource sender, MediaStreamSourceStartingEventArgs args)
{
    using (var frame = WaitForNewFrame())
    {
        args.Request.SetActualStartPosition(frame.SystemRelativeTime);
    }
}

开始捕获Start capturing

此步骤中显示的 StartCapture 方法是从上一步中所示的 EncodeAsync helper 方法调用的。The StartCapture method shown in this step is called from the EncodeAsync helper method shown in a previous step. 首先,此方法初始化一组用于控制捕获操作流的事件对象。First, this method initializes up a set of event objects that are used to control the flow of the capture operation.

  • _multithread 是一种帮助器类,用于包装 SharpDX 库的多 线程 对象,此类对象将用于确保其他线程在复制时无法访问 SharpDX 纹理。_multithread is a helper class wrapping the SharpDX library's Multithread object that will be used to make sure that no other threads access the SharpDX texture while it's being copied.
  • _frameEvent 用于指示已捕获新帧并可传递到 MediaStreamSource_frameEvent is used to signal that a new frame has been captured and can be passed to the MediaStreamSource
  • _closedEvent 指示记录已停止,并且我们不应等待任何新帧。_closedEvent signals that recording has stopped and that we shouldn't wait for any new frames.

将帧事件和已关闭事件添加到一个数组中,以便我们可以在捕获循环中等待其中任一项。The frame event and closed event are added to an array so we can wait for either one of them in the capture loop.

StartCapture方法的其余部分将设置 Windows。捕获将执行实际屏幕捕获的 api。The rest of the StartCapture method sets up the Windows.Graphics.Capture APIs that will do the actual screen capturing. 首先,为 CaptureItem 事件注册事件。First, an event is registered for the CaptureItem.Closed event. 接下来,创建一个 Direct3D11CaptureFramePool ,它允许一次缓冲多个捕获的帧。Next, a Direct3D11CaptureFramePool is created, which allows multiple captured frames to be buffered at a time. CreateFreeThreaded方法用于创建框架池,以便在池自己的工作线程而不是应用程序的主线程上调用FrameArrived事件。The CreateFreeThreaded method is used to create the frame pool so that the FrameArrived event is called on the pool's own worker thread rather than on the app's main thread. 接下来,为 FrameArrived 事件注册处理程序。Next, a handler is registered for the FrameArrived event. 最后,为所选CaptureItem创建GraphicsCaptureSession ,并通过调用StartCapture启动帧捕获。Finally, a GraphicsCaptureSession is created for the selected CaptureItem and the capture of frames is initiated by calling StartCapture.

public void StartCapture()
{

    _multithread = _sharpDxD3dDevice.QueryInterface<SharpDX.Direct3D11.Multithread>();
    _multithread.SetMultithreadProtected(true);
    _frameEvent = new ManualResetEvent(false);
    _closedEvent = new ManualResetEvent(false);
    _events = new[] { _closedEvent, _frameEvent };

    _captureItem.Closed += OnClosed;
    _framePool = Direct3D11CaptureFramePool.CreateFreeThreaded(
        _device,
        DirectXPixelFormat.B8G8R8A8UIntNormalized,
        1,
        _captureItem.Size);
    _framePool.FrameArrived += OnFrameArrived;
    _session = _framePool.CreateCaptureSession(_captureItem);
    _session.StartCapture();
}

处理图形捕获事件Handle graphics capture events

在上一步中,我们为图形捕获事件注册了两个处理程序,并设置了一些事件以帮助管理捕获循环的流。In the previous step we registered two handlers for graphics capture events and set up some events to help manage the flow of the capture loop.

Direct3D11CaptureFramePool具有新的捕获帧时,将引发FrameArrived事件。The FrameArrived event is raised when the Direct3D11CaptureFramePool has a new captured frame available. 在此事件的处理程序中,调用发送方上的 TryGetNextFrame 以获取下一个捕获的帧。In the handler for this event, call TryGetNextFrame on the sender to get the next captured frame. 检索到帧后,我们设置 _frameEvent 以便捕获循环知道有新帧可用。After the frame is retrieved, we set the _frameEvent so that our capture loop knows there is a new frame available.

private void OnFrameArrived(Direct3D11CaptureFramePool sender, object args)
{
    _currentFrame = sender.TryGetNextFrame();
    _frameEvent.Set();
}

关闭 的事件处理程序中,我们向 _closedEvent 发出信号,以便捕获循环知道何时停止。In the Closed event handler, we signal the _closedEvent so that the capture loop will know when to stop.

private void OnClosed(GraphicsCaptureItem sender, object args)
{
    _closedEvent.Set();
}

等待新帧Wait for new frames

此部分中所述的 WaitForNewFrame helper 方法是发生大量捕获循环的位置。The WaitForNewFrame helper method described in this section is where the heavy-lifting of the capture loop occurs. 请记住,每当MediaStreamSource准备好将新帧添加到视频流时,将从OnMediaStreamSourceSampleRequested事件处理程序调用此方法。Remember, this method is called from the OnMediaStreamSourceSampleRequested event handler whenever the MediaStreamSource is ready for a new frame to be added to the video stream. 在高级别,此函数只是将每个屏幕捕获的视频帧从一个 Direct3D surface 面复制到另一个,以便在捕获新帧时可以将其传递到 MediaStreamSource 进行编码。At a high-level, this function simply copies each screen-captured video frame from one Direct3D surface to another so that it can be passed into the MediaStreamSource for encoding while a new frame is being captured. 此示例使用 SharpDX 库来执行实际复制操作。This example uses the SharpDX library to perform the actual copy operation.

在等待新帧之前,方法会释放类变量中存储的任何以前的帧, _currentFrame,并重置 _frameEventBefore waiting for a new frame, the method disposes of any previous frame stored in the class variable, _currentFrame, and resets the _frameEvent. 然后,该方法将等待 _frameEvent 或要发出信号的 _closedEventThen the method waits for either the _frameEvent or the _closedEvent to be signaled. 如果设置了已关闭的事件,应用程序将调用帮助程序方法来清理捕获资源。If the closed event is set, then the app calls a helper method to cleanup the capture resources. 此方法将在本文的后面部分显示。This method is shown later in this article.

如果设置了帧事件,我们知道已经调用了上一步中定义的 FrameArrived 事件处理程序,并且我们开始将捕获的帧数据复制到将传递到 MediaStreamSource的 Direct3D 11 图面。If the frame event is set, then we know that the FrameArrived event handler defined in the previous step has been called, and we begin the process of copying the captured frame data into a Direct3D 11 surface that will be passed to the MediaStreamSource.

此示例使用 helper 类SurfaceWithInfo,它只允许我们将帧的视频帧和系统时间传递给 MediaStreamSource,这两个对象都是MediaStreamSourceThis example uses a helper class, SurfaceWithInfo, which simply allows us to pass the video frame and the system time of the frame - both required by the MediaStreamSource - as a single object. 帧复制过程的第一步是实例化此类,并设置系统时间。The first step of the frame copy process is to instantiate this class and set the system time.

接下来的步骤是此示例中特别依赖于 SharpDX 库的部分。The next steps are the part of this example that relies specifically on the SharpDX library. 本文末尾定义了此处使用的帮助程序函数。The helper functions used here are defined at the end of this article. 首先,我们使用 MultiThreadLock 来确保在进行复制时没有其他线程访问视频帧缓冲区。First we use the MultiThreadLock to make sure no other threads access the video frame buffer while we are making the copy. 接下来,我们调用 helper 方法 CreateSharpDXTexture2D 从视频帧创建 SharpDX Texture2D 对象。Next, we call the helper method CreateSharpDXTexture2D to create a SharpDX Texture2D object from the video frame. 这将是复制操作的源纹理。This will be the source texture for the copy operation.

接下来,我们将在上一步中创建的 Texture2D 对象复制到我们之前在此过程中创建的撰写纹理。Next, we copy from the Texture2D object created in the previous step into the composition texture we created earlier in the process. 此撰写纹理充当交换缓冲区,以便在捕获下一帧时编码过程可以对像素进行运算。This composition texture acts as a swap buffer so that the encoding process can operate on the pixels while the next frame is being captured. 若要执行复制,请清除与复合纹理关联的呈现目标视图,然后在要复制的纹理中定义区域(在这种情况下为整个纹理),然后我们调用 CopySubresourceRegion 以实际将像素复制到撰写纹理。To perform the copy, we clear the render target view associated with the composition texture, then we define the region within the texture we want to copy - the entire texture in this case, and then we call CopySubresourceRegion to actually copy the pixels to the composition texture.

我们创建纹理说明的副本,以便在创建目标纹理时使用,但会修改说明,将 BindFlags 设置为 RenderTarget ,使新纹理具有写入访问权限。We create a copy of the texture description to use when we create our target texture, but the description is modified, setting the BindFlags to RenderTarget so that the new texture has write access. 如果将 CpuAccessFlags 设置为 None ,则系统将优化复制操作。Setting the CpuAccessFlags to None allows the system to optimize the copy operation. 纹理说明用于创建新的纹理资源,并通过调用 CopyResource将合成纹理资源复制到此新资源中。The texture description is used to create a new texture resource and the composition texture resource is copied into this new resource with a call to CopyResource. 最后,调用 CreateDirect3DSurfaceFromSharpDXTexture 来创建从该方法返回的 IDirect3DSurface 对象。Finally, CreateDirect3DSurfaceFromSharpDXTexture is called to create the IDirect3DSurface object that is returned from this method.

public SurfaceWithInfo WaitForNewFrame()
{
    // Let's get a fresh one.
    _currentFrame?.Dispose();
    _frameEvent.Reset();

    var signaledEvent = _events[WaitHandle.WaitAny(_events)];
    if (signaledEvent == _closedEvent)
    {
        Cleanup();
        return null;
    }

    var result = new SurfaceWithInfo();
    result.SystemRelativeTime = _currentFrame.SystemRelativeTime;
    using (var multithreadLock = new MultithreadLock(_multithread))
    using (var sourceTexture = Direct3D11Helpers.CreateSharpDXTexture2D(_currentFrame.Surface))
    {

        _sharpDxD3dDevice.ImmediateContext.ClearRenderTargetView(_composeRenderTargetView, new SharpDX.Mathematics.Interop.RawColor4(0, 0, 0, 1));

        var width = Math.Clamp(_currentFrame.ContentSize.Width, 0, _currentFrame.Surface.Description.Width);
        var height = Math.Clamp(_currentFrame.ContentSize.Height, 0, _currentFrame.Surface.Description.Height);
        var region = new SharpDX.Direct3D11.ResourceRegion(0, 0, 0, width, height, 1);
        _sharpDxD3dDevice.ImmediateContext.CopySubresourceRegion(sourceTexture, 0, region, _composeTexture, 0);

        var description = sourceTexture.Description;
        description.Usage = SharpDX.Direct3D11.ResourceUsage.Default;
        description.BindFlags = SharpDX.Direct3D11.BindFlags.ShaderResource | SharpDX.Direct3D11.BindFlags.RenderTarget;
        description.CpuAccessFlags = SharpDX.Direct3D11.CpuAccessFlags.None;
        description.OptionFlags = SharpDX.Direct3D11.ResourceOptionFlags.None;

        using (var copyTexture = new SharpDX.Direct3D11.Texture2D(_sharpDxD3dDevice, description))
        {
            _sharpDxD3dDevice.ImmediateContext.CopyResource(_composeTexture, copyTexture);
            result.Surface = Direct3D11Helpers.CreateDirect3DSurfaceFromSharpDXTexture(copyTexture);
        }
    }

    return result;
}

停止捕获并清理资源Stop capture and clean up resources

Stop方法提供一种停止捕获操作的方法。The Stop method provides a way to stop the capture operation. 应用可以通过编程方式或响应用户交互(如按钮单击)来调用此。Your app may call this programmatically or in response to a user interaction, like a button click. 此方法只是设置 _closedEventThis method simply sets the _closedEvent. 在前面的步骤中定义的 WaitForNewFrame 方法将查找此事件,如果已设置,则将关闭捕获操作。The WaitForNewFrame method defined in the previous steps looks for this event and, if set, shuts down the capture operation.

private void Stop()
{
    _closedEvent.Set();
}

清理方法用于正确释放在复制操作过程中创建的资源。The Cleanup method is used to properly dispose of the resources that were created during the copy operation. 这包括:This includes:

  • 捕获会话使用的 Direct3D11CaptureFramePool 对象The Direct3D11CaptureFramePool object used by the capture session
  • GraphicsCaptureSessionGraphicsCaptureItemThe GraphicsCaptureSession and GraphicsCaptureItem
  • Direct3D 和 SharpDX 设备The Direct3D and SharpDX devices
  • 复制操作中使用的 SharpDX 纹理和呈现目标视图。The SharpDX texture and render target view used in the copy operation.
  • 用于存储当前帧的 Direct3D11CaptureFrameThe Direct3D11CaptureFrame used for storing the current frame.
private void Cleanup()
{
    _framePool?.Dispose();
    _session?.Dispose();
    if (_captureItem != null)
    {
        _captureItem.Closed -= OnClosed;
    }
    _captureItem = null;
    _device = null;
    _sharpDxD3dDevice = null;
    _composeTexture?.Dispose();
    _composeTexture = null;
    _composeRenderTargetView?.Dispose();
    _composeRenderTargetView = null;
    _currentFrame?.Dispose();
}

Helper 包装类Helper wrapper classes

以下帮助器类定义为有助于处理本文中的示例代码。The following helper classes were defined to help with the example code in this article.

MultithreadLock helper 类包装 SharpDX 多线程类,以确保其他线程在复制时不会访问纹理资源。The MultithreadLock helper class wraps the SharpDX Multithread class that makes sure that other threads don't access the texture resources while being copied.

class MultithreadLock : IDisposable
{
    public MultithreadLock(SharpDX.Direct3D11.Multithread multithread)
    {
        _multithread = multithread;
        _multithread?.Enter();
    }

    public void Dispose()
    {
        _multithread?.Leave();
        _multithread = null;
    }

    private SharpDX.Direct3D11.Multithread _multithread;
}

SurfaceWithInfo 用于将 IDirect3DSurface 关联到表示捕获的帧的 SystemRelativeTime ,以及分别捕获捕获的帧的时间。SurfaceWithInfo is used to associate an IDirect3DSurface with a SystemRelativeTime representing the a captured frame and the time it was captured, respectively.

public sealed class SurfaceWithInfo : IDisposable
{
    public IDirect3DSurface Surface { get; internal set; }
    public TimeSpan SystemRelativeTime { get; internal set; }

    public void Dispose()
    {
        Surface?.Dispose();
        Surface = null;
    }
}

Direct3D 和 SharpDX helper ApiDirect3D and SharpDX helper APIs

以下帮助器 Api 定义为抽象出 Direct3D 和 SharpDX 资源的创建。The following helper APIs are defined to abstract out the creation of Direct3D and SharpDX resources. 有关这些技术的详细说明不在本文的讨论范围内,但此处提供了代码,以使你能够实现本演练中所示的示例代码。A detailed explanation of these technologies is outside the scope of this article but the code is provided here to allow you to implement the example code shown in the walkthrough.

[ComImport]
[Guid("A9B3D012-3DF2-4EE3-B8D1-8695F457D3C1")]
[InterfaceType(ComInterfaceType.InterfaceIsIUnknown)]
[ComVisible(true)]
interface IDirect3DDxgiInterfaceAccess
{
    IntPtr GetInterface([In] ref Guid iid);
};

public static class Direct3D11Helpers
{
    internal static Guid IInspectable = new Guid("AF86E2E0-B12D-4c6a-9C5A-D7AA65101E90");
    internal static Guid ID3D11Resource = new Guid("dc8e63f3-d12b-4952-b47b-5e45026a862d");
    internal static Guid IDXGIAdapter3 = new Guid("645967A4-1392-4310-A798-8053CE3E93FD");
    internal static Guid ID3D11Device = new Guid("db6f6ddb-ac77-4e88-8253-819df9bbf140");
    internal static Guid ID3D11Texture2D = new Guid("6f15aaf2-d208-4e89-9ab4-489535d34f9c");

    [DllImport(
        "d3d11.dll",
        EntryPoint = "CreateDirect3D11DeviceFromDXGIDevice",
        SetLastError = true,
        CharSet = CharSet.Unicode,
        ExactSpelling = true,
        CallingConvention = CallingConvention.StdCall
        )]
    internal static extern UInt32 CreateDirect3D11DeviceFromDXGIDevice(IntPtr dxgiDevice, out IntPtr graphicsDevice);

    [DllImport(
        "d3d11.dll",
        EntryPoint = "CreateDirect3D11SurfaceFromDXGISurface",
        SetLastError = true,
        CharSet = CharSet.Unicode,
        ExactSpelling = true,
        CallingConvention = CallingConvention.StdCall
        )]
    internal static extern UInt32 CreateDirect3D11SurfaceFromDXGISurface(IntPtr dxgiSurface, out IntPtr graphicsSurface);

    public static IDirect3DDevice CreateD3DDevice()
    {
        return CreateD3DDevice(false);
    }

    public static IDirect3DDevice CreateD3DDevice(bool useWARP)
    {
        var d3dDevice = new SharpDX.Direct3D11.Device(
            useWARP ? SharpDX.Direct3D.DriverType.Software : SharpDX.Direct3D.DriverType.Hardware,
            SharpDX.Direct3D11.DeviceCreationFlags.BgraSupport);
        IDirect3DDevice device = null;

        // Acquire the DXGI interface for the Direct3D device.
        using (var dxgiDevice = d3dDevice.QueryInterface<SharpDX.DXGI.Device3>())
        {
            // Wrap the native device using a WinRT interop object.
            uint hr = CreateDirect3D11DeviceFromDXGIDevice(dxgiDevice.NativePointer, out IntPtr pUnknown);

            if (hr == 0)
            {
                device = Marshal.GetObjectForIUnknown(pUnknown) as IDirect3DDevice;
                Marshal.Release(pUnknown);
            }
        }

        return device;
    }


    internal static IDirect3DSurface CreateDirect3DSurfaceFromSharpDXTexture(SharpDX.Direct3D11.Texture2D texture)
    {
        IDirect3DSurface surface = null;

        // Acquire the DXGI interface for the Direct3D surface.
        using (var dxgiSurface = texture.QueryInterface<SharpDX.DXGI.Surface>())
        {
            // Wrap the native device using a WinRT interop object.
            uint hr = CreateDirect3D11SurfaceFromDXGISurface(dxgiSurface.NativePointer, out IntPtr pUnknown);

            if (hr == 0)
            {
                surface = Marshal.GetObjectForIUnknown(pUnknown) as IDirect3DSurface;
                Marshal.Release(pUnknown);
            }
        }

        return surface;
    }



    internal static SharpDX.Direct3D11.Device CreateSharpDXDevice(IDirect3DDevice device)
    {
        var access = (IDirect3DDxgiInterfaceAccess)device;
        var d3dPointer = access.GetInterface(ID3D11Device);
        var d3dDevice = new SharpDX.Direct3D11.Device(d3dPointer);
        return d3dDevice;
    }

    internal static SharpDX.Direct3D11.Texture2D CreateSharpDXTexture2D(IDirect3DSurface surface)
    {
        var access = (IDirect3DDxgiInterfaceAccess)surface;
        var d3dPointer = access.GetInterface(ID3D11Texture2D);
        var d3dSurface = new SharpDX.Direct3D11.Texture2D(d3dPointer);
        return d3dSurface;
    }


    public static SharpDX.Direct3D11.Texture2D InitializeComposeTexture(
        SharpDX.Direct3D11.Device sharpDxD3dDevice,
        SizeInt32 size)
    {
        var description = new SharpDX.Direct3D11.Texture2DDescription
        {
            Width = size.Width,
            Height = size.Height,
            MipLevels = 1,
            ArraySize = 1,
            Format = SharpDX.DXGI.Format.B8G8R8A8_UNorm,
            SampleDescription = new SharpDX.DXGI.SampleDescription()
            {
                Count = 1,
                Quality = 0
            },
            Usage = SharpDX.Direct3D11.ResourceUsage.Default,
            BindFlags = SharpDX.Direct3D11.BindFlags.ShaderResource | SharpDX.Direct3D11.BindFlags.RenderTarget,
            CpuAccessFlags = SharpDX.Direct3D11.CpuAccessFlags.None,
            OptionFlags = SharpDX.Direct3D11.ResourceOptionFlags.None
        };
        var composeTexture = new SharpDX.Direct3D11.Texture2D(sharpDxD3dDevice, description);
       

        using (var renderTargetView = new SharpDX.Direct3D11.RenderTargetView(sharpDxD3dDevice, composeTexture))
        {
            sharpDxD3dDevice.ImmediateContext.ClearRenderTargetView(renderTargetView, new SharpDX.Mathematics.Interop.RawColor4(0, 0, 0, 1));
        }

        return composeTexture;
    }
}

请参阅See also