Windows 机器学习Windows Machine Learning

使用 Windows ML(高性能的可靠 API,用于在 Windows 设备上部署硬件加速的 ML 推理)在 Windows 应用中实现机器学习。Implement Machine Learning in your Windows apps using Windows ML — a high-performance, reliable API for deploying hardware-accelerated ML inferences on Windows devices.

Windows ML 图

概述Overview

Windows ML 内置于 Windows 10 和 Windows Server 2019 的最新版本中,也可作为 NuGet 包提供到更低的操作系统版本 Windows 8.1。Windows ML is built into the latest versions of Windows 10 and Windows Server 2019, and is also available as a NuGet package for down-level reach to Windows 8.1. Windows ML 为开发人员提供以下优势:Windows ML provides developers with the following advantages:

  • 使开发变得更轻松: 使用最新版 Windows 10 和 Windows Server 2019 中内置的 Windows ML,只需 Visual Studio 以及连同 Windows 应用程序一起分发的已训练 ONNX 模型即可进行开发。Ease of development: With Windows ML built into the latest versions of Windows 10 and Windows Server 2019, all you need is Visual Studio and a trained ONNX model, which can be distributed along with the Windows application. 此外,如果你需要将基于 AI 的功能提供到较低版本的 Windows(低至 8.1),Windows ML 也可作为随应用程序一起分发的 NuGet 包提供。Also, if you need to deliver your AI-based features to older versions of Windows (down to 8.1), Windows ML is also available as a NuGet package that you can distribute with your application.

  • 广泛的硬件支持: 使用 Windows ML 可以一次性编写 ML 工作负荷,并自动为不同的硬件供应商和芯片类型(例如 CPU、GPU 和 AI 加速器)提供高度优化的性能。Broad hardware support: Windows ML allows you to write your ML workload once and automatically get highly optimized performance across different hardware vendors and silicon types, such as CPUs, GPUs, and AI accelerators. 此外,Windows ML 保证各种受支持硬件的行为保持一致。In addition, Windows ML guarantees consistent behavior across the range of supported hardware.

  • 低延迟、实时结果: 可以使用 Windows 设备的处理能力来评估 ML 模型,以实现对图像和视频等大量数据的本地实时分析。Low latency, real-time results: ML models can be evaluated using the processing capabilities of the Windows device, enabling local, real-time analysis of large data volumes, such as images and video. 可以快速高效地提供结果用于游戏引擎等性能密集型工作负荷,或搜索索引等后台任务。Results are available quickly and efficiently for use in performance-intensive workloads like game engines, or background tasks such as indexing for search.

  • 提高灵活性: 在 Windows 设备本地评估 ML 模型的选项可让你解决更广泛的方案。Increased flexibility: The option to evaluate ML models locally on Windows devices lets you address a broader range of scenarios. 例如,可以在设备处于脱机状态或者连接间歇性中断时运行 ML 模型的评估。For example, evaluation of ML models can run while the device is offline, or when faced with intermittent connectivity. 此外,还可以让你解决因隐私或数据主权问题而无法将所有数据发送到云的情景。This also lets you address scenarios where not all data can be sent to the cloud due to privacy or data sovereignty issues.

  • 降低运营成本: 在云中训练机器 ML,然后在 Windows 设备本地评估这些模型可以极大地节省带宽成本(只需将持续改善 ML 模型时可能会需要的极少量数据发送到云)。Reduced operational costs: Training ML models in the cloud and then evaluating them locally on Windows devices can deliver significant savings in bandwidth costs, with only minimal data sent to the cloud—as might be needed for continual improvement of your ML model. 此外,在服务器方案中部署 ML 模型时,开发人员可以利用 Windows ML 硬件加速来加快为模型提供服务的速度,减少处理工作负荷所需的计算机数量。Moreover, when deploying the ML model in a server scenario, developers can leverage Windows ML hardware acceleration to speed-up model serving, reducing the number of machines needed in order to handle the workload.

入门Get Started

将已训练的 ML 模型整合到应用程序代码的过程非常简单,只需执行几个简单的步骤即可:The process of incorporating trained ML models into your application code is simple, requiring just a few straightforward steps:

  1. 获取经过训练的开放神经网络交换 (ONNX) 模型,或者使用 WinMLTools 将其他 ML 框架中训练的模型转换为 ONNX。Get a trained Open Neural Network Exchange (ONNX) model, or convert models trained in other ML frameworks into ONNX with WinMLTools.

  2. 将 ONNX 模型文件添加到应用程序,或者在目标设备上以其他某种方式提供该文件。Add the ONNX model file to your application, or make it available in some other way on the target device.

  3. 将模型集成到应用程序代码中,然后生成并部署应用程序。Integrate the model into your application code, then build and deploy the application.

训练环境,添加模型引用,应用程序,Windows ML

若要从内置的 Windows ML 开始,请访问使用 Windows ML 将模型集成到应用中To start with the in-box Windows ML, go to Integrate a model into your app with Windows ML. 还可以尝试 GitHub 上的 Windows-Machine-Learning 存储库中的示例应用。You can also try out the sample apps in the Windows-Machine-Learning repo on GitHub.

如果要使用 NuGet 包,请参阅教程:将现有 WinML 应用移植到 NuGet 包If you want to use the NuGet package, please see Tutorial: Port an Existing WinML App to NuGet Package.

有关最新 Windows ML 功能和修补程序,请参阅我们的发行说明For the latest Windows ML features and fixes, see our release notes.

内置 WinML 解决方案与 NuGet WinML 解决方案In-box vs NuGet WinML solutions

下表突出显示了内置 Windows ML 和 Windows ML NuGet 包的可用性、分发、语言支持、维护和前向兼容性方面。The table below highlights the availability, distribution, language support, servicing, and forward compatibility aspects of the In-Box and NuGet package for Windows ML.

内置In-Box NuGetNuGet
可用性Availability Windows 10 版本 1809 或更高版本Windows 10 version 1809 or higher Windows 8.1 或更高版本Windows 8.1 or higher
分发Distribution 内置到 Windows SDKBuilt into the Windows SDK 作为应用程序的一部分进行打包和分发Package and distribute as part of your application
维护Servicing Microsoft 驱动(客户自动受益)Microsoft-driven (customers benefit automatically) 开发人员驱动Developer-driven
前向兼容性Forward compatibility 自动使用新功能进行前滚Automatically rolls forward with new features 开发人员需要手动更新包Developer needs to update package manually

在使用内置解决方案的情况下运行应用程序时,Windows ML 运行时(包含 ONNX 模型推理引擎)将评估 Windows 10 设备(或面向服务器部署时使用的 Windows Server 2019)上已训练的模型。When your application runs with the in-box solution, the Windows ML runtime (which contains the ONNX Model Inference Engine) evaluates the trained model on the Windows 10 device (or Windows Server 2019 if targeting a server deployment). Windows ML 将处理硬件抽象,从而使开发人员可将目标定位在各种芯片上 — 包括 CPU、GPU 以及未来的 AI 加速器。Windows ML handles the hardware abstraction, allowing developers to target a broad range of silicon—including CPUs, GPUs, and, in the future, AI accelerators. Windows ML 硬件加速构建在 DirectML 的基础之上。DirectML 是用于运行 ML 推理的高性能低级别 API,是 DirectX 家族中的一员。Windows ML hardware acceleration is built on top of DirectML, a high-performance, low-level API for running ML inferences that is part of the DirectX family.

Windows ML 层

windows ml nuget 包

对于 NuGet 包,这些层显示为下图中所示的二进制文件。For the NuGet package, these layers appear as binaries shown in the diagram below. Windows ML 内置于 Microsoft.ai.machinelearning.dll 中。Windows ML is built into the Microsoft.ai.machinelearning.dll. 它并不包含嵌入的 ONNX 运行时,ONNX 运行时内置于文件 onnxruntime.dll 中。It does not contain an embedded ONNX runtime, instead the ONNX runtime is built into the file: onnxruntime.dll. WindowsAI NuGet 包中包含的版本包含一个嵌入在其内部的 DirectML EP。The version included in the WindowsAI NuGet packages contains a DirectML EP embedded inside of it. 最终的二进制文件 DirectML.dll 是 DirectML 的实际平台代码,它基于 Windows 中内置的 Direct 3D 和计算驱动程序而构建。The final binary, DirectML.dll, is the actual platform code as DirectML and is built on top of the Direct 3D and compute drivers that are built into Windows. 所有这三个二进制文件都包含在 NuGet 版本中,以便随应用程序一起分发。All three of these binaries are included in the NuGet releases for you to distribute along with your applications.

直接访问 onnxruntime.dll 还使你能够将目标定位于跨平台方案,同时让相同的、独立于硬件的加速扩展到所有 Windows 设备上。Direct access to the onnxruntime.dll also allows you to target cross-platform scenarios while getting the same hardware agnostic acceleration that scales across all Windows devices.

Microsoft 的其他机器学习解决方案Other machine learning solutions from Microsoft

Microsoft 提供了各种各样的机器学习解决方案,以便满足你的需求。Microsoft offers a variety of machine learning solutions to suit your needs. 这些解决方案在云中、本地以及本地设备上运行。These solutions run in the cloud, on-premises, and locally on the device. 请参阅 Microsoft 的机器学习产品有哪些?以了解详细信息。See What are the machine learning product options from Microsoft? for more information.

备注

使用以下资源可获取有关 Windows ML 的帮助:Use the following resources for help with Windows ML:

  • 若要提出或回答有关 Windows ML 的技术问题,请在 Stack Overflow 上使用 windows-machine-learning 标记。To ask or answer technical questions about Windows ML, please use the windows-machine-learning tag on Stack Overflow.
  • 若要报告 bug,请在 GitHub 上提交问题。To report a bug, please file an issue on our GitHub.
  • 若要请求一项功能,请访问 Windows 开发人员反馈To request a feature, please head over to Windows Developer Feedback.