Windows Machine LearningWindows Machine Learning

使用 Windows ML (這是一個高效能、可靠的 API,可在 Windows 裝置上部署硬體加速的 ML 推論) 在 Windows 應用程式中實作 Machine Learning。Implement Machine Learning in your Windows apps using Windows ML — a high-performance, reliable API for deploying hardware-accelerated ML inferences on Windows devices.

Windows ML 圖形

概觀Overview

Windows ML 內建於 Windows 10 和 Windows Server 2019 的最新版本中,而且也可作為 NuGet 套件, 來使用以供下層延伸至 Windows 8.1。Windows ML is built into the latest versions of Windows 10 and Windows Server 2019, and is also available as a NuGet package for down-level reach to Windows 8.1. Windows ML 為開發人員提供下列優點:Windows ML provides developers with the following advantages:

  • 方便開發: 使用內建至最新版本 Windows 10 和 Windows Server 2019 的 Windows ML,您需要的只有 Visual Studio 和定型 ONNX 模型,可以與 Windows 應用程式一起散佈。Ease of development: With Windows ML built into the latest versions of Windows 10 and Windows Server 2019, all you need is Visual Studio and a trained ONNX model, which can be distributed along with the Windows application. 此外,如果您需要將以 AI 為基礎的功能交付給傳遞至舊版的 Windows (向下至 8.1),則 Windows ML 也會以 NuGet 套件的形式提供,您可以與應用程式一起散發。Also, if you need to deliver your AI-based features to older versions of Windows (down to 8.1), Windows ML is also available as a NuGet package that you can distribute with your application.

  • 廣泛的硬體支援: Windows ML 可讓您一次撰寫您的 ML 工作負載,並且在不同硬體廠商和晶片類型 (例如 CPU、GPU 和 AI 加速器) 之間,自動取得高度最佳化的效能。Broad hardware support: Windows ML allows you to write your ML workload once and automatically get highly optimized performance across different hardware vendors and silicon types, such as CPUs, GPUs, and AI accelerators. 此外,Windows ML 在支援硬體的範圍之間,保證一致的行為。In addition, Windows ML guarantees consistent behavior across the range of supported hardware.

  • 低延遲、即時結果: 系統可以透過 Windows 裝置的處理功能來評估 ML 模型,達到大型資料磁碟區 (例如影像和影片) 的本機、即時分析。Low latency, real-time results: ML models can be evaluated using the processing capabilities of the Windows device, enabling local, real-time analysis of large data volumes, such as images and video. 結果可以有效率地快速取得,用於耗費大量效能的工作負載,例如遊戲引擎,或是例如搜尋索引編製的背景工作。Results are available quickly and efficiently for use in performance-intensive workloads like game engines, or background tasks such as indexing for search.

  • 增加彈性: 在 Windows 裝置本機上評估 ML 模型的選項,可讓您解決廣泛的案例。Increased flexibility: The option to evaluate ML models locally on Windows devices lets you address a broader range of scenarios. 例如,ML 模型的評估可以在裝置離線時執行,或者在遇到間歇性連線時執行。For example, evaluation of ML models can run while the device is offline, or when faced with intermittent connectivity. 也可以讓您解決由於隱私權或資料主權問題,而無法將所有資料都傳送到雲端的案例。This also lets you address scenarios where not all data can be sent to the cloud due to privacy or data sovereignty issues.

  • 降低營運成本: 在雲端中定型 ML 模型,然後在 Windows 裝置本機上進行評估,可以大幅節省頻寬成本,只需要將最少量的資料傳送到雲端,持續改善您的 ML 模型所需的資料。Reduced operational costs: Training ML models in the cloud and then evaluating them locally on Windows devices can deliver significant savings in bandwidth costs, with only minimal data sent to the cloud—as might be needed for continual improvement of your ML model. 此外,在伺服器案例中部署 ML 模型時,開發人員可以利用 Windows ML 硬體加速來加速模型服務,降低處理工作負載所需的機器數量。Moreover, when deploying the ML model in a server scenario, developers can leverage Windows ML hardware acceleration to speed-up model serving, reducing the number of machines needed in order to handle the workload.

開始使用Get Started

將定型 ML 模型納入應用程式程式碼的程序很簡單,只需要一些直覺的步驟:The process of incorporating trained ML models into your application code is simple, requiring just a few straightforward steps:

  1. 取得定型 Open Neural Network Exchange (ONNX) 模型,或使用 WinMLTools 將在其他 ML 架構中定型的模型轉換成 ONNX。Get a trained Open Neural Network Exchange (ONNX) model, or convert models trained in other ML frameworks into ONNX with WinMLTools.

  2. 將 ONNX 模型檔案新增至您的應用程式,或者透過其他方式將其用於目標裝置上。Add the ONNX model file to your application, or make it available in some other way on the target device.

  3. 將模型整合到您的應用程式程式碼,然後建置和部署應用程式。Integrate the model into your application code, then build and deploy the application.

定型環境、新增模型參考、應用程式、Windows ML

若要開始使用隨附的 Windows ML,請移至使用 Windows ML 將模型整合到應用程式To start with the in-box Windows ML, go to Integrate a model into your app with Windows ML. 您也可以試用 GitHub 上的 Windows-Machine-Learning 存放庫中的範例應用程式。You can also try out the sample apps in the Windows-Machine-Learning repo on GitHub.

如果您想要使用 NuGet 套件,請參閱教學課程:將現有的 WinML 應用程式移植至 NuGet 套件If you want to use the NuGet package, please see Tutorial: Port an Existing WinML App to NuGet Package.

如需最新的 Windows ML 功能和修正,請參閱我們的版本資訊For the latest Windows ML features and fixes, see our release notes.

隨附與 NuGet WinML 解決方案In-box vs NuGet WinML solutions

下表強調 Windows ML 的隨附和 NuGet 套件的可用性、散發、語言支援、服務和往後相容性等方面。The table below highlights the availability, distribution, language support, servicing, and forward compatibility aspects of the In-Box and NuGet package for Windows ML.

隨附In-Box NuGetNuGet
可用性Availability Windows 10 (版本 1809 或更高版本)Windows 10 version 1809 or higher Windows 8.1 或更高版本Windows 8.1 or higher
發佈Distribution 內建於 Windows SDKBuilt into the Windows SDK 封裝並散發為應用程式的一部分Package and distribute as part of your application
維護Servicing Microsoft 驅動 (客戶自動受益)Microsoft-driven (customers benefit automatically) 開發人員驅動Developer-driven
往後相容性Forward compatibility 使用自動向前新功能Automatically rolls forward with new features 開發人員需要手動更新套件Developer needs to update package manually

當您的應用程式以隨附解決方案執行時,Windows ML 執行階段 (包含 ONNX 模型推論引擎) 會評估 Windows 10 (或者 Windows Server 2019,如果目標是伺服器部署) 裝置上的定型模型。When your application runs with the in-box solution, the Windows ML runtime (which contains the ONNX Model Inference Engine) evaluates the trained model on the Windows 10 device (or Windows Server 2019 if targeting a server deployment). Windows ML 會處理硬體精簡,讓開發人員以大範圍的晶片作為目標,包括 CPU、GPU 以及未來包含 AI 加速器。Windows ML handles the hardware abstraction, allowing developers to target a broad range of silicon—including CPUs, GPUs, and, in the future, AI accelerators. Windows ML 硬體加速是以 DirectML 為基礎所建置的,DirectML 是用來執行 ML 推論的高效能、低階 API,它是 DirectX 系列的一員。Windows ML hardware acceleration is built on top of DirectML, a high-performance, low-level API for running ML inferences that is part of the DirectX family.

Windows ML 圖層

windows ml nuget 套件

若為 NuGet 套件,這些圖層會顯示為下圖所示的二進位檔。For the NuGet package, these layers appear as binaries shown in the diagram below. Windows ML 已內建在 Microsoft.ai.machinelearning.dll 中。Windows ML is built into the Microsoft.ai.machinelearning.dll. 其不包含內嵌的 ONNX 執行時間,而是將 ONNX 執行時間內建在以下檔案中:onnxruntime.dll。It does not contain an embedded ONNX runtime, instead the ONNX runtime is built into the file: onnxruntime.dll. WindowsAI NuGet 套件中所包含的版本包含一個內嵌其中的 DirectML EP。The version included in the WindowsAI NuGet packages contains a DirectML EP embedded inside of it. 最終的二進位檔 DirectML.dll 是 DirectML 的實際平台程式碼,其建置在 Windows 內建的 Direct 3D 和計算驅動程式上。The final binary, DirectML.dll, is the actual platform code as DirectML and is built on top of the Direct 3D and compute drivers that are built into Windows. 這三個二進位檔都包含在 NuGet 版本中,以便與您的應用程式一起散發。All three of these binaries are included in the NuGet releases for you to distribute along with your applications.

直接存取 onnxruntime.dll 也可讓您以跨平台的案例為目標,同時取得可在所有 Windows 裝置上進行調整的相同硬體無關加速。Direct access to the onnxruntime.dll also allows you to target cross-platform scenarios while getting the same hardware agnostic acceleration that scales across all Windows devices.

Microsoft 的其他機器學習解決方案Other machine learning solutions from Microsoft

Microsoft 提供各種不同的機器學習解決方案以符合您的需求。Microsoft offers a variety of machine learning solutions to suit your needs. 這些解決方案會在雲端、內部部署和本機裝置上執行。These solutions run in the cloud, on-premises, and locally on the device. 如需詳細資訊,請參閱Microsoft 有哪些機器學習產品選項?See What are the machine learning product options from Microsoft? for more information.

注意

使用下列資源取得 Windows ML 的說明:Use the following resources for help with Windows ML:

  • 如需詢問或回答有關 Windows ML 的技術問題,請使用 Stack Overflow 上的 windows-machine-learning 標籤。To ask or answer technical questions about Windows ML, please use the windows-machine-learning tag on Stack Overflow.
  • 如需回報錯誤 (bug),請在 GitHub 上提出問題。To report a bug, please file an issue on our GitHub.
  • 如需要求功能,請前往 Windows 開發人員意見反應To request a feature, please head over to Windows Developer Feedback.