Windows Copilot Runtime Overview

Microsoft Copilot stack on Windows democratizes the ability to experiment, build, and reach people with breakthrough AI experiences, putting the developer in control. Copilot in Windows appears as a side bar docked on the Windows desktop and is designed to help users get things done in the operating system (like changing Windows settings). See Manage Copilot in Windows to learn more about configuring Copilot in Windows for commercial environments.

Windows Copilot Runtime introduces new ways of interacting with the operating system that utilize AI, such as Phi Silica, the Small Language Model (SLM) created by Microsoft Research that is able to offer many of the same capabilities found in Large Language Models (LLMs), but more compact and efficient so that it can run locally on Windows.

As a developer, you can integrate your apps with AI-powered Windows experiences such as Recall and Studio Effects, leverage new APIs powered by on-device models through the Windows Copilot Library, discover Machine Learning (ML) models to fine-tune with your own customized data using AI Toolkit in Visual Studio Code, integrate your own ML models using frameworks like ONNX Runtime, PyTorch, or WebNN, and access hardware-acceleration for better performance and scale through DirectML.

Windows Copilot Runtime and Windows Copilot Library

There are new innovations that use AI to improve and redefine the Windows experience, with some AI innovations baked right into using Windows and others available for App Developers to integrate into their app features. These new ways of integrating AI into a Windows app form the Windows Copilot Library, a list of ready-to-use AI-backed features and APIs, including:

  • Studio Effects: Enhances the camera and audio capabilities of Windows devices with AI-powered background effects, eye contact correction, automatic framing, voice focus, blur, lighting, and creative filters run on the device NPU to maintain fast performance speeds.
  • Recall: Makes past activities on your Windows device searchable, so that you can pick up where you left off, whether that was using an app, editing a document, or responding to an email.
  • Phi Silica: Enables your app connect with the on-device Phi model for natural language processing tasks (chat, math, code, reasoning) using the Windows App SDK.
  • Text Recognition: Optical Character Recognition, or OCR, enables the extraction of text from images and documents. Imagine tasks like converting a PDF, paper document, or picture of a classroom white board into editable digital text.
  • Live Caption Translations: Helps everyone on Windows, including those who are deaf or hard of hearing, better understand audio by viewing captions of spoken content, even when the audio content is in a language different from the system's preferred language.

Developers will be able to access these APIs as part of the Windows App SDK.

In addition to the ready-to-use AI-backed APIs in Windows Copilot Library, we have guidance on how to enhance your app using Machine Learning (ML) models. This covers topics like:

Leading with responsibility and examples

We have also created resources to support our developers seeking to integrate AI into your Windows applications by providing a Samples Gallery, a guide for Responsible AI use, and some high-level FAQs that can help with unpacking some of the terminology and concepts.

Get started adding models to your Windows app