Integrate a model into your app with Windows ML (Preview)

Note

Windows ML is a preview feature which may be substantially modified before it’s officially released. Microsoft makes no warranties, express or implied, with respect to the information provided here.

Windows ML's automatic code generation creates an interface that calls the Windows ML APIs for you, allowing you to easily interact with your model. Using the interface's generated wrapper classes, you'll follow the load, bind, and evaluate pattern to integrate your ML model into your app.

load, bind, evaluate

In this article, we'll use the MNIST model from Get Started to demonstrate how to load, bind, and evaluate a model in your app.

Add the model

First, you'll need to add your ONNX model to your Visual Studio project's Assets. If you're building a UWP app with Visual Studio (version 15.7 - Preview 1), then Visual Studio will automatically generate the wrapper classes in a new file. For other workflows, you'll need to use the mlgen tool to generate wrapper classes.

Below are the Windows ML generated wrapper classes for the MNIST model. We'll use the remainder of this article to explain how to use these classes in your app.

public sealed class MNISTModelInput
{
    public VideoFrame Input3 { get; set; }
}

public sealed class MNISTModelOutput
{
    public IList<float> Plus214_Output_0 { get; set; }
    public MNISTModelOutput()
    {
        this.Plus214_Output_0 = new List<float>();
    }
}

public sealed class MNISTModel
{
    private LearningModelPreview learningModel;
    public static async Task<MNISTModel> CreateMNISTModel(StorageFile file)
    {
        LearningModelPreview learningModel = await LearningModelPreview.LoadModelFromStorageFileAsync(file);
        MNISTModel model = new MNISTModel();
        model.learningModel = learningModel;
        return model;
    }
    public async Task<MNISTModelOutput> EvaluateAsync(MNISTModelInput input) {
        MNISTModelOutput output = new MNISTModelOutput();
        LearningModelBindingPreview binding = new LearningModelBindingPreview(learningModel);
        binding.Bind("Input3", input.Input3);
        binding.Bind("Plus214_Output_0", output.Plus214_Output_0);
        LearningModelEvaluationResultPreview evalResult = await learningModel.EvaluateAsync(binding, string.Empty);
        return output;
    }
}

Note: To make sure your model builds when you compile your application, right click on the .onnx file, and select Properties. For Build Action, select Content.

Load

Once you have the generated wrapper classes, you'll load the model in your app.

The MNISTModel class represents the MNIST model, and to load the model, we call the CreateMNISTModel method, passing in the ONNX file as the parameter.

// Load the model
StorageFile modelFile = await StorageFile.GetFileFromApplicationUriAsync(new Uri($"ms-appx:///Assets/MNIST.onnx"));
MNISTModel model = MNISTModel.CreateMNISTModel(modelFile);

Bind

The generated code also includes Input and Output wrapper classes. The Input class represents the model's expected inputs, and the Output class represents the model's expected outputs.

To initialize the model's input object, call the Input class constructor, passing in your application data, and make sure that your input data matches the input type that your model expects. The MNISTModelInput class expects a VideoFrame, so we use a helper method to get a VideoFrame for the input.

//Bind the input data to the model
MNISTModelInputs ModelInput = new MNISTModelInputs();
ModelInput.Input3 = await helper.GetHandWrittenImage(inkGrid);

Output objects are initialized to Null values and will be updated with the model's results after the next step, Evaluate.

Evaluate

Once your inputs are initialized, call the model's EvaluateAsync method to evaluate your model on the input data. EvaluateAsync binds your inputs and outputs to the model object and evaluates the model on the inputs.

// Evaluate the model
MNISTModelOuput ModelOutput = model.EvaluateAsync(ModelInput);

After evaluation, your output contains the model's results, which you now can view and analyze. Since the MNIST model outputs a list of probabilities, we iterate through the list to find and display the digit with the highest probability.

 //Iterate through output to determine highest probability digit
float maxProb = 0;
int maxIndex = 0;
for (int i = 0; i < 10; i++)
{
    if (ModelOutput.Plus214_Output_0[i] > maxProb)
    {
        maxIndex = i;
        maxProb = ModelOutput.Plus214_Output_0[i];
    }
}
numberLabel.Text = maxIndex.ToString();

Using the Windows ML APIs directly

Although Windows ML's automatic code generator provides a convenient interface to interact with your model, you don't have to use the wrapper classes. Instead, you can call the Windows ML APIs directly in your app. If you choose to do so, you'll follow the same pattern: load your model, bind your inputs and outputs, and evaluate.

For more information on using the APIs, see the Windows ML API reference.