Use multiple LUIS and QnA models with Orchestrator

APPLIES TO: SDK v4

If a bot uses multiple LUIS models and QnA Maker knowledge bases (knowledge bases), you can use Orchestrator to determine which LUIS model or QnA Maker knowledge base best matches the user input. The bf orchestrator CLI tool does this by creating a Orchestrator snapshot file that will be used to route user input to the correct model at run time. For more information about Orchestrator, including the CLI commands, click here.

Prerequisites

About this sample

This sample is based on a predefined set of LUIS and QnA Maker apps.

Code sample logic flow cs

OnMessageActivityAsync is called for each user input received. This module finds the top scoring user intent and passes that result on to DispatchToTopIntentAsync. DispatchToTopIntentAsync, in turn, calls the appropriate app handler.

  • ProcessSampleQnAAsync - for bot FAQ questions.
  • ProcessWeatherAsync - for weather queries.
  • ProcessHomeAutomationAsync - for home lighting commands.

The handler calls the LUIS or QnA Maker service and returns the generated result back to the user.

Create LUIS apps and QnA knowledge base

Before you can create Orchestrator snapshot file, you'll need to have your LUIS apps and QnA knowledge bases created and published. In this article, we'll publish the following models that are included with the NLP With Orchestrator sample in the \CognitiveModels folder:

Name Description
HomeAutomation A LUIS app that recognizes a home automation intent with associated entity data.
Weather A LUIS app that recognizes weather-related intents with location data.
QnAMaker A QnA Maker knowledge base that provides answers to simple questions about the bot.

Create the LUIS apps

Create LUIS apps from the HomeAutomation and Weather lu files in the cognitive models directory of the sample.

  1. Run the following command to import, train and publish the app to the production environment.

    > bf luis:build --in CognitiveModels --authoringKey <YOUR-KEY> --botName <YOUR-BOT-NAME>
    
  2. Record the application ID, display name, authoring key, and location.

For more information, see how to Create a LUIS app in the LUIS portal and Obtain values to connect to your LUIS app in Add natural language understanding to your bot and the LUIS documentation on how to train and publish an app to the production environment.

Create the QnA Maker knowledge base

  1. Create your QnAMaker service from qnamaker.ai portal or from https://portal.azure.com, get the resource key for the next step below.

  2. Create a QnAMaker kb from the QnAMaker .qna file.

    1. Run the following command to import, train and publish the app to the production environment.

      > bf qnamaker:build --in CognitiveModels --subscriptionKey <YOUR-KEY> --botName <YOUR-BOT-NAME>
      
    2. Record the QnA Maker kb, hostname and endpoint key from the output of the command above.

Create the Orchestrator snapshot file

The CLI interface for the bf orchestrator tool creates the Orchestrator snapshot file for routing to the correct LUIS or QnA Maker app at run time.

  1. Open a command prompt or terminal window, and change directories to the sample directory

  2. Make sure you have the current version of npm and the bf cli tool.

    npm i -g npm
    npm i -g @microsoft/botframework-cli
    
  3. Download Orchestrator base model file

    > mkdir model
    > bf orchestrator:basemodel:get --out ./model
    
  4. Create the Orchestrator snapshot file

    > mkdir generated
    > bf orchestrator:create --hierarchical --in ./CognitiveModels --out ./generated --model ./model
    

Installing packages

Prior to running this app for the first time ensure that several NuGet packages are installed:

  • Microsoft.Bot.Builder
  • Microsoft.Bot.Builder.AI.Luis
  • Microsoft.Bot.Builder.AI.QnA
  • Microsoft.Bot.Builder.AI.Orchestrator

Manually update your appsettings.json file

Once all of your service apps are created, the information for each needs to be added into your 'appsettings.json' file. The initial C# Sample code contains an empty appsettings.json file:

appsettings.json


"LuisHomeAutomationAppId": "",
"LuisWeatherAppId": "",
"LuisAPIKey": "",
"LuisAPIHostName": "",

"AllowedHosts": "*",

"Orchestrator": {
  "ModelFolder": ".\\model",

For each of the entities shown below, add the values you recorded earlier in these instructions:

"MicrosoftAppId": "",
"MicrosoftAppPassword": "",

"QnAKnowledgebaseId": "<knowledge-base-id>",
"QnAEndpointKey": "<qna-maker-resource-key>",
"QnAEndpointHostName": "<your-hostname>",

"LuisAppId": "<app-id-for-dispatch-app>",
"LuisAPIKey": "<your-luis-endpoint-key>",
"LuisAPIHostName": "<your-dispatch-app-region>",

When all changes are complete, save this file.

Connect to the services from your bot

To connect to the LUIS, and QnA Maker services, your bot pulls information from the settings file.

In BotServices.cs, the information contained within configuration file appsettings.json is used to connect your Orchestrator bot to the HomeAutomation, Weather and SampleQnA services. The constructors use the values you provided to connect to these services.

BotServices.cs

{
    public class BotServices : IBotServices
    {
        public BotServices(IConfiguration configuration, OrchestratorRecognizer dispatcher)
        {
            // Read the setting for cognitive services (LUIS, QnA) from the appsettings.json
            // If includeApiResults is set to true, the full response from the LUIS api (LuisResult)
            // will be made available in the properties collection of the RecognizerResult
            LuisHomeAutomationRecognizer = CreateLuisRecognizer(configuration, "LuisHomeAutomationAppId");
            LuisWeatherRecognizer = CreateLuisRecognizer(configuration, "LuisWeatherAppId");

            Dispatch = dispatcher;

            SampleQnA = new QnAMaker(new QnAMakerEndpoint
            {
                KnowledgeBaseId = configuration["QnAKnowledgebaseId"],
                EndpointKey = configuration["QnAEndpointKey"],
                Host = configuration["QnAEndpointHostName"]
            });
        }

        public OrchestratorRecognizer Dispatch { get; private set; }
        
        public QnAMaker SampleQnA { get; private set; }
        
        public LuisRecognizer LuisHomeAutomationRecognizer { get; private set; }

        public LuisRecognizer LuisWeatherRecognizer { get; private set; }

        private LuisRecognizer CreateLuisRecognizer(IConfiguration configuration, string appIdKey)
        {
            var luisApplication = new LuisApplication(
                configuration[appIdKey],
                configuration["LuisAPIKey"],
                configuration["LuisAPIHostName"]);

            // Set the recognizer options depending on which endpoint version you want to use.
            // More details can be found in https://docs.microsoft.com/en-gb/azure/cognitive-services/luis/luis-migration-api-v3

Call the services from your bot

For each input from your user, the bot logic passes in user input to Orchestrator Recognizer, finds the top returned intent, and uses that information to call the appropriate service for the input.

In the DispatchBot.cs file whenever the OnMessageActivityAsync method is called, we check the incoming user message and get the top intent from Orchestrator Recognizer. We then pass the topIntentand recognizerResult` on to the correct method to call the service and return the result.

bots\DispatchBot.cs


protected override async Task OnMessageActivityAsync(ITurnContext<IMessageActivity> turnContext, CancellationToken cancellationToken)
{
    var dc = new DialogContext(new DialogSet(), turnContext, new DialogState());
    // Top intent tell us which cognitive service to use.
    var allScores = await _botServices.Dispatch.RecognizeAsync(dc, (Activity)turnContext.Activity, cancellationToken);
    var topIntent = allScores.Intents.First().Key;
    
    // Next, we call the dispatcher with the top intent.
    await DispatchToTopIntentAsync(turnContext, topIntent, cancellationToken);
}

Work with the recognition results

When the Orchestrator recognizer produces a result, it indicates which service can most appropriately process the utterance. The code in this bot routes the request to the corresponding service, and then summarizes the response from the called service. Depending on the intent returned from Orchestrator, this code uses the returned intent to route to the correct LUIS model or QnA service.

bots\DispatchBot.cs

private async Task DispatchToTopIntentAsync(ITurnContext<IMessageActivity> turnContext, string intent, CancellationToken cancellationToken)
{
    switch (intent)
    {
        case "HomeAutomation":
            await ProcessHomeAutomationAsync(turnContext, cancellationToken);
            break;
        case "Weather":
            await ProcessWeatherAsync(turnContext, cancellationToken);
            break;
        case "QnAMaker":
            await ProcessSampleQnAAsync(turnContext, cancellationToken);
            break;
        default:
            _logger.LogInformation($"Dispatch unrecognized intent: {intent}.");
            await turnContext.SendActivityAsync(MessageFactory.Text($"Dispatch unrecognized intent: {intent}."), cancellationToken);
            break;
    }
}

The ProcessHomeAutomationAsync and ProcessWeatherAsync methods use the user input contained within the turn context to to get the top intent and entities from the correct LUIS model.

The ProcessSampleQnAAsync method uses the user input contained within the turn context to generate an answer from the knowledge base and display that result to the user.

Note

If this were a production application, this is where the selected LUIS methods would connect to their specified service, pass in the user input, and process the returned LUIS intent and entity data.

Test your bot

  1. Using your development environment, start the sample code. Note the localhost address shown in the address bar of the browser window opened by your App: "https://localhost:<Port_Number>".

  2. Open Bot Framework Emulator, click on Open Bot button.

  3. In the Open a bot dialog box, enter your bot endpoint URL, such as http://localhost:3978/api/messages. Click Connect.

  4. For your reference, here are some of the questions and commands that are covered by the services built for your bot:

    • QnA Maker
      • hi, good morning
      • what are you, what do you do
    • LUIS (home automation)
      • turn on bedroom light
      • turn off bedroom light
      • make some coffee
    • LUIS (weather)
      • whats the weather in redmond washington
      • what's the forecast for london
      • show me the forecast for nebraska

Route user utterance to QnA Maker

  1. In the Emulator, enter the text hi and submit the utterance. The bot submits this query to Orchestrator and gets back a response indicating which child app should get this utterance for further processing.

  2. By selecting the Orchestrator Recognition Trace line in the log, you can see the JSON response in the Emulator. The Orchestrator result is displayed in the Inspector.

    {
    "type": "trace",
    "timestamp": "2021-05-01T06:26:04.067Z",
    "serviceUrl": "http://localhost:58895",
    "channelId": "emulator",
    "from": {
      "id": "36b2a460-aa43-11eb-920f-7da472b36492",
      "name": "Bot",
      "role": "bot"
    },
    "conversation": {
      "id": "17ef3f40-aa46-11eb-920f-7da472b36492|livechat"
    },
    "recipient": {
      "id": "5f8c6123-2596-45df-928c-566d44426556",
      "role": "user"
    },
    "locale": "en-US",
    "replyToId": "1a3f70d0-aa46-11eb-8b97-2b2a779de581",
    "label": "Orchestrator Recognition",
    "valueType": "OrchestratorRecognizer",
    "value": {
      "text": "hi",
      "alteredText": null,
      "intents": {
        "QnAMaker": {
          "score": 0.9987310956576168
        },
        "HomeAutomation": {
          "score": 0.3402091165577196
        },
        "Weather": {
          "score": 0.24092200496795158
        }
      },
      "entities": {},
      "result": [
        {
          "Label": {
            "Type": 1,
            "Name": "QnAMaker",
            "Span": {
              "Offset": 0,
              "Length": 2
            }
          },
          "Score": 0.9987310956576168,
          "ClosestText": "hi"
        },
        {
          "Label": {
            "Type": 1,
            "Name": "HomeAutomation",
            "Span": {
              "Offset": 0,
              "Length": 2
            }
          },
          "Score": 0.3402091165577196,
          "ClosestText": "make some coffee"
        },
        {
          "Label": {
            "Type": 1,
            "Name": "Weather",
            "Span": {
              "Offset": 0,
              "Length": 2
            }
          },
          "Score": 0.24092200496795158,
          "ClosestText": "soliciting today's weather"
        }
      ]
    },
    "name": "OrchestratorRecognizerResult",
    "id": "1ae65f30-aa46-11eb-8b97-2b2a779de581",
    "localTimestamp": "2021-04-30T23:26:04-07:00"
    }
    

    Because the utterance, hi, is part of the Orchestrator's QnAMaker intent, and is selected as the topScoringIntent, the bot will make a second request, this time to the QnA Maker app, with the same utterance.

  3. Select the QnAMaker Trace line in the Emulator log. The QnA Maker result displays in the Inspector.

    {
        "questions": [
            "hi",
            "greetings",
            "good morning",
            "good evening"
        ],
        "answer": "Hello!",
        "score": 1,
        "id": 96,
        "source": "QnAMaker.tsv",
        "metadata": [],
        "context": {
            "isContextOnly": false,
            "prompts": []
        }
    }