Extend your Healthcare Bot with QnA Maker

QnA Maker is a cloud-based API service that creates a conversational, question and answer layer over your data. The QnA Maker service answers your users' natural language questions by matching it with the best possible answer, and it can be used to extend the Healthcare Bot experience by connecting it to your knowledge-base (KB) or to easily add a set of chit-chat as a starting point for your healthcare bot's personality.

In this tutorial, we will follow the steps required to extend your Healthcare Bot with a QnA Maker language model. Our model will understand when a user is asking a question from the semi-structured content in your KB and reply with the corresponding answer.

Create your Knowledge Base

  1. You should first navigate to your QnA Maker portal and create a new knowledge base.

  2. Add your content such as Frequently Asked Question (FAQ) URLs, product manuals, support documents and custom questions and answers to your knowledge-base.

  3. Select the ‘Publish’ button from the top menu to get knowledge-base endpoints. You will then be able to use the following endpoint details to extend your bot:

    Screen shot of QnA Maker endpoint details

    You can always find the deployment details in your service’s knowledge-base settings page.

Create the Healthcare Bot QnA Language model

  1. Login to the Health Bot Management portal and navigate to Language >> Models. Here you should add a new model and select the QnA maker recognition method.

    Screen shot of language model recognition methods

  2. Provide the required name and description fields. These can be any name you like and are only seen internally to recognize the model.

  3. Provide the endpoint host knowledge-base ID and QnA endpoint key as per steps three. If the credentials have been provided correctly, you will be able to validate the connection to the QnA endpoint successfully.

    Screen shot of QnA maker endpoint details

  4. You should then set a unique intent name and define the triggered scenario. Use the built-in QnA scenario to reply with the top scoring answer of QnA.

    For more advanced flows, you may map the intent to your custom scenario. Your custom scenario will be triggered with the following input arguments:

    • The raw QnA endpoint response.
    • The unique intent name that can be used to identify the model that had triggered your custom scenario.

    Screen shot of QnA maker intent mapping

  5. You can now save the language model and test it. Click ‘Create’ and open the webchat in the management portal. Type one of the supported questions from your KB and see how the Healthcare Bot can now answer your users' natural language questions by matching it with the best possible answer from your Knowledge base or chit-chat file.

    Screen shot of QnA maker intent mapping

Using multiple LUIS, RegEx and QnA models

When a user query is matched against a knowledge base, QnA Maker returns relevant answers, along with a confidence score. This score indicates the confidence that the answer is the right match for the given user query. The Healthcare Bot service will filter QnA response with a score lower than 50 (or 0.5 in a scale of 0-1).

When a user query is matched against the model, it indicates which language model is most appropriately process the utterance. The code in this bot routes the request to the corresponding scenario as specified in details here.

Next steps

Adding chit-chat to your bot makes it more conversational and engaging. The chit-chat feature in QnA maker allows you to easily add a pre-populated set of the top chit-chat, into your knowledge base (KB).

Add Chit-chat to a knowledge base