If you'd like to see us expand this article with more information, implementation details, pricing guidance, or code examples, let us know with GitHub Feedback!
Together, the Azure Bot Service and Language Understanding service enable developers to create conversational interfaces for various scenarios like banking, travel, and entertainment. For example, a hotel's concierge can use a bot to enhance traditional e-mail and phone call interactions by validating a customer via Azure Active Directory and using Cognitive Services to better contextually process customer requests using text and voice. The Speech recognition service can be added to support voice commands.
Download an SVG of this architecture.
- The customer uses your mobile app.
- Using Azure AD B2C, the user authenticates.
- Using the custom Application Bot, the user requests information.
- Cognitive Services helps process the natural language request.
- The response is reviewed by the customer, who can refine the question using natural conversation.
- Once the user is happy with the results, the Application Bot updates the customer's reservation.
- Application Insights gathers runtime telemetry to help development with Bot performance and usage.
Key technologies used to implement this architecture:
- Azure Active Directory B2C
- Azure App Service
- Azure Bot Service
- Azure Cognitive Services Language Understanding
- Azure Cognitive Services Speech Services
- Azure SQL Database
- Azure Monitor: Application Insights is a feature of Azure Monitor.
- Artificial intelligence (AI) - Architectural overview
- Choosing a Microsoft cognitive services technology
- What are Azure Cognitive Services?
- What is Language Understanding (LUIS)?
- What is the Speech service?
- What is Azure Active Directory B2C?
- Introduction to Bot Framework Composer
- What is Application Insights
Fully deployable architectures: