AI at the edge with Azure Stack Hub

Azure Container Registry
Azure Kubernetes Service (AKS)
Azure Machine Learning
Azure Stack Hub

Solution ideas

This article is a solution idea. If you'd like us to expand the content with more information, such as potential use cases, alternative services, implementation considerations, or pricing guidance, let us know by providing GitHub feedback.

This architecture shows how you can bring your trained AI model to the edge with Azure Stack Hub and integrate it with your applications for low-latency intelligence.

Architecture

Architecture diagram showing an AI -enabled application that's running at the edge with Azure Stack Hub.

Download a Visio file of this architecture.

Dataflow

  1. Data is processed using Azure Data Factory, to be placed on Azure Data Lake.
  2. Data from Azure Data Factory is placed into the Azure Data Lake Storage for training.
  3. Data scientists train a model using Azure Machine Learning. The model is containerized and put into an Azure Container Registry.
  4. The model is deployed to a Kubernetes cluster on Azure Stack Hub.
  5. The on-premises web application can be used to score data that's provided by the end user, to score against the model that's deployed in the Kubernetes cluster.
  6. End users provide data that's scored against the model.
  7. Insights and anomalies from scoring are placed into a queue.
  8. A function app gets triggered once scoring information is placed in the queue.
  9. A function sends compliant data and anomalies to Azure Storage.
  10. Globally relevant and compliant insights are available for consumption in Power BI and a global app.
  11. Feedback loop: The model retraining can be triggered by a schedule. Data scientists work on the optimization. The improved model is deployed and containerized as an update to the container registry.

Components

Key technologies used to implement this architecture:

Scenario details

With the Azure AI tools, edge, and cloud platform, edge intelligence is possible. The next generation of AI-enabled hybrid applications can run where your data lives. With Azure Stack Hub, bring a trained AI model to the edge, integrate it with your applications for low-latency intelligence, and continuously feedback into a refined AI model for improved accuracy, with no tool or process changes for local applications. This solution idea shows a connected Stack Hub scenario, where edge applications are connected to Azure. For the disconnected-edge version of this scenario, see the article AI at the edge - disconnected.

Potential use cases

There's a wide range of Edge AI applications that monitor and provide information in near real-time. Areas where Edge AI can help include:

  • Security camera detection processes.
  • Image and video analysis (the media and entertainment industry).
  • Transportation and traffic (the automotive and mobility industry).
  • Manufacturing.
  • Energy (smart grids).

Next steps

For more information about the featured Azure services, see the following articles and samples:

See the following related architectures: