AI at the Edge with Azure Stack Hub

Container Registry
Kubernetes Service
Machine Learning
Azure Stack Hub

Solution Idea

If you'd like to see us expand this article with more information, implementation details, pricing guidance, or code examples, let us know with GitHub Feedback!

With the Azure AI tools and cloud platform, the next generation of AI-enabled hybrid applications can run where your data lives. With Azure Stack Hub, bring a trained AI model to the edge and integrate it with your applications for low-latency intelligence, with no tool or process changes for local applications.


Architecture diagram Download an SVG of this architecture.

Data Flow

  1. Data scientists train a model using Azure Machine Learning workbench and an HDInsight cluster. The model is containerized and put into an Azure Container Registry.
  2. The model is deployed to a Kubernetes cluster on Azure Stack Hub.
  3. End users provide data that's scored against the model.
  4. Insights and anomalies from scoring are placed into a queue.
  5. A function sends compliant data and anomalies to Azure Storage.
  6. Globally relevant and compliant insights are available in the global app.
  7. Data from edge scoring is used to improve the model.


  • Azure Machine Learning: Build, deploy, and manage predictive analytics solutions
  • HDInsight: Provision cloud Hadoop, Spark, R Server, HBase, and Storm clusters
  • Container Registry: Store and manage container images across all types of Azure deployments
  • Azure Kubernetes Service (AKS): Simplify the deployment, management, and operations of Kubernetes
  • Storage: Durable, highly available, and massively scalable cloud storage
  • Azure Stack Hub: Build and run innovative hybrid applications across cloud boundaries

Next steps