AI at the Edge with Azure Stack - disconnected

Solution Idea

If you'd like to see us expand this article with more information (implementation details, pricing guidance, code examples, etc), let us know with GitHub Feedback!

With the Azure AI tools and cloud platform, the next generation of AI-enabled hybrid applications can run where your data lives. With Azure Stack, bring a trained AI model to the edge and integrate it with your applications for low-latency intelligence, with no tool or process changes for local applications. With Azure Stack, you can ensure that your cloud solutions work even when disconnected from the internet.


Architecture diagram Download an SVG of this architecture.

Data Flow

  1. Data scientists train a model using Azure Machine Learning and an HDInsight cluster. The model is containerized and put in to an Azure Container Registry.
  2. The model is deployed via an offline installer to a Kubernetes cluster on Azure Stack.
  3. End users provide data that is scored against the model.
  4. Insights and anomalies from scoring are placed into storage for later upload.
  5. Globally-relevant and compliant insights are available in the global app.
  6. Data from edge scoring is used to improve the model.


  • HDInsight: Provision cloud Hadoop, Spark, R Server, HBase, and Storm clusters
  • Machine Learning Studio: Easily build, deploy, and manage predictive analytics solutions
  • Virtual Machines: Provision Windows and Linux virtual machines in seconds
  • Azure Kubernetes Service (AKS): Simplify the deployment, management, and operations of Kubernetes
  • Storage: Durable, highly available, and massively scalable cloud storage
  • Azure Stack: Build and run innovative hybrid applications across cloud boundaries

Next steps