AI at the edge with Azure Stack Hub - disconnected

Container Registry
HDInsight
Kubernetes Service
Machine Learning
Azure Stack Hub
Storage
App Service
Virtual Machines

Solution Idea

If you'd like to see us expand this article with more information, such as potential use cases, alternative services, implementation considerations, or pricing guidance, let us know with GitHub Feedback!

With the Azure AI tools, edge, and cloud platform, edge intelligence is possible. AI-enabled hybrid applications can run where your data lives, on-premises. With Azure Stack Hub, bring a trained AI model to the edge and integrate it with your applications for low-latency intelligence, with no tool or process changes for local applications. With Azure Stack Hub, you can ensure that your cloud solutions work even when disconnected from the internet.

This solution idea shows a disconnected Stack Hub scenario. Issues of latency, intermittent connectivity, or regulations may not always allow for connectivity to Azure. In the disconnected scenario, data is processed locally and later aggregated in Azure for further analytics. For the connected version of this scenario, see the article AI at the edge.

Architecture

Architecture diagram: AI-enabled application running at the edge with Azure Stack Hub and hybrid connectivity). Download an SVG of this architecture.

Data flow

  1. Data scientists train a model using Azure Machine Learning and an HDInsight cluster. The model is containerized and put into an Azure Container Registry.
  2. The model is deployed to a Kubernetes cluster on Azure Stack Hub.
  3. End users provide data that's scored against the model.
  4. Insights and anomalies from scoring are placed into storage for later upload.
  5. Globally relevant and compliant insights are available in the global app.
  6. Data scientists use scoring from the edge to improve the model.

Components

Key technologies used to implement this architecture:

Next steps