Store data at the edge with Azure Blob Storage on IoT Edge (preview)

Azure Blob Storage on IoT Edge provides a block blob storage solution at the edge. A blob storage module on your IoT Edge device behaves like an Azure block blob service, but the block blobs are stored locally on your IoT Edge device. You can access your blobs using the same Azure storage SDK methods or block blob API calls that you're already used to.

Note

Azure Blob Storage on IoT Edge is in public preview.

This module comes with tiering and time-to-live features.

Note

Currently tiering and time-to-live functionality are only available in Linux AMD64 and Linux ARM32.

Tiering is a configurable functionality, which allows you to automatically upload the data from your local blob storage to Azure with intermittent internet connectivity support. It allows you to:

  • Turn ON/OFF the tiering feature
  • Choose the order in which the data is copied to Azure like NewestFirst or OldestFirst
  • Specify the Azure Storage account to which you want your data uploaded.
  • Specify the containers you want to upload to Azure. This module allows you to specify both source and target container names.
  • Do full blob tiering (using Put Blob operation) and block level tiering (using Put Block and Put Block List operations).

This module uses block level tiering, when your blob consists of blocks. Here are some of the common scenarios:

  • Your application updates some blocks of a previously uploaded blob, this module uploads only the updated blocks and not the whole blob.
  • The module is uploading blob and internet connection goes away, when the connectivity is back again it uploads only the remaining blocks and not the whole blob.

If an unexpected process termination (like power failure) happens during a blob upload, all blocks that were due for the upload will be uploaded again, when the module comes back online.

Time-to-live (TTL) is a configurable functionality where the module automatically deletes your blobs from the local storage when the specified duration (measured in minutes) expires. TTL allows you to:

  • Turn ON/OFF the tiering feature
  • Specify the TTL in minutes

Scenarios where data like videos, images, finance data, hospital data, or any data that needs to be stored locally, later that could be processed locally or transferred to the cloud are good examples to use this module.

This article provides instructions for deploying an Azure Blob Storage on IoT Edge container that runs a blob service on your IoT Edge device.

Note

The terms "auto-tiering" and "auto-expiration" used in the video have been replaced by "tiering" and "time-to-live".

Watch the video for quick introduction

Prerequisites

An Azure IoT Edge device:

  • You can use your development machine or a virtual machine as an IoT Edge device by following the steps in the quickstart for Linux or Windows devices.

  • The Azure Blob Storage on IoT Edge module supports the following device configurations:

    Operating system Architecture
    Ubuntu Server 16.04 AMD64
    Ubuntu Server 18.04 AMD64
    Windows 10 IoT Core (October update) AMD64
    Windows 10 IoT Enterprise (October update) AMD64
    Windows Server 2019 AMD64
    Raspbian-stretch ARM32

Cloud resources:

A standard-tier IoT Hub in Azure.

Tiering and time-to-live properties

Use desired properties to set tiering and time-to-live properties. They can be set during deployment or changed later by editing the module twin without the need to redeploy. We recommend checking the "Module Twin" for reported configuration and configurationValidation to make sure values are correctly propagated.

Tiering properties

The name of this setting is tieringSettings

Field Possible Values Explanation
tieringOn true, false By default it is set to false, if you want to turn it On set it to true
backlogPolicy NewestFirst, OldestFirst Allows you to choose the order in which the data is copied to Azure. By default it is set to OldestFirst. The order is determined by last modified time of Blob
remoteStorageConnectionString "DefaultEndpointsProtocol=https;AccountName=<your Azure Storage Account Name>;AccountKey=<your Azure Storage Account Key>;EndpointSuffix=<your end point suffix>" is a connection string that allows you to specify the Azure Storage account to which you want your data uploaded. Specify Azure Storage Account Name, Azure Storage Account Key, End point suffix. Add appropriate EndpointSuffix of Azure where data will be uploaded, it varies for Global Azure, Government Azure, and Microsoft Azure Stack.
tieredContainers "<source container name1>": {"target": "<target container name>"},

"<source container name1>": {"target": "%h-%d-%m-%c"},

"<source container name1>": {"target": "%d-%c"}
Allows you to Specify the container names you want to upload to Azure. This module allows you to specify both source and target container names. If you don't specify the target container name, it will automatically assign the container name as <IoTHubName>-<IotEdgeDeviceName>-<ModuleName>-<ContainerName>. You can create template strings for target container name, check out the possible values column.
* %h -> IoT Hub Name (3-50 characters).
* %d -> IoT device ID (1 to 129 characters).
* %m -> Module Name (1 to 64 characters).
* %c -> Source Container Name (3 to 63 characters).

Maximum size of the container name is 63 characters, while automatically assigning the target container name if the size of container exceeds 63 characters it will trim each section (IoTHubName, IotEdgeDeviceName, ModuleName, ContainerName) to 15 characters.

Time-to-live properties

The name of this setting is ttlSettings

Field Possible Values Explanation
ttlOn true, false By default it is set to false, if you want to turn it On set it to true
timeToLiveInMinutes <minutes> Specify the TTL in minutes. The module will automatically delete your blobs from local storage when TTL expires

Configure log files

For information on configuring log files for your module, see these production best practices.

Connect to your blob storage module

You can use the account name and account key that you configured for your module to access the blob storage on your IoT Edge device.

Specify your IoT Edge device as the blob endpoint for any storage requests that you make to it. You can Create a connection string for an explicit storage endpoint using the IoT Edge device information and the account name that you configured.

  • For modules that are deployed on the same device as where the Azure Blob Storage on IoT Edge module is running, the blob endpoint is: http://<module name>:11002/<account name>.
  • For modules that are deployed on a different device than where the Azure Blob Storage on IoT Edge module is running, then depending upon your setup the blob endpoint is one of:
    • http://<device IP >:11002/<account name>
    • http://<IoT Edge device hostname>:11002/<account name>
    • http://<fully qualified domain name>:11002/<account name>

Azure Blob Storage quickstart samples

The Azure Blob Storage documentation includes quickstarts that provide sample code in several languages. You can run these samples to test Azure Blob Storage on IoT Edge by changing the blob endpoint to connect to your blob storage module.

The following quickstarts use languages that are also supported by IoT Edge, so you could deploy them as IoT Edge modules alongside the blob storage module:

Connect to your local storage with Azure Storage Explorer

You can use Azure Storage Explorer to connect to your local storage account. This is available only with Azure Storage Explorer version 1.5.0.

Note

You might encounter errors while performing the following steps, such as adding a connection to a local storage account, or creating containers in local storage account. Please ignore and refresh.

  1. Download and install Azure Storage Explorer

  2. Connect to Azure Storage using a connection string

  3. Provide connection string: DefaultEndpointsProtocol=http;BlobEndpoint=http://<host device name>:11002/<your local account name>;AccountName=<your local account name>;AccountKey=<your local account key>;

  4. Go through the steps to connect.

  5. Create container inside your local storage account

  6. Start uploading files as Block blobs.

    Note

    This module does not support Page blobs.

  7. You can choose to connect your Azure storage accounts where you are uploading the data. It gives you a single view for both your local storage account and Azure storage account

Supported storage operations

Blob storage modules on IoT Edge use the same Azure Storage SDKs, and are consistent with the 2017-04-17 version of the Azure Storage API for block blob endpoints. Later releases are dependent on customer needs.

Because not all Azure Blob Storage operations are supported by Azure Blob Storage on IoT Edge, this section lists the status of each.

Account

Supported:

  • List containers

Unsupported:

  • Get and set blob service properties
  • Preflight blob request
  • Get blob service stats
  • Get account information

Containers

Supported:

  • Create and delete container
  • Get container properties and metadata
  • List blobs
  • Get and set container ACL
  • Set container metadata

Unsupported:

  • Lease container

Blobs

Supported:

  • Put, get, and delete blob
  • Get and set blob properties
  • Get and set blob metadata

Unsupported:

  • Lease blob
  • Snapshot blob
  • Copy and abort copy blob
  • Undelete blob
  • Set blob tier

Block blobs

Supported:

  • Put block
  • Put and get block list

Unsupported:

  • Put block from URL

Feedback

Your feedback is important to us to make this module and its features useful and easy to use. Please share your feedback and let us know how we can improve.

You can reach us at absiotfeedback@microsoft.com

Next steps

Learn more about Deploy Azure Blob Storage on IoT Edge