Write directly to storage

APPLIES TO: SDK v4

You can read and write directly to your storage object without using middleware or context object. This can be appropriate for data your bot uses to preserve a conversation, or data that comes from a source outside your bot's conversation flow. In this data storage model, data is read in directly from storage instead of using a state manager. The code examples in this article show you how to read and write data to storage using memory, Cosmos DB, Azure Blob, and Azure Blob transcript storage.

Note

The Bot Framework JavaScript, C#, and Python SDKs will continue to be supported, however, the Java SDK is being retired with final long-term support ending in November 2023.

Existing bots built with the Java SDK will continue to function.

For new bot building, consider using Microsoft Copilot Studio and read about choosing the right copilot solution.

For more information, see The future of bot building.

Prerequisites

Note

You can install the templates from within Visual Studio.

  1. In the menu, select Extensions then Manage Extensions.
  2. In the Manage Extensions dialog, search for and install Bot Framework v4 SDK templates for Visual Studio.

For information about deploying .NET bots to Azure, see how to Provision and publish a bot.

About this sample

The sample code in this article begins with the structure of a basic echo bot, then extends that bot's functionality by adding additional code (provided below). This extended code creates a list to preserve user inputs as they're received. Each turn, the full list of user inputs, saved to memory, is echoed back to the user. The data structure containing this list of inputs is then modified to save to storage. Various types of storage are explored as additional functionality is added to this sample code.

Important

This article contains legacy code samples using connection strings in configuration files for internal connection to storage. Microsoft recommends that you use the most secure authentication flow available. If you're connecting to an Azure resource, Managed Identities for Azure resources is the recommended authentication method.

Memory storage

The Bot Framework SDK allows you to store user inputs using in-memory storage. Since in-memory storage is cleared each time the bot is restarted, it's best suited for testing purposes and isn't intended for production use. Persistent storage types, such as database storage, are best for production bots.

Build a basic bot

The rest of this topic builds off of an Echo bot. The Echo bot sample code can be locally built by following the quickstart instructions to Create a bot.

To use the .env configuration file, your bot needs an extra package included. If not already installed, get the .NET package from npm:

npm install --save dotenv

Modify the code in index.js that creates the main dialog. The existing line of code to create the main dialog is const myBot = new EchoBot();, it needs updated to enable passing a storage object to the EchoBot constructor so you can store the user's input to the bot's internal memory:

First you'll need to add a reference to MemoryStorage in botbuilder:

const { MemoryStorage } = require('botbuilder');

Then create the memory storage object, and pass it to the EchoBot constructor:

const myStorage = new MemoryStorage();
const myBot = new EchoBot(myStorage);

This passes in a MemoryStorage object to the EchoBot constructor. You'll change that later to pass a Cosmos DB or Blob Storage object.

Next, replace the code in bot.js with the following code:

const { ActivityHandler, MemoryStorage } = require('botbuilder');
const restify = require('restify');

// Process incoming requests - adds storage for messages.
class EchoBot extends ActivityHandler {
    constructor(myStorage) {
        super();
        this.storage = myStorage;
        // See https://learn.microsoft.com/azure/bot-service/bot-builder-basics to learn more about the message and other activity types.
        this.onMessage(async turnContext => { console.log('this gets called (message)');
        await turnContext.sendActivity(`You said '${ turnContext.activity.text }'`);
        // Save updated utterance inputs.
        await logMessageText(this.storage, turnContext);
    });
        this.onConversationUpdate(async turnContext => { console.log('this gets called (conversation update)');
        await turnContext.sendActivity('Welcome, enter an item to save to your list.'); });
    }
}

// This function stores new user messages. Creates new utterance log if none exists.
async function logMessageText(storage, turnContext) {
    let utterance = turnContext.activity.text;
    // debugger;
    try {
        // Read from the storage.
        let storeItems = await storage.read(["UtteranceLogJS"])
        // Check the result.
        var UtteranceLogJS = storeItems["UtteranceLogJS"];
        if (typeof (UtteranceLogJS) != 'undefined') {
            // The log exists so we can write to it.
            storeItems["UtteranceLogJS"].turnNumber++;
            storeItems["UtteranceLogJS"].UtteranceList.push(utterance);
            // Gather info for user message.
            var storedString = storeItems.UtteranceLogJS.UtteranceList.toString();
            var numStored = storeItems.UtteranceLogJS.turnNumber;

            try {
                await storage.write(storeItems)
                await turnContext.sendActivity(`${numStored}: The list is now: ${storedString}`);
            } catch (err) {
                await turnContext.sendActivity(`Write failed of UtteranceLogJS: ${err}`);
            }
        }
        else{
            await turnContext.sendActivity(`Creating and saving new utterance log`);
            var turnNumber = 1;
            storeItems["UtteranceLogJS"] = { UtteranceList: [`${utterance}`], "eTag": "*", turnNumber }
            // Gather info for user message.
            var storedString = storeItems.UtteranceLogJS.UtteranceList.toString();
            var numStored = storeItems.UtteranceLogJS.turnNumber;

            try {
                await storage.write(storeItems)
                await turnContext.sendActivity(`${numStored}: The list is now: ${storedString}`);
            } catch (err) {
                await turnContext.sendActivity(`Write failed: ${err}`);
            }
        }
    }
    catch (err){
        await turnContext.sendActivity(`Read rejected. ${err}`);
    }
}

module.exports.EchoBot = EchoBot;

Start your bot

Run your bot locally.

Start the Emulator and connect your bot

Install the Bot Framework Emulator Next, start the Emulator and then connect to your bot in the Emulator:

  1. Select the Create new bot configuration link in the Emulator Welcome tab.
  2. Fill in fields to connect to your bot, given the information on the webpage displayed when you started your bot.

Interact with your bot

Send a message to your bot. The bot will list the messages it has received.

A conversation with the bot that shows the bot keeping a list of messages from the user.

The remainder of this article will demonstrate how to save to persistent storage instead of the bot's internal memory.

Using Cosmos DB

Important

The Cosmos DB storage class has been deprecated. Containers originally created with CosmosDbStorage had no partition key set, and were given the default partition key of _/partitionKey.

Containers created with Cosmos DB storage can be used with Cosmos DB partitioned storage. Read Partitioning in Azure Cosmos DB for more information.

Also note that, unlike the legacy Cosmos DB storage, the Cosmos DB partitioned storage doesn't automatically create a database within your Cosmos DB account. You need to create a new database manually, but skip manually creating a container since CosmosDbPartitionedStorage will create the container for you.

Now that you've used memory storage, we'll update the code to use Azure Cosmos DB. Cosmos DB is Microsoft's globally distributed, multi-model database. Azure Cosmos DB enables you to elastically and independently scale throughput and storage across any number of Azure's geographic regions. It offers throughput, latency, availability, and consistency guarantees with comprehensive service level agreements (SLAs).

Set up a Cosmos DB resource

To use Cosmos DB in your bot, you'll need to create a database resource before getting into the code. For an in-depth description of Cosmos DB database and app creation, see the quickstart for .NET, Node.js, or Python.

Create your database account

  1. Go to the Azure portal to create an Azure Cosmos DB account. Search for and select Azure Cosmos DB.

  2. In the Azure Cosmos DB page, select New to bring up the Create Azure Cosmos DB Account page.

    Screenshot of creating your Cosmos DB account.

  3. Provide values for the following fields:

    1. Subscription. Select the Azure subscription that you want to use for this Azure Cosmos account.
    2. Resource group. Select an existing resource group or select Create new, and enter a name for a new resource group.
    3. Account name. Enter a name to identify your Azure Cosmos account. Because documents.azure.com is appended to the name that you provide to create your URI, use a unique name. Note the following guidelines:
      • The name must be unique across Azure.
      • The name must be between 3 and 31 characters long.
      • The name can include only lowercase letters, numbers, and the hyphen (-) character.
    4. API. Select Core(SQL)
    5. Location. select a location that is closest to your users to give them the fastest access to the data.
  4. Select Review + Create.

  5. Once validated, select Create.

The account creation takes a few minutes. Wait for the portal to display the Congratulations! Your Azure Cosmos DB account was created page.

Add a database

Note

Don't create the container yourself. Your bot will create it for you when creating its internal Cosmos DB client, ensuring it's configured correctly for storing bot state.

  1. Navigate to the Data Explorer page within your newly created Cosmos DB account, then choose New Database from the New Container drop-down. A panel will then open on the right-hand side of the window, where you can enter the details for the new database.

    Screenshot of creating your Cosmos DB database.

  2. Enter an ID for your new database and, optionally, set the throughput (you can change this later) and finally select OK to create your database. Make a note of this database ID for use later on when configuring your bot.

  3. Now that you've created a Cosmos DB account and a database, you need to copy over some of the values for integrating your new database into your bot. To retrieve these, navigate to the Keys tab within the database settings section of your Cosmos DB account. From this page, you'll need your URI (Cosmos DB endpoint) and your PRIMARY KEY (authorization key).

You should now have a Cosmos DB account with a database and the following values ready to use in your bot settings.

  • URI
  • Primary Key
  • Database ID

Add Cosmos DB configuration information

Use the details you made a note of in the previous part of this article to set your endpoint, authorization key, and database ID. Finally, you should choose an appropriate name for the container that will be created within your database to store your bot state. In the example below the Cosmos DB container that is created will be named "bot-storage".

Add the following information to your .env file.

.env

CosmosDbEndpoint="<your-CosmosDb-URI>"
CosmosDbAuthKey="<your-primary-key>"
CosmosDbDatabaseId="<your-database-id>"
CosmosDbContainerId="bot-storage"

Installing Cosmos DB packages

Make sure you've the packages necessary for Cosmos DB.

Add a reference to botbuilder-azure using npm.

npm install --save botbuilder-azure

If not already installed, get the .NET package from npm in order to access your .env file settings.

npm install --save dotenv

Cosmos DB implementation

Note

Version 4.6 introduced a new Cosmos DB storage provider, the Cosmos DB partitioned storage class, and the original Cosmos DB storage class is deprecated. Containers created with Cosmos DB storage can be used with Cosmos DB partitioned storage. Read Partitioning in Azure Cosmos DB for more information.

Unlike the legacy Cosmos DB storage, the Cosmos DB partitioned storage doesn't automatically create a database within your Cosmos DB account. You need to create a new database manually, but skip manually creating a container since CosmosDbPartitionedStorage will create the container for you.

First, add code to your index.js file to enable you to access the values from your .env file that you entered previously:

// initialized to access values in .env file.
const ENV_FILE = path.join(__dirname, '.env');
require('dotenv').config({ path: ENV_FILE });

Next, you'll need to make changes to index.js to use Cosmos DB partitioned storage instead of the Bot Frameworks internal storage. All the code changes in this section are made in index.js.

First, add a reference to botbuilder-azure in index.js. This will give you access to the CosmosDbPartitionedStorage class.

const { CosmosDbPartitionedStorage } = require('botbuilder-azure');

Next, create a new CosmosDbPartitionedStorage object.

const myStorage = new CosmosDbPartitionedStorage({
    cosmosDbEndpoint: process.env.CosmosDbEndpoint,
    authKey: process.env.CosmosDbAuthKey,
    databaseId: process.env.CosmosDbDatabaseId,
    containerId: process.env.CosmosDbContainerId,
    compatibilityMode: false
});

You can now comment out or remove the myStorage const declaration that you previously added, since you no longer are saving user input to the Bot Frameworks internal storage, but instead to Cosmos DB:

//const myStorage = new MemoryStorage();
const myBot = new EchoBot(myStorage);

Start your Cosmos DB bot

Run your bot locally.

Test your Cosmos DB bot with Bot Framework Emulator

Now start the Bot Framework Emulator and connect to your bot:

  1. Select the create a new bot configuration link in the Emulator Welcome tab.
  2. Fill in fields to connect to your bot, given the information on the webpage displayed when you started your bot.

Interact with your Cosmos DB bot

Send a message to your bot, and the bot will list the messages it received.

A conversation with the bot that shows the bot keeping a list of messages from the user.

View your Cosmos DB data

After you've run your bot and saved your information, you can view the data stored in the Azure portal under the Data Explorer tab.

Screenshot of the Data Explorer in the Azure portal.

Using Blob storage

Azure Blob storage is Microsoft's object storage solution for the cloud. Blob storage is optimized for storing massive amounts of unstructured data, such as text or binary data. This section explains how to create an Azure blob storage account and container, then how to reference your blob storage container from your bot.

For more information on Blob Storage, see What is Azure Blob storage?

Create your Blob storage account

To use Blob storage in your bot, you'll need to get a few things set up before getting into the code.

  1. In the Azure portal, select All services.

  2. In the Featured section of the All services page, select Storage accounts.

  3. In the Storage accounts page, select New.

    Screenshot of creating an Azure Storage account.

  4. In the Subscription field, select the subscription in which to create the storage account.

  5. In the Resource group field, select an existing resource group or select Create new, and enter a name for the new resource group.

  6. In the Storage account name field, enter a name for the account. Note the following guidelines:

    • The name must be unique across Azure.
    • The name must be between 3 and 24 characters long.
    • The name can include only numbers and lowercase letters.
  7. In the Location field, select a location for the storage account, or use the default location.

  8. For the rest of the settings, configure the following:

  9. In the Project details section of the Create storage account page, select the desired values for subscription and Resource group.

  10. In the Instance details section of the Create storage account page, enter the Storage account name then select values for Location, Account kind, and Replication.

  11. Select Review + create to review the storage account settings.

  12. Once validated, select Create.

Create Blob storage container

Once your Blob storage account is created, open it, then:

  1. Select Storage Explorer (Preview).

  2. Then right-click on BLOB CONTAINERS

  3. Select Create blob container from the drop-down list.

    Screenshot of creating a blob container.

  4. Enter a name in the New container form. You'll use this name for the value of your "blob container name" to provide access to your Blob storage account. Note the following guidelines:

    • This name may only contain lowercase letters, numbers, and hyphens.
    • This name must begin with a letter or a number.
    • Each hyphen must be preceded and followed by a valid non-hyphen character.
    • The name must be between 3 and 63 characters long.

Add Blob storage configuration information

Find the Blob storage keys you need to configure Blob storage for your bot as shown above:

  1. In the Azure portal, open your Blob storage account and select Access keys in the Settings section.
  2. To configure your bot to access to your Blob storage account, use Connection string as the value for the blob connection string.

Add the following information to your .env file.

.env

BlobConnectionString="<your-blob-connection-string>"
BlobContainerName="<your-blob-container-name>"

Installing Blob storage packages

If not previously installed, install the following packages.

Add references to botbuilder-azure-blobs in your project via npm.

Note

This npm package relies on an installation of Python existing on your development machine. If you've not previously installed Python you can find installation resources for your machine at Python.org

npm install --save botbuilder-azure-blobs

If not already installed, get the .NET package from npm in order to access your .env file settings.

npm install --save dotenv

Blob storage implementation

Blob storage is used to store bot state.

All the code changes in this section are made in index.js.

First, add code to enable you to access the values from your .env file that you entered previously:

// initialized to access values in .env file.
const ENV_FILE = path.join(__dirname, '.env');
require('dotenv').config({ path: ENV_FILE });

Next, you'll need to make changes use Blob storage instead of the Bot Frameworks internal storage.

Next, add a reference to botbuilder-azure-blobs. This will give you access to the Blob Storage API:

const { BlobsStorage } = require("botbuilder-azure-blobs");

To modify your code to use the BlobsStorage class, modify your myStorage declaration as follows:

const myStorage = new BlobsStorage(
    process.env.BlobConnectionString,
    process.env.BlobContainerName
);

Once your storage is set to point to your Blob Storage account, your bot code will now store and retrieve data from Blob storage.

Once your storage is set to point to your Blob Storage account, your bot code will now store and retrieve data from Blob storage.

Start your Blob storage bot

Run your bot locally.

Start the Emulator and connect your Blob storage bot

Next, start the Emulator and then connect to your bot in the Emulator:

  1. Select the Create new bot configuration link in the Emulator "Welcome" tab.
  2. Fill in fields to connect to your bot, given the information on the webpage displayed when you started your bot.

Interact with your Blob storage bot

Send a message to your bot, and the bot will list the messages it receives.

A conversation with the bot that shows the bot keeping a list of messages from the user.

View your Blob storage data

After you've run your bot and saved your information, we can view it in under the Storage Explorer tab in the Azure portal.

Blob transcript storage

Azure blob transcript storage provides a specialized storage option that allows you to easily save and retrieve user conversations in the form of a recorded transcript. Azure blob transcript storage is useful for automatically capturing user inputs to examine while debugging your bot's performance.

Note

Python doesn't currently support Azure Blob transcript storage. While JavaScript supports Blob transcript storage, the following directions are for C# only.

Set up a Blob transcript storage container

Azure blob transcript storage can use the same blob storage account created following the steps detailed in sections "Create your blob storage account" and "Add configuration information" above. We now add a container to hold our transcripts

Screenshot of creating a blob container to use as a transcript store.

  1. Open your Azure blob storage account.
  2. Select Storage Explorer.
  3. Right click on BLOB CONTAINERS and select create blob container.
  4. Enter a name for your transcript container and then select OK. (We entered mybottranscripts)

Blob transcript storage implementation

The following code connects transcript storage pointer _myTranscripts to your new Azure blob transcript storage account. To create this link with a new container name, <your-blob-transcript-container-name>, it creates a new container within Blob storage to hold your transcript files.

Blob transcript storage is designed to store bot transcripts.

Note

As of version 4.10, Microsoft.Bot.Builder.Azure.AzureBlobTranscriptStore is deprecated. Use the new Microsoft.Bot.Builder.Azure.Blobs.BlobsTranscriptStore in its place.

echoBot.cs

using Microsoft.Bot.Builder.Azure.Blobs;

public class EchoBot : ActivityHandler
{
   ...

   private readonly BlobsTranscriptStore _myTranscripts = new BlobsTranscriptStore("<your-azure-storage-connection-string>", "<your-blob-transcript-container-name>");

   ...
}

Store user conversations in Azure blob transcripts

After a blob container is available to store transcripts, you can begin to preserve your users' conversations with your bot. These conversations can later be used as a debugging tool to see how users interact with your bot. Each Emulator Restart conversation initiates the creation of a new transcript conversation list. The following code preserves user conversation inputs within a stored transcript file.

  • The current transcript is saved using LogActivityAsync.
  • Saved transcripts are retrieved using ListTranscriptsAsync. In this sample code, the ID of each stored transcript is saved into a list named "storedTranscripts". This list is later used to manage the number of stored blob transcripts we retain.

echoBot.cs


protected override async Task OnMessageActivityAsync(ITurnContext<IMessageActivity> turnContext, CancellationToken cancellationToken)
{
    await _myTranscripts.LogActivityAsync(turnContext.Activity);

    List<string> storedTranscripts = new List<string>();
    PagedResult<Microsoft.Bot.Builder.TranscriptInfo> pagedResult = null;
    var pageSize = 0;
    do
    {
       pagedResult = await _myTranscripts.ListTranscriptsAsync("emulator", pagedResult?.ContinuationToken);
       pageSize = pagedResult.Items.Count();

       // transcript item contains ChannelId, Created, Id.
       // save the channelIds found by "ListTranscriptsAsync" to a local list.
       foreach (var item in pagedResult.Items)
       {
          storedTranscripts.Add(item.Id);
       }
    } while (pagedResult.ContinuationToken != null);

    ...
}

Manage stored blob transcripts

While stored transcripts can be used as a debugging tool, over time the number of stored transcripts can grow larger than you care to preserve. The additional code included below uses DeleteTranscriptAsync to remove all but the last three retrieved transcript items from your blob transcript store.

echoBot.cs


protected override async Task OnMessageActivityAsync(ITurnContext<IMessageActivity> turnContext, CancellationToken cancellationToken)
{
    await _myTranscripts.LogActivityAsync(turnContext.Activity);

    List<string> storedTranscripts = new List<string>();
    PagedResult<Microsoft.Bot.Builder.TranscriptInfo> pagedResult = null;
    var pageSize = 0;
    do
    {
       pagedResult = await _myTranscripts.ListTranscriptsAsync("emulator", pagedResult?.ContinuationToken);
       pageSize = pagedResult.Items.Count();

       // transcript item contains ChannelId, Created, Id.
       // save the channelIds found by "ListTranscriptsAsync" to a local list.
       foreach (var item in pagedResult.Items)
       {
          storedTranscripts.Add(item.Id);
       }
    } while (pagedResult.ContinuationToken != null);

    // Manage the size of your transcript storage.
    for (int i = 0; i < pageSize; i++)
    {
       // Remove older stored transcripts, save just the last three.
       if (i < pageSize - 3)
       {
          string thisTranscriptId = storedTranscripts[i];
          try
          {
             await _myTranscripts.DeleteTranscriptAsync("emulator", thisTranscriptId);
           }
           catch (System.Exception ex)
           {
              await turnContext.SendActivityAsync("Debug Out: DeleteTranscriptAsync had a problem!");
              await turnContext.SendActivityAsync("exception: " + ex.Message);
           }
       }
    }
    ...
}

For more information about the class, see Azure Blob Transcript Storage.

Additional Information

Manage concurrency using eTags

In our bot code example, we set the eTag property of each IStoreItem to *. The eTag (entity tag) member of your store object is used within Cosmos DB to manage concurrency. The eTag tells your database what to do if another instance of the bot has changed the object in the same storage that your bot is writing to.

Last write wins - allow overwrites

An eTag property value of asterisk (*) indicates that the last writer wins. When creating a new data store, you can set eTag of a property to * to indicate that you've not previously saved the data that you're writing, or that you want the last writer to overwrite any previously saved property. If concurrency isn't an issue for your bot, setting the eTag property to * for any data that you're writing enables overwrites.

Maintain concurrency and prevent overwrites

When storing your data into Cosmos DB, use a value other than * for the eTag if you want to prevent concurrent access to a property and avoid overwriting changes from another instance of the bot. The bot receives an error response with the message etag conflict key= when it attempts to save state data and the eTag isn't the same value as the eTag in storage.

By default, the Cosmos DB store checks the eTag property of a storage object for equality every time a bot writes to that item, and then updates it to a new unique value after each write. If the eTag property on write doesn't match the eTag in storage, it means another bot or thread changed the data.

For example, let's say you want your bot to edit a saved note, but you don't want your bot to overwrite changes that another instance of the bot has done. If another instance of the bot has made edits, you want the user to edit the version with the latest updates.

Add a helper function to the end of your bot that will write a sample note to a data store. First create myNoteData object.

bot.js

// Helper function for writing a sample note to a data store
async function createSampleNote(storage, context) {
    var myNoteData = {
        name: "Shopping List",
        contents: "eggs",
        // If any Note file is already stored, the eTag field
        // must be set to "*" in order to allow writing without first reading the stored eTag
        // otherwise you'll likely get an exception indicating an eTag conflict.
        eTag: "*"
    }
}

Within the createSampleNote helper function, initialize a changes object and add your notes to it, then write it to storage.

bot.js

// Write the note data to the "Note" key
var changes = {};
changes["Note"] = myNoteData;
// Creates a file named Note, if it doesn't already exist.
// specifying eTag= "*" will overwrite any existing contents.
// The act of writing to the file automatically updates the eTag property
// The first time you write to Note, the eTag is changed from *, and file contents will become:
//    {"name":"Shopping List","contents":"eggs","eTag":"1"}
try {
     await storage.write(changes);
     var list = changes["Note"].contents;
     await context.sendActivity(`Successful created a note: ${list}`);
} catch (err) {
     await context.sendActivity(`Could not create note: ${err}`);
}

The helper function is accessed from within your bot's logic by adding the following call:

bot.js

// Save a note with etag.
await createSampleNote(storage, turnContext);

Once created, to later retrieve and update the note we create another helper function that can be accessed when the user types in "update note".

bot.js

async function updateSampleNote(storage, context) {
    try {
        // Read in a note
        var note = await storage.read(["Note"]);
        console.log(`note.eTag=${note["Note"].eTag}\n note=${JSON.stringify(note)}`);
        // update the note that we just read
        note["Note"].contents += ", bread";
        console.log(`Updated note=${JSON.stringify(note)}`);

        try {
             await storage.write(note); // Write the changes back to storage
             var list = note["Note"].contents;
             await context.sendActivity(`Successfully updated note: ${list}`);
        } catch (err) {
             console.log(`Write failed: ${err}`);
        }
    }
    catch (err) {
        await context.sendActivity(`Unable to read the Note: ${err}`);
    }
}

This helper function is accessed from within your bot's logic by adding the following call:

bot.js

// Update a note with etag.
await updateSampleNote(storage, turnContext);

If the note was updated in the store by another user before you attempted to write back your changes, the eTag value will no longer match and the call to write will throw an exception.

To maintain concurrency, always read a property from storage, then modify the property you read, so that the eTag is maintained. If you read user data from the store, the response will contain the eTag property. If you change the data and write updated data to the store, your request should include the eTag property that specifies the same value as you read earlier. However, writing an object with its eTag set to * will allow the write to overwrite any other changes.

Next steps

Now that you know how to read and write directly from storage, lets take a look at how you can use the state manager to do that for you.