Quickstart: Upload, download, and list blobs using Node.js

In this quickstart, you learn how to use Node.js to upload, download, and list blobs and manage containers with Azure Blob storage.

To complete this quickstart, you need an Azure subscription.

To create a general-purpose v2 storage account in the Azure portal, follow these steps:

  1. In the Azure portal, expand the menu on the left side to open the menu of services, and choose All services. Then, scroll down to Storage, and choose Storage accounts. On the Storage Accounts window that appears, choose Add.
  2. Select the subscription in which to create the storage account.
  3. Under the Resource group field, click Create new. Enter a name for your new resource group, as shown in the following image.

    Screen shot showing how to create a resource group in the portal

  4. Next, enter a name for your storage account. The name you choose must be unique across Azure, must be between 3 and 24 characters in length, and may contain numbers and lowercase letters only.

  5. Select a location for your storage account, or use the default location.
  6. Leave these fields set to their default values:

    • The Deployment model field is set to Resource manager by default.
    • The Performance field is set to Standard by default.
    • The Account kind field is set to StorageV2 (general-purpose v2) by default.
    • The Replication field is set to Locally-redundant storage (LRS) by default.
    • The Access tier is set to Hot by default.
  7. Click Review + Create to review your storage account settings and create the account.

For more information about types of storage accounts and other storage account settings, see Azure storage account overview. For more information on resource groups, see Azure Resource Manager overview.

Download the sample application

The sample application in this quickstart is a simple Node.js console application. To begin, clone the repository to your machine using the following command:

git clone https://github.com/Azure-Samples/storage-blobs-node-quickstart.git

To open the application, look for the storage-blobs-node-quickstart folder and open it in your favorite code editing environment.

Copy your credentials from the Azure portal

The sample application needs to authenticate access to your storage account. To authenticate, you provide the application with your storage account credentials in the form of a connection string. To view your storage account credentials:

  1. Navigate to the Azure portal.
  2. Locate your storage account.
  3. In the Settings section of the storage account overview, select Access keys. Your account access keys appear, as well as the complete connection string for each key.
  4. Find the Connection string value under key1, and click the Copy button to copy the connection string. You will add the connection string value to an environment variable in the next step.

    Screen shot showing how to copy a connection string from the Azure portal

Configure your storage connection string

Before running the application, you must provide the connection string for your storage account. The sample repository includes a file named .env.example. You can rename this file by removing the .example extension, which results in a file named .env. Inside the .env file, add your connection string value after the AZURE_STORAGE_CONNECTION_STRING key.

Install required packages

In the application directory, run npm install to install the required packages for the application.

npm install

Run the sample

Now that the dependencies are installed, you can run the sample by issuing the following command:

npm start

The output from the script will be similar to the following:

Containers:
 - container-one
 - container-two
Container "demo" is created
Blob "quickstart.txt" is uploaded
Local file "./readme.md" is uploaded
Blobs in "demo" container:
 - quickstart.txt
 - readme.md
Blob downloaded blob content: "hello Blob SDK"
Blob "quickstart.txt" is deleted
Container "demo" is deleted
Done

Note that if you are using a new storage account for this quickstart, then you may not see container names listed under the label "Containers".

Understanding the code

The first expression is used to load values into environment variables.

if (process.env.NODE_ENV !== 'production') {
    require('dotenv').load();
}

The dotenv module loads environment variables when running the app locally for debugging. Values are defined in a file named .env and loaded into the current execution context. In production contexts, the server configuration provides these values and that is why this code is only run when the script is not running under a "production" context.

const path = require('path');
const storage = require('azure-storage');

The purpose of the modules is as follows:

file named .env into the current execution context

  • path is required in order to determine the absolute file path of the file to upload to blob storage
  • azure-storage is the Azure Storage SDK module for Node.js

Next, the blobService variable is initialized as a new instance of the Azure Blob service.

const blobService = storage.createBlobService();

In the following implementation, each of the blobService functions is wrapped in a Promise, which allows access to JavaScript's async function and await operator to streamline the callback nature of the Azure Storage API. When a successful response returns for each function, the promise resolves with relevant data along with a message specific to the action.

List containers

The listContainers function calls listContainersSegmented which returns collections of containers in groups.

const listContainers = async () => {
    return new Promise((resolve, reject) => {
        blobService.listContainersSegmented(null, (err, data) => {
            if (err) {
                reject(err);
            } else {
                resolve({ message: `${data.entries.length} containers`, containers: data.entries });
            }
        });
    });
};

The size of the groups is configurable via ListContainersOptions. Calling listContainersSegmented returns blob metadata as an array of ContainerResult instances. Results are returned in 5,000 increment batches (segments). If there are more than 5,000 blobs in a container, then the results include a value for continuationToken. To list subsequent segments from the blob container, you can pass the continuation token back into listContainersSegment as the second argument.

Create a container

The createContainer function calls createContainerIfNotExists and sets the appropriate access level for the blob.

const createContainer = async (containerName) => {
    return new Promise((resolve, reject) => {
        blobService.createContainerIfNotExists(containerName, { publicAccessLevel: 'blob' }, err => {
            if (err) {
                reject(err);
            } else {
                resolve({ message: `Container '${containerName}' created` });
            }
        });
    });
};

The second parameter (options) for createContainerIfNotExists accepts a value for publicAccessLevel. The value blob for publicAccessLevel specifies that specific blob data is exposed to the public. This setting is in contrast to container level access, which grants the ability to list the contents of the container.

The use of createContainerIfNotExists allows the application to run the createContainer command multiple times without returning errors when the container already exists. In a production environment, you often only call createContainerIfNotExists once as the same container is used throughout the application. In these cases, you can create the container ahead of time through the portal or via the Azure CLI.

Upload text

The uploadString function calls createBlockBlobFromText to write (or overwrite) an arbitrary string to the blob container.

const uploadString = async (containerName, blobName, text) => {
    return new Promise((resolve, reject) => {
        blobService.createBlockBlobFromText(containerName, blobName, text, err => {
            if (err) {
                reject(err);
            } else {
                resolve({ message: `Text "${text}" is written to blob storage` });
            }
        });
    });
};

Upload a local file

The uploadLocalFile function uses createBlockBlobFromLocalFile to upload and write (or overwrite) a file from the file system into blob storage.

const uploadLocalFile = async (containerName, filePath) => {
    return new Promise((resolve, reject) => {
        const fullPath = path.resolve(filePath);
        const blobName = path.basename(filePath);
        blobService.createBlockBlobFromLocalFile(containerName, blobName, fullPath, err => {
            if (err) {
                reject(err);
            } else {
                resolve({ message: `Local file "${filePath}" is uploaded` });
            }
        });
    });
};

Other approaches available to upload content into blobs include working with text and streams. To verify the file is uploaded to your blob storage, you can use the Azure Storage Explorer to view the data in your account.

List the blobs

The listBlobs function calls the listBlobsSegmented method to return a list of blob metadata in a container.

const listBlobs = async (containerName) => {
    return new Promise((resolve, reject) => {
        blobService.listBlobsSegmented(containerName, null, (err, data) => {
            if (err) {
                reject(err);
            } else {
                resolve({ message: `${data.entries.length} blobs in '${containerName}'`, blobs: data.entries });
            }
        });
    });
};

Calling listBlobsSegmented returns blob metadata as an array of BlobResult instances. Results are returned in 5,000 increment batches (segments). If there are more than 5,000 blobs in a container, then the results include a value for continuationToken. To list subsequent segments from the blob container, you can pass the continuation token back into listBlobSegmented as the second argument.

Download a blob

The downloadBlob function uses getBlobToText to download the contents of the blob to the given absolute file path.

const downloadBlob = async (containerName, blobName) => {
    const dowloadFilePath = path.resolve('./' + blobName.replace('.txt', '.downloaded.txt'));
    return new Promise((resolve, reject) => {
        blobService.getBlobToText(containerName, blobName, (err, data) => {
            if (err) {
                reject(err);
            } else {
                resolve({ message: `Blob downloaded "${data}"`, text: data });
            }
        });
    });
};

The implementation shown here changes the source returns the contents of the blob as a string. You can also download the blob as a stream as well as directly to a local file.

Delete a blob

The deleteBlob function calls the deleteBlobIfExists function. As the name implies, this function does not return an error if the blob is already deleted.

const deleteBlob = async (containerName, blobName) => {
    return new Promise((resolve, reject) => {
        blobService.deleteBlobIfExists(containerName, blobName, err => {
            if (err) {
                reject(err);
            } else {
                resolve({ message: `Block blob '${blobName}' deleted` });
            }
        });
    });
};

Delete a container

Containers are deleted by calling the deleteContainer method off the blob service and passing in the container name.

const deleteContainer = async (containerName) => {
    return new Promise((resolve, reject) => {
        blobService.deleteContainer(containerName, err => {
            if (err) {
                reject(err);
            } else {
                resolve({ message: `Container '${containerName}' deleted` });
            }
        });
    });
};

Calling code

In order to support JavaScript's async/await syntax, all the calling code is wrapped in a function named execute. Then execute is called and handled as a promise.

async function execute() {
    // commands 
}

execute().then(() => console.log("Done")).catch((e) => console.log(e));

All of the following code runs inside the execute function where the // commands comment is placed.

First, the relevant variables are declared to assign names, sample content and to point to the local file to upload to Blob storage.

const containerName = "demo";
const blobName = "quickstart.txt";
const content = "hello Node SDK";
const localFilePath = "./readme.md";
let response;

To list the containers in the storage account, the listContainers function is called and the returned list of containers is logged to the output window.

console.log("Containers:");
response = await listContainers();
response.containers.forEach((container) => console.log(` -  ${container.name}`));

Once the list of containers is available, then you can use the Array findIndex method to see if the container you want to create already exists. If the container does not exist then the container is created.

const containerDoesNotExist = response.containers.findIndex((container) => container.name === containerName) === -1;

if (containerDoesNotExist) {
    await createContainer(containerName);
    console.log(`Container "${containerName}" is created`);
}

Next, a string and a local file is uploaded to Blob storage.

await uploadString(containerName, blobName, content);
console.log(`Blob "${blobName}" is uploaded`);

response = await uploadLocalFile(containerName, localFilePath);
console.log(response.message);

The process to list blobs is the same as listing containers. The call to listBlobs returns an array of blobs in the container and are logged to the output window.

console.log(`Blobs in "${containerName}" container:`);
response = await listBlobs(containerName);
response.blobs.forEach((blob) => console.log(` - ${blob.name}`));

To download a blob, the response is captured and used to access the value of the blob. From the response readableStreamBody is converted to a string and logged out to the output window.

response = await downloadBlob(containerName, blobName);
console.log(`Downloaded blob content: "${response.text}"`);

Finally, the blob and container are deleted from the storage account.

await deleteBlob(containerName, blobName);
console.log(`Blob "${blobName}" is deleted`);

await deleteContainer(containerName);
console.log(`Container "${containerName}" is deleted`);

Clean up resources

All data written to the storage account is automatically deleted at the end of the code sample.

Resources for developing Node.js applications with blobs

See these additional resources for Node.js development with Blob storage:

Binaries and source code

Client library reference and samples

Next steps

This quickstart demonstrates how to upload a file between a local disk and Azure Blob storage using Node.js. To learn more about working with Blob storage, continue to the GitHub repository.