Quickstart: Upload, download, and list blobs using Node.js

In this quickstart, you learn how to use Node.js to upload, download, and list block blobs in a container using Azure Blob storage.

To complete this quickstart, you need an Azure subscription.

Create a storage account by using the Azure portal

First, create a new general-purpose storage account to use for this quickstart.

  1. Go to the Azure portal and sign in by using your Azure account.
  2. Enter a unique name for your storage account. Keep these rules in mind for naming your storage account:
    • The name must be 3 to 24 characters in length.
    • The name can contain numbers and lowercase letters only.
  3. Select your subscription.
  4. Create a new Resource group and give it a unique name.
  5. Select the Location to use for your storage account.
  6. Leave other fields set to their default values.
  7. Select Pin to dashboard and select Create to create your storage account.

After your storage account is created, it's pinned to the dashboard of the Azure portal. Select the storage account to open it. Under Settings, select Access keys. Select the primary account access key and click the Copy button to copy the associated connection string to the clipboard. Then paste the string into a text editor for later use.

Download the sample application

The sample application in this quickstart is a simple Node.js console application. To begin, clone the repository to your machine using the following command:

git clone https://github.com/Azure-Samples/storage-blobs-node-quickstart.git

To open the application, look for the storage-blobs-node-quickstart folder and open it in your favorite code editing environment.

Copy your credentials from the Azure portal

The sample application needs to authenticate access to your storage account. To authenticate, you provide the application with your storage account credentials in the form of a connection string. To view your storage account credentials:

  1. Navigate to the Azure portal.
  2. Locate your storage account.
  3. In the Settings section of the storage account overview, select Access keys. Your account access keys appear, as well as the complete connection string for each key.
  4. Find the Connection string value under key1, and click the Copy button to copy the connection string. You will add the connection string value to an environment variable in the next step.

    Screen shot showing how to copy a connection string from the Azure portal

Configure your storage connection string

Before running the application, you must provide the connection string for your storage account. The sample repository includes a file named .env.example. You can rename this file by removing the .example extension, which results in a file named .env. Inside the .env file, add your connection string value after the AZURE_STORAGE_CONNECTION_STRING key.

Install required packages

In the application directory, run npm install to install the required packages for the application.

npm install

Run the sample

Now that the dependencies are installed, you can run the sample by passing commands to the script. For instance, to create a blob container you run the following command:

node index.js --command createContainer

Commands available include:

Command Description
createContainer Creates a container named test-container (succeeds even if container already exists)
upload Uploads the example.txt file to the test-container container
download Downloads the contents of the example blob to example.downloaded.txt
delete Deletes the example blob
list Lists the contents of test-container container to the console

Understanding the sample code

This code sample uses a few modules to interface with the file system and the command line.

if (process.env.NODE_ENV !== 'production') {
const path = require('path');
const args = require('yargs').argv;
const storage = require('azure-storage');

The purpose of the modules is as follows:

  • dotenv loads environment variables defined in a file named .env into the current execution context
  • path is required in order to determine the absolute file path of the file to upload to blob storage
  • yargs exposes a simple interface to access command-line arguments
  • azure-storage is the Azure Storage SDK module for Node.js

Next, a series of variables are initialized:

const blobService = storage.createBlobService();
const containerName = 'test-container';
const sourceFilePath = path.resolve('./example.txt');
const blobName = path.basename(sourceFilePath, path.extname(sourceFilePath));

The variables are set to the following values:

  • blobService is set to a new instance of the Azure Blob service
  • containerName is set the name of the container
  • sourceFilePath is set to the absolute path of the file to upload
  • blobName is created by taking the file name and removing the file extension

In the following implementation, each of the blobService functions is wrapped in a Promise, which allows access to JavaScript's async function and await operator to streamline the callback nature of the Azure Storage API. When a successful response returns for each function, the promise resolves with relevant data along with a message specific to the action.

Create a blob container

The createContainer function calls createContainerIfNotExists and sets the appropriate access level for the blob.

const createContainer = () => {
    return new Promise((resolve, reject) => {
        blobService.createContainerIfNotExists(containerName, { publicAccessLevel: 'blob' }, err => {
            if(err) {
            } else {
                resolve({ message: `Container '${containerName}' created` });

The second parameter (options) for createContainerIfNotExists accepts a value for publicAccessLevel. The value blob for publicAccessLevel specifies that specific blob data is exposed to the public. This setting is in contrast to container level access, which grants the ability to list the contents of the container.

The use of createContainerIfNotExists allows the application to run the createContainer command multiple times without returning errors when the container already exists. In a production environment, you often only call createContainerIfNotExists once as the same container is used throughout the application. In these cases, you can create the container ahead of time through the portal or via the Azure CLI.

Upload a blob to the container

The upload function uses the createBlockBlobFromLocalFile function to upload and write or overwrite a file from the file system into blob storage.

const upload = () => {
    return new Promise((resolve, reject) => {
        blobService.createBlockBlobFromLocalFile(containerName, blobName, sourceFilePath, err => {
            if(err) {
            } else {
                resolve({ message: `Upload of '${blobName}' complete` });

In context of the sample application, the file named example.txt is uploaded to a blob named example inside a container named test-container. Other approaches available to upload content into blobs include working with text and streams.

To verify the file is uploaded to your blob storage, you can use the Azure Storage Explorer to view the data in your account.

List the blobs in a container

The list function calls the listBlobsSegmented method to return a list of blob metadata in a container.

const list = () => {
    return new Promise((resolve, reject) => {
        blobService.listBlobsSegmented(containerName, null, (err, data) => {
            if(err) {
            } else {
                resolve({ message: `Items in container '${containerName}':`, data: data });

Calling listBlobsSegmented returns blob metadata as an array of BlobResult instances. Results are returned in 5,000 increment batches (segments). If there are more than 5,000 blobs in a container, then the results include a value for continuationToken. To list subsequent segments from the blob container, you can pass the continuation token back into listBlobSegmented as the second argument.

Download a blob from the container

The download function uses getBlobToLocalFile to download the contents of the blob to the given absolute file path.

const download = () => {
    const dowloadFilePath = sourceFilePath.replace('.txt', '.downloaded.txt');
    return new Promise((resolve, reject) => {
        blobService.getBlobToLocalFile(containerName, blobName, dowloadFilePath, err => {
            if(err) {
            } else {
                resolve({ message: `Download of '${blobName}' complete` });

The implementation shown here changes the source file path to append .downloaded.txt to the file name. In real-world contexts, you can change the location as well as the file name when selecting a download destination.

Delete blobs in the container

The deleteBlock function (aliased as the delete console command) calls the deleteBlobIfExists function. As the name implies, this function does not return an error the blob is already deleted.

const deleteBlock = () => {
    return new Promise((resolve, reject) => {
        blobService.deleteBlobIfExists(containerName, blobName, err => {
            if(err) {
            } else {
                resolve({ message: `Block blob '${blobName}' deleted` });

Upload and list

One of the benefits of using promises is being able to chain commands together. The uploadAndList function demonstrates how easy it is to list the contents of a Blob directly after uploading a file.

const uploadAndList = () => {
    return _module.upload().then(_module.list);

Calling code

To expose the functions implemented to the command line, each of the functions is mapped to an object literal.

const _module = {
    "createContainer": createContainer,
    "upload": upload,
    "download": download,
    "delete": deleteBlock,
    "list": list,
    "uploadAndList": uploadAndList

With _module now in place, each of the commands is available from the command line.

const commandExists = () => exists = !!_module[args.command];

If a given command does not exist, then _module's properties are rendered to the console to as help text to the user.

The function executeCommand is an async function, which calls the given command using the await operator and logs any messages to data to the console.

const executeCommand = async () => {
    const response = await _module[args.command]();


    if (response.data) {
        response.data.entries.forEach(entry => {
            console.log('Name:', entry.name, ' Type:', entry.blobType)

Finally, the executing code first calls commandExists to verify a known command is passed into the script. If an existing command is selected, then the command is run and any errors are logged to the console.

try {
    const cmd = args.command;

    console.log(`Executing '${cmd}'...`);

    if (commandExists()) {
    } else {
        console.log(`The '${cmd}' command does not exist. Try one of these:`);
        Object.keys(_module).forEach(key => console.log(` - ${key}`));
} catch (e) {

Clean up resources

If you do not plan on using the data or accounts created in this article, you may want to delete them in order to avoid any undesired billing. To delete the blob and containers, you can use the deleteBlobIfExists and deleteContainerIfExists methods. You can also delete the storage account through the portal.

Resources for developing Node.js applications with blobs

See these additional resources for Node.js development with Blob storage:

Binaries and source code

Client library reference and samples

Next steps

This quickstart demonstrates how to upload a file between a local disk and Azure Blob storage using Node.js. To learn more about working with Blob storage, continue to the Blob storage How-to.

For the Node.js reference for Azure Storage, see azure-storage package.