Azure Functions Storage table bindings

This article explains how to configure and code Azure Storage table bindings in Azure Functions. Azure Functions supports input and output bindings for Azure Storage tables.

The Storage table binding supports the following scenarios:

  • Read a single row in a C# or Node.js function - Set partitionKey and rowKey. The filter and take properties are not used in this scenario.
  • Read multiple rows in a C# function - The Functions runtime provides an IQueryable<T> object bound to the table. Type T must derive from TableEntity or implement ITableEntity. The partitionKey, rowKey, filter, and take properties are not used in this scenario; you can use the IQueryable object to do any filtering required.
  • Read multiple rows in a Node function - Set the filter and take properties. Don't set partitionKey or rowKey.
  • Write one or more rows in a C# function - The Functions runtime provides an ICollector<T> or IAsyncCollector<T> bound to the table, where T specifies the schema of the entities you want to add. Typically, type T derives from TableEntity or implements ITableEntity, but it doesn't have to. The partitionKey, rowKey, filter, and take properties are not used in this scenario.

This is reference information for Azure Functions developers. If you're new to Azure Functions, start with the following resources:

Storage table input binding

The Azure Storage table input binding enables you to use a storage table in your function.

The Storage table input to a function uses the following JSON objects in the bindings array of function.json:

{
    "name": "<Name of input parameter in function signature>",
    "type": "table",
    "direction": "in",
    "tableName": "<Name of Storage table>",
    "partitionKey": "<PartitionKey of table entity to read - see below>",
    "rowKey": "<RowKey of table entity to read - see below>",
    "take": "<Maximum number of entities to read in Node.js - optional>",
    "filter": "<OData filter expression for table input in Node.js - optional>",
    "connection": "<Name of app setting - see below>",
}

Note the following:

  • Use partitionKey and rowKey together to read a single entity. These properties are optional.
  • connection must contain the name of an app setting that contains a storage connection string. In the Azure portal, the standard editor in the Integrate tab configures this app setting for you when you create a Storage account or selects an existing one. You can also configure this app setting manually.

Input usage

In C# functions, you bind to the input table entity (or entities) by using a named parameter in your function signature, like <T> <name>. Where T is the data type that you want to deserialize the data into, and paramName is the name you specified in the input binding. In Node.js functions, you access the input table entity (or entities) using context.bindings.<name>.

The input data can be deserialized in Node.js or C# functions. The deserialized objects have RowKey and PartitionKey properties.

In C# functions, you can also bind to any of the following types, and the Functions runtime will attempt to deserialize the table data using that type:

  • Any type that implements ITableEntity
  • IQueryable<T>

Input sample

Supposed you have the following function.json, which uses a queue trigger to read a single table row. The JSON specifies PartitionKey RowKey. "rowKey": "{queueTrigger}" indicates that the row key comes from the queue message string.

{
  "bindings": [
    {
      "queueName": "myqueue-items",
      "connection": "MyStorageConnection",
      "name": "myQueueItem",
      "type": "queueTrigger",
      "direction": "in"
    },
    {
      "name": "personEntity",
      "type": "table",
      "tableName": "Person",
      "partitionKey": "Test",
      "rowKey": "{queueTrigger}",
      "connection": "MyStorageConnection",
      "direction": "in"
    }
  ],
  "disabled": false
}

See the language-specific sample that reads a single table entity.

Input sample in C#

public static void Run(string myQueueItem, Person personEntity, TraceWriter log)
{
    log.Info($"C# Queue trigger function processed: {myQueueItem}");
    log.Info($"Name in Person entity: {personEntity.Name}");
}

public class Person
{
    public string PartitionKey { get; set; }
    public string RowKey { get; set; }
    public string Name { get; set; }
}

Input sample in F#

[<CLIMutable>]
type Person = {
  PartitionKey: string
  RowKey: string
  Name: string
}

let Run(myQueueItem: string, personEntity: Person) =
    log.Info(sprintf "F# Queue trigger function processed: %s" myQueueItem)
    log.Info(sprintf "Name in Person entity: %s" personEntity.Name)

Input sample in Node.js

module.exports = function (context, myQueueItem) {
    context.log('Node.js queue trigger function processed work item', myQueueItem);
    context.log('Person entity name: ' + context.bindings.personEntity.Name);
    context.done();
};

Storage table output binding

The Azure Storage table output binding enables you to write entities to a Storage table in your function.

The Storage table output for a function uses the following JSON objects in the bindings array of function.json:

{
    "name": "<Name of input parameter in function signature>",
    "type": "table",
    "direction": "out",
    "tableName": "<Name of Storage table>",
    "partitionKey": "<PartitionKey of table entity to write - see below>",
    "rowKey": "<RowKey of table entity to write - see below>",
    "connection": "<Name of app setting - see below>",
}

Note the following:

  • Use partitionKey and rowKey together to write a single entity. These properties are optional. You can also specify PartitionKey and RowKey when you create the entity objects in your function code.
  • connection must contain the name of an app setting that contains a storage connection string. In the Azure portal, the standard editor in the Integrate tab configures this app setting for you when you create a Storage account or selects an existing one. You can also configure this app setting manually.

Output usage

In C# functions, you bind to the table output by using the named out parameter in your function signature, like out <T> <name>, where T is the data type that you want to serialize the data into, and paramName is the name you specified in the output binding. In Node.js functions, you access the table output using context.bindings.<name>.

You can serialize objects in Node.js or C# functions. In C# functions, you can also bind to the following types:

  • Any type that implements ITableEntity
  • ICollector<T> (to output multiple entities. See sample.)
  • IAsyncCollector<T> (async version of ICollector<T>)
  • CloudTable (using the Azure Storage SDK. See sample.)

Output sample

The following function.json and run.csx example shows how to write multiple table entities.

{
  "bindings": [
    {
      "name": "input",
      "type": "manualTrigger",
      "direction": "in"
    },
    {
      "tableName": "Person",
      "connection": "MyStorageConnection",
      "name": "tableBinding",
      "type": "table",
      "direction": "out"
    }
  ],
  "disabled": false
}

See the language-specific sample that creates multiple table entities.

Output sample in C#

public static void Run(string input, ICollector<Person> tableBinding, TraceWriter log)
{
    for (int i = 1; i < 10; i++)
        {
            log.Info($"Adding Person entity {i}");
            tableBinding.Add(
                new Person() { 
                    PartitionKey = "Test", 
                    RowKey = i.ToString(), 
                    Name = "Name" + i.ToString() }
                );
        }

}

public class Person
{
    public string PartitionKey { get; set; }
    public string RowKey { get; set; }
    public string Name { get; set; }
}

Output sample in F#

[<CLIMutable>]
type Person = {
  PartitionKey: string
  RowKey: string
  Name: string
}

let Run(input: string, tableBinding: ICollector<Person>, log: TraceWriter) =
    for i = 1 to 10 do
        log.Info(sprintf "Adding Person entity %d" i)
        tableBinding.Add(
            { PartitionKey = "Test"
              RowKey = i.ToString()
              Name = "Name" + i.ToString() })

Output sample in Node.js

module.exports = function (context) {

    context.bindings.tableBinding = [];

    for (var i = 1; i < 10; i++) {
        context.bindings.tableBinding.push({
            PartitionKey: "Test",
            RowKey: i.toString(),
            Name: "Name " + i
        });
    }

    context.done();
};

Sample: Read multiple table entities in C#

The following function.json and C# code example reads entities for a partition key that is specified in the queue message.

{
  "bindings": [
    {
      "queueName": "myqueue-items",
      "connection": "MyStorageConnection",
      "name": "myQueueItem",
      "type": "queueTrigger",
      "direction": "in"
    },
    {
      "name": "tableBinding",
      "type": "table",
      "connection": "MyStorageConnection",
      "tableName": "Person",
      "direction": "in"
    }
  ],
  "disabled": false
}

The C# code adds a reference to the Azure Storage SDK so that the entity type can derive from TableEntity.

#r "Microsoft.WindowsAzure.Storage"
using Microsoft.WindowsAzure.Storage.Table;

public static void Run(string myQueueItem, IQueryable<Person> tableBinding, TraceWriter log)
{
    log.Info($"C# Queue trigger function processed: {myQueueItem}");
    foreach (Person person in tableBinding.Where(p => p.PartitionKey == myQueueItem).ToList())
    {
        log.Info($"Name: {person.Name}");
    }
}

public class Person : TableEntity
{
    public string Name { get; set; }
}

Next steps

For information about other bindings and triggers for Azure Functions, see Azure Functions triggers and bindings developer reference