Azure Functions C# developer reference

The C# experience for Azure Functions is based on the Azure WebJobs SDK. Data flows into your C# function via method arguments. Argument names are specified in function.json, and there are predefined names for accessing things like the function logger and cancellation tokens.

This article assumes that you've already read the Azure Functions developer reference.

How .csx works

The .csx format allows you to write less "boilerplate" and focus on writing just a C# function. For Azure Functions, you just include any assembly references and namespaces you need up top, as usual, and instead of wrapping everything in a namespace and class, you can just define your Run method. If you need to include any classes, for instance to define Plain Old CLR Object (POCO) objects, you can include a class inside the same file.

Binding to arguments

The various bindings are bound to a C# function via the name property in the function.json configuration. Each binding has its own supported types which is documented per binding; for instance, a blob trigger can support a string, a POCO, or several other types. You can use the type which best suits your need. A POCO object must have a getter and setter defined for each property.

public static void Run(string myBlob, out MyClass myQueueItem)
{
    log.Verbose($"C# Blob trigger function processed: {myBlob}");
    myQueueItem = new MyClass() { Id = "myid" };
}

public class MyClass
{
    public string Id { get; set; }
}
Tip

If you plan to use the HTTP or WebHook bindings, we suggest reading this best practices document on HTTPClient.

Logging

To log output to your streaming logs in C#, you can include a TraceWriter typed argument. We recommend that you name it log. We recommend you avoid Console.Write in Azure Functions.

public static void Run(string myBlob, TraceWriter log)
{
    log.Info($"C# Blob trigger function processed: {myBlob}");
}

Async

To make a function asynchronous, use the async keyword and return a Task object.

public async static Task ProcessQueueMessageAsync(
        string blobName, 
        Stream blobInput,
        Stream blobOutput)
    {
        await blobInput.CopyToAsync(blobOutput, 4096, token);
    }

Cancellation Token

In certain cases, you may have operations which are sensitive to being shut down. While it's always best to write code which can handle crashing, in cases where you want to handle graceful shutdown requests, you define a CancellationToken typed argument. A CancellationToken will be provided if a host shutdown is triggered.

public async static Task ProcessQueueMessageAsyncCancellationToken(
        string blobName, 
        Stream blobInput,
        Stream blobOutput,
        CancellationToken token)
    {
        await blobInput.CopyToAsync(blobOutput, 4096, token);
    }

Importing namespaces

If you need to import namespaces, you can do so as usual, with the using clause.

using System.Net;
using System.Threading.Tasks;

public static Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)

The following namespaces are automatically imported and are therefore optional:

  • System
  • System.Collections.Generic
  • System.IO
  • System.Linq
  • System.Net.Http
  • System.Threading.Tasks
  • Microsoft.Azure.WebJobs
  • Microsoft.Azure.WebJobs.Host.

Referencing External Assemblies

For framework assemblies, add references by using the #r "AssemblyName" directive.

#r "System.Web.Http"

using System.Net;
using System.Net.Http;
using System.Threading.Tasks;

public static Task<HttpResponseMessage> Run(HttpRequestMessage req, TraceWriter log)

The following assemblies are automatically added by the Azure Functions hosting environment:

  • mscorlib,
  • System
  • System.Core
  • System.Xml
  • System.Net.Http
  • Microsoft.Azure.WebJobs
  • Microsoft.Azure.WebJobs.Host
  • Microsoft.Azure.WebJobs.Extensions
  • System.Web.Http
  • System.Net.Http.Formatting.

In addition, the following assemblies are special cased and may be referenced by simplename (e.g. #r "AssemblyName"):

  • Newtonsoft.Json
  • Microsoft.WindowsAzure.Storage
  • Microsoft.ServiceBus
  • Microsoft.AspNet.WebHooks.Receivers
  • Microsoft.AspNet.WebHooks.Common
  • Microsoft.Azure.NotificationHubs

If you need to reference a private assembly, you can upload the assembly file into a bin folder relative to your function and reference it by using the file name (e.g. #r "MyAssembly.dll"). For information on how to upload files to your function folder, see the following section on package management.

Package management

To use NuGet packages in a C# function, upload a project.json file to the the function's folder in the function app's file system. Here is an example project.json file that adds a reference to Microsoft.ProjectOxford.Face version 1.1.0:

{
  "frameworks": {
    "net46":{
      "dependencies": {
        "Microsoft.ProjectOxford.Face": "1.1.0"
      }
    }
   }
}

Only the .NET Framework 4.6 is supported, so make sure that your project.json file specifies net46 as shown here.

When you upload a project.json file, the runtime gets the packages and automatically adds references to the package assemblies. You don't need to add #r "AssemblyName" directives. Just add the required using statements to your run.csx file to use the types defined in the NuGet packages.

In the Functions runtime, NuGet restore works by comparing project.json and project.lock.json. If the date and time stamps of the files do not match, a NuGet restore runs and NuGet downloads updated packages. However, if the date and time stamps of the files match, NuGet does not perform a restore. Therefore, project.lock.json should not be deployed as this causes NuGet to skip the restore, and the function will not have the required packages. To avoid deploying the lock file, add the project.lock.json to the .gitignore file.

How to upload a project.json file

  1. Begin by making sure your function app is running, which you can do by opening your function in the Azure portal.

    This also gives access to the streaming logs where package installation output will be displayed.

  2. To upload a project.json file, use one of the methods described in the How to update function app files section of the Azure Functions developer reference topic.
  3. After the project.json file is uploaded, you see output like the following example in your function's streaming log:
2016-04-04T19:02:48.745 Restoring packages.
2016-04-04T19:02:48.745 Starting NuGet restore
2016-04-04T19:02:50.183 MSBuild auto-detection: using msbuild version '14.0' from 'D:\Program Files (x86)\MSBuild\14.0\bin'.
2016-04-04T19:02:50.261 Feeds used:
2016-04-04T19:02:50.261 C:\DWASFiles\Sites\facavalfunctest\LocalAppData\NuGet\Cache
2016-04-04T19:02:50.261 https://api.nuget.org/v3/index.json
2016-04-04T19:02:50.261 
2016-04-04T19:02:50.511 Restoring packages for D:\home\site\wwwroot\HttpTriggerCSharp1\Project.json...
2016-04-04T19:02:52.800 Installing Newtonsoft.Json 6.0.8.
2016-04-04T19:02:52.800 Installing Microsoft.ProjectOxford.Face 1.1.0.
2016-04-04T19:02:57.095 All packages are compatible with .NETFramework,Version=v4.6.
2016-04-04T19:02:57.189 
2016-04-04T19:02:57.189 
2016-04-04T19:02:57.455 Packages restored.

Environment variables

To get an environment variable or an app setting value, use System.Environment.GetEnvironmentVariable, as shown in the following code example:

public static void Run(TimerInfo myTimer, TraceWriter log)
{
    log.Info($"C# Timer trigger function executed at: {DateTime.Now}");
    log.Info(GetEnvironmentVariable("AzureWebJobsStorage"));
    log.Info(GetEnvironmentVariable("WEBSITE_SITE_NAME"));
}

public static string GetEnvironmentVariable(string name)
{
    return name + ": " + 
        System.Environment.GetEnvironmentVariable(name, EnvironmentVariableTarget.Process);
}

Reusing .csx code

You can use classes and methods defined in other .csx files in your run.csx file. To do that, use #load directives in your run.csx file. In the following example, a logging routine named MyLogger is shared in myLogger.csx and loaded into run.csx using the #load directive:

Example run.csx:

#load "mylogger.csx"

public static void Run(TimerInfo myTimer, TraceWriter log)
{
    log.Verbose($"Log by run.csx: {DateTime.Now}"); 
    MyLogger(log, $"Log by MyLogger: {DateTime.Now}");
}

Example mylogger.csx:

public static void MyLogger(TraceWriter log, string logtext)
{
    log.Verbose(logtext); 
}

Using a shared .csx is a common pattern when you want to strongly type your arguments between functions using a POCO object. In the following simplified example, a HTTP trigger and queue trigger share a POCO object named Order to strongly type the order data:

Example run.csx for HTTP trigger:

#load "..\shared\order.csx"

using System.Net;

public static async Task<HttpResponseMessage> Run(Order req, IAsyncCollector<Order> outputQueueItem, TraceWriter log)
{
    log.Info("C# HTTP trigger function received an order.");
    log.Info(req.ToString());
    log.Info("Submitting to processing queue.");

    if (req.orderId == null)
    {
        return new HttpResponseMessage(HttpStatusCode.BadRequest);
    }
    else
    {
        await outputQueueItem.AddAsync(req);
        return new HttpResponseMessage(HttpStatusCode.OK);
    }
}

Example run.csx for queue trigger:

#load "..\shared\order.csx"

using System;

public static void Run(Order myQueueItem, out Order outputQueueItem,TraceWriter log)
{
    log.Info($"C# Queue trigger function processed order...");
    log.Info(myQueueItem.ToString());

    outputQueueItem = myQueueItem;
}

Example order.csx:

public class Order
{
    public string orderId {get; set; }
    public string custName {get; set;}
    public string custAddress {get; set;}
    public string custEmail {get; set;}
    public string cartId {get; set; }

    public override String ToString()
    {
        return "\n{\n\torderId : " + orderId + 
                  "\n\tcustName : " + custName +             
                  "\n\tcustAddress : " + custAddress +             
                  "\n\tcustEmail : " + custEmail +             
                  "\n\tcartId : " + cartId + "\n}";             
    }
}

You can use a relative path with the #load directive:

  • #load "mylogger.csx" loads a file located in the function folder.
  • #load "loadedfiles\mylogger.csx" loads a file located in a folder in the function folder.
  • #load "..\shared\mylogger.csx" loads a file located in a folder at the same level as the function folder, that is, directly under wwwroot.

The #load directive works only with .csx (C# script) files, not with .cs files.

Versioning

The Functions runtime runs as a site extension to your Function App. Site extensions are extensibility points that enable you to add features to an Azure App Service, Website, or Function App. Kudu and Monaco are two examples of site extensions, and you may create and use custom extensions as well. You can configure the version of the extensions using the FUNCTIONS_EXTENSION_VERSION app setting.

The FUNCTIONS_EXTENSION_VERSION only sets the major version of the runtime. For example, the value "~1" indicates that your Function App will use 1 as its major version. Function Apps are upgraded to each new minor version as they are released. This allows you manage when to upgrade to versions to avoid breaking changes.

Additionally, you may want to upgrade the runtime before it becomes the default version in the portal. Don't worry though, you can roll back at any time by reverting the FUNCTIONS_EXTENSION_VERSION setting to its old value.

To determine your Azure Function App's runtime version:

Locate the applicationhost.config file located in the D:\local\Config folder in Kudu. The virtualDirectory entry reveals the exact Functions runtime version:

<virtualDirectory path="/" physicalPath="D:\Program Files (x86)\SiteExtensions\Functions\0.8.10564" />

You may use this value to set a specific major and minor runtime version of your Function App. Whenever you change the version of a Function App you must restart it.

Advanced binding at runtime (imperative binding)

In C# and other .NET languages, you can use an imperative binding pattern, as opposed to the declarative bindings in function.json. Imperative binding is useful when binding parameters need to be computed at runtime rather than design time. With this patttern, you can bind to any number of supported input and output binding on-the-fly in your function code.

Define an imperative binding as follows:

  • Do not include an entry in function.json for your desired imperative bindings.
  • Pass in an input parameter Binder binder or IBinder binder.
  • Use the following C# pattern to perform the data binding.
using (var output = await binder.BindAsync<T>(new BindingTypeAttribute(...)))
{
    ...
}

where BindingTypeAttribute is the .NET attribute that defines your binding and T is the input or output type that's supported by that binding type. T also cannot be an out parameter type (such as out JObject). For example, the Mobile Apps table output binding supports six output types, but you can only use ICollector or IAsyncCollector for T.

The following example code creates a Storage blob output binding with blob path that's defined at run time, then writes a string to the blob.

using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host.Bindings.Runtime;

public static async Task Run(string input, Binder binder)
{
    using (var writer = await binder.BindAsync<TextWriter>(new BlobAttribute("samples-output/path")))
    {
        writer.Write("Hello World!!");
    }
}

BlobAttribute defines the Storage blob input or output binding, and TextWriter is a supported output binding type. As is, the code gets the default app setting for the Storage account connection string (which is AzureWebJobsStorage). You can specify a custom app setting to use by adding the StorageAccountAttribute and passing the attribute array into BindAsync<T>(). For example,

using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host.Bindings.Runtime;

public static async Task Run(string input, Binder binder)
{
    var attributes = new Attribute[]
    {    
        new BlobAttribute("samples-output/path"),
        new StorageAccountAttribute("MyStorageAccount")
    };

    using (var writer = await binder.BindAsync<TextWriter>(attributes))
    {
        writer.Write("Hello World!");
    }
}

The following table lists the .NET attributes for each binding type and the packages in which they are defined.

Binding Attribute Add reference
DocumentDB Microsoft.Azure.WebJobs.DocumentDBAttribute #r "Microsoft.Azure.WebJobs.Extensions.DocumentDB"
Event Hubs Microsoft.Azure.WebJobs.ServiceBus.EventHubAttribute, Microsoft.Azure.WebJobs.ServiceBusAccountAttribute #r "Microsoft.Azure.Jobs.ServiceBus"
Mobile Apps Microsoft.Azure.WebJobs.MobileTableAttribute #r "Microsoft.Azure.WebJobs.Extensions.MobileApps"
Notification Hubs Microsoft.Azure.WebJobs.NotificationHubAttribute #r "Microsoft.Azure.WebJobs.Extensions.NotificationHubs"
Service Bus Microsoft.Azure.WebJobs.ServiceBusAttribute, Microsoft.Azure.WebJobs.ServiceBusAccountAttribute #r "Microsoft.Azure.WebJobs.ServiceBus"
Storage queue Microsoft.Azure.WebJobs.QueueAttribute, Microsoft.Azure.WebJobs.StorageAccountAttribute
Storage blob Microsoft.Azure.WebJobs.BlobAttribute, Microsoft.Azure.WebJobs.StorageAccountAttribute
Storage table Microsoft.Azure.WebJobs.TableAttribute, Microsoft.Azure.WebJobs.StorageAccountAttribute
Twilio Microsoft.Azure.WebJobs.TwilioSmsAttribute #r "Microsoft.Azure.WebJobs.Extensions.Twilio"

Next steps

For more information, see the following resources: