The output binding allows you to modify and delete blob storage data in an Azure Function.
The following example is a C# function that uses a blob trigger and two output blob bindings. The function is triggered by the creation of an image blob in the sample-images container. It creates small and medium size copies of the image blob.
using System.Collections.Generic;
using System.IO;
using Microsoft.Azure.WebJobs;
using SixLabors.ImageSharp;
using SixLabors.ImageSharp.Formats;
using SixLabors.ImageSharp.PixelFormats;
using SixLabors.ImageSharp.Processing;
public class ResizeImages
{
[FunctionName("ResizeImage")]
public static void Run([BlobTrigger("sample-images/{name}")] Stream image,
[Blob("sample-images-sm/{name}", FileAccess.Write)] Stream imageSmall,
[Blob("sample-images-md/{name}", FileAccess.Write)] Stream imageMedium)
{
IImageFormat format;
using (Image<Rgba32> input = Image.Load<Rgba32>(image, out format))
{
ResizeImage(input, imageSmall, ImageSize.Small, format);
}
image.Position = 0;
using (Image<Rgba32> input = Image.Load<Rgba32>(image, out format))
{
ResizeImage(input, imageMedium, ImageSize.Medium, format);
}
}
public static void ResizeImage(Image<Rgba32> input, Stream output, ImageSize size, IImageFormat format)
{
var dimensions = imageDimensionsTable[size];
input.Mutate(x => x.Resize(dimensions.Item1, dimensions.Item2));
input.Save(output, format);
}
public enum ImageSize { ExtraSmall, Small, Medium }
private static Dictionary<ImageSize, (int, int)> imageDimensionsTable = new Dictionary<ImageSize, (int, int)>() {
{ ImageSize.ExtraSmall, (320, 200) },
{ ImageSize.Small, (640, 400) },
{ ImageSize.Medium, (800, 600) }
};
}
The following example shows blob input and output bindings in a function.json file and C# script (.csx) code that uses the bindings. The function makes a copy of a text blob. The function is triggered by a queue message that contains the name of the blob to copy. The new blob is named {originalblobname}-Copy.
In the function.json file, the queueTrigger
metadata property is used to specify the blob name in the path
properties:
{
"bindings": [
{
"queueName": "myqueue-items",
"connection": "MyStorageConnectionAppSetting",
"name": "myQueueItem",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "myInputBlob",
"type": "blob",
"path": "samples-workitems/{queueTrigger}",
"connection": "MyStorageConnectionAppSetting",
"direction": "in"
},
{
"name": "myOutputBlob",
"type": "blob",
"path": "samples-workitems/{queueTrigger}-Copy",
"connection": "MyStorageConnectionAppSetting",
"direction": "out"
}
],
"disabled": false
}
The configuration section explains these properties.
Here's the C# script code:
public static void Run(string myQueueItem, string myInputBlob, out string myOutputBlob, ILogger log)
{
log.LogInformation($"C# Queue trigger function processed: {myQueueItem}");
myOutputBlob = myInputBlob;
}
This section contains the following examples:
HTTP trigger, using OutputBinding (Java)
The following example shows a Java function that uses the HttpTrigger
annotation to receive a parameter containing the name of a file in a blob storage container. The BlobInput
annotation then reads the file and passes its contents to the function as a byte[]
. The BlobOutput
annotation binds to OutputBinding outputItem
, which is then used by the function to write the contents of the input blob to the configured storage container.
@FunctionName("copyBlobHttp")
@StorageAccount("Storage_Account_Connection_String")
public HttpResponseMessage copyBlobHttp(
@HttpTrigger(name = "req",
methods = {HttpMethod.GET},
authLevel = AuthorizationLevel.ANONYMOUS)
HttpRequestMessage<Optional<String>> request,
@BlobInput(
name = "file",
dataType = "binary",
path = "samples-workitems/{Query.file}")
byte[] content,
@BlobOutput(
name = "target",
path = "myblob/{Query.file}-CopyViaHttp")
OutputBinding<String> outputItem,
final ExecutionContext context) {
// Save blob to outputItem
outputItem.setValue(new String(content, StandardCharsets.UTF_8));
// build HTTP response with size of requested blob
return request.createResponseBuilder(HttpStatus.OK)
.body("The size of \"" + request.getQueryParameters().get("file") + "\" is: " + content.length + " bytes")
.build();
}
Queue trigger, using function return value (Java)
The following example shows a Java function that uses the QueueTrigger
annotation to receive a message containing the name of a file in a blob storage container. The BlobInput
annotation then reads the file and passes its contents to the function as a byte[]
. The BlobOutput
annotation binds to the function return value, which is then used by the runtime to write the contents of the input blob to the configured storage container.
@FunctionName("copyBlobQueueTrigger")
@StorageAccount("Storage_Account_Connection_String")
@BlobOutput(
name = "target",
path = "myblob/{queueTrigger}-Copy")
public String copyBlobQueue(
@QueueTrigger(
name = "filename",
dataType = "string",
queueName = "myqueue-items")
String filename,
@BlobInput(
name = "file",
path = "samples-workitems/{queueTrigger}")
String content,
final ExecutionContext context) {
context.getLogger().info("The content of \"" + filename + "\" is: " + content);
return content;
}
In the Java functions runtime library, use the @BlobOutput
annotation on function parameters whose value would be written to an object in blob storage. The parameter type should be OutputBinding<T>
, where T is any native Java type or a POJO.
The following example shows blob input and output bindings in a function.json file and JavaScript code that uses the bindings. The function makes a copy of a blob. The function is triggered by a queue message that contains the name of the blob to copy. The new blob is named {originalblobname}-Copy.
In the function.json file, the queueTrigger
metadata property is used to specify the blob name in the path
properties:
{
"bindings": [
{
"queueName": "myqueue-items",
"connection": "MyStorageConnectionAppSetting",
"name": "myQueueItem",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "myInputBlob",
"type": "blob",
"path": "samples-workitems/{queueTrigger}",
"connection": "MyStorageConnectionAppSetting",
"direction": "in"
},
{
"name": "myOutputBlob",
"type": "blob",
"path": "samples-workitems/{queueTrigger}-Copy",
"connection": "MyStorageConnectionAppSetting",
"direction": "out"
}
],
"disabled": false
}
The configuration section explains these properties.
Here's the JavaScript code:
module.exports = function(context) {
context.log('Node.js Queue trigger function processed', context.bindings.myQueueItem);
context.bindings.myOutputBlob = context.bindings.myInputBlob;
context.done();
};
The following example demonstrates how to create a copy of an incoming blob as the output from a PowerShell function.
In the function's configuration file (function.json), the trigger
metadata property is used to specify the output blob name in the path
properties.
Note
To avoid infinite loops, make sure your input and output paths are different.
{
"bindings": [
{
"name": "myInputBlob",
"path": "data/{trigger}",
"connection": "MyStorageConnectionAppSetting",
"direction": "in",
"type": "blobTrigger"
},
{
"name": "myOutputBlob",
"type": "blob",
"path": "data/copy/{trigger}",
"connection": "MyStorageConnectionAppSetting",
"direction": "out"
}
],
"disabled": false
}
Here's the PowerShell code:
# Input bindings are passed in via param block.
param([byte[]] $myInputBlob, $TriggerMetadata)
Write-Host "PowerShell Blob trigger function Processed blob Name: $($TriggerMetadata.Name)"
Push-OutputBinding -Name myOutputBlob -Value $myInputBlob
The following example shows blob input and output bindings in a function.json file and Python code that uses the bindings. The function makes a copy of a blob. The function is triggered by a queue message that contains the name of the blob to copy. The new blob is named {originalblobname}-Copy.
In the function.json file, the queueTrigger
metadata property is used to specify the blob name in the path
properties:
{
"bindings": [
{
"queueName": "myqueue-items",
"connection": "MyStorageConnectionAppSetting",
"name": "queuemsg",
"type": "queueTrigger",
"direction": "in"
},
{
"name": "inputblob",
"type": "blob",
"dataType": "binary",
"path": "samples-workitems/{queueTrigger}",
"connection": "MyStorageConnectionAppSetting",
"direction": "in"
},
{
"name": "outputblob",
"type": "blob",
"dataType": "binary",
"path": "samples-workitems/{queueTrigger}-Copy",
"connection": "MyStorageConnectionAppSetting",
"direction": "out"
}
],
"disabled": false,
"scriptFile": "__init__.py"
}
The configuration section explains these properties.
Here's the Python code:
import logging
import azure.functions as func
def main(queuemsg: func.QueueMessage, inputblob: bytes, outputblob: func.Out[bytes]):
logging.info(f'Python Queue trigger function processed {len(inputblob)} bytes')
outputblob.set(inputblob)
The following table explains the binding configuration properties that you set in the function.json file and the Blob
attribute.
Default
You can bind to the following types to write blobs:
TextWriter
out string
out Byte[]
CloudBlobStream
Stream
CloudBlobContainer
1
CloudBlobDirectory
ICloudBlob
2
CloudBlockBlob
2
CloudPageBlob
2
CloudAppendBlob
2
1 Requires "in" binding direction
in function.json or FileAccess.Read
in a C# class library. However, you can use the container object that the runtime provides to do write operations, such as uploading blobs to the container.
2 Requires "inout" binding direction
in function.json or FileAccess.ReadWrite
in a C# class library.
If you try to bind to one of the Storage SDK types and get an error message, make sure that you have a reference to the correct Storage SDK version.
Binding to string
or Byte[]
is only recommended if the blob size is small, as the entire blob contents are loaded into memory. Generally, it is preferable to use a Stream
or CloudBlockBlob
type. For more information, see Concurrency and memory usage earlier in this article.
Additional types
Apps using the 5.0.0 or higher version of the Storage extension may also use types from the Azure SDK for .NET. This version drops support for the legacy CloudBlobContainer
, CloudBlobDirectory
, ICloudBlob
, CloudBlockBlob
, CloudPageBlob
, and CloudAppendBlob
types in favor of the following types:
1 Requires "in" binding direction
in function.json or FileAccess.Read
in a C# class library. However, you can use the container object that the runtime provides to do write operations, such as uploading blobs to the container.
2 Requires "inout" binding direction
in function.json or FileAccess.ReadWrite
in a C# class library.
For examples using these types, see the GitHub repository for the extension.
Default
You can bind to the following types to write blobs:
TextWriter
out string
out Byte[]
CloudBlobStream
Stream
CloudBlobContainer
1
CloudBlobDirectory
ICloudBlob
2
CloudBlockBlob
2
CloudPageBlob
2
CloudAppendBlob
2
1 Requires "in" binding direction
in function.json or FileAccess.Read
in a C# class library. However, you can use the container object that the runtime provides to do write operations, such as uploading blobs to the container.
2 Requires "inout" binding direction
in function.json or FileAccess.ReadWrite
in a C# class library.
If you try to bind to one of the Storage SDK types and get an error message, make sure that you have a reference to the correct Storage SDK version.
Binding to string
or Byte[]
is only recommended if the blob size is small, as the entire blob contents are loaded into memory. Generally, it is preferable to use a Stream
or CloudBlockBlob
type. For more information, see Concurrency and memory usage earlier in this article.
Additional types
Apps using the 5.0.0 or higher version of the Storage extension may also use types from the Azure SDK for .NET. This version drops support for the legacy CloudBlobContainer
, CloudBlobDirectory
, ICloudBlob
, CloudBlockBlob
, CloudPageBlob
, and CloudAppendBlob
types in favor of the following types:
1 Requires "in" binding direction
in function.json or FileAccess.Read
in a C# class library. However, you can use the container object that the runtime provides to do write operations, such as uploading blobs to the container.
2 Requires "inout" binding direction
in function.json or FileAccess.ReadWrite
in a C# class library.
For examples using these types, see the GitHub repository for the extension.
The @BlobOutput
attribute gives you access to the blob that triggered the function. If you use a byte array with the attribute, set dataType
to binary
. Refer to the output example for details.
Access the blob data using context.bindings.<BINDING_NAME>
, where the binding name is defined in the function.json file.
Access the blob data via a parameter that matches the name designated by binding's name parameter in the function.json file.
You can declare function parameters as the following types to write out to blob storage:
- Strings as
func.Out[str]
- Streams as
func.Out[func.InputStream]
Refer to the output example for details.