Upload file to Azure Blob Storage with an Azure Function
This article shows you how to create an Azure Function API, which uploads a file to Azure Storage using an out binding to move the file contents from the API to Storage.
Solution architecture considerations
Caution
The Azure Function file upload limit is 100 MB. If you need to upload larger files, consider either a browser-based approach or a server app.
This sample:
Uploads a file to an Azure Function
Uses parse-multipart npm package to get information about the uploaded file.
Uses @azure/storage-blob to generate a blob SAS token URL for the file. The URL should be handed back to a client or other service to read the file with authorization.
Uses a Function App out binding to upload the file to Blob Storage. This is the easiest way to get a file into blob storage.
While you can replace the out binding with more code to upload the file to Blob storage, you can't replace the SDK with any out binding to generate the SAS token URL. As you move from beginning code for this functionality to more complex code, you will replace the out binding with SDK upload calls.
Prepare your development environment
Make sure the following are installed on your local developer workstation:
- An Azure account with an active subscription which you own. Create an account for free.
- Ownership is required to provide the correct Azure Active folder permissions to complete these steps.
- Node.js LTS and npm - for local development.
- Visual Studio Code - to develop locally and to deploy to Azure.
- Visual Studio Code extensions:
1. Create a resource group
A resource group holds both the Azure Function resource and the Azure Storage resource. Because both resources are in a single resource group, when you want to remove these resources, you remove the resource group. That action removes all resources in the resource group.
In Visual Studio Code, select the Azure explorer, then select the + (Plus/Addition) icon under Resources.
Select Create Resource Group from the list of resources.
Use the following table to finish creating the resource group:
Prompt Value Notes Enter the name of the new resource group. blob-storage-upload-function-groupIf you choose a different name, remember to use it as a replacement for this name when you see it in the rest of this article. Select a location for new resources. Select a region close to you.
2. Create the local Function app
Create a new folder on your local workstation, then open Visual Studio Code in this folder.
In Visual Studio Code, open the Command Palette (View -> Command Palette | Ctrl + Shift + P), then filter and select Azure Function: Create New Project ...
Use the following table to finish creating the local Azure Function project:
Prompt Value Notes Select the folder that will contain your function project. Select the current folder, which is the default value. Select a language TypeScript Select a template for your project's first function HTTP Trigger API is invoked with an HTTP request. Provide a function name uploadAPI route is /api/uploadAuthorization Level Function This locks the remote API to requests that pass the function key with the request. While developing locally, you won't need the function key. This process doesn't create cloud-based Azure Function resource yet. That step will come later.
Return to the Visual Studio Code File Explorer.
After a few moments, Visual Studio Code completes creation of the local project, including a folder named for the function, upload, within which are three files:
Filename Description index.ts The source code that responds to the HTTP request. function.json The binding configuration for the HTTP trigger. sample.dat A placeholder data file to demonstrate that you can have other files in the folder. You can delete this file, if desired, as it's not used in this tutorial.
3. Install dependencies
In Visual Studio Code, open an integrated bash terminal, Ctrl + `.
Install npm dependencies:
npm install
4. Install and start Azurite storage emulator
Now that the basic project folder structure and files are in place, add local storage emulation.
To emulate the Azure Storage service locally, install Azurite.
npm install azuriteCreate a folder to hold the storage files inside your local project folder:
mkdir azureStorageTo start the Azurite emulator, add an npm script to the end of the
scriptsproperty items in the package.json file:"start-azurite": "azurite --silent --location azureStorage --debug azureStorage/debug.log"This action uses the local folder
azureStorageto hold the storage files and logs.In a new Visual Studio Code bash terminal, start the emulator:
npm run start-azuriteDon't close this terminal during the article until the cleanup step.
5. Add code to manage file upload
In a new Visual Studio Code integrated bash terminal, add npm packages to handle file tasks:
npm install http-status-enum parse-multipart @types/parse-multipartLeave this terminal open to use other script commands. You should have two terminal windows open: one window running Azurite storage emulator, and this terminal for commands.
Open the
./upload/index.tsfile and replace the contents with the following code:import { AzureFunction, Context, HttpRequest } from "@azure/functions"; import HTTP_CODES from "http-status-enum"; // Multiform management import * as multipart from "parse-multipart"; // Used to get read-only SAS token URL import { generateReadOnlySASUrl } from './azure-storage-blob-sas-url'; const httpTrigger: AzureFunction = async function (context: Context, req: HttpRequest): Promise<any> { context.log('upload HTTP trigger function processed a request.'); // get connection string to Azure Storage from environment variables // Replace with DefaultAzureCredential before moving to production const storageConnectionString = process.env.AzureWebJobsStorage; if (!storageConnectionString) { context.res.body = `AzureWebJobsStorage env var is not defined - get Storage Connection string from Azure portal`; context.res.status = HTTP_CODES.BAD_REQUEST } // User name is the container name const containerName = req.query?.username; if (!containerName) { context.res.body = `username is not defined`; context.res.status = HTTP_CODES.BAD_REQUEST } // `filename` is required property to use multi-part npm package const fileName = req.query?.filename; if (!fileName) { context.res.body = `filename is not defined`; context.res.status = HTTP_CODES.BAD_REQUEST } // file content must be passed in as body if (!req.body || !req.body.length){ context.res.body = `Request body is not defined`; context.res.status = HTTP_CODES.BAD_REQUEST } // Content type is required to know how to parse multi-part form if (!req.headers || !req.headers["content-type"]){ context.res.body = `Content type is not sent in header 'content-type'`; context.res.status = HTTP_CODES.BAD_REQUEST } context.log(`*** Username:${req.query?.username}, Filename:${req.query?.filename}, Content type:${req.headers["content-type"]}, Length:${req.body.length}`); try { const userName = req.query?.username; const fileName = req.query?.filename; const containerName = userName; // Each chunk of the file is delimited by a special string const bodyBuffer = Buffer.from(req.body); const boundary = multipart.getBoundary(req.headers["content-type"]); const parts = multipart.Parse(bodyBuffer, boundary); // The file buffer is corrupted or incomplete ? if (!parts?.length){ context.res.body = `File buffer is incorrect`; context.res.status = HTTP_CODES.BAD_REQUEST } // filename is a required property of the parse-multipart package if(parts[0]?.filename)console.log(`Original filename = ${parts[0]?.filename}`); if(parts[0]?.type)console.log(`Content type = ${parts[0]?.type}`); if(parts[0]?.data?.length)console.log(`Size = ${parts[0]?.data?.length}`); // Passed to Storage context.bindings.storage = parts[0]?.data; // Get SAS token const sasInfo = await generateReadOnlySASUrl( process.env.AzureWebJobsStorage, containerName, fileName); // Returned to requestor context.res.body = { fileName, storageAccountName: sasInfo.storageAccountName, containerName, url: sasInfo.accountSasTokenUrl, }; } catch (err) { context.log.error(err.message); context.res.body = { error: `${err.message}`}; context.res.status = HTTP_CODES.INTERNAL_SERVER_ERROR; } return context.res; }; export default httpTrigger;The
filenamequery string parameter is required because the out binding needs to know the name of the file to create. Theusernamequery string parameter is required because it becomes the Storage container (folder) name. For example, if the user name isjsmithand the file name istest-file.txt, the Storage location isjsmith/test-file.txt.The code to read the file and send it to the out binding is highlighted.
Create a new file named
azure-storage-blob-sas-url.ts, then copy the following code into the file to generate a SAS token for the uploaded file.// Used to get read-only SAS token URL import { BlobSASPermissions, BlobServiceClient, SASProtocol, } from "@azure/storage-blob"; /** * Utility method for generating a secure short-lived SAS URL for a blob. * To know more about SAS URLs, see: https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview * @connectionString connectionString - string * @param containerName - string (User's alias) * @param filename - string */ export const generateReadOnlySASUrl = async ( connectionString: string, containerName: string, filename: string ) => { // get storage client const blobServiceClient = BlobServiceClient.fromConnectionString( connectionString ); // get container client const containerClient = blobServiceClient.getContainerClient( containerName ); // connect to blob client const blobClient = containerClient.getBlobClient(filename); // Best practice: create time limits const SIXTY_MINUTES = 60 * 60 * 1000; const NOW = new Date(); // Create SAS URL const accountSasTokenUrl = await blobClient.generateSasUrl({ startsOn: NOW, expiresOn: new Date(new Date().valueOf() + (SIXTY_MINUTES)), permissions: BlobSASPermissions.parse("r"), // Read only permission to the blob protocol: SASProtocol.Https, // Only allow HTTPS access to the blob }); return { accountSasTokenUrl, storageAccountName: blobClient.accountName }; };
6. Connect Azure Function to Azure Storage
Open the
./upload/function.jsonfile and replace the contents with the following code:{ "bindings": [ { "authLevel": "Function", "type": "httpTrigger", "direction": "in", "dataType": "binary", "name": "req", "methods": [ "post" ] }, { "type": "http", "direction": "out", "name": "$return" }, { "name": "storage", "type": "blob", "path": "{username}/{filename}", "direction": "out", "connection": "AzureWebJobsStorage" } ], "scriptFile": "../dist/upload/index.js" }The first highlighted object defines the out binding to read the returned object from the function. The second highlighted object defines how to use the read information. The connection string for the Storage resource is defined in the connection property with the
AzureWebJobsStoragevalue.Open the
./local.settings.jsonfile and replace the AzureWebJobsStorage property's value withUseDevelopmentStorage=trueto ensure that when you develop locally, the function uses the local Azurite storage emulator:{ "IsEncrypted": false, "Values": { "Environment": "Development", "AzureWebJobsStorage": "UseDevelopmentStorage=true", "FUNCTIONS_WORKER_RUNTIME": "node" } }
7. Run the local function
In the integrated terminal window for commands (not the terminal window running Azurite), start the function:
npm startWait until you see the URL for the function. This indicates your function started correctly.
upload: [POST] http://localhost:7071/api/uploadCreate a new file in the root of the project named
test-file.txtand copy in the text:https://azure.microsoft.com/en-us/overview/what-is-azure/ The Azure cloud platform is more than 200 products and cloud services designed to help you bring new solutions to life—to solve today’s challenges and create the future. Build, run, and manage applications across multiple clouds, on-premises, and at the edge, with the tools and frameworks of your choice.In Visual Studio Code, open a new bash terminal at the root of the project to use the function API to upload the
test-file.txt. Copy the bash script to the terminal and execute it.#!/bin/bash curl -X POST \ -F 'filename=@test-file.txt' \ -H 'Content-Type: text/plain' \ 'http://localhost:7071/api/upload?filename=test-file.txt&username=jsmith' --verboseCheck the response for a status code of 200:
Note: Unnecessary use of -X or --request, POST is already inferred. * Trying 127.0.0.1:7071... * Connected to localhost (127.0.0.1) port 7071 (#0) > POST /api/upload?filename=test-file.txt&username=jsmith HTTP/1.1 > Host: localhost:7071 > User-Agent: curl/7.74.0 > Accept: */* > Content-Length: 564 > Content-Type: text/plain; boundary=------------------------a7f29ae099b687a4 > * We are completely uploaded and fine * Mark bundle as not supporting multiuse < HTTP/1.1 200 OK < Content-Type: text/plain; charset=utf-8 < Date: Mon, 25 Jul 2022 20:02:18 GMT < Server: Kestrel < Transfer-Encoding: chunked < { "json": "{\"fileName\":\"test-file.txt\",\"storageAccountName\":\"my-devstoreaccount1\",\"containerName\":\"jsmith\",\"url\":\"http://127.0.0.1:10000/my-devstoreaccount1/jsmith/test-file.txt?sv=2021-08-06&spr=https&st=2022-07-25T20%3A02%3A18Z&se=2022-07-25T21%3A02%3A18Z&sr=b&sp=r&sig=QuydFobkmae2q%2BSt3fQQGkZW9Rt1GZfh2ooKezuVlOM%3D\"}" * Connection #0 to host localhost left intactIn the response JSON, the url property is the SAS token url for the file. It can be used to read the file.
In Visual Studio Code, in the file explorer, expand the azureStorage/blobstorage folder and view the contents of the file.
Locally, you've called the function and uploaded the file to the storage emulator successfully.
8. Create Function App resource
In Visual Studio Code, select the Azure explorer, then right-click on Function App, then select Create Function App in Azure (Advanced).
Alternately, you can create a Function App by opening the Command Palette (F1), entering
Azure Functions:, and running the Azure Functions: Create Function App in Azure (Advanced) command.Use the following table to complete the prompts to create a new Azure Function resource.
Prompt Value Notes Select Function App in Azure Create new Function app in Azure (Advanced) Create a cloud-based resource for your function. Enter a globally unique name for the new Function App The name becomes part of the API's URL. API is invoked with an HTTP request. Valid characters for a function app name are 'a-z', '0-9', and '-'. An example is blob-storage-upload-function-app-jsmith. You can replacejsmithwith your own name, if you would prefer.Select a runtime stack Select a Node.js stack with the LTSdescriptor.LTS means long-term support. Select an OS. Windows Windows is selected specifically for the stream logs integration in Visual Studio Code. Linux log streaming is available from the Azure portal. Select a resource group for new resources. blob-storage-upload-function-groupSelect the resource group you created. Select a location for new resources. Select the recommended region. Select a hosting plan. Consumption Select a storage account. + Create new storage account Enter the name of the new storage account. blobstoragefunctionSelect an Application Insights resource for your app. + Create new Application Insights resource. Enter an Application Insights resource for your app. blob-storage-upload-function-app-insightsThe Visual Studio Code Azure: Activity log shows progress:
In Visual Studio Code, select the Azure explorer, then right-click on your new app in Function App resource area, then select Deploy to Function app.
Select the notification link to see the output of the deployment.
When this is complete, your Function App isn't configured to use Azure Blob Storage yet.
10. Create an Azure Storage Resource
In Visual Studio Code, select the Azure explorer, then right-click on your subscription under Storage to select Create Storage Account (Advanced).
Use the following table to finish creating the local Azure Function project:
Prompt Value Notes Enter a globally unique name for the new Storage resource blobstoragefunctionThe name must be 3 to 24 lowercase letters and numbers only. Select a resource group for new resources. blob-storage-upload-function-groupSelect the resource group you created. Would you like to enable static website hosting? No. Select a location for new resources. Select one of the recommended locations close to use.
11. Set Storage connection string in Function app setting
In Visual Studio Code, select the Azure explorer, then right-click on your new storage resource, and select Copy Connection String.
Still in the Azure explorer, expand your Azure Function app, then expand the Application Settings node and right-click AzureWebJobsStorage to select Edit Setting.
Paste in the Azure Storage connection string and press enter to complete the change.
When moving to production, this connection string setting and its environment variable in the source code should be replaced DefaultAzureCredential in order to use credential-less authentication.
12. Use cloud-based function
Once deployment is completed and the AzureWebJobsStorage app setting have been updated, test your Azure Function.
In Visual Studio Code, create a bash file named
upload-azure.shand copy the following code into the file.#!/bin/bash FUNCTION_URL="https://YOUR-RESOURCE-NAME.azurewebsites.net/api/upload?code=YOUR-FUNCTION-KEY" echo "${FUNCTION_URL}" curl -X POST \ -F "filename=@test-file.txt" \ -H "Content-Type: text/plain" \ "$FUNCTION_URL&filename=test-file.txt&username=jsmith" --verboseIn Visual Studio Code, select the Azure explorer, then expand the node for your Function app, then expand Functions. Right-click the function name,
uploadand select Copy Function Url.In the
upload-azure.shbash file, paste your function url value intoFUNCTION_URL.Execute that bash script in the terminal from the project's root folder:
bash upload-azure.shCheck the response for a status code of 200:
$ curl -X POST -F 'filename=@test-file.txt' 'https://blob-storage-upload-function-app-jsmith.azurewebsites.net/api/upload?code=123456&filename=test-file.txt&username=jsmith&code=abc' --verboseNote: Unnecessary use of -X or --request, POST is already inferred. Note: Unnecessary use of -X or --request, POST is already inferred. * Trying 20.49.104.23:443... * Connected to blob-storage-upload-function-app-jsmith.azurewebsites.net (20.49.104.23) port 443 (#0) * ALPN, offering h2 * ALPN, offering http/1.1 * successfully set certificate verify locations: * CAfile: /etc/ssl/certs/ca-certificates.crt * CApath: /etc/ssl/certs * TLSv1.3 (OUT), TLS handshake, Client hello (1): * TLSv1.3 (IN), TLS handshake, Server hello (2): * TLSv1.2 (IN), TLS handshake, Certificate (11): * TLSv1.2 (IN), TLS handshake, Server key exchange (12): * TLSv1.2 (IN), TLS handshake, Server finished (14): * TLSv1.2 (OUT), TLS handshake, Client key exchange (16): * TLSv1.2 (OUT), TLS change cipher, Change cipher spec (1): * TLSv1.2 (OUT), TLS handshake, Finished (20): * TLSv1.2 (IN), TLS handshake, Finished (20): * SSL connection using TLSv1.2 / ECDHE-RSA-AES256-GCM-SHA384 * ALPN, server accepted to use http/1.1 * Server certificate: * subject: C=US; ST=WA; L=Redmond; O=Microsoft Corporation; CN=*.azurewebsites.net * start date: Mar 14 18:39:55 2022 GMT * expire date: Mar 9 18:39:55 2023 GMT * subjectAltName: host "blob-storage-upload-function-app-jsmith.azurewebsites.net" matched cert's "*.azurewebsites.net" * issuer: C=US; O=Microsoft Corporation; CN=Microsoft Azure TLS Issuing CA 01 * SSL certificate verify ok. > POST /api/upload?filename=test-file.txt&username=jsmith&code=123456 HTTP/1.1 > Host: blob-storage-upload-function-app-jsmith.azurewebsites.net > User-Agent: curl/7.74.0 > Accept: */* > Content-Length: 564 > Content-Type: text/plain; boundary=------------------------7bef55872c98cf16 > * We are completely uploaded and fine * Mark bundle as not supporting multiuse < HTTP/1.1 200 OK < Content-Type: text/plain; charset=utf-8 < Date: Mon, 25 Jul 2022 20:32:59 GMT < Server: Kestrel < Transfer-Encoding: chunked < { "json": "{\"fileName\":\"test-file.txt\",\"storageAccountName\":\"blob-storage-upload-function-app-jsmith\",\"containerName\":\"jsmith\",\"url\":\"https://blob-storage-upload-function-app-jsmith.blob.core.windows.net/jsmith/test-file.txt?sv=2021-08-06&spr=https&st=2022-07-25T20%3A32%3A58Z&se=2022-07-25T21%3A32%3A58Z&sr=b&sp=r&sig=SuA5pXH8K9bBdtQg9Jk5MuzGKXk995JE2JG3MKEHMvI%3D\"}" * Connection #0 to host blob-storage-upload-function-app-jsmith.azurewebsites.net left intact }In the response JSON, the url property is the SAS token url for the file. It can be used to read the file.
In Visual Studio Code, open the Azure explorer, expand your Storage blob resource, under containers, and find the container name that matches your username value in the query string.
13. Query your Azure Function logs
In Visual Studio Code, select the Azure explorer, then under Functions, right-click on your function app, then select Open in Portal.
This opens the Azure portal to your Azure Function.
Select Application Insights from the Settings, then select View Application Insights data.
This link takes you to your separate metrics resource created for you when you created your Azure Function with Visual Studio Code.
Select Logs in the Monitoring section. If a Queries pop-up window appears, select the X in the top-right corner of the pop-up to close it.
In the New Query 1 pane, on the Tables tab, double-click the traces table.
This enters the Kusto query,
tracesinto the query window.Edit the query to search for the custom logs:
traces | where message startswith "***"Select Run.
If the log doesn't display any results, it may be because there's a few minutes delay between the HTTP request to the Azure Function and the log availability in Kusto. Wait a few minutes and run the query again.
14. Clean up Azure resources
In Visual Studio Code, in the Azure explorer, use the Group by feature to switch the Resources view to Group by Resource Group.
Find your resource group name, such as
blob-storage-upload-function-group, in the list.Right-click the resource group name and select Delete Resource Group.
Troubleshooting
- SPLIT: If you try to use this sample and run into an error regarding
splitfrom theparse-multipartlibrary, verify that you're sending thefilenameproperty in your multiform data and that you're sending thecontent-typeheader into the function - Debug in Dev Container: If you run this app in its dev container in Visual Studio Code, make sure the Azure Function extensions in installed and enabled in the dev container. You may have to rebuild the container.
Next steps
Azure Functions
Azure Storage
- Learn how to write code with Azure Blob Storage SDK
Credential-less code
Manage Azure resources with SDKs
Povratne informacije
Pošalјite i prikažite povratne informacije za