I don't see anything wrong with the snippet you already have. If you want some more examples, you check out https://learn.microsoft.com/en-us/samples/azure/azure-sdk-for-python/storage-blob-samples/. To learn more about Azure Function Python development, you can refer to Python developer reference for Azure Functions | Microsoft Learn
How to read new csv files from input-container, process them one by one and write into output-container with Azure function (Python Model V2)
Maryna Paluyanava
191
Reputation points
Hello,
I am trying to read new csv files from Blob storage input-container one by one, process them and write into Blob storage output-container using Azure function with Python Model V2.
I can copy one particular file input_test.txt from input-container into output-container as output_test.txt.
But I do not know how to iterate trough all files, read them as pandas dataframe, process, and write into output-container with the same names. I implemented similar with python v1, but I am totally confused with v2.
Could you please provide a simple example how to do that?
Many thanks
import logging
import azure.functions as func
import os
app = func.FunctionApp()
@app.function_name(name="ProcessBlob")
@app.route(route="file")
@app.blob_input(arg_name="inputblob",
path="input-container/input_test.txt",
connection="AzureWebJobsStorage")
@app.blob_output(arg_name="outputblob",
path="output-container/output_test.txt",
connection="AzureWebJobsStorage")
def main(req: func.HttpRequest, inputblob: str, outputblob: func.Out[str]):
logging.info(f'Python Queue trigger function processed {len(inputblob)} bytes')
outputblob.set(inputblob)
return "ok"
Accepted answer
-
Ryan Hill 25,821 Reputation points Microsoft Employee
2024-03-18T21:08:39.7633333+00:00