question

VigneshwarSB-6839 avatar image
0 Votes"
VigneshwarSB-6839 asked VigneshwarSB-6839 commented

Azure Data Factory - Copy Activity ( Copy Azure Function response to CSV)

Hi,

Use Case: Read data from SQL server, transform each row by passing it to Azure Function and then load them into Cosmos DB.

  1. SQL Server with Integration Run time is done.

  2. Azure Function to transform each row is complete.

  3. Cosmos DB sink is also ready.

EDIT HERE (Added points) 4. Transformed data should be upserted back to the SQL server against each row that was read in the first step.
5. Transformed data should also be saved onto a CSV file in Blob Storage.

I'm unable to figure out how to load the response ( JSON Object) from the Azure function into the cosmos sink.

Pipeline:

Read Data from SQL using Lookup activity ----> Use a ForEach iteration to iterate through each row -----> Add Azure function as an activity inside ForEach Iteration and pass each item() ----> Azure Function is responding back with transformed data as JSON object -----> ???? (I'm stuck here) How do I now load it back into the Cosmos Sink.

Can someone guide me out here?




azure-data-factoryazure-functionsazure-cosmos-db
· 3
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

You might be over thinking it - A copy activity should be able to do what you need depending on how your CosmosDB and data are set up.
See this article for using a copy activity
https://docs.microsoft.com/en-us/azure/data-factory/connector-azure-cosmos-db

This should help for more complex transformations:
https://docs.microsoft.com/en-us/azure/cosmos-db/migrate-relational-to-cosmos-db-sql-api

0 Votes 0 ·

@vigneshwarsb-6839 Thanks for using Microsoft Q&A!!

Any specific reason you want to use Azure Functions for transformations ? You can try using the data flow for the transformation task if this helps to keep everything at ADF end.
Another suggestion is to ingest data to Azure Cosmos DB through Azure function as well, instead of getting the transformed row back to Azure Data Factory. You can refer to this tutorial to ingest data from Azure Functions to Azure Cosmos DB.


0 Votes 0 ·

@SaurabhSharma-msft @JohnAherne-9205 Thanks for your responses. My bad, I missed two other points in my pipeline process. Apologize for that. Pls have a look into the question again?

The transformation that I do on each row that is read from the SQL server using Azure Function has to be loaded back into three places
1. Load JSON object into Cosmos DB. ( Used Copy Data Tool inside ForEach to do this )
2. Upserting into SQL server against each row. ( No idea on this )
3. Write as a CSV file. ( Used for later use ) (No idea on this)

0 Votes 0 ·
JohnAherne-9205 avatar image
0 Votes"
JohnAherne-9205 answered VigneshwarSB-6839 commented

There are a few ways you can accomplish this.

Before loop copy data to storage in csv format
Lookup against the csv files
In your loop - execute function to populate Cosmos DB
After loop Copy csv files to staging table in SQL.
Run stored procedure to merge the staged data into the final table

One thing to note is that Lookups only support a max of 5000 records (or 4MB in size), so make sure you are not trying to select more than that.



· 3
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

@vigneshwarsb-6839 Do you still need any help on this issue.

0 Votes 0 ·

Thanks for all your suggestions.

1 Vote 1 ·

@JohnAherne-9205 As you suggested I have made the transformation. Instead of using a Azure function, I have made a web service call which returns transformed data. Can you suggest me if there is a way to update the csv file with transformed data??


It would be of great help.

0 Votes 0 ·
SaurabhSharma-msft avatar image
0 Votes"
SaurabhSharma-msft answered SaurabhSharma-msft commented

@vigneshwarsb-6839 You cannot directly use the function activity output in the pipeline. As suggested earlier you can try inserting from function itself to Cosmos DB or you can try the below - Instead of using function you can try using Copy data Activity with REST source with Azure Function URL as linked service. You need to pass the function url with key as a Base URL. (See screenshot below).
76030-image.png
I have tested this and it gets the data from function and pushes it to cosmos db.
76033-image.png
76041-image.png
This will cover you 1st part.


image.png (32.9 KiB)
image.png (27.9 KiB)
image.png (21.1 KiB)
· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

@vigneshwarsb-6839 Please do not forget to "Accept the answer" wherever the information provided helps you to help others in the community.

0 Votes 0 ·