question

WynnBoss-7501 avatar image
0 Votes"
WynnBoss-7501 asked SaurabhSharma-msft commented

Data Factory: Copying Large SQL Database to REST API

Hello Microsoft,

I am currently attempting to move data from a large SQL database into a REST API using Azure Data Factory. This table has 100,000+ Rows. My Initial attempt was to use a look up activity to get the data into a JSON format, and then use a for each activity to load the data into the REST API.

This API Can only accept one record at a time, so parallelization is key.

However, this implementation has a limitation, lookUp can only pull around 5000 records at a time, so I would need to nest that flow within another forEach Activity to iterate through the entirety of my table. Is this optimal?

I noticed that the copy Activity recently supports copying to REST API but I am running to issues pushing records to the API with this method. I am currently getting this Error:
125793-image.png



Using a similar connection Method works fine in a web activity so I am left to assume that the payload from the copy activity is being provided in an obscure way. The body for the record needs to be in this format {Column1 : Value, Column2 : Value,.......}

Can I get some assistance debugging this?

azure-data-factory
image.png (11.4 KiB)
· 10
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hi @wynnboss-7501,

Thanks for using Microsoft Q&A !!
I suspect you are passing the entire source resultset to your REST API, however you mentioned that your API handles only record at a time. Can you please check by passing only one single row in Copy Activity by using Query instead of table in your source and check if this works.
126085-image.png

Also, you can compare your debug runs of Lookup vs Copy activity to see what exactly is getting passed to the REST API.

Thanks
Saurabh

0 Votes 0 ·
image.png (37.8 KiB)

I am currently only passing 1 record using the batch size of 1 parameter, but even altering the query as you mentioned resulted in the same error. In the lookup activity i am able to pass the body as { column1: value, column2 : value, column3: value}.

IIs there a way to see what the POST request looks like coming out of ADF?

0 Votes 0 ·

@wynnboss-7501 Sorry, I was not clear, you can try to send the data to storage location as sink (as Json file) and check if json is correct.
126759-image.png
126874-image.png
126760-image.png

Also, do you have any authentication setup at REST API endpoint ? What authentication type you have chosen in the linked service ?
126796-image.png


Thanks
Saurabh


0 Votes 0 ·
image.png (46.0 KiB)
Show more comments

Hello,

Why not just use a copy activity with source to your table database and sink with a REST dataset. on the sink side you can set Write batch size parameter to 1 in order to send just 1 record per request, you can set your custom mapping to adapt it to your backend contract. Also you can set parallellism with "degree of copy parallellism" option.

0 Votes 0 ·
WynnBoss-7501 avatar image WynnBoss-7501 CarlosMarioVilladiegoArroyo-3680 ·

This is my current implementation. However, I am blocked because the only error response I receive back from the copy activity is a bad request. My plane was to try and move 1 record across using the batch size set to 1, and then scale the degree of parallelism parameter.

0 Votes 0 ·

0 Answers