Hello Microsoft,
I am currently attempting to move data from a large SQL database into a REST API using Azure Data Factory. This table has 100,000+ Rows. My Initial attempt was to use a look up activity to get the data into a JSON format, and then use a for each activity to load the data into the REST API.
This API Can only accept one record at a time, so parallelization is key.
However, this implementation has a limitation, lookUp can only pull around 5000 records at a time, so I would need to nest that flow within another forEach Activity to iterate through the entirety of my table. Is this optimal?
I noticed that the copy Activity recently supports copying to REST API but I am running to issues pushing records to the API with this method. I am currently getting this Error: 
Using a similar connection Method works fine in a web activity so I am left to assume that the payload from the copy activity is being provided in an obscure way. The body for the record needs to be in this format {Column1 : Value, Column2 : Value,.......}
Can I get some assistance debugging this?




