Hello community,
I want to transfer data from an Azure SQL to Dataverse via Azure Data Factory. I have 2 questions about this:
Which connector is the latest and which one should be used for data transfer?
a) Dynamics 365 b) Dynamics CRM c) Dataverse
My first attempts were made with the Dataverse connector and I noticed the following:
First I tried a DataFlow with first of all no transformations, only source and sink. With this I could process about 20 rows per second or about 80k rows per hour. After that I ran into a connection reset error.
After this, I tried a copy activity. I was able to transfer 100k lines in under 9 minutes. The Linked Service was the same.
In my case I need a few data transformations, but also e.g. Conditions before the data is transferred to Dataverse. Therefore I would prefer the way over the dataflows.
How does this performance difference come about if I use the same connector? Is there anything I can do to improve the performance in the dataflow? In the CopyActivity I still had settings like BatchSize. In the DataFlow I do not have this option.
Many thanks in advance!
Christopher

or upvote
button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is 