question

Khadam-5168 avatar image
0 Votes"
Khadam-5168 asked ·

Azure Datafactory (copy data activity) : filter rows data before ingesting into datawarehouse

Hi all,


I'm using a copydata activity inside datafactory to connect to azure datalake and insert a range of csv files into Azure data warehouse.
My question is : what is the best way to filter a csv file before inserting it into data base.

What is the drawbacks of insertion of all file's rows and then use a stores procedure to come with a filtering action.

PS: I don't privilege data flow


Thanks for your help

azure-data-factoryazure-synapse-analyticsazure-data-lake-storage
10 |1000 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

1 Answer

VaibhavChaudhari avatar image
0 Votes"
VaibhavChaudhari answered ·

I think, approach to dump CSV data to some temp table and then use SP to insert required records to final table should be good. I don't see any drawback here.


If the response helped, do "Accept Answer" and upvote it - Vaibhav

· 1 · Share
10 |1000 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hi,

thank you for your replay.

I used the same logic as you told but without using temp. I dumped all data in the final table and used SP to delete what i don't need

0 Votes 0 · ·