I think, approach to dump CSV data to some temp table and then use SP to insert required records to final table should be good. I don't see any drawback here.
If the response helped, do "Accept Answer" and upvote it - Vaibhav
This browser is no longer supported.
Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.
Hi all,
I'm using a copydata activity inside datafactory to connect to azure datalake and insert a range of csv files into Azure data warehouse.
My question is : what is the best way to filter a csv file before inserting it into data base.
What is the drawbacks of insertion of all file's rows and then use a stores procedure to come with a filtering action.
PS: I don't privilege data flow
Thanks for your help
I think, approach to dump CSV data to some temp table and then use SP to insert required records to final table should be good. I don't see any drawback here.
If the response helped, do "Accept Answer" and upvote it - Vaibhav
@Khadam @Vaibhav Chaudhari
Did you insert all records in one go or one by one. I am looking some logic to validate all the records before insertion into multiple tables on-premises SQL database