I am loading data from csv files (file size ~40 MB) to Azure Cosmos DB using Azure Data Factory.If data mismatches happened during dataload, I want to delete the data entered using the file and reload once again.The data will be loaded across multiple partitions.I tried deleting data using Azure Functions,but will get timed out 230sec.
Please help me with the preferred way to delete large data from multiple partitions. Kindly share any links that explains the method.