question

LittyKuriakose-7666 avatar image
0 Votes"
LittyKuriakose-7666 asked GeethaThatipatri-MSFT edited

Deleting items in multiple partitions

I am loading data from csv files (file size ~40 MB) to Azure Cosmos DB using Azure Data Factory.If data mismatches happened during dataload, I want to delete the data entered using the file and reload once again.The data will be loaded across multiple partitions.I tried deleting data using Azure Functions,but will get timed out 230sec.

Please help me with the preferred way to delete large data from multiple partitions. Kindly share any links that explains the method.

azure-webappsazure-functions
· 3
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Does your function calls Cosmos DB sproc from your Azure Functions to delete data?

0 Votes 0 ·

No.I am using container.delete_item method to delete each item. If the csv has 30,000 records we need to make 30,000 calls as the partition key will be different for items in file

0 Votes 0 ·

Microsoft team/Experts,

Please throw some insights on the optimal way to delete data loaded by files with sizes of 30-50 MB span across different partitions as explained in the query .I am getting stuck here.





0 Votes 0 ·

1 Answer

StephanvanRooij-6273 avatar image
0 Votes"
StephanvanRooij-6273 answered

It depends on how many partitions we are talking about.

You can create a stored procedure that takes a partionKey and then queries all items in that partition and deletes them. Then you don't have get all the items in the Azure Functions first.

https://docs.microsoft.com/en-us/azure/cosmos-db/sql/how-to-write-stored-procedures-triggers-udfs?tabs=javascript

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.