I recently followed the attached tutorial, and it worked perfectly with the provided code. (link: https://docs.microsoft.com/en-us/azure/batch/tutorial-run-python-batch-azure-data-factory)
I realised that Pandas stores a local copy on the VMs used as part of the Batch Service, so I thought it might be good practise to delete the file just uploaded, and tried the following piece of code:
container_client = blob_service_client.get_container_client(containerName) with open(output_file_name, "rb") as data: blob_client = container_client.upload_blob(name = output_file_name, data=data) if os.path.exists(output_file_name): os.remove(output_file_name) else: pass
However, my batch output error file (stderr.txt) returned the following error:
Traceback (most recent call last):
File "main.py", line 44, in <module>
PermissionError: [WinError 32] The process cannot access the file because it is being used by another process: 'iris_setosa.csv'
What should I do? Is there a way to delete files? If not, what is the best alternative? OR am I just confused about how Azure Batch Services work? Does the file get deleted automatically?
If I am wrong, I apologise in advance, since it's my first time working with Batch Services.
Thank you in advance!