I'm trying to upload file to Blob storage from the URL: https://datasets.imdbws.com/title.akas.tsv.gz without having to download to an intermediary storage location.
Can I accomplish that with Databricks notebook in Python?
I looked into the link: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-python?tabs=environment-variable-linux#upload-blobs-to-a-container
but it's not clear of dealing with URL.
There was an older version of BlobServiceClient which is BlockBlobService has the solution as below example but I couldn't find the similar one for BlobServiceClient
from azure.storage.blob import BlockBlobService, PublicAccess
from azure.storage.blob.models import Blob
def run_sample():
block_blob_service = BlockBlobService(account_name='your_name', account_key='your_key')
container_name ='t1s'
block_blob_service.copy_blob(container_name,'remoteURL.pdf','https://media.readthedocs.org/pdf/azure-storage/v0.20.3/azure-storage.pdf')
if name == 'main':
run_sample()
and “up-vote!” wherever the information provided helps you, this can be beneficial to other community members.