question

MatthewCooksey-7557 avatar image
0 Votes"
MatthewCooksey-7557 asked IlmoAnttonen-8842 commented

What are the best options for uploading and maintaining Azure Archive?

I'm about to start using Azure Archive as my disaster recovery solution for my on prem server. I have around 500Tb of media files that will need backing up accross a range of projects. The data is currently stored on a device on prem where it is accessed via network shares.

There are a number of stages I guess, firstly, what is the best practice for initially uploading the bulk of the content? I was previously using Cloudberry with Glacier which was working ok.

Secondly, am I able to upload straight to Archive rather then via the cold tier. I see the functionality mentioned in the documentation here but there are answers on here suggesting otherwise.

Lastly, what would be the best practice for automating and maintaining the archive with new incoming projects? If possible I'd like to avoid manually archiving every project as it arrives and just push the new and updated files and folders to the archive. We'll soon be deploying a seperate storage account for active projects so potentially I can just archive straight from there as required.

I'm fairly new to this so any pointers would be hugely apreciated!


azure-blob-storageazure-archive-storage
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

1 Answer

deherman-MSFT avatar image
0 Votes"
deherman-MSFT answered IlmoAnttonen-8842 commented

@MatthewCooksey-7557
For initially uploading the data you can check out this page for suggestions. If you have a smaller network bandwidth you should probably use Data Box.

You can upload directly to the archive tier but only at the blob level, not the storage account level. If you end up using Data Box you could set the storage account to cold tier then move the data to archive tier after it has been added using lifecycle rules. If you are importing via AzCopy you can directly specify the archive tier.


You could use something like azcopy sync to move the changed/new files to the archive storage account.

Hope this helps. Let us know if you have further questions or run into any issues.



Please don’t forget to "Accept the answer" and “up-vote” wherever the information provided helps you, this can be beneficial to other community members.

· 1
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

I have a similar question and I believe OP also asked this but it was never elaborated on.

I have already made the initial backup transfer to Azure and my lifecycling policy moves untouched files to archive tier after 15 days. I have configured a weekly backup to be uploaded to Azure blob storage with new and updated files.

After files have been moved to archive tier, what happens if the backup contains an update to one of the files already in archive? Will it fail? Will I get charged for early rehydration or how does this specific scenario work? I haven't been able to find any article describing this.

0 Votes 0 ·