I'm about to start using Azure Archive as my disaster recovery solution for my on prem server. I have around 500Tb of media files that will need backing up accross a range of projects. The data is currently stored on a device on prem where it is accessed via network shares.
There are a number of stages I guess, firstly, what is the best practice for initially uploading the bulk of the content? I was previously using Cloudberry with Glacier which was working ok.
Secondly, am I able to upload straight to Archive rather then via the cold tier. I see the functionality mentioned in the documentation here but there are answers on here suggesting otherwise.
Lastly, what would be the best practice for automating and maintaining the archive with new incoming projects? If possible I'd like to avoid manually archiving every project as it arrives and just push the new and updated files and folders to the archive. We'll soon be deploying a seperate storage account for active projects so potentially I can just archive straight from there as required.
I'm fairly new to this so any pointers would be hugely apreciated!