question

amitawasthi avatar image
0 Votes"
amitawasthi asked PramodValavala-MSFT answered

Develop the Azure processes to receive the blobs of data from on-prem and persist to storage

Designing the solution - i.e. what Azure technologies should be used, how should the Blob storage directory hierarchy be used, housekeeping, keeping costs low?

Develop the Azure processes to receive the blobs of data from on-prem and persist to storage


Also ensure that the solution

is scalable for large data volumes
can support additional environments (e.g. multiple regions)
can be extended to store data into additional targets (e.g. Elasticsearch)
can be extended to support additional data types e.g. status events, exceptions, etc.
robust, monitored and secure

azure-functions
· 2
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

What's your question?

0 Votes 0 ·

Designing the solution - i.e. what Azure technologies should be used, how should the Blob storage directory hierarchy be used, housekeeping, keeping costs low for below senenario


Develop the Azure processes to receive the blobs of data from on-prem and persist to storage


Also ensure that the solution

is scalable for large data volumes
can support additional environments (e.g. multiple regions)
can be extended to store data into additional targets (e.g. Elasticsearch)
can be extended to support additional data types e.g. status events, exceptions, etc.
robust, monitored and secure

0 Votes 0 ·

1 Answer

PramodValavala-MSFT avatar image
0 Votes"
PramodValavala-MSFT answered

@amitawasthi I suppose this would be a much larger discussion but here are some things you could consider in your design.

While building your own service to handle the uploads will give you maximum control, you could offload this to Azure Blob Storage itself by having your service send details of the upload (storage account and SAS token to your clients to directly upload them where required.

This should cover is scalable for large data volumes and can support additional environments (e.g. multiple regions) since Azure Storage already addresses these.

With this covered, your service could simply react to storage events to identify when an upload is complete to perform additional processing as required. Since blobs can be any data, it supports uploading any type of data and your service would keep track of the same (since it sends the details for the upload like filename, it can use that to keep track).

The service could be as simple as an Azure Function which can be protected with Azure AD and hooked up to Azure Monitor. It can also be deployed to multiple regions and fronted with Azure Front Door, all while maintaining all metadata in a Geo-Distributed Cosmos DB.


5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.