question

GargVedPrakash-6207 avatar image
0 Votes"
GargVedPrakash-6207 asked GargVedPrakash-6207 commented

Hello Experts, Can anyone let me know what are the components/things require to build a azure cloud data migration solution.

Hello Experts,

I have a requirement for my project which needs to migrate around 40 TB's of data from different source system using azure cloud solutions. So for it I needed to figure out the below 3 queries:
1- What are the initial set up is require for azure point of view with the component details ?
2- What will be the costs for each of the component that is going to use for our project with the subscriptions details.
3- What will be our total cost of data movement if we select ADF as our ETL tool & what will be the preferable storage as azure has a number of options for it.

azure-data-factoryazure-storage-accounts
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

1 Answer

PRADEEPCHEEKATLA-MSFT avatar image
1 Vote"
PRADEEPCHEEKATLA-MSFT answered GargVedPrakash-6207 commented

Hello @GargVedPrakash-6207,

Welcome to the Microsoft Q&A platform.

You can use Azure Storage migration tool to identify which tools needs to choose to transfer data to and from Azure based on your requirement.

132287-azstoragemigrate.gif


There are several options for transferring data to and from Azure, depending on your needs.

  • Physical transfer – Azure Import/Export Service and Azure Data Box
    • Command line tools and APIs – (Azure CLI, AzCopy, PowerShell, Adlcopy, PolyBase, Distcp, Sqoop and Hadoop CLI)

    • Graphical interface – (Azure Portal, Azure Storage Explorer, and Azure Data Factory)

    • Data Pipeline – (Azure Data Factory)

For more details, refer to Transferring data to and from Azure.

Data transfer can be offline or over the network connection. Choose your solution depending on your:

  • Data size - Size of the data intended for transfer,

  • Transfer frequency - One-time or periodic data ingestion, and

  • Network – Bandwidth available for data transfer in your environment.

132413-image.png

This article provides an overview of some of the common Azure data transfer solutions. The article also links out to recommended options depending on the network bandwidth in your environment and the size of the data you intend to transfer.

Azure Data Factory is a managed service best suited for regularly transferring files between a number of Azure services, on-premises, or a combination of the two. Using Azure Data Factory, you can create and schedule data-driven workflows (called pipelines) that ingest data from disparate data stores. It can process and transform the data by using compute services such as Azure HDInsight Hadoop, Spark, Azure Data Lake Analytics, and Azure Machine Learning. Create data-driven workflows for orchestrating and automating data movement and data transformation.

When you say different sources, make sure these sources are supported in Azure Data Factory.

What are the initial set up is require for azure point of view with the component details?

Once you gone through the above articles, you will in a place to understand which tool need to be used for what purpose.

What will be the costs for each of the component that is going to use for our project with the subscription’s details.

Once you select the tool which you want to use to transfer the data to and from Azure, you can review the pricing model on the product using Azure price calculator.

What will be our total cost of data movement if we select ADF as our ETL tool & what will be the preferable storage as azure has a number of options for it.

Azure Data Factory - Pricing page: Pipelines are control flows of discrete steps referred to as activities. You pay for data pipeline orchestration by activity run and activity execution by integration runtime hours. The integration runtime, which is serverless in Azure and self-hosted in hybrid scenarios, provides the compute resources used to execute the activities in a pipeline. Integration runtime charges are prorated by the minute and rounded up.

For example, the Azure Data Factory copy activity can move data across various data stores in a secure, reliable, performant and scalable way. As data volume or throughput needs grow, the integration runtime can scale out to meet those needs.

Note: If you run an operation that takes 2 minutes and 20 seconds, you will be billed for 3 minutes.

132358-image.png

Hope this will help. Please let us know if any further queries.


  • Please don't forget to click on 130616-image.png or upvote 130671-image.png button whenever the information provided helps you. Original posters help the community find answers faster by identifying the correct answer. Here is how

  • Want a reminder to come back and check responses? Here is how to subscribe to a notification

  • If you are interested in joining the VM program and help shape the future of Q&A: Here is how you can be part of Q&A Volunteer Moderators


image.png (39.5 KiB)
image.png (40.3 KiB)
· 2
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Hi @PRADEEPCHEEKATLA-MSFT Thanks a lot for exhaustive and detailed answer, It is really well written.


0 Votes 0 ·

Thanks a lot Pradeep for giving the insights of my queries. I will definitely go & check each & everyone solutions that you have mentioned. Thanks again for your quick response.

0 Votes 0 ·