Petabyte-scale ingestion with Azure Data Factory or Azure Synapse Pipeline

Beginner
Data Engineer
Data Scientist
Data Factory
Synapse Analytics

In this module, you will learn the various methods that can be used to ingest data between various data stores using Azure Data Factory.

Learning objectives

  • Introduction
  • List the data factory ingestion methods
  • Describe data factory connectors
  • Exercise: Use the data factory copy activity
  • Exercise: Manage the self hosted integration runtime
  • Exercise: Setup the Azure integration runtime
  • Understand data ingestion security considerations
  • Knowledge check
  • Summary

Prerequisites

The student should be able to:

  • Log into the Azure portal
  • Explain and create resource groups
  • Describe Azure Data Factory and it's core components