Templates are predefined Azure Data Factory pipelines that allow you to get started quickly with Data Factory. Templates are useful when you're new to Data Factory and want to get started quickly. These templates reduce the development time for building data integration projects thereby improving developer productivity.
Create Data Factory pipelines from templates
You can get started creating a Data Factory pipeline from a template in the following two ways:
Select Create pipeline from template on the Overview page to open the template gallery.
On the Author tab in Resource Explorer, select +, then Pipeline from template to open the template gallery.
Out of the box Data Factory templates
Data Factory uses Azure Resource Manager templates for saving data factory pipeline templates. You can see all the Resource Manager templates, along with the manifest file used for out of the box Data Factory templates, in the official Azure Data Factory GitHub repo. The predefined templates provided by Microsoft include but are not limited to the following items:
Copy from <source> to <destination>
From Amazon S3 to Azure Data Lake Store Gen 2
From Google Big Query to Azure Data Lake Store Gen 2
From HDF to Azure Data Lake Store Gen 2
From Netezza to Azure Data Lake Store Gen 1
From SQL Server on premises to Azure SQL Database
From SQL Server on premises to Azure SQL Data Warehouse
From Oracle on premises to Azure SQL Data Warehouse
- Schedule Azure-SSIS Integration Runtime to execute SSIS packages
You can also save a pipeline as a template by selecting Save as template on the Pipeline tab.
You can view pipelines saved as templates in the My Templates section of the Template Gallery. You can also see them in the Templates section in the Resource Explorer.
To use the My Templates feature, you have to enable GIT integration. Both Azure DevOps GIT and GitHub are supported.