Azure Feature Pack for Integration Services (SSIS)
SQL Server Integration Services (SSIS) Feature Pack for Azure is an extension that provides the components listed on this page for SSIS to connect to Azure services, transfer data between Azure and on-premises data sources, and process data stored in Azure.
- For SQL Server 2019 - Microsoft SQL Server 2019 Integration Services Feature Pack for Azure
- For SQL Server 2017 - Microsoft SQL Server 2017 Integration Services Feature Pack for Azure
- For SQL Server 2016 - Microsoft SQL Server 2016 Integration Services Feature Pack for Azure
- For SQL Server 2014 - Microsoft SQL Server 2014 Integration Services Feature Pack for Azure
- For SQL Server 2012 - Microsoft SQL Server 2012 Integration Services Feature Pack for Azure
The download pages also include information about prerequisites. Make sure you install SQL Server before you install the Azure Feature Pack on a server, or the components in the Feature Pack may not be available when you deploy packages to the SSIS Catalog database, SSISDB, on the server.
Components in the Feature Pack
Data Flow Components
Azure Blob, Azure Data Lake Store, and Data Lake Storage Gen2 File Enumerator. See Foreach Loop Container
Use TLS 1.2
The TLS version used by Azure Feature Pack follows system .NET Framework settings.
To use TLS 1.2, add a
REG_DWORD value named
SchUseStrongCrypto with data
1 under the following two registry keys.
Dependency on Java
Java is required to use ORC/Parquet file formats with Azure Data Lake Store/Flat File connectors.
Architecture (32/64-bit) of the Java build should match that of the SSIS runtime to use. The following Java builds have been tested.
Set Up Zulu's OpenJDK
- Download and extract the installation zip package.
- From the Command Prompt, run
- On the Advanced tab, select Environment Variables.
- Under the System variables section, select New.
JAVA_HOMEfor the Variable name.
- Select Browse Directory, navigate to the extracted folder, and select the
jresubfolder. Then select OK, and the Variable value is populated automatically.
- Select OK to close the New System Variable dialog box.
- Select OK to close the Environment Variables dialog box.
- Select OK to close the System Properties dialog box.
Set Up Zulu's OpenJDK on Azure-SSIS Integration Runtime
This should be done via custom setup interface for Azure-SSIS Integration Runtime.
zulu126.96.36.199-jdk8.0.192-win_x64.zip is used.
The blob container could be organized as follows.
main.cmd install_openjdk.ps1 zulu188.8.131.52-jdk8.0.192-win_x64.zip
As the entry point,
main.cmd triggers execution of the PowerShell script
install_openjdk.ps1 which in turn extracts
zulu184.108.40.206-jdk8.0.192-win_x64.zip and sets
powershell.exe -file install_openjdk.ps1
Expand-Archive zulu220.127.116.11-jdk8.0.192-win_x64.zip -DestinationPath C:\ [Environment]::SetEnvironmentVariable("JAVA_HOME", "C:\zulu18.104.22.168-jdk8.0.192-win_x64\jre", "Machine")
Set Up Oracle's Java SE Runtime Environment
- Download and run the exe installer.
- Follow the installer instructions to complete setup.
Scenario: Processing big data
Use Azure Connector to complete following big data processing work:
Use the Azure Blob Upload Task to upload input data to Azure Blob Storage.
Use the Azure HDInsight Create Cluster Task to create an Azure HDInsight cluster. This step is optional if you want to use your own cluster.
Use the Azure HDInsight Hive Task or Azure HDInsight Pig Task to invoke a Pig or Hive job on the Azure HDInsight cluster.
Use the Azure HDInsight Delete Cluster Task to delete the HDInsight Cluster after use if you have created an on-demand HDInsight cluster in step #2.
Use the Azure HDInsight Blob Download Task to download the Pig/Hive output data from the Azure Blob Storage.
Scenario: Managing data in the cloud
Use the Azure Blob Destination in an SSIS package to write output data to Azure Blob Storage, or use the Azure Blob Source to read data from an Azure Blob Storage.
Use the Foreach Loop Container with the Azure Blob Enumerator to process data in multiple blob files.
- In certain cases, package execution reports "Error: Could not load file or assembly ‘Newtonsoft.Json, Version=22.214.171.124, Culture=neutral, PublicKeyToken=30ad4fe6b2a6aeed’ or one of its dependencies."
- Add delete folder/file operation to Flexible File Task
- Add External/Output data type convert function in Flexible File Source
- In certain cases, test connection malfunctions for Data Lake Storage Gen2 with the error message "Attempted to access an element as a type incompatible with the array"
- Bring back support for Azure storage emulator