Copy data from Google Cloud Storage using Azure Data Factory

This article outlines how to copy data from Google Cloud Storage. To learn about Azure Data Factory, read the introductory article.

Supported capabilities

This Google Cloud Storage connector is supported for the following activities:

Specifically, this Google Cloud Storage connector supports copying files as-is or parsing files with the supported file formats and compression codecs.

Note

Copying data from Google Cloud Storage leverages the Amazon S3 connector with corresponding custom S3 endpoint, as Google Cloud Storage provides S3-compatible interoperability.

Required permissions

To copy data from Google Cloud Storage, make sure you have been granted the following permissions:

  • For copy activity execution:: s3:GetObject and s3:GetObjectVersion for Object Operations.
  • For Data Factory GUI authoring: s3:ListAllMyBuckets and s3:ListBucket/s3:GetBucketLocation for Bucket Operations permissions are additionally required for operations like test connection and browse/navigate file paths. If you don't want to grant these permission, skip test connection in linked service creation page and specify the path directly in dataset settings.

Getting started

You can use one of the following tools or SDKs to use the copy activity with a pipeline. Select a link for step-by-step instructions:

The following sections provide details about properties that are used to define Data Factory entities specific to Google Cloud Storage.

Linked service properties

The following properties are supported for Google Cloud Storage linked service:

Property Description Required
type The type property must be set to AmazonS3. Yes
accessKeyId ID of the secret access key. To find the access key and secret, go to Google Cloud Storage > Settings > Interoperability. Yes
secretAccessKey The secret access key itself. Mark this field as a SecureString to store it securely in Data Factory, or reference a secret stored in Azure Key Vault. Yes
serviceUrl Specify the custom S3 endpoint as https://storage.googleapis.com. Yes
connectVia The Integration Runtime to be used to connect to the data store. You can use Azure Integration Runtime or Self-hosted Integration Runtime (if your data store is located in private network). If not specified, it uses the default Azure Integration Runtime. No

Here is an example:

{
    "name": "GoogleCloudStorageLinkedService",
    "properties": {
        "type": "AmazonS3",
        "typeProperties": {
            "accessKeyId": "<access key id>",
            "secretAccessKey": {
                "type": "SecureString",
                "value": "<secret access key>"
            },
            "serviceUrl": "https://storage.googleapis.com"
        },
        "connectVia": {
            "referenceName": "<name of Integration Runtime>",
            "type": "IntegrationRuntimeReference"
        }
    }
}

Dataset properties

Parquet, delimited text, JSON, Avro and binary format dataset

To copy data from Parquet, delimited text, JSON, Avro and binary format, refer to Parquet format, Delimited text format, Avro format and Binary format article on format-based dataset and supported settings. The following properties are supported for Google Cloud Storage under location settings in format-based dataset:

Property Description Required
type The type property under location in dataset must be set to AmazonS3Location. Yes
bucketName The S3 bucket name. Yes
folderPath The path to folder under the given bucket. If you want to use wildcard to filter folder, skip this setting and specify in activity source settings. No
fileName The file name under the given bucket + folderPath. If you want to use wildcard to filter files, skip this setting and specify in activity source settings. No

Note

AmazonS3Object type dataset with Parquet/Text format mentioned in next section is still supported as-is for Copy/Lookup/GetMetadata activity for backward compatibility. You are suggested to use this new model going forward, and the ADF authoring UI has switched to generating these new types.

Example:

{
    "name": "DelimitedTextDataset",
    "properties": {
        "type": "DelimitedText",
        "linkedServiceName": {
            "referenceName": "<Google Cloud Storage linked service name>",
            "type": "LinkedServiceReference"
        },
        "schema": [ < physical schema, optional, auto retrieved during authoring > ],
        "typeProperties": {
            "location": {
                "type": "AmazonS3Location",
                "bucketName": "bucketname",
                "folderPath": "folder/subfolder"
            },
            "columnDelimiter": ",",
            "quoteChar": "\"",
            "firstRowAsHeader": true,
            "compressionCodec": "gzip"
        }
    }
}

Other format dataset

To copy data from Google Cloud Storage in ORC format, the following properties are supported:

Property Description Required
type The type property of the dataset must be set to: AmazonS3Object Yes
bucketName The S3 bucket name. Wildcard filter is not supported. Yes for Copy/Lookup activity, No for GetMetadata activity
key The name or wildcard filter of S3 object key under the specified bucket. Applies only when "prefix" property is not specified.

The wildcard filter is supported for both folder part and file name part. Allowed wildcards are: * (matches zero or more characters) and ? (matches zero or single character).
- Example 1: "key": "rootfolder/subfolder/*.csv"
- Example 2: "key": "rootfolder/subfolder/???20180427.txt"
See more examples in Folder and file filter examples. Use ^ to escape if your actual folder/file name has wildcard or this escape char inside.
No
prefix Prefix for the S3 object key. Objects whose keys start with this prefix are selected. Applies only when "key" property is not specified. No
version The version of the S3 object, if S3 versioning is enabled. No
modifiedDatetimeStart Files filter based on the attribute: Last Modified. The files will be selected if their last modified time are within the time range between modifiedDatetimeStart and modifiedDatetimeEnd. The time is applied to UTC time zone in the format of "2018-12-01T05:00:00Z".

The properties can be NULL which mean no file attribute filter will be applied to the dataset. When modifiedDatetimeStart has datetime value but modifiedDatetimeEnd is NULL, it means the files whose last modified attribute is greater than or equal with the datetime value will be selected. When modifiedDatetimeEnd has datetime value but modifiedDatetimeStart is NULL, it means the files whose last modified attribute is less than the datetime value will be selected.
No
modifiedDatetimeEnd Files filter based on the attribute: Last Modified. The files will be selected if their last modified time are within the time range between modifiedDatetimeStart and modifiedDatetimeEnd. The time is applied to UTC time zone in the format of "2018-12-01T05:00:00Z".

The properties can be NULL which mean no file attribute filter will be applied to the dataset. When modifiedDatetimeStart has datetime value but modifiedDatetimeEnd is NULL, it means the files whose last modified attribute is greater than or equal with the datetime value will be selected. When modifiedDatetimeEnd has datetime value but modifiedDatetimeStart is NULL, it means the files whose last modified attribute is less than the datetime value will be selected.
No
format If you want to copy files as-is between file-based stores (binary copy), skip the format section in both input and output dataset definitions.

If you want to parse or generate files with a specific format, the following file format types are supported: TextFormat, JsonFormat, AvroFormat, OrcFormat, ParquetFormat. Set the type property under format to one of these values. For more information, see Text Format, Json Format, Avro Format, Orc Format, and Parquet Format sections.
No (only for binary copy scenario)
compression Specify the type and level of compression for the data. For more information, see Supported file formats and compression codecs.
Supported types are: GZip, Deflate, BZip2, and ZipDeflate.
Supported levels are: Optimal and Fastest.
No

Tip

To copy all files under a folder, specify bucketName for bucket and prefix for folder part.
To copy a single file with a given name, specify bucketName for bucket and key for folder part plus file name.
To copy a subset of files under a folder, specify bucketName for bucket and key for folder part plus wildcard filter.

Example: using prefix

{
    "name": "GoogleCloudStorageDataset",
    "properties": {
        "type": "AmazonS3Object",
        "linkedServiceName": {
            "referenceName": "<linked service name>",
            "type": "LinkedServiceReference"
        },
        "typeProperties": {
            "bucketName": "testbucket",
            "prefix": "testFolder/test",
            "modifiedDatetimeStart": "2018-12-01T05:00:00Z",
            "modifiedDatetimeEnd": "2018-12-01T06:00:00Z",
            "format": {
                "type": "TextFormat",
                "columnDelimiter": ",",
                "rowDelimiter": "\n"
            },
            "compression": {
                "type": "GZip",
                "level": "Optimal"
            }
        }
    }
}

Copy activity properties

For a full list of sections and properties available for defining activities, see the Pipelines article. This section provides a list of properties supported by Google Cloud Storage source.

Google Cloud Storage as source

Parquet, delimited text, JSON, Avro and binary format source

To copy data from Parquet, delimited text, JSON, Avro and binary format, refer to Parquet format, Delimited text format, Avro format and Binary format article on format-based copy activity source and supported settings. The following properties are supported for Google Cloud Storage under storeSettings settings in format-based copy source:

Property Description Required
type The type property under storeSettings must be set to AmazonS3ReadSetting. Yes
recursive Indicates whether the data is read recursively from the subfolders or only from the specified folder. Note that when recursive is set to true and the sink is a file-based store, an empty folder or subfolder isn't copied or created at the sink. Allowed values are true (default) and false. No
prefix Prefix for the S3 object key under the given bucket configured in dataset to filter source objects. Objects whose keys start with this prefix are selected. Applies only when wildcardFolderPath and wildcardFileName properties are not specified.
wildcardFolderPath The folder path with wildcard characters under the given bucket configured in dataset to filter source folders.
Allowed wildcards are: * (matches zero or more characters) and ? (matches zero or single character); use ^ to escape if your actual folder name has wildcard or this escape char inside.
See more examples in Folder and file filter examples.
No
wildcardFileName The file name with wildcard characters under the given bucket + folderPath/wildcardFolderPath to filter source files.
Allowed wildcards are: * (matches zero or more characters) and ? (matches zero or single character); use ^ to escape if your actual folder name has wildcard or this escape char inside. See more examples in Folder and file filter examples.
Yes if fileName in dataset and prefix are not specified
modifiedDatetimeStart Files filter based on the attribute: Last Modified. The files will be selected if their last modified time are within the time range between modifiedDatetimeStart and modifiedDatetimeEnd. The time is applied to UTC time zone in the format of "2018-12-01T05:00:00Z".
The properties can be NULL which mean no file attribute filter will be applied to the dataset. When modifiedDatetimeStart has datetime value but modifiedDatetimeEnd is NULL, it means the files whose last modified attribute is greater than or equal with the datetime value will be selected. When modifiedDatetimeEnd has datetime value but modifiedDatetimeStart is NULL, it means the files whose last modified attribute is less than the datetime value will be selected.
No
modifiedDatetimeEnd Same as above. No
maxConcurrentConnections The number of the connections to connect to storage store concurrently. Specify only when you want to limit the concurrent connection to the data store. No

Note

For Parquet/delimited text format, FileSystemSource type copy activity source mentioned in next section is still supported as-is for backward compatibility. You are suggested to use this new model going forward, and the ADF authoring UI has switched to generating these new types.

Example:

"activities":[
    {
        "name": "CopyFromGoogleCloudStorage",
        "type": "Copy",
        "inputs": [
            {
                "referenceName": "<Delimited text input dataset name>",
                "type": "DatasetReference"
            }
        ],
        "outputs": [
            {
                "referenceName": "<output dataset name>",
                "type": "DatasetReference"
            }
        ],
        "typeProperties": {
            "source": {
                "type": "DelimitedTextSource",
                "formatSettings":{
                    "type": "DelimitedTextReadSetting",
                    "skipLineCount": 10
                },
                "storeSettings":{
                    "type": "AmazonS3ReadSetting",
                    "recursive": true,
                    "wildcardFolderPath": "myfolder*A",
                    "wildcardFileName": "*.csv"
                }
            },
            "sink": {
                "type": "<sink type>"
            }
        }
    }
]

Other format source

To copy data from Google Cloud Storage in ORC format, the following properties are supported in the copy activity source section:

Property Description Required
type The type property of the copy activity source must be set to: FileSystemSource Yes
recursive Indicates whether the data is read recursively from the sub folders or only from the specified folder. Note when recursive is set to true and sink is file-based store, empty folder/sub-folder will not be copied/created at sink.
Allowed values are: true (default), false
No
maxConcurrentConnections The number of the connections to connect to storage store concurrently. Specify only when you want to limit the concurrent connection to the data store. No

Example:

"activities":[
    {
        "name": "CopyFromGoogleCloudStorage",
        "type": "Copy",
        "inputs": [
            {
                "referenceName": "<input dataset name>",
                "type": "DatasetReference"
            }
        ],
        "outputs": [
            {
                "referenceName": "<output dataset name>",
                "type": "DatasetReference"
            }
        ],
        "typeProperties": {
            "source": {
                "type": "FileSystemSource",
                "recursive": true
            },
            "sink": {
                "type": "<sink type>"
            }
        }
    }
]

Folder and file filter examples

This section describes the resulting behavior of the folder path and file name with wildcard filters.

bucket key recursive Source folder structure and filter result (files in bold are retrieved)
bucket Folder*/* false bucket
    FolderA
        File1.csv
        File2.json
        Subfolder1
            File3.csv
            File4.json
            File5.csv
    AnotherFolderB
        File6.csv
bucket Folder*/* true bucket
    FolderA
        File1.csv
        File2.json
        Subfolder1
            File3.csv
            File4.json
            File5.csv
    AnotherFolderB
        File6.csv
bucket Folder*/*.csv false bucket
    FolderA
        File1.csv
        File2.json
        Subfolder1
            File3.csv
            File4.json
            File5.csv
    AnotherFolderB
        File6.csv
bucket Folder*/*.csv true bucket
    FolderA
        File1.csv
        File2.json
        Subfolder1
            File3.csv
            File4.json
            File5.csv
    AnotherFolderB
        File6.csv

Lookup activity properties

To learn details about the properties, check Lookup activity.

GetMetadata activity properties

To learn details about the properties, check GetMetadata activity

Delete activity properties

To learn details about the properties, check Delete activity

Next steps

For a list of data stores that are supported as sources and sinks by the copy activity in Azure Data Factory, see supported data stores.