Copy data from PostgreSQL by using Azure Data Factory

This article outlines how to use the Copy Activity in Azure Data Factory to copy data from a PostgreSQL database. It builds on the copy activity overview article that presents a general overview of copy activity.

Note

This article applies to version 2 of Data Factory, which is currently in preview. If you are using version 1 of the Data Factory service, which is generally available (GA), see PostgreSQL connector in V1.

Supported capabilities

You can copy data from PostgreSQL database to any supported sink data store. For a list of data stores that are supported as sources/sinks by the copy activity, see the Supported data stores table.

Specifically, this PostgreSQL connector supports PostgreSQL version 7.4 and above.

Prerequisites

To use this PostgreSQL connector, you need to:

Getting started

You can create a pipeline with the copy activity by using one of the following tools or SDKs. Select a link to go to a tutorial with step-by-step instructions to create a pipeline with a copy activity.

The following sections provide details about properties that are used to define Data Factory entities specific to PostgreSQL connector.

Linked service properties

The following properties are supported for PostgreSQL linked service:

Property Description Required
type The type property must be set to: PostgreSql Yes
server Name of the PostgreSQL server. Yes
database Name of the PostgreSQL database. Yes
schema Name of the schema in the database. The schema name is case-sensitive. No
username Specify user name to connect to the PostgreSQL database. Yes
password Specify password for the user account you specified for the username. Mark this field as a SecureString to store it securely in Data Factory, or reference a secret stored in Azure Key Vault. Yes
connectVia The Integration Runtime to be used to connect to the data store. A Self-hosted Integration Runtime is required as mentioned in Prerequisites. Yes

Example:

{
    "name": "PostgreSqlLinkedService",
    "properties": {
        "type": "PostgreSql",
        "typeProperties": {
            "server": "<server>",
            "database": "<database>",
            "username": "<username>",
            "password": {
                "type": "SecureString",
                "value": "<password>"
            }
        },
        "connectVia": {
            "referenceName": "<name of Integration Runtime>",
            "type": "IntegrationRuntimeReference"
        }
    }
}

Dataset properties

For a full list of sections and properties available for defining datasets, see the datasets article. This section provides a list of properties supported by PostgreSQL dataset.

To copy data from PostgreSQL, set the type property of the dataset to RelationalTable. The following properties are supported:

Property Description Required
type The type property of the dataset must be set to: RelationalTable Yes
tableName Name of the table in the PostgreSQL database. No (if "query" in activity source is specified)

Example

{
    "name": "PostgreSQLDataset",
    "properties":
    {
        "type": "RelationalTable",
        "linkedServiceName": {
            "referenceName": "<PostgreSQL linked service name>",
            "type": "LinkedServiceReference"
        },
        "typeProperties": {}
    }
}

Copy activity properties

For a full list of sections and properties available for defining activities, see the Pipelines article. This section provides a list of properties supported by PostgreSQL source.

PostgreSQL as source

To copy data from PostgreSQL, set the source type in the copy activity to RelationalSource. The following properties are supported in the copy activity source section:

Property Description Required
type The type property of the copy activity source must be set to: RelationalSource Yes
query Use the custom SQL query to read data. For example: "query": "SELECT * FROM \"MySchema\".\"MyTable\"". No (if "tableName" in dataset is specified)

Note

Schema and table names are case-sensitive. Enclose them in "" (double quotes) in the query.

Example:

"activities":[
    {
        "name": "CopyFromPostgreSQL",
        "type": "Copy",
        "inputs": [
            {
                "referenceName": "<PostgreSQL input dataset name>",
                "type": "DatasetReference"
            }
        ],
        "outputs": [
            {
                "referenceName": "<output dataset name>",
                "type": "DatasetReference"
            }
        ],
        "typeProperties": {
            "source": {
                "type": "RelationalSource",
                "query": "SELECT * FROM \"MySchema\".\"MyTable\""
            },
            "sink": {
                "type": "<sink type>"
            }
        }
    }
]

Data type mapping for PostgreSQL

When copying data from PostgreSQL, the following mappings are used from PostgreSQL data types to Azure Data Factory interim data types. See Schema and data type mappings to learn about how copy activity maps the source schema and data type to the sink.

PostgreSQL data type PostgresSQL aliases Data factory interim data type
abstime Datetime
bigint int8 Int64
bigserial serial8 Int64
bit [ (n) ] Byte[], String
bit varying [ (n) ] `varbit Byte[], String`
boolean bool Boolean
box Byte[], String
bytea Byte[], String
character [ (n) ] char [ (n) ] String
character varying [ (n) ] varchar [ (n) ] String
cid String
cidr String
circle Byte[], String
date Datetime
daterange String
double precision float8 Double
inet Byte[], String
intarry String
int4range String
int8range String
integer `int, int4 Int32`
interval [ fields ] [ (p) ] Timespan
json String
jsonb Byte[]
line Byte[], String
lseg Byte[], String
macaddr Byte[], String
money Decimal
numeric [ (p, s) ] decimal [ (p, s) ] Decimal
numrange String
oid Int32
path Byte[], String
pg_lsn Int64
point Byte[], String
polygon Byte[], String
real float4 Single
smallint int2 Int16
smallserial serial2 Int16
serial serial4 Int32
text String

Next steps

For a list of data stores supported as sources and sinks by the copy activity in Azure Data Factory, see supported data stores.