AmazonRedshiftSource Class
A copy activity source for Amazon Redshift Source.
All required parameters must be populated in order to send to Azure.
- Inheritance
-
azure.mgmt.datafactory.models._models_py3.TabularSourceAmazonRedshiftSource
Constructor
AmazonRedshiftSource(*, additional_properties: Optional[Dict[str, Any]] = None, source_retry_count: Optional[Any] = None, source_retry_wait: Optional[Any] = None, max_concurrent_connections: Optional[Any] = None, disable_metrics_collection: Optional[Any] = None, query_timeout: Optional[Any] = None, additional_columns: Optional[Any] = None, query: Optional[Any] = None, redshift_unload_settings: Optional[_models.RedshiftUnloadSettings] = None, **kwargs)
Variables
Unmatched properties from the message are deserialized to this collection.
- type
- str
Required. Copy source type.Constant filled by server.
- source_retry_count
- any
Source retry count. Type: integer (or Expression with resultType integer).
- source_retry_wait
- any
Source retry wait. Type: string (or Expression with resultType string), pattern: ((d+).)?(dd):(60|([0-5][0-9])):(60|([0-5][0-9])).
- max_concurrent_connections
- any
The maximum concurrent connection count for the source data store. Type: integer (or Expression with resultType integer).
- disable_metrics_collection
- any
If true, disable data store metrics collection. Default is false. Type: boolean (or Expression with resultType boolean).
- query_timeout
- any
Query timeout. Type: string (or Expression with resultType string), pattern: ((d+).)?(dd):(60|([0-5][0-9])):(60|([0-5][0-9])).
- additional_columns
- any
Specifies the additional columns to be added to source data. Type: array of objects(AdditionalColumns) (or Expression with resultType array of objects).
- query
- any
Database query. Type: string (or Expression with resultType string).
- redshift_unload_settings
- RedshiftUnloadSettings
The Amazon S3 settings needed for the interim Amazon S3 when copying from Amazon Redshift with unload. With this, data from Amazon Redshift source will be unloaded into S3 first and then copied into the targeted sink from the interim S3.
Feedback
Submit and view feedback for