DatabricksSparkPythonActivity Class

DatabricksSparkPython activity.

All required parameters must be populated in order to send to Azure.

Inheritance
azure.mgmt.datafactory.models._models_py3.ExecutionActivity
DatabricksSparkPythonActivity

Constructor

DatabricksSparkPythonActivity(*, name: str, python_file: Any, additional_properties: Optional[Dict[str, Any]] = None, description: Optional[str] = None, depends_on: Optional[List[_models.ActivityDependency]] = None, user_properties: Optional[List[_models.UserProperty]] = None, linked_service_name: Optional[_models.LinkedServiceReference] = None, policy: Optional[_models.ActivityPolicy] = None, parameters: Optional[List[Any]] = None, libraries: Optional[List[Dict[str, Any]]] = None, **kwargs)

Variables

additional_properties
dict[str, any]

Unmatched properties from the message are deserialized to this collection.

name
str

Required. Activity name.

type
str

Required. Type of activity.Constant filled by server.

description
str

Activity description.

depends_on
list[ActivityDependency]

Activity depends on condition.

user_properties
list[UserProperty]

Activity user properties.

linked_service_name
LinkedServiceReference

Linked service reference.

policy
ActivityPolicy

Activity policy.

python_file
any

Required. The URI of the Python file to be executed. DBFS paths are supported. Type: string (or Expression with resultType string).

parameters
list[any]

Command line parameters that will be passed to the Python file.

libraries
list[dict[str, any]]

A list of libraries to be installed on the cluster that will execute the job.