BigDataPoolResourceInfo Class

A Big Data pool.

Variables are only populated by the server, and will be ignored when sending a request.

All required parameters must be populated in order to send to Azure.

Inheritance
azure.mgmt.synapse.models._models_py3.TrackedResource
BigDataPoolResourceInfo

Constructor

BigDataPoolResourceInfo(*, location: str, tags: Optional[Dict[str, str]] = None, provisioning_state: Optional[str] = None, auto_scale: Optional[azure.mgmt.synapse.models._models_py3.AutoScaleProperties] = None, creation_date: Optional[datetime.datetime] = None, auto_pause: Optional[azure.mgmt.synapse.models._models_py3.AutoPauseProperties] = None, is_compute_isolation_enabled: Optional[bool] = None, session_level_packages_enabled: Optional[bool] = None, cache_size: Optional[int] = None, dynamic_executor_allocation: Optional[azure.mgmt.synapse.models._models_py3.DynamicExecutorAllocation] = None, spark_events_folder: Optional[str] = None, node_count: Optional[int] = None, library_requirements: Optional[azure.mgmt.synapse.models._models_py3.LibraryRequirements] = None, custom_libraries: Optional[List[azure.mgmt.synapse.models._models_py3.LibraryInfo]] = None, spark_config_properties: Optional[azure.mgmt.synapse.models._models_py3.LibraryRequirements] = None, spark_version: Optional[str] = None, default_spark_log_folder: Optional[str] = None, node_size: Optional[Union[str, azure.mgmt.synapse.models._synapse_management_client_enums.NodeSize]] = None, node_size_family: Optional[Union[str, azure.mgmt.synapse.models._synapse_management_client_enums.NodeSizeFamily]] = None, **kwargs)

Parameters

tags
dict[str, str]
Required

A set of tags. Resource tags.

location
str
Required

Required. The geo-location where the resource lives.

provisioning_state
str
Required

The state of the Big Data pool.

auto_scale
AutoScaleProperties
Required

Auto-scaling properties.

creation_date
datetime
Required

The time when the Big Data pool was created.

auto_pause
AutoPauseProperties
Required

Auto-pausing properties.

is_compute_isolation_enabled
bool
Required

Whether compute isolation is required or not.

session_level_packages_enabled
bool
Required

Whether session level packages enabled.

cache_size
int
Required

The cache size.

dynamic_executor_allocation
DynamicExecutorAllocation
Required

Dynamic Executor Allocation.

spark_events_folder
str
Required

The Spark events folder.

node_count
int
Required

The number of nodes in the Big Data pool.

library_requirements
LibraryRequirements
Required

Library version requirements.

custom_libraries
list[LibraryInfo]
Required

List of custom libraries/packages associated with the spark pool.

spark_config_properties
LibraryRequirements
Required

Spark configuration file to specify additional properties.

spark_version
str
Required

The Apache Spark version.

default_spark_log_folder
str
Required

The default folder where Spark logs will be written.

node_size
str or NodeSize
Required

The level of compute power that each node in the Big Data pool has. Possible values include: "None", "Small", "Medium", "Large", "XLarge", "XXLarge", "XXXLarge".

node_size_family
str or NodeSizeFamily
Required

The kind of nodes that the Big Data pool provides. Possible values include: "None", "MemoryOptimized".

Variables

id
str

Fully qualified resource ID for the resource. Ex - /subscriptions/{subscriptionId}/resourceGroups/{resourceGroupName}/providers/{resourceProviderNamespace}/{resourceType}/{resourceName}.

name
str

The name of the resource.

type
str

The type of the resource. E.g. "Microsoft.Compute/virtualMachines" or "Microsoft.Storage/storageAccounts".

last_succeeded_timestamp
datetime

The time when the Big Data pool was updated successfully.