SparkSessionOperations Class

SparkSessionOperations async operations.

You should not instantiate this class directly. Instead, you should create a Client instance that instantiates it for you and attaches it as an attribute.

Inheritance
builtins.object
SparkSessionOperations

Constructor

SparkSessionOperations(client, config, serializer, deserializer)

Parameters

client
Required

Client for service requests.

config
Required

Configuration of service client.

serializer
Required

An object model serializer.

deserializer
Required

An object model deserializer.

Variables

models

Alias to model classes used in this operation group.

Methods

cancel_spark_session

Cancels a running spark session.

cancel_spark_statement

Kill a statement within a session.

create_spark_session

Create new spark session.

create_spark_statement

Create statement within a spark session.

get_spark_session

Gets a single spark session.

get_spark_sessions

List all spark sessions which are running under a particular spark pool.

get_spark_statement

Gets a single statement within a spark session.

get_spark_statements

Gets a list of statements within a spark session.

reset_spark_session_timeout

Sends a keep alive call to the current session to reset the session timeout.

cancel_spark_session

Cancels a running spark session.

async cancel_spark_session(session_id: int, **kwargs: Any) -> None

Parameters

session_id
int
Required

Identifier for the session.

cls
callable

A custom type or function that will be passed the direct response

Returns

None, or the result of cls(response)

Return type

Exceptions

cancel_spark_statement

Kill a statement within a session.

async cancel_spark_statement(session_id: int, statement_id: int, **kwargs: Any) -> azure.synapse.spark.models._models_py3.SparkStatementCancellationResult

Parameters

session_id
int
Required

Identifier for the session.

statement_id
int
Required

Identifier for the statement.

cls
callable

A custom type or function that will be passed the direct response

Returns

SparkStatementCancellationResult, or the result of cls(response)

Return type

Exceptions

create_spark_session

Create new spark session.

async create_spark_session(spark_session_options: azure.synapse.spark.models._models_py3.SparkSessionOptions, detailed: Optional[bool] = None, **kwargs: Any) -> azure.synapse.spark.models._models_py3.SparkSession

Parameters

spark_session_options
SparkSessionOptions
Required

Livy compatible batch job request payload.

detailed
bool
default value: None

Optional query param specifying whether detailed response is returned beyond plain livy.

cls
callable

A custom type or function that will be passed the direct response

Returns

SparkSession, or the result of cls(response)

Return type

Exceptions

create_spark_statement

Create statement within a spark session.

async create_spark_statement(session_id: int, spark_statement_options: azure.synapse.spark.models._models_py3.SparkStatementOptions, **kwargs: Any) -> azure.synapse.spark.models._models_py3.SparkStatement

Parameters

session_id
int
Required

Identifier for the session.

spark_statement_options
SparkStatementOptions
Required

Livy compatible batch job request payload.

cls
callable

A custom type or function that will be passed the direct response

Returns

SparkStatement, or the result of cls(response)

Return type

Exceptions

get_spark_session

Gets a single spark session.

async get_spark_session(session_id: int, detailed: Optional[bool] = None, **kwargs: Any) -> azure.synapse.spark.models._models_py3.SparkSession

Parameters

session_id
int
Required

Identifier for the session.

detailed
bool
default value: None

Optional query param specifying whether detailed response is returned beyond plain livy.

cls
callable

A custom type or function that will be passed the direct response

Returns

SparkSession, or the result of cls(response)

Return type

Exceptions

get_spark_sessions

List all spark sessions which are running under a particular spark pool.

async get_spark_sessions(from_parameter: Optional[int] = None, size: Optional[int] = None, detailed: Optional[bool] = None, **kwargs: Any) -> azure.synapse.spark.models._models_py3.SparkSessionCollection

Parameters

from_parameter
int
default value: None

Optional param specifying which index the list should begin from.

size
int
default value: None

Optional param specifying the size of the returned list. By default it is 20 and that is the maximum.

detailed
bool
default value: None

Optional query param specifying whether detailed response is returned beyond plain livy.

cls
callable

A custom type or function that will be passed the direct response

Returns

SparkSessionCollection, or the result of cls(response)

Return type

Exceptions

get_spark_statement

Gets a single statement within a spark session.

async get_spark_statement(session_id: int, statement_id: int, **kwargs: Any) -> azure.synapse.spark.models._models_py3.SparkStatement

Parameters

session_id
int
Required

Identifier for the session.

statement_id
int
Required

Identifier for the statement.

cls
callable

A custom type or function that will be passed the direct response

Returns

SparkStatement, or the result of cls(response)

Return type

Exceptions

get_spark_statements

Gets a list of statements within a spark session.

async get_spark_statements(session_id: int, **kwargs: Any) -> azure.synapse.spark.models._models_py3.SparkStatementCollection

Parameters

session_id
int
Required

Identifier for the session.

cls
callable

A custom type or function that will be passed the direct response

Returns

SparkStatementCollection, or the result of cls(response)

Return type

Exceptions

reset_spark_session_timeout

Sends a keep alive call to the current session to reset the session timeout.

async reset_spark_session_timeout(session_id: int, **kwargs: Any) -> None

Parameters

session_id
int
Required

Identifier for the session.

cls
callable

A custom type or function that will be passed the direct response

Returns

None, or the result of cls(response)

Return type

Exceptions

Attributes

models

models = <module 'azure.synapse.spark.models' from 'C:\\hostedtoolcache\\windows\\Python\\3.9.13\\x64\\lib\\site-packages\\azure\\synapse\\spark\\models\\__init__.py'>