Job Class

Information about a Job.

Variables are only populated by the server, and will be ignored when sending a request.

Inheritance
azure.mgmt.batchai.models._models_py3.ProxyResource
Job

Constructor

Job(*, scheduling_priority: Optional[Union[str, azure.mgmt.batchai.models._batch_ai_enums.JobPriority]] = None, cluster: Optional[azure.mgmt.batchai.models._models_py3.ResourceId] = None, mount_volumes: Optional[azure.mgmt.batchai.models._models_py3.MountVolumes] = None, node_count: Optional[int] = None, container_settings: Optional[azure.mgmt.batchai.models._models_py3.ContainerSettings] = None, tool_type: Optional[Union[str, azure.mgmt.batchai.models._batch_ai_enums.ToolType]] = None, cntk_settings: Optional[azure.mgmt.batchai.models._models_py3.CNTKsettings] = None, py_torch_settings: Optional[azure.mgmt.batchai.models._models_py3.PyTorchSettings] = None, tensor_flow_settings: Optional[azure.mgmt.batchai.models._models_py3.TensorFlowSettings] = None, caffe_settings: Optional[azure.mgmt.batchai.models._models_py3.CaffeSettings] = None, caffe2_settings: Optional[azure.mgmt.batchai.models._models_py3.Caffe2Settings] = None, chainer_settings: Optional[azure.mgmt.batchai.models._models_py3.ChainerSettings] = None, custom_toolkit_settings: Optional[azure.mgmt.batchai.models._models_py3.CustomToolkitSettings] = None, custom_mpi_settings: Optional[azure.mgmt.batchai.models._models_py3.CustomMpiSettings] = None, horovod_settings: Optional[azure.mgmt.batchai.models._models_py3.HorovodSettings] = None, job_preparation: Optional[azure.mgmt.batchai.models._models_py3.JobPreparation] = None, std_out_err_path_prefix: Optional[str] = None, input_directories: Optional[List[azure.mgmt.batchai.models._models_py3.InputDirectory]] = None, output_directories: Optional[List[azure.mgmt.batchai.models._models_py3.OutputDirectory]] = None, environment_variables: Optional[List[azure.mgmt.batchai.models._models_py3.EnvironmentVariable]] = None, secrets: Optional[List[azure.mgmt.batchai.models._models_py3.EnvironmentVariableWithSecretValue]] = None, constraints: Optional[azure.mgmt.batchai.models._models_py3.JobPropertiesConstraints] = None, execution_info: Optional[azure.mgmt.batchai.models._models_py3.JobPropertiesExecutionInfo] = None, **kwargs)

Parameters

scheduling_priority
str or <xref:batch_ai.models.JobPriority>
Required

Scheduling priority associated with the job. Possible values include: "low", "normal", "high".

cluster
<xref:batch_ai.models.ResourceId>
Required

Resource ID of the cluster associated with the job.

mount_volumes
<xref:batch_ai.models.MountVolumes>
Required

Collection of mount volumes available to the job during execution. These volumes are mounted before the job execution and unmounted after the job completion. The volumes are mounted at location specified by $AZ_BATCHAI_JOB_MOUNT_ROOT environment variable.

node_count
int
Required

The job will be gang scheduled on that many compute nodes.

container_settings
<xref:batch_ai.models.ContainerSettings>
Required

If the container was downloaded as part of cluster setup then the same container image will be used. If not provided, the job will run on the VM.

tool_type
str or <xref:batch_ai.models.ToolType>
Required

Possible values are: cntk, tensorflow, caffe, caffe2, chainer, pytorch, custom, custommpi, horovod. Possible values include: "cntk", "tensorflow", "caffe", "caffe2", "chainer", "horovod", "custommpi", "custom".

cntk_settings
<xref:batch_ai.models.CNTKsettings>
Required

CNTK (aka Microsoft Cognitive Toolkit) job settings.

py_torch_settings
<xref:batch_ai.models.PyTorchSettings>
Required

pyTorch job settings.

tensor_flow_settings
<xref:batch_ai.models.TensorFlowSettings>
Required

TensorFlow job settings.

caffe_settings
<xref:batch_ai.models.CaffeSettings>
Required

Caffe job settings.

caffe2_settings
<xref:batch_ai.models.Caffe2Settings>
Required

Caffe2 job settings.

chainer_settings
<xref:batch_ai.models.ChainerSettings>
Required

Chainer job settings.

custom_toolkit_settings
<xref:batch_ai.models.CustomToolkitSettings>
Required

Custom tool kit job settings.

custom_mpi_settings
<xref:batch_ai.models.CustomMpiSettings>
Required

Custom MPI job settings.

horovod_settings
<xref:batch_ai.models.HorovodSettings>
Required

Specifies the settings for Horovod job.

job_preparation
<xref:batch_ai.models.JobPreparation>
Required

The specified actions will run on all the nodes that are part of the job.

std_out_err_path_prefix
str
Required

The path where the Batch AI service stores stdout, stderror and execution log of the job.

input_directories
list[<xref:batch_ai.models.InputDirectory>]
Required

A list of input directories for the job.

output_directories
list[<xref:batch_ai.models.OutputDirectory>]
Required

A list of output directories for the job.

environment_variables
list[<xref:batch_ai.models.EnvironmentVariable>]
Required

A collection of user defined environment variables to be setup for the job.

secrets
list[<xref:batch_ai.models.EnvironmentVariableWithSecretValue>]
Required

A collection of user defined environment variables with secret values to be setup for the job. Server will never report values of these variables back.

constraints
<xref:batch_ai.models.JobPropertiesConstraints>
Required

Constraints associated with the Job.

execution_info
<xref:batch_ai.models.JobPropertiesExecutionInfo>
Required

Information about the execution of a job.

Variables

id
str

The ID of the resource.

name
str

The name of the resource.

type
str

The type of the resource.

job_output_directory_path_segment
str

A segment of job's output directories path created by Batch AI. Batch AI creates job's output directories under an unique path to avoid conflicts between jobs. This value contains a path segment generated by Batch AI to make the path unique and can be used to find the output directory on the node or mounted filesystem.

creation_time
datetime

The creation time of the job.

provisioning_state
str or <xref:batch_ai.models.ProvisioningState>

The provisioned state of the Batch AI job. Possible values include: "creating", "succeeded", "failed", "deleting".

provisioning_state_transition_time
datetime

The time at which the job entered its current provisioning state.

execution_state
str or <xref:batch_ai.models.ExecutionState>

The current state of the job. Possible values are: queued - The job is queued and able to run. A job enters this state when it is created, or when it is awaiting a retry after a failed run. running - The job is running on a compute cluster. This includes job-level preparation such as downloading resource files or set up container specified on the job - it does not necessarily mean that the job command line has started executing. terminating

  • The job is terminated by the user, the terminate operation is in progress. succeeded - The job has completed running successfully and exited with exit code 0. failed - The job has finished unsuccessfully (failed with a non-zero exit code) and has exhausted its retry limit. A job is also marked as failed if an error occurred launching the job. Possible values include: "queued", "running", "terminating", "succeeded", "failed".
execution_state_transition_time
datetime

The time at which the job entered its current execution state.