az dt job import

Note

This reference is part of the azure-iot extension for the Azure CLI (version 2.30.0 or higher). The extension will automatically install the first time you run an az dt job import command. Learn more about extensions.

Manage and configure jobs for importing model, twin and relationships data to a digital twin instance.

Commands

az dt job import create

Create and execute a data import job on a digital twin instance.

az dt job import delete

Delete a data import job executed on a digital twins instance.

az dt job import list

List all data import jobs executed on a digital twins instance.

az dt job import show

Show details of a data import job executed on a digital twins instance.

az dt job import create

Create and execute a data import job on a digital twin instance.

The command requires an input import data file (in .ndjson format) to be present in the input blob container. Additionally, the DT instance being used must have 'Storage Blob Data Contributor' role set on input storage account. Once the job completes, an output file containing job's logs and errors will be created.

az dt job import create --data-file
                        --dt-name
                        --ibc
                        --input-storage-account
                        [--job-id]
                        [--obc]
                        [--of]
                        [--osa]
                        [--resource-group]

Examples

Creates a job for importing data file stored in an Azure Storage Container. Import job's output file is created in the input file's blob container.

az dt job import create -n {instance_or_hostname} --data-file {data_file_name} --input-blob-container {input_blob_container_name} --input-storage-account {input_storage_account_name} --output-file {output_file_name}

Creates a job for importing data file stored in an azure storage container. Import job's output file is created in user-defined storage account and blob container.

az dt job import create -n {instance_or_hostname} --data-file {data_file_name} --input-blob-container {input_blob_container_name} --input-storage-account {input_storage_account_name} --output-file {output_file_name} --output-blob-container {output_blob_container_name} --output-storage-account {output_storage_account_name}

Required Parameters

--data-file --df

Name of data file input to the bulk import job. The file must be in 'ndjson' format. Sample input data file: https://github.com/Azure/azure-iot-cli-extension/tree/dev/docs/samples/adt-bulk-import-data-sample.ndjson.

--dt-name --dtn -n

Digital Twins instance name or hostname. If an instance name is provided, the user subscription is first queried for the target instance to retrieve the hostname. If a hostname is provided, the subscription query is skipped and the provided value is used for subsequent interaction.

--ibc --input-blob-container

Name of Azure Storage blob container which contains the bulk import data file.

--input-storage-account --isa

Name of Azure Storage account containing blob container which stores the bulk import data file.

Optional Parameters

--job-id -j

ID of bulk import job. A system generated ID is assigned when this parameter is ommitted during job creation.

--obc --output-blob-container

Name of Azure Storage blob container where the bulk import job's output file will be created. If not provided, will use the input blob container.

--of --output-file

Name of the bulk import job's output file. This file will contain logs as well as error information. The file gets created automatically once the job finishes. The file gets overwritten if it already exists. If not provided the output file is created with the name: <job_id>_output.txt.

--osa --output-storage-account

Name of Azure Storage account containing blob container where bulk import job's output file will be created. If not provided, will use the input storage account.

--resource-group -g

Digital Twins instance resource group. You can configure the default group using az configure --defaults group=<name>.

az dt job import delete

Delete a data import job executed on a digital twins instance.

az dt job import delete --dt-name
                        --job-id
                        [--resource-group]
                        [--yes]

Examples

Delete a data import job by job id.

az dt job import delete -n {instance_or_hostname} -j {job_id}

Required Parameters

--dt-name --dtn -n

Digital Twins instance name or hostname. If an instance name is provided, the user subscription is first queried for the target instance to retrieve the hostname. If a hostname is provided, the subscription query is skipped and the provided value is used for subsequent interaction.

--job-id -j

ID of bulk import job. A system generated ID is assigned when this parameter is ommitted during job creation.

Optional Parameters

--resource-group -g

Digital Twins instance resource group. You can configure the default group using az configure --defaults group=<name>.

--yes -y

Do not prompt for confirmation.

az dt job import list

List all data import jobs executed on a digital twins instance.

az dt job import list --dt-name
                      [--resource-group]

Examples

List all data import jobs on a target digital twins instance.

az dt job import list -n {instance_or_hostname}

Required Parameters

--dt-name --dtn -n

Digital Twins instance name or hostname. If an instance name is provided, the user subscription is first queried for the target instance to retrieve the hostname. If a hostname is provided, the subscription query is skipped and the provided value is used for subsequent interaction.

Optional Parameters

--resource-group -g

Digital Twins instance resource group. You can configure the default group using az configure --defaults group=<name>.

az dt job import show

Show details of a data import job executed on a digital twins instance.

az dt job import show --dt-name
                      --job-id
                      [--resource-group]

Examples

Show details of a data import job by job id.

az dt job import show -n {instance_or_hostname} -j {job_id}

Required Parameters

--dt-name --dtn -n

Digital Twins instance name or hostname. If an instance name is provided, the user subscription is first queried for the target instance to retrieve the hostname. If a hostname is provided, the subscription query is skipped and the provided value is used for subsequent interaction.

--job-id -j

ID of bulk import job. A system generated ID is assigned when this parameter is ommitted during job creation.

Optional Parameters

--resource-group -g

Digital Twins instance resource group. You can configure the default group using az configure --defaults group=<name>.