Rebuild a Studio (classic) experiment in Azure Machine Learning

Important

Support for Machine Learning Studio (classic) will end on 31 August 2024. We recommend you transition to Azure Machine Learning by that date.

Beginning 1 December 2021, you will not be able to create new Machine Learning Studio (classic) resources. Through 31 August 2024, you can continue to use the existing Machine Learning Studio (classic) resources.

ML Studio (classic) documentation is being retired and may not be updated in the future.

In this article, you learn how to rebuild an ML Studio (classic) experiment in Azure Machine Learning. For more information on migrating from Studio (classic), see the migration overview article.

Studio (classic) experiments are similar to pipelines in Azure Machine Learning. However, in Azure Machine Learning pipelines are built on the same back-end that powers the SDK. This means that you have two options for machine learning development: the drag-and-drop designer or code-first SDKs.

For more information on building pipelines with the SDK, see What are Azure Machine Learning pipelines.

Prerequisites

Rebuild the pipeline

After you migrate your dataset to Azure Machine Learning, you're ready to recreate your experiment.

In Azure Machine Learning, the visual graph is called a pipeline draft. In this section, you recreate your classic experiment as a pipeline draft.

  1. Go to Azure Machine Learning studio (ml.azure.com)

  2. In the left navigation pane, select Designer > Easy-to-use prebuilt modules Screenshot showing how to create a new pipeline draft.

  3. Manually rebuild your experiment with designer modules.

    Consult the module-mapping table to find replacement modules. Many of Studio (classic)'s most popular modules have identical versions in the designer.

    Important

    If your experiment uses the Execute R Script module, you need to perform additional steps to migrate your experiment. For more information, see Migrate R Script modules.

  4. Adjust parameters.

    Select each module and adjust the parameters in the module settings panel to the right. Use the parameters to recreate the functionality of your Studio (classic) experiment. For more information on each module, see the module reference.

Submit a run and check results

After you recreate your Studio (classic) experiment, it's time to submit a pipeline run.

A pipeline run executes on a compute target attached to your workspace. You can set a default compute target for the entire pipeline, or you can specify compute targets on a per-module basis.

Once you submit a run from a pipeline draft, it turns into a pipeline run. Each pipeline run is recorded and logged in Azure Machine Learning.

To set a default compute target for the entire pipeline:

  1. Select the Gear icon Gear icon in the designer next to the pipeline name.
  2. Select Select compute target.
  3. Select an existing compute, or create a new compute by following the on-screen instructions.

Now that your compute target is set, you can submit a pipeline run:

  1. At the top of the canvas, select Submit.

  2. Select Create new to create a new experiment.

    Experiments organize similar pipeline runs together. If you run a pipeline multiple times, you can select the same experiment for successive runs. This is useful for logging and tracking.

  3. Enter an experiment name. Then, select Submit.

    The first run may take up to 20 minutes. Since the default compute settings have a minimum node size of 0, the designer must allocate resources after being idle. Successive runs take less time, since the nodes are already allocated. To speed up the running time, you can create a compute resources with a minimum node size of 1 or greater.

After the run finishes, you can check the results of each module:

  1. Right-click the module whose output you want to see.

  2. Select either Visualize, View Output, or View Log.

    • Visualize: Preview the results dataset.
    • View Output: Open a link to the output storage location. Use this to explore or download the output.
    • View Log: View driver and system logs. Use the 70_driver_log to see information related to your user-submitted script such as errors and exceptions.

Important

Designer modules use open source Python packages, compared to C# packages in Studio (classic). As a result, module output may vary slightly between the designer and Studio (classic).

Next steps

In this article, you learned how to rebuild a Studio (classic) experiment in Azure Machine Learning. The next step is to rebuild web services in Azure Machine Learning.

See the other articles in the Studio (classic) migration series:

  1. Migration overview.
  2. Migrate dataset.
  3. Rebuild a Studio (classic) training pipeline.
  4. Rebuild a Studio (classic) web service.
  5. Integrate an Azure Machine Learning web service with client apps.
  6. Migrate Execute R Script.