ExperimentBase<TMetrics,TExperimentSettings>.Execute Method

Definition

Overloads

Execute(IDataView, ColumnInformation, IEstimator<ITransformer>, IProgress<RunDetail<TMetrics>>)

Executes an AutoML experiment.

Execute(IDataView, IDataView, ColumnInformation, IEstimator<ITransformer>, IProgress<RunDetail<TMetrics>>)

Executes an AutoML experiment.

Execute(IDataView, IDataView, String, IEstimator<ITransformer>, IProgress<RunDetail<TMetrics>>)

Executes an AutoML experiment.

Execute(IDataView, String, String, IEstimator<ITransformer>, IProgress<RunDetail<TMetrics>>)

Executes an AutoML experiment.

Execute(IDataView, UInt32, ColumnInformation, IEstimator<ITransformer>, IProgress<CrossValidationRunDetail<TMetrics>>)

Executes an AutoML experiment.

Execute(IDataView, UInt32, String, String, IEstimator<ITransformer>, IProgress<CrossValidationRunDetail<TMetrics>>)

Executes an AutoML experiment.

Execute(IDataView, ColumnInformation, IEstimator<ITransformer>, IProgress<RunDetail<TMetrics>>)

Executes an AutoML experiment.

public virtual Microsoft.ML.AutoML.ExperimentResult<TMetrics> Execute (Microsoft.ML.IDataView trainData, Microsoft.ML.AutoML.ColumnInformation columnInformation, Microsoft.ML.IEstimator<Microsoft.ML.ITransformer> preFeaturizer = default, IProgress<Microsoft.ML.AutoML.RunDetail<TMetrics>> progressHandler = default);
abstract member Execute : Microsoft.ML.IDataView * Microsoft.ML.AutoML.ColumnInformation * Microsoft.ML.IEstimator<Microsoft.ML.ITransformer> * IProgress<Microsoft.ML.AutoML.RunDetail<'Metrics>> -> Microsoft.ML.AutoML.ExperimentResult<'Metrics (requires 'Metrics : null)>
override this.Execute : Microsoft.ML.IDataView * Microsoft.ML.AutoML.ColumnInformation * Microsoft.ML.IEstimator<Microsoft.ML.ITransformer> * IProgress<Microsoft.ML.AutoML.RunDetail<'Metrics>> -> Microsoft.ML.AutoML.ExperimentResult<'Metrics (requires 'Metrics : null)>
Public Overridable Function Execute (trainData As IDataView, columnInformation As ColumnInformation, Optional preFeaturizer As IEstimator(Of ITransformer) = Nothing, Optional progressHandler As IProgress(Of RunDetail(Of TMetrics)) = Nothing) As ExperimentResult(Of TMetrics)

Parameters

trainData
IDataView

The training data to be used by the AutoML experiment.

columnInformation
ColumnInformation

Column information for the dataset.

preFeaturizer
IEstimator<ITransformer>

Pre-featurizer that AutoML will apply to the data during an experiment. (The pre-featurizer will be fit only on the training data split to produce a trained transform. Then, the trained transform will be applied to both the training data split and corresponding validation data split.)

progressHandler
IProgress<RunDetail<TMetrics>>

A user-defined object that implements the IProgress<T> interface. AutoML will invoke the method Report(T) after each model it produces during the course of the experiment.

Returns

The experiment result.

Remarks

Depending on the size of your data, the AutoML experiment could take a long time to execute.

Applies to

Execute(IDataView, IDataView, ColumnInformation, IEstimator<ITransformer>, IProgress<RunDetail<TMetrics>>)

Executes an AutoML experiment.

public virtual Microsoft.ML.AutoML.ExperimentResult<TMetrics> Execute (Microsoft.ML.IDataView trainData, Microsoft.ML.IDataView validationData, Microsoft.ML.AutoML.ColumnInformation columnInformation, Microsoft.ML.IEstimator<Microsoft.ML.ITransformer> preFeaturizer = default, IProgress<Microsoft.ML.AutoML.RunDetail<TMetrics>> progressHandler = default);
abstract member Execute : Microsoft.ML.IDataView * Microsoft.ML.IDataView * Microsoft.ML.AutoML.ColumnInformation * Microsoft.ML.IEstimator<Microsoft.ML.ITransformer> * IProgress<Microsoft.ML.AutoML.RunDetail<'Metrics>> -> Microsoft.ML.AutoML.ExperimentResult<'Metrics (requires 'Metrics : null)>
override this.Execute : Microsoft.ML.IDataView * Microsoft.ML.IDataView * Microsoft.ML.AutoML.ColumnInformation * Microsoft.ML.IEstimator<Microsoft.ML.ITransformer> * IProgress<Microsoft.ML.AutoML.RunDetail<'Metrics>> -> Microsoft.ML.AutoML.ExperimentResult<'Metrics (requires 'Metrics : null)>
Public Overridable Function Execute (trainData As IDataView, validationData As IDataView, columnInformation As ColumnInformation, Optional preFeaturizer As IEstimator(Of ITransformer) = Nothing, Optional progressHandler As IProgress(Of RunDetail(Of TMetrics)) = Nothing) As ExperimentResult(Of TMetrics)

Parameters

trainData
IDataView

The training data to be used by the AutoML experiment.

validationData
IDataView

The validation data to be used by the AutoML experiment.

columnInformation
ColumnInformation

Column information for the dataset.

preFeaturizer
IEstimator<ITransformer>

Pre-featurizer that AutoML will apply to the data during an experiment. (The pre-featurizer will be fit only on the training data split to produce a trained transform. Then, the trained transform will be applied to both the training data split and corresponding validation data split.)

progressHandler
IProgress<RunDetail<TMetrics>>

A user-defined object that implements the IProgress<T> interface. AutoML will invoke the method Report(T) after each model it produces during the course of the experiment.

Returns

The experiment result.

Remarks

Depending on the size of your data, the AutoML experiment could take a long time to execute.

Applies to

Execute(IDataView, IDataView, String, IEstimator<ITransformer>, IProgress<RunDetail<TMetrics>>)

Executes an AutoML experiment.

public virtual Microsoft.ML.AutoML.ExperimentResult<TMetrics> Execute (Microsoft.ML.IDataView trainData, Microsoft.ML.IDataView validationData, string labelColumnName = "Label", Microsoft.ML.IEstimator<Microsoft.ML.ITransformer> preFeaturizer = default, IProgress<Microsoft.ML.AutoML.RunDetail<TMetrics>> progressHandler = default);
abstract member Execute : Microsoft.ML.IDataView * Microsoft.ML.IDataView * string * Microsoft.ML.IEstimator<Microsoft.ML.ITransformer> * IProgress<Microsoft.ML.AutoML.RunDetail<'Metrics>> -> Microsoft.ML.AutoML.ExperimentResult<'Metrics (requires 'Metrics : null)>
override this.Execute : Microsoft.ML.IDataView * Microsoft.ML.IDataView * string * Microsoft.ML.IEstimator<Microsoft.ML.ITransformer> * IProgress<Microsoft.ML.AutoML.RunDetail<'Metrics>> -> Microsoft.ML.AutoML.ExperimentResult<'Metrics (requires 'Metrics : null)>
Public Overridable Function Execute (trainData As IDataView, validationData As IDataView, Optional labelColumnName As String = "Label", Optional preFeaturizer As IEstimator(Of ITransformer) = Nothing, Optional progressHandler As IProgress(Of RunDetail(Of TMetrics)) = Nothing) As ExperimentResult(Of TMetrics)

Parameters

trainData
IDataView

The training data to be used by the AutoML experiment.

validationData
IDataView

The validation data to be used by the AutoML experiment.

labelColumnName
String

The name of the label column.

preFeaturizer
IEstimator<ITransformer>

Pre-featurizer that AutoML will apply to the data during an experiment. (The pre-featurizer will be fit only on the training data split to produce a trained transform. Then, the trained transform will be applied to both the training data split and corresponding validation data split.)

progressHandler
IProgress<RunDetail<TMetrics>>

A user-defined object that implements the IProgress<T> interface. AutoML will invoke the method Report(T) after each model it produces during the course of the experiment.

Returns

The experiment result.

Remarks

Depending on the size of your data, the AutoML experiment could take a long time to execute.

Applies to

Execute(IDataView, String, String, IEstimator<ITransformer>, IProgress<RunDetail<TMetrics>>)

Executes an AutoML experiment.

public virtual Microsoft.ML.AutoML.ExperimentResult<TMetrics> Execute (Microsoft.ML.IDataView trainData, string labelColumnName = "Label", string samplingKeyColumn = default, Microsoft.ML.IEstimator<Microsoft.ML.ITransformer> preFeaturizer = default, IProgress<Microsoft.ML.AutoML.RunDetail<TMetrics>> progressHandler = default);
abstract member Execute : Microsoft.ML.IDataView * string * string * Microsoft.ML.IEstimator<Microsoft.ML.ITransformer> * IProgress<Microsoft.ML.AutoML.RunDetail<'Metrics>> -> Microsoft.ML.AutoML.ExperimentResult<'Metrics (requires 'Metrics : null)>
override this.Execute : Microsoft.ML.IDataView * string * string * Microsoft.ML.IEstimator<Microsoft.ML.ITransformer> * IProgress<Microsoft.ML.AutoML.RunDetail<'Metrics>> -> Microsoft.ML.AutoML.ExperimentResult<'Metrics (requires 'Metrics : null)>
Public Overridable Function Execute (trainData As IDataView, Optional labelColumnName As String = "Label", Optional samplingKeyColumn As String = Nothing, Optional preFeaturizer As IEstimator(Of ITransformer) = Nothing, Optional progressHandler As IProgress(Of RunDetail(Of TMetrics)) = Nothing) As ExperimentResult(Of TMetrics)

Parameters

trainData
IDataView

The training data used by the AutoML experiment.

labelColumnName
String

The dataset column used as the label.

samplingKeyColumn
String

The dataset column used as the sampling key column. See SamplingKeyColumnName for more information.

preFeaturizer
IEstimator<ITransformer>

Pre-featurizer that AutoML will apply to the data during an experiment. (The pre-featurizer will be fit only on the training data split to produce a trained transform. Then, the trained transform will be applied to both the training data split and corresponding validation data split.)

progressHandler
IProgress<RunDetail<TMetrics>>

A user-defined object that implements the IProgress<T> interface. AutoML will invoke the method Report(T) after each model it produces during the course of the experiment.

Returns

The experiment result.

Remarks

Depending on the size of your data, the AutoML experiment could take a long time to execute.

Applies to

Execute(IDataView, UInt32, ColumnInformation, IEstimator<ITransformer>, IProgress<CrossValidationRunDetail<TMetrics>>)

Executes an AutoML experiment.

public virtual Microsoft.ML.AutoML.CrossValidationExperimentResult<TMetrics> Execute (Microsoft.ML.IDataView trainData, uint numberOfCVFolds, Microsoft.ML.AutoML.ColumnInformation columnInformation = default, Microsoft.ML.IEstimator<Microsoft.ML.ITransformer> preFeaturizer = default, IProgress<Microsoft.ML.AutoML.CrossValidationRunDetail<TMetrics>> progressHandler = default);
abstract member Execute : Microsoft.ML.IDataView * uint32 * Microsoft.ML.AutoML.ColumnInformation * Microsoft.ML.IEstimator<Microsoft.ML.ITransformer> * IProgress<Microsoft.ML.AutoML.CrossValidationRunDetail<'Metrics>> -> Microsoft.ML.AutoML.CrossValidationExperimentResult<'Metrics (requires 'Metrics : null)>
override this.Execute : Microsoft.ML.IDataView * uint32 * Microsoft.ML.AutoML.ColumnInformation * Microsoft.ML.IEstimator<Microsoft.ML.ITransformer> * IProgress<Microsoft.ML.AutoML.CrossValidationRunDetail<'Metrics>> -> Microsoft.ML.AutoML.CrossValidationExperimentResult<'Metrics (requires 'Metrics : null)>
Public Overridable Function Execute (trainData As IDataView, numberOfCVFolds As UInteger, Optional columnInformation As ColumnInformation = Nothing, Optional preFeaturizer As IEstimator(Of ITransformer) = Nothing, Optional progressHandler As IProgress(Of CrossValidationRunDetail(Of TMetrics)) = Nothing) As CrossValidationExperimentResult(Of TMetrics)

Parameters

trainData
IDataView

The training data to be used by the AutoML experiment.

numberOfCVFolds
UInt32

The number of cross validation folds into which the training data should be divided when fitting a model.

columnInformation
ColumnInformation

Column information for the dataset.

preFeaturizer
IEstimator<ITransformer>

Pre-featurizer that AutoML will apply to the data during an experiment. (The pre-featurizer will be fit only on the training data split to produce a trained transform. Then, the trained transform will be applied to both the training data split and corresponding validation data split.)

progressHandler
IProgress<CrossValidationRunDetail<TMetrics>>

A user-defined object that implements the IProgress<T> interface. AutoML will invoke the method Report(T) after each model it produces during the course of the experiment.

Returns

The cross validation experiment result.

Remarks

Depending on the size of your data, the AutoML experiment could take a long time to execute.

Applies to

Execute(IDataView, UInt32, String, String, IEstimator<ITransformer>, IProgress<CrossValidationRunDetail<TMetrics>>)

Executes an AutoML experiment.

public virtual Microsoft.ML.AutoML.CrossValidationExperimentResult<TMetrics> Execute (Microsoft.ML.IDataView trainData, uint numberOfCVFolds, string labelColumnName = "Label", string samplingKeyColumn = default, Microsoft.ML.IEstimator<Microsoft.ML.ITransformer> preFeaturizer = default, IProgress<Microsoft.ML.AutoML.CrossValidationRunDetail<TMetrics>> progressHandler = default);
abstract member Execute : Microsoft.ML.IDataView * uint32 * string * string * Microsoft.ML.IEstimator<Microsoft.ML.ITransformer> * IProgress<Microsoft.ML.AutoML.CrossValidationRunDetail<'Metrics>> -> Microsoft.ML.AutoML.CrossValidationExperimentResult<'Metrics (requires 'Metrics : null)>
override this.Execute : Microsoft.ML.IDataView * uint32 * string * string * Microsoft.ML.IEstimator<Microsoft.ML.ITransformer> * IProgress<Microsoft.ML.AutoML.CrossValidationRunDetail<'Metrics>> -> Microsoft.ML.AutoML.CrossValidationExperimentResult<'Metrics (requires 'Metrics : null)>
Public Overridable Function Execute (trainData As IDataView, numberOfCVFolds As UInteger, Optional labelColumnName As String = "Label", Optional samplingKeyColumn As String = Nothing, Optional preFeaturizer As IEstimator(Of ITransformer) = Nothing, Optional progressHandler As IProgress(Of CrossValidationRunDetail(Of TMetrics)) = Nothing) As CrossValidationExperimentResult(Of TMetrics)

Parameters

trainData
IDataView

The training data to be used by the AutoML experiment.

numberOfCVFolds
UInt32

The number of cross validation folds into which the training data should be divided when fitting a model.

labelColumnName
String

The name of the label column.

samplingKeyColumn
String

The name of the sampling key column.

preFeaturizer
IEstimator<ITransformer>

Pre-featurizer that AutoML will apply to the data during an experiment. (The pre-featurizer will be fit only on the training data split to produce a trained transform. Then, the trained transform will be applied to both the training data split and corresponding validation data split.)

progressHandler
IProgress<CrossValidationRunDetail<TMetrics>>

A user-defined object that implements the IProgress<T> interface. AutoML will invoke the method Report(T) after each model it produces during the course of the experiment.

Returns

The cross validation experiment result.

Remarks

Depending on the size of your data, the AutoML experiment could take a long time to execute.

Applies to