MagnusEschment-2555 avatar image
0 Votes"
MagnusEschment-2555 asked Jaceklusarczyk-3236 edited

Hyperparamter optimization in Azure AutoML

Which type of hyperparameter optimization is used in Azure Automated Machine Learning (not the SDK) as default? Grid Search, Random Search, Bayesian? In the SDK you can specify that but in the AutoML section you can not specify that and there is no further information on that

5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

1 Answer

ramr-msft avatar image
1 Vote"
ramr-msft answered Jaceklusarczyk-3236 edited

@MagnusEschment-2555 Thanks for the question. Can you please add more details about the document that you are trying.
To clarify, what exactly are you optimizing for? Are you optimizing the parameters of the ML model to maximize model accuracy? If so:
• Random forest is pretty lightweight, so you may be able to just brute force grid search to get the best model
• If that costs is too high, consider using Bayesian (or similar) methods for tuning hyperparameters.
You can select the Algorithm name of a completed model to explore its performance details. Please follow the document to explore models.
Set up AutoML with Python: The primary metric parameter determines the metric to be used during model training for optimization. Azure AutoML supports a specific list of primary metrics per ML task, which are defined in docs, as mentioned below: Set up AutoML with Python - Azure Machine Learning | Microsoft Docs.

· 4
5 |1600 characters needed characters left characters exceeded

Up to 10 attachments (including images) can be used with a maximum of 3.0 MiB each and 30.0 MiB total.

Also please follow the tutorial to Train a classification model with no-code AutoML in the Azure Machine Learning studio:

1 Vote 1 ·

thanks for the quick response! I am talking about the "Automated ML" section of Azure and not the "Notebooks" section of Azure Machine Learning. In the "Notebooks" section you can specify the optimization technique of the hyperparameters but not in the "Automated ML" section, right?

I have a classification problem and I use accuracy and AUC weighted as a primary metric. Does the hyperparameter optimization technique also depend on the primary metric? Or does the technique depend on the model that is used? In other words, are the techniques probably different for XGBoostClassifier and LighGBM?

I already looked in the details of each model/algorithm. You can see the hyperparameters there but not the technique that is used.

0 Votes 0 ·
ramr-msft avatar image ramr-msft MagnusEschment-2555 ·

@MagnusEschment-2555 Thanks for the details. About being able to provide specific algorithm hyperparameters when training a model, we’re currently building a related feature which is Model’s training Code Generation, which, once you have a specific model created/selected by AutoML, you can generate it’s training code (featurization+algorithm training code) like XGBoost algorithm configuration or other Scikit-Learn algorithms, LightGBM, AutoArima/Prophet, etc. and once you have that code for that specific model, you can customize its hyper-parameters and train again (independently of AutoML).
This feature is currently in development but we’ll release an early private preview in near future, so if you are interested on trying it and provide feedback, please tell me.

0 Votes 0 ·
Show more comments