ModelProfile class

Definition

Contains the results of a profiling run.

A model profile of a model is a resource requirement recommendation. A ModelProfile object is returned from the profile(workspace, profile_name, models, inference_config, input_dataset, cpu=None, memory_in_gb=None, description=None) method of the Model class.

ModelProfile(workspace, name)
Inheritance
azureml.core.profile._ModelEvaluationResultBase
ModelProfile

Parameters

workspace
Workspace

The workspace object containing the model.

name
str

The name of the profile to create and retrieve.

Remarks

The following example shows how to return a ModelProfile object.


   profile = Model.profile(ws, "profilename", [model], inference_config, input_dataset=dataset)
   profile.wait_for_profiling(True)
   profiling_details = profile.get_details()
   print(profiling_details)

Methods

get_details()

Get the details of the profiling result.

Return the the observed metrics (various latency percentiles, maximum utilized cpu and memory, etc.) and recommended resource requirements in case of a success.

serialize()

Convert this Profile into a JSON serialized dictionary.

wait_for_completion(show_output=False)

Wait for the model to finish profiling.

get_details()

Get the details of the profiling result.

Return the the observed metrics (various latency percentiles, maximum utilized cpu and memory, etc.) and recommended resource requirements in case of a success.

get_details()

Returns

A dictionary of recommended resource requirements.

Return type

serialize()

Convert this Profile into a JSON serialized dictionary.

serialize()

Returns

The JSON representation of this Profile

Return type

wait_for_completion(show_output=False)

Wait for the model to finish profiling.

wait_for_completion(show_output=False)

Parameters

show_output
bool
default value: False

Boolean option to print more verbose output. Defaults to False.