MimicExplainer class

Definition

MimicExplainer(model, initialization_examples, explainable_model, explainable_model_args=None, is_function=False, augment_data=True, max_num_of_augmentations=10, explain_subset=None, features=None, classes=None, transformations=None, allow_all_transformations=False, shap_values_output=<ShapValuesOutput.DEFAULT: 'default'>, categorical_features=None, model_task=<ModelTask.Unknown: 'unknown'>, reset_index=False, **kwargs)
Inheritance
interpret_community.common.base_explainer.BaseExplainer
interpret_community.common.blackbox_explainer.BlackBoxExplainer
MimicExplainer

Methods

explain_global(evaluation_examples=None, include_local=True, batch_size=100)

Globally explains the blackbox model using the surrogate model.

If evaluation_examples are unspecified, retrieves global feature importances from explainable surrogate model. Note this will not include per class feature importances. If evaluation_examples are specified, aggregates local explanations to global from the given evaluation_examples - which computes both global and per class feature importances.

explain_local(evaluation_examples)

Locally explains the blackbox model using the surrogate model.

explain_global(evaluation_examples=None, include_local=True, batch_size=100)

Globally explains the blackbox model using the surrogate model.

If evaluation_examples are unspecified, retrieves global feature importances from explainable surrogate model. Note this will not include per class feature importances. If evaluation_examples are specified, aggregates local explanations to global from the given evaluation_examples - which computes both global and per class feature importances.

explain_global(evaluation_examples=None, include_local=True, batch_size=100)

Parameters

evaluation_examples
numpy.array or DataFrame or scipy.sparse.csr_matrix

A matrix of feature vector examples (# examples x # features) on which to explain the model's output. If specified, computes feature importances through aggregation.

default value: None
include_local
bool

Include the local explanations in the returned global explanation. If evaluation examples are specified and include_local is False, will stream the local explanations to aggregate to global.

default value: True
batch_size
int

If include_local is False, specifies the batch size for aggregating local explanations to global.

default value: 100

Returns

A model explanation object. It is guaranteed to be a GlobalExplanation. If evaluation_examples are passed in, it will also have the properties of a LocalExplanation. If the model is a classifier (has predict_proba), it will have the properties of ClassesMixin, and if evaluation_examples were passed in it will also have the properties of PerClassMixin.

Return type

DynamicGlobalExplanation

explain_local(evaluation_examples)

Locally explains the blackbox model using the surrogate model.

explain_local(evaluation_examples)

Parameters

evaluation_examples
numpy.array or DataFrame or scipy.sparse.csr_matrix

A matrix of feature vector examples (# examples x # features) on which to explain the model's output.

Returns

A model explanation object. It is guaranteed to be a LocalExplanation. If the model is a classifier, it will have the properties of the ClassesMixin.

Return type

DynamicLocalExplanation

Attributes

available_explanations

available_explanations = ['global', 'local']

explainer_type

explainer_type = 'blackbox'