probly.quantification.classification¶
Collection of uncertainty quantification measures for classification settings.
Functions
Compute the aleatoric uncertainty using samples from a second-order distribution. |
|
|
Compute conditional entropy as the aleatoric uncertainty. |
Compute the epistemic uncertainty using samples from a second-order distribution. |
|
|
Compute the evidential uncertainty given the evidences based on [SKK18]. |
Compute the aleatoric uncertainty using variance-based measures. |
|
|
Compute the expected divergence to the mean of the second-order distribution. |
|
Compute the expected entropy of the second-order distribution. |
|
Compute the expected loss of the second-order distribution. |
|
Compute the generalized Hartley measure as defined in [AbellanKM06]. |
|
Compute the lower entropy of a credal set. |
|
Compute the lower entropy of a credal set. |
|
Compute the mutual information as epistemic uncertainty. |
|
Compute the total entropy as the total uncertainty based on [DHernandezLobatoDoshiVelezU18]. |
|
Compute the total uncertainty using samples from a second-order distribution. |
|
Compute the total uncertainty using variance-based measures. |
|
Compute the upper entropy of a credal set. |
|
Compute the upper entropy of a credal set. |
Compute the epistemic uncertainty using variance-based measures. |
- probly.quantification.classification.aleatoric_uncertainty_distance(probs)[source]¶
Compute the aleatoric uncertainty using samples from a second-order distribution.
The measure of aleatoric uncertainty is from [SBCHullermeier24].
- Parameters:
probs (ndarray) – numpy.ndarray of shape (n_instances, n_samples, n_classes)
- Returns:
numpy.ndarray of shape (n_instances,)
- Return type:
au
- probly.quantification.classification.conditional_entropy(probs, base=2)[source]¶
Compute conditional entropy as the aleatoric uncertainty.
The computation is based on samples from a second-order distribution.
- probly.quantification.classification.epistemic_uncertainty_distance(probs)[source]¶
Compute the epistemic uncertainty using samples from a second-order distribution.
The measure of epistemic uncertainty is from [SBCHullermeier24].
- Parameters:
probs (ndarray) – numpy.ndarray of shape (n_instances, n_samples, n_classes)
- Returns:
numpy.ndarray of shape (n_instances,)
- Return type:
eu
- probly.quantification.classification.evidential_uncertainty(evidences)[source]¶
Compute the evidential uncertainty given the evidences based on [SKK18].
- Parameters:
evidences (ndarray) – Evidence values of shape (n_instances, n_classes).
- Returns:
Evidential uncertainty values of shape (n_instances,).
- Return type:
eu
- probly.quantification.classification.expected_conditional_variance(probs)[source]¶
Compute the aleatoric uncertainty using variance-based measures.
The computation is based on samples from a second-order distribution.
- Parameters:
probs (ndarray) – numpy.ndarray of shape (n_instances, n_samples, n_classes)
- Returns:
numpy.ndarray, shape (n_instances,)
- Return type:
ecv
- probly.quantification.classification.expected_divergence(probs, loss_fn)[source]¶
Compute the expected divergence to the mean of the second-order distribution.
The computation is based on samples from a second-order distribution.
- Parameters:
probs (np.ndarray) – numpy.ndarray of shape (n_instances, n_samples, n_classes)
loss_fn (Callable[[np.ndarray, np.ndarray | None], np.ndarray]) – Callable[[numpy.ndarray, np.ndarray | None], numpy.ndarray]
- Returns:
numpy.ndarray, shape (n_instances,)
- Return type:
ed
- probly.quantification.classification.expected_entropy(probs, loss_fn)[source]¶
Compute the expected entropy of the second-order distribution.
The computation is based on samples from a second-order distribution.
- Parameters:
probs (np.ndarray) – numpy.ndarray of shape (n_instances, n_samples, n_classes)
loss_fn (Callable[[np.ndarray, np.ndarray | None], np.ndarray]) – Callable[[numpy.ndarray, np.ndarray | None], numpy.ndarray]
- Returns:
numpy.ndarray, shape (n_instances,)
- Return type:
ee
- probly.quantification.classification.expected_loss(probs, loss_fn)[source]¶
Compute the expected loss of the second-order distribution.
The computation is based on samples from a second-order distribution.
- Parameters:
probs (np.ndarray) – numpy.ndarray of shape (n_instances, n_samples, n_classes)
loss_fn (Callable[[np.ndarray, np.ndarray | None], np.ndarray]) – Callable[[numpy.ndarray, np.ndarray | None], numpy.ndarray]
- Returns:
numpy.ndarray, shape (n_instances,)
- Return type:
el
- probly.quantification.classification.generalized_hartley(probs, base=2)[source]¶
Compute the generalized Hartley measure as defined in [AbellanKM06].
Based on the extreme points of a credal set the generalized Hartley measure is computed.
- probly.quantification.classification.lower_entropy(probs, base=2, n_jobs=None)[source]¶
Compute the lower entropy of a credal set.
Given the probs array the lower and upper probabilities are computed and the credal set is assumed to be a convex set including all probability distributions in the interval [lower, upper] for all classes. The lower entropy of this set is computed.
- Parameters:
- Returns:
Lower entropy values of shape (n_instances,).
- Return type:
le
- probly.quantification.classification.lower_entropy_convex_hull(probs, base=2, n_jobs=None)[source]¶
Compute the lower entropy of a credal set.
Given the probs the convex hull defined by the extreme points in probs is considered. The lower entropy of this set is computed.
- Parameters:
- Returns:
Lower entropy values of shape (n_instances,).
- Return type:
le
- probly.quantification.classification.mutual_information(probs, base=2)[source]¶
Compute the mutual information as epistemic uncertainty.
The computation is based on samples from a second-order distribution.
- probly.quantification.classification.total_entropy(probs, base=2)[source]¶
Compute the total entropy as the total uncertainty based on [DHernandezLobatoDoshiVelezU18].
The computation is based on samples from a second-order distribution.
- probly.quantification.classification.total_uncertainty_distance(probs)[source]¶
Compute the total uncertainty using samples from a second-order distribution.
The measure of total uncertainty is from [SBCHullermeier24].
- Parameters:
probs (ndarray) – numpy.ndarray of shape (n_instances, n_samples, n_classes)
- Returns:
numpy.ndarray of shape (n_instances,)
- Return type:
tu
- probly.quantification.classification.total_variance(probs)[source]¶
Compute the total uncertainty using variance-based measures.
The computation is based on samples from a second-order distribution.
- Parameters:
probs (ndarray) – numpy.ndarray of shape (n_instances, n_samples, n_classes)
- Returns:
numpy.ndarray, shape (n_instances,)
- Return type:
tv
- probly.quantification.classification.upper_entropy(probs, base=2, n_jobs=None)[source]¶
Compute the upper entropy of a credal set.
Given the probs array the lower and upper probabilities are computed and the credal set is assumed to be a convex set including all probability distributions in the interval [lower, upper] for all classes. The upper entropy of this set is computed.
- Parameters:
- Returns:
Upper entropy values of shape (n_instances,).
- Return type:
ue
- probly.quantification.classification.upper_entropy_convex_hull(probs, base=2, n_jobs=None)[source]¶
Compute the upper entropy of a credal set.
Given the probs the convex hull defined by the extreme points in probs is considered. The upper entropy of this set is computed.
- Parameters:
- Returns:
Upper entropy values of shape (n_instances,).
- Return type:
ue
- probly.quantification.classification.variance_conditional_expectation(probs)[source]¶
Compute the epistemic uncertainty using variance-based measures.
The computation is based on samples from a second-order distribution.
- Parameters:
probs (ndarray) – numpy.ndarray of shape (n_instances, n_samples, n_classes)
- Returns:
numpy.ndarray, shape (n_instances,)
- Return type:
ecv