tfma.metrics.SpecificityAtSensitivity

Computes best specificity where sensitivity is >= specified value.

Inherits From: Metric

Sensitivity measures the proportion of actual positives that are correctly identified as such (tp / (tp + fn)). Specificity measures the proportion of actual negatives that are correctly identified as such (tn / (tn + fp)).

The threshold for the given sensitivity value is computed and used to evaluate the corresponding specificity.

If sample_weight is None, weights default to 1. Use sample_weight of 0 to mask values.

For additional information about specificity and sensitivity, see the following.

sensitivity A scalar value or a list of scalar value in range [0, 1].
num_thresholds (Optional) Defaults to 1000. The number of thresholds to use for matching the given sensitivity.
class_id (Optional) Used with a multi-class model to specify which class to compute the confusion matrix for. When class_id is used, metrics_specs.binarize settings must not be present. Only one of class_id or top_k should be configured.
name (Optional) string name of the metric instance.
top_k (Optional) Used with a multi-class model to specify that the top-k values should be used to compute the confusion matrix. The net effect is that the non-top-k values are set to -inf and the matrix is then constructed from the average TP, FP, TN, FN across the classes. When top_k is used, metrics_specs.binarize settings must not be present. Only one of class_id or top_k should be configured. When top_k is set, the default thresholds are [float('-inf')].

compute_confidence_interval Whether to compute confidence intervals for this metric.

Note that this may not completely remove the computational overhead involved in computing a given metric. This is only respected by the jackknife confidence interval method.

Methods

computations

View source

Creates computations associated with metric.

from_config

View source

get_config

View source

Returns serializable config.