langtest.utils.util_metrics.calculate_f1_score_multi_label#
- calculate_f1_score_multi_label(y_true: List[Set[str | int]], y_pred: List[Set[str | int]], average: str = 'macro', zero_division: int = 0) float #
Calculate the F1 score for multi-label classification using binarized labels.
- Parameters:
y_true (List[Set[Union[str, int]]]) – List of sets of true labels.
y_pred (List[Set[Union[str, int]]]) – List of sets of predicted labels.
average (str, optional) – Method to calculate F1 score, can be ‘micro’, ‘macro’, or ‘weighted’. Defaults to ‘macro’.
zero_division (int, optional) – Value to return when there is a zero division case. Defaults to 0.
- Returns:
Calculated F1 score for multi-label classification.
- Return type:
float
- Raises:
AssertionError – If lengths of y_true and y_pred are not equal.
ValueError – If invalid averaging method is provided.