Interpreting output

GA CA [Back] [Sclera] [Iris] [Pupil] Precision Recall F1 IoU
0.991 0.997 0.946 0.957 0.956 0.992 0.991 0.991 0.939

Could I get some help interpreting the output? What is GA/CA and what numbers should I average together to get a mIOU of 0.939?

1 Like

Thanks for asking this important question. Please see clarifications below-

GA: Global Pixel Accuracy
CA: Mean Class Accuracy for different classes

  • Back: Background (non-eye part of peri-ocular region)
  • Sclera: Sclera
  • Iris: Iris
  • Pupil: Pupil

Precision: Computed using sklearn.metrics.precision_score(pred, gt, ‘weighted’)
Recall: Computed using sklearn.metrics.recall_score(pred, gt, ‘weighted’)
F1: Computed using sklearn.metrics.f1_score(pred, gt, ‘weighted’)
IoU: Computed using the function below

def compute_mean_iou(flat_pred, flat_label):
    '''
    compute mean intersection over union (IOU) over all classes
    :param flat_pred: flattened prediction matrix
    :param flat_label: flattened label matrix
    :return: mean IOU
    '''
    unique_labels = np.unique(flat_label)
    num_unique_labels = len(unique_labels)

    Intersect = np.zeros(num_unique_labels)
    Union = np.zeros(num_unique_labels)

    for index, val in enumerate(unique_labels):
        pred_i = flat_pred == val
        label_i = flat_label == val

    Intersect[index] = float(np.sum(np.logical_and(label_i, pred_i)))
    Union[index] = float(np.sum(np.logical_or(label_i, pred_i)))

    mean_iou = np.mean(Intersect / Union)
    return mean_iou