FYI on Semantic Segmentation


#1
  1. Please find the evaluation metrics for semantic segmentation.
    The winning entry will be evaluated using the performance metric 0<M<=100, defined as follows: where 0<P<=1, measures model-accuracy as defined by the unweighted mean intersection-over union score over all classes for the test-set, and S>0, measures model-complexity, as defined by the number of model parameters, measured in unit of model-size in MB. More precisely, model-size in MB will be computed as follows: S = number of learned model parameters x 4 (floating point) / (1024*1024)

  2. Please find common submission errors and corresponding solutions below.

  • assert pred.shape == gt.shape
    This happens for cases where #masks / segmentation of images is different from #masks from ground truth. This usually means that the #masks you generated doesn’t equal to ground truth. Please go back and check your results.
  • raise ValueError(“Invalid submission zip!”) ValueError: Invalid submission zip!
    This has been fixed now and please re-submit your results for evaluation.
  1. Challenge participation deadline: September 15, 2019
  2. Clarifications on metrics below-

GA: Global Pixel Accuracy
CA: Mean Class Accuracy for different classes

  • Back: Background (non-eye part of peri-ocular region)
  • Sclera: Sclera
  • Iris: Iris
  • Pupil: Pupil

Precision: Computed using sklearn.metrics.precision_score(pred, gt, ‘weighted’)
Recall: Computed using sklearn.metrics.recall_score(pred, gt, ‘weighted’)
F1: Computed using sklearn.metrics.f1_score(pred, gt, ‘weighted’)
IoU: Computed using the function below

def compute_mean_iou(flat_pred, flat_label):
    '''
    compute mean intersection over union (IOU) over all classes
    :param flat_pred: flattened prediction matrix
    :param flat_label: flattened label matrix
    :return: mean IOU
    '''
    unique_labels = np.unique(flat_label)
    num_unique_labels = len(unique_labels)

    Intersect = np.zeros(num_unique_labels)
    Union = np.zeros(num_unique_labels)

    for index, val in enumerate(unique_labels):
        pred_i = flat_pred == val
        label_i = flat_label == val

    Intersect[index] = float(np.sum(np.logical_and(label_i, pred_i)))
    Union[index] = float(np.sum(np.logical_or(label_i, pred_i)))

    mean_iou = np.mean(Intersect / Union)
    return mean_iou

Question about Evaluation Criteria
#2

#4

#5

#6

Submission issue
#9

Regarding the evaluation metrics, please find our comments below:

Our interest in designing the measure is to meet some practical considerations, ie. any model less than or equal to 1MB in model size, is good enough for our purposes.
Amongst all the models that satisfy 1MB or less model-size constraint, mIOU becomes the key deciding factor.


#10

======== Notification of Paper Submission ========

This is a notification that the deadline of paper submission to ICCV workshop regarding OpenEDS challenges has changed.

Please note that there is a two tier submission, and the major difference with early deadline is that the accepted paper will appear in ICCV workshop proceedings. See table below for the two tiers.

Tier 1 Tier 2
Paper submission deadline Aug 19th Aug 31st
Paper acceptance notification deadline Aug 25th Sept 15th
Camera Ready Deadline Aug 30th Sept 27th
Publication * ICCV workshop proceedings * Archived on IEEE Xplore
* Archived on IEEE Xplore * Archived on CVF open access
* Archived on CVF open access

OpenEDS Challenge Team


#11

======== Common questions regarding paper submission ========

  1. The challenge deadline remains the same, , which will close on Sept 15th.

  2. Deadline to submit paper is stated in above table.
    Specifically, (1) the first deadline is for folks who want to submit manuscript and are interested in seeing their paper (if accepted) appear in ICCV workshop proceedings; (2) the second deadline is for folks who are not ready to submit paper or do not necessarily care about the paper appearing in ICCV workshop proceedings.

  3. For challenge participants they are welcome to submit papers to meet either of the two deadlines.

  4. For winners, we expect the authors to submit paper, which are considered accepted, i.e the challenge winning papers will not undergo formal review if submitted after the challenge ends.

======== How to submit paper ========

  1. Please submit paper at: OpenEDS2019. This link is also provided on the official Workshop page under Submissions, https://research.fb.com/programs/the-2019-openeds-workshop-eye-tracking-for-vr-and-ar/

OpenEDS Challenge Team


#12
  1. How many top positions will get the chance to present the paper.
  2. If we will submit the paper after the end of challenge, will it be considered as ‘tier1’ paper or ‘tier 2’ paper ?
  3. What would be the deadline for submitting the paper after the end of challenge.

#14

Hi

Please see response below to your questions:

All submitted papers will be considered for publication. Accepted papers will also have an opportunity to present.
With regards to challenge specific submissions, only the top spot winner will be asked to present their work.

Papers submitted after tier 1 deadline, which is Aug 19th will all be considered for tier 2 submissions as long as they meet the Aug 31st submission deadline.

Only the top place winner will have the opportunity to submit their camera ready version of paper after the challenge deadline.


#15

Might there be any opportunity to extend the Aug 19th deadline by 1 day?
We have been having trouble submitting our latest model over the last few days…