With this challenge, we made available a large dataset of 4800 annotated AS-OCT images. In addition, an evaluation framework has been designed to allow all the submitted results to be evaluated and compared with one another in a uniform manner.

AGE Challenge consists of TWO Tasks:

  1. Angle closure classification
  2. Scleral spur localization

Imaging Data

All AS-OCT images are stored as JPG files.

Reference Standard

Tasks:Angle closure classification and scleral spur localization


For training data, angle structure labels and scleral spur locations for both the left and right angles are provided in a single XSLX file, with the first column corresponding to the filename of the AS-OCT image (including the extension “.jpg”), the second and fifth columns containing the angle structure labels (1 for angle closure and 0 for others) of the left and right angles, respectively, the third and fourth columns containing the X-coordinate and Y-coordinate of .the left angle, respectively, and sixth and seventh columns containing the X-coordinate and Y-coordinate of the right angles, respectively. NOTE THAT if the scleral spur location cannot be determined due to reasons like obstruction, both of its X-coordinate and Y-coordinate are set to be -1 in the annotation file. Participants are recommended to exclude these annotations during training.

An AS-OCT images with annotated scleral spur locations (the red cross points), where the closure angle classification labels are 0 for both the left and right angles.

An AS-OCT images with annotated scleral spur locations (the red cross points), where the closure angle classification labels are 1 for both the left and right angles.

Submission Guidelines

General Guidelines

By the time of submitting your results, please prepare a single .ZIP file. Inside the ZIP must be a CSV file, named "Classification_Results.csv", with the angle closure classification results and a CSV file, named “Localization_Results.csv”, with the scleral spur localization results. Please take into account that submitting unorganized files could lead to a wrong evaluation.

Challenge Task 1: Angle closure classification

The classification results should be provided in a single CSV file, named “Classification_Results.csv”, with the first column corresponding to the filename of the test AS-OCT images (including the extension “.jpg”), and the second and third columns containing the corresponding estimated angle closure classification results (positive value for angle closure structure and non-positive value for others). NOTE THAT the ranks of the estimated results also matter as AUC will be computed as an evaluation metric for this task.

Challenge Task 2: Scleral spur localization

The localization results should be provided in a single CSV file, named “Localization_Results.csv”, with the first column corresponding to the filename of the test ASOCT images (including the extension “.jpg”), the second and third columns containing the estimated X-coordinate and Y-coordinate of the scleral spur of the left angle, respectively, and the fourth and fifth columns containing the estimated X-coordinate and Y-coordinate of the scleral spur locations of the right angle, respectively.

** **

The following is what a zip file should contain:

** **

** **

The name of the ZIP file must be your Team Name, which you may use it on the onsite challenge. You can think one when you submit the results at the first time.

Here we give you a submission file example, you can download it. 

Submission file example

Link for Mainland China  PWD: 13of

Attention: When you submit the results, please follow the rules strictly. If you want to submit only one task result, you can use it to replace the corresponding item in the example. And then follow the rules to submit the results.

Evaluation Framework

This challenge evaluates the performance of the algorithms for: (1) angle closure classification, (2) sclera spur localization. Thus, there will be two main leaderboards.

Classification results will be compared to the reference standard. Receiver operating curve will be created across all the test set images and an area under the curve (AUC) will be calculated. Each team receives a rank  (1=best), based on the obtained AUC value. In addition, both the sensitivity and specificity will also be computed based on the classification results. Each team receives two rank  and  (1=best), based on these two values. The final score for the classification task is computed as the fomula below.

,

which then determines the ranking of the angle closure classification leaderboard. The team with the lowest score will be ranked #1.

Submitted scleral spur localization results will be compared to the reference standard. The evaluation criterion is the Average Euclidean Distance between the estimations and ground truth, which is the lower the better. NOTE THAT due to reasons like obstruction by the eyelash, a few of scleral spur location in the test ASOCT images may not be well determined. We will exclude these locations from the evaluation procedure. Each team receives a rank (1=best), based on the obtained measure. This ranking forms the scleral spur localization leaderboard.

The rule for the overall leaderboard is pending.

MICCAI 2019 and On-site part of the Challenge at OMIA Workshop

The AGE challenge will be hosted at the MICCAI 2019 conference in conjunction with OMIA workshop. There will also be an on-site part of the challenge when the second part of the test set will be released. The participants will have 1 hour on the day of the challenge to provide the results on the "on-site test set". Papers submitted to the AGE will be automatically considered for the challenge-part of the OMIA workshop unless otherwise stated by the participants. Each team can have up to 2 names appear in the author list of the challenge review paper (edited by the organizers with the papers submitted from the participants).

A paper must be submitted together with the off-site validation results by Aug 31 for the teams intent to attend the on-site challenge.

The camera-ready paper (max. 8 pages, PDF in Springer LNCS format) to be submitted by 30 Sep 2019 via email.

In the manuscript please describe the methods used, the novelty of the methodology and how it fits with the state-of-the-art, and a qualitative and quantitative analysis of results on the training/validation data (and on other reasonable settings).