now that the submission pipeline somewhat works, I was wondering if you have any plans on “enforcing” submissions to be published there. As of right now, there is no incentive to publish submission results on the leaderboard, meaning that it is impossible to tell how well a specific approach is performing in comparison to the other participant’s approaches. Given the framework of a competition it would be nice to know what we are competing against.
You can use the leaderboard to see how your submissions compare to other participants and to the baseline.
yea I saw the leaderboard, however when uploading submissions, I can choose to make them private (and so can everybody else), so that they won’t show on the leaderboard. Therefore, there is no incentive to publish on the leaderboard, and I cannot know how many private submissions there are and how well they perform… That’s why I was wondering if it would be possible to enforce public submissions, or ad least some method that lets everybody know how well the actual current best performs.
That is a good point. By default the configuration of the challenge makes them public, but I don’t know if they can be enforced to be public on EvalAI. I’ll ask them.