This is regarding the LVIS challenge 2021. But could apply to others as well.
The evaluation servers are meant to help us get scores for technical reports and also get some confidence that we are submitting the right format of submissions.
We can see that multiple people are facing the issue of submissions not evaluating/showing “None”.
Is it not possible to have a size limit on the submission file? Or any other guideline we can follow which can assure us that our submission will get evaluated in x time.
This is a critical issue I feel. If you have ever faced this issue as a participant can you comment here? Would help get more support for this issue.
@xcdh@dasdx@waizei I believe you guys are facing similar issues? @ram81 you do help out everytime when someone creates this issue. But can’t we have a permanent solution?
I have fixed the issue for now. Your submissions will be done evaluating soon.
you do help out everytime when someone creates this issue. But can’t we have a permanent solution?
Yes, we are working on it. We know why this issue is happening, we will make a fix for it as soon as possible and update you. The issue is partly due to large input files and partly due to a bug in our cleanup setup. Once we deploy the fix we will let you know.
I am not sure if restricting the input size is possible, you’ll have to ask this to challenge hosts. We can try to fix the issue at our end but due to large submission the evaluation time for LVIS challenge is slightly higher even at max compute resources we support at the moment.
Hi @kamathsutra@dasdx, we had to re-run all pending submissions after the fix. The temporary fix I made didn’t work. I am looking into the issue, I’ll update on the thread once I have more details.
@kamathsutra I have prioritized submissions from your team. You’ll see bunch of cancelled submissions, please ignore them. I am closely monitoring our evaluation setup to make sure all submissions get done evaluated soon. If I find any other issues that could cause delays I’ll update on this thread. So far the setup looks good and you should receive your results in couple of hours (LVIS evaluation script takes ~40-60mins per submissions based on our data).
Hi @ram81, our first result has been submitted for 7 hours, and it is running now. As we will submit the second one, can you help us check whether the results is running normally, and when will it finish running?
Hi @flytocc, there was a issue with previous run of your submission with old setup. I have requeued it now. It should be done evaluating in couple of hours
Hi @ram81, we has submmitted two results, one is running for about 1 hour, the other one is subbmitted just now, please help us speed up the evalutation, thanks!
Hi @kamathsutra, only challenge hosts can update the challenge end date. We manage the EvalAI infrastructure. If the challenge end date has changed then the challenge hosts would’ve modified the challenge end date.