We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
We are not able to evaluate VQA1 on CloudCV/EvalAI server. They only have the evaluation code for VQA2.
I have attached the JSON file for VQA1 for you. Please let us know how you have reported evaluation numbers? https://drive.google.com/drive/folders/1Moq6qOimMpyCmT499KQgwoONGHaXVTgK?usp=sharing
We evaluated it here: http://evalai.cloudcv.org/
For VQA1:
Doing the same procedure for VQA2 JSON file we get a ~90% accuracy.
The text was updated successfully, but these errors were encountered:
@monajalal https://competitions.codalab.org/competitions/6961#participate-submit_results
Sorry, something went wrong.
No branches or pull requests
We are not able to evaluate VQA1 on CloudCV/EvalAI server. They only have the evaluation code for VQA2.
I have attached the JSON file for VQA1 for you. Please let us know how you have reported evaluation numbers?
https://drive.google.com/drive/folders/1Moq6qOimMpyCmT499KQgwoONGHaXVTgK?usp=sharing
We evaluated it here:
http://evalai.cloudcv.org/
For VQA1:
Doing the same procedure for VQA2 JSON file we get a ~90% accuracy.
The text was updated successfully, but these errors were encountered: