Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

update config to increase mAP for model, result increase mAP by 87.5% #4

Open
wants to merge 3 commits into
base: denis-update-to-setup-guide
Choose a base branch
from

Conversation

interactivetech
Copy link

Updated the const.yaml file to improve performance of FasterRCNN model. Teammate @denisabrantes pointed out mAP of model after finetuning was 0.08. The performance could result in customer not being confident in the model trained. Investigating the issue, the following insights and updates were discovered:

  • Model finetuning uses MultiStepLR Decay, where the original model starts at 0.02, and decays to 0.0002
  • The original const.yaml had the settings for pretraining. the dataset for pretraining was much larger, and required a higher learning rate.

I completed the following updates:

  • The training datasets is much smaller, so updated the config to reflect appropriate training settings.
  • Ran a hyperparameter search over 100 trials, and found the ideal learn rate and warmup ratio to improve performance.

The model after training 10 batches should get around 0.14 mAP, and 0.28 mAP50, a big improvement!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant