Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fixes a couple of issues to add fp16 training support #488

Merged
merged 3 commits into from
Jan 20, 2023

Conversation

RangiLyu
Copy link
Owner

No description provided.

crisp-snakey and others added 3 commits January 20, 2023 15:42
* Add half precision support to `nanodet_plus` head

Moves the explicit `sigmoid` calculation inside the `dsl_assigner` so
that `binary_cross_entropy_with_logits` can be used. This allows for the
use of `auto_cast` to support training with `fp16` precision. If this is
not done `torch` will complain that using `binary_cross_entropy` with
`fp16` is unstable and as such refuses to train the model in `fp16`
precision.

* Add model precision settings to config

Allows for setting the model precision during training using the config
system.

Co-authored-by: RangiLyu <lyuchqi@gmail.com>
@codecov
Copy link

codecov bot commented Jan 20, 2023

Codecov Report

Merging #488 (e5beeb6) into main (d8ba391) will increase coverage by 0.04%.
The diff coverage is 100.00%.

@@            Coverage Diff             @@
##             main     #488      +/-   ##
==========================================
+ Coverage   74.65%   74.70%   +0.04%     
==========================================
  Files          70       70              
  Lines        4600     4601       +1     
  Branches      716      716              
==========================================
+ Hits         3434     3437       +3     
+ Misses        975      974       -1     
+ Partials      191      190       -1     
Flag Coverage Δ
unittests 74.70% <100.00%> (+0.04%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

Impacted Files Coverage Δ
nanodet/model/head/nanodet_plus_head.py 73.87% <ø> (ø)
nanodet/model/head/assigner/dsl_assigner.py 83.33% <100.00%> (ø)
nanodet/util/config.py 84.00% <100.00%> (+0.66%) ⬆️
nanodet/data/transform/warp.py 84.44% <0.00%> (+1.11%) ⬆️

📣 We’re building smart automated test selection to slash your CI/CD build times. Learn more

@RangiLyu RangiLyu merged commit a59db3c into main Jan 20, 2023
@RangiLyu RangiLyu deleted the crisp-snakey/add-fp16-support branch January 20, 2023 09:13
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants