-
-
Notifications
You must be signed in to change notification settings - Fork 16.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. Weβll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add nn.SiLU inplace in attempt_load() #1940
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
π Hello @1991wangliang, thank you for submitting a π PR! To allow your work to be integrated as seamlessly as possible, we advise you to:
- β Verify your PR is up-to-date with origin/master. If your PR is behind origin/master update by running the following, replacing 'feature' with the name of your local branch:
git remote add upstream https://github.com/ultralytics/yolov5.git
git fetch upstream
git checkout feature # <----- replace 'feature' with local branch name
git rebase upstream/master
git push -u origin -f
- β Verify all Continuous Integration (CI) checks are passing.
- β Reduce changes to the absolute minimum required for your bug fix or feature addition. "It is not daily increase but daily decrease, hack away the unessential. The closer to the source, the less wastage there is." -Bruce Lee
@1991wangliang hi there. Could you explain the purpose behind the sleep call? |
perhaps the CPU performance is too poor, or it may be blocked under multi-threading. It will get stuck when loading data.
before create_dataloader ,sleep some sec can be running. |
@1991wangliang I'm not sure if the sleep term brings any benefits. Your system may be low on resources. Perhaps you could try training with less I've removed the sleep term and added a SiLU inplace change to this PR in it's place. |
thanks, i known why can be locked . when train.py execute once ,that can be created labels.cache file on train data path . that train.py will be lock in execute load data. i remove that can be run. |
* sleep 3 sec to load data . * Update train.py * Add nn.SiLU inplace in attempt_load() Co-authored-by: wangliang <wangliang@codingapi.com> Co-authored-by: Glenn Jocher <glenn.jocher@ultralytics.com>
* sleep 3 sec to load data . * Update train.py * Add nn.SiLU inplace in attempt_load() Co-authored-by: wangliang <wangliang@codingapi.com> Co-authored-by: Glenn Jocher <glenn.jocher@ultralytics.com>
* sleep 3 sec to load data . * Update train.py * Add nn.SiLU inplace in attempt_load() Co-authored-by: wangliang <wangliang@codingapi.com> Co-authored-by: Glenn Jocher <glenn.jocher@ultralytics.com>
π οΈ PR Summary
Made with β€οΈ by Ultralytics Actions
π Summary
Enhancement of activation function compatibility in model loading.
π Key Changes
nn.SiLU
to the list of activation functions settinginplace
toTrue
during model loading.π― Purpose & Impact
SiLU
activation functions to be loaded without issues.SiLU
activation will experience smoother integrations and upgrades.