-
-
Notifications
You must be signed in to change notification settings - Fork 16.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
update detect.py
in order to support torch script
#5109
Conversation
This change assumes the torchscrip file was previously saved with `export.py`
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👋 Hello @andreiionutdamian, thank you for submitting a 🚀 PR! To allow your work to be integrated as seamlessly as possible, we advise you to:
- ✅ Verify your PR is up-to-date with origin/master. If your PR is behind origin/master an automatic GitHub actions rebase may be attempted by including the /rebase command in a comment body, or by running the following code, replacing 'feature' with the name of your local branch:
git remote add upstream https://github.com/ultralytics/yolov5.git
git fetch upstream
git checkout feature # <----- replace 'feature' with local branch name
git merge upstream/master
git push -u origin -f
- ✅ Verify all Continuous Integration (CI) checks are passing.
- ✅ Reduce changes to the absolute minimum required for your bug fix or feature addition. "It is not daily increase but daily decrease, hack away the unessential. The closer to the source, the less wastage there is." -Bruce Lee
Simple update for torchscript support. Assumes the torchscript file has been generated with `export.py`
@andreiionutdamian thanks for the PR! I think this covers model loading, but we need additions in a second, place, inference, right? Otherwise you are attempting to use a torchscript model for normal pytorch inference, which should not be possible (probably the source of the CI fails): Lines 142 to 145 in 276b674
|
@glenn-jocher it should work just fine. The whole intuition is to have exactly the same functionality but in a more graph/serialized way. Tried this already on other models and it works quite nice. So basically what happens after And one more thing: thank you sir for all your replies and real hands-on involvement. |
@andreiionutdamian oh, got it! Good to know. Can you debug the updates then to resolve the CI failures? Then we should be all set. Thanks! |
update `detect.py` for torchscript support
Sorry, forgot to merge the modifications of the sloppy code spacing done directly in GitHub editor. |
* update detect.py in order to support torch script This change assumes the torchscrip file was previously saved with `export.py` * update `detect.py` for torchscript support Simple update for torchscript support. Assumes the torchscript file has been generated with `export.py` * Cleanup Co-authored-by: Glenn Jocher <glenn.jocher@ultralytics.com>
This change assumes the torchscrip file was previously saved with
export.py
.Based on the issue #5070
🛠️ PR Summary
Made with ❤️ by Ultralytics Actions
🌟 Summary
Added TorchScript compatibility to model loading in
detect.py
.📊 Key Changes
detect.py
now includes a conditional to check for 'torchscript' in the model file name.torch.jit.load()
to load the model; otherwise, it uses the existingattempt_load()
method.🎯 Purpose & Impact