-
Notifications
You must be signed in to change notification settings - Fork 419
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove use of deepcopy #163
Conversation
Codecov Report
@@ Coverage Diff @@
## master #163 +/- ##
=========================================
Coverage 95.99% 95.99%
=========================================
Files 168 84 -84
Lines 5144 2572 -2572
=========================================
- Hits 4938 2469 -2469
+ Misses 206 103 -103
Flags with carried forward coverage won't be shown. Click here to find out more. Continue to review full report at Codecov.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM !
Before submitting
What does this PR do?
After profiling some metrics, I saw that that
reset
is spending a lot of time callingdeepcopy
. As the recommended method for cloning a tensor is.detach().clone()
and it is a bit faster:calculate for 5 repetitions over 1000 forward calculations (
reset
is called once for everyforward
call which is why such a small optimization may matter in the long run).PR review
Anyone in the community is free to review the PR once the tests have passed.
If we didn't discuss your PR in Github issues there's a high chance it will not be merged.
Did you have fun?
Make sure you had fun coding 🙃