-
Notifications
You must be signed in to change notification settings - Fork 3.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Memory Leak] Revert "[TE][Fix] Comparison of the output tensor" #10540
Conversation
I agree that the PR should be revereted. To compare two tensors, we need to use deep comparison, the |
cc @leeexyz |
@AndrewZhaoLuo Sorry for the issue I introduced that I did not think too much. I am just wondering the original idea here is also has a cyclic dependency, @tqchen I agree with your suggestion that using Thanks for your kind help. |
I'm sorry that I didn't find the problem during reviewing. Please fix the CI and merge it as soon as possible. |
@leeexyz the old way tensor has a reference to the operator but operator does not have a reference to the tensor so we do not have cyclic references. |
This reverts commit 73cf51b.
26df00f
to
bfc986d
Compare
…che#10540) This reverts commit 73cf51b.
…che#10540) This reverts commit 73cf51b.
…che#10540) This reverts commit 73cf51b.
Reverts #9829
The above behavior avoids having TE output Tensors change in address everytime by caching the tensors instead of reallocating everytime. However in doing so they introduce a cyclic dependency. Each
Operation
has a list ofTensors
. However, eachTensor
has a reference to theOperation
. As a result, each will always have a reference to each other and never be deleted under TVM's reference counting system.Possible alternatives:
I'm reverting for now though.
Thanks to @mbrookhart for help in finding this memory error!