-
Notifications
You must be signed in to change notification settings - Fork 24.7k
Fix complex tensor printing #38031
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix complex tensor printing #38031
Conversation
[ghstack-poisoned]
[ghstack-poisoned]
torch/_tensor_str.py
Outdated
@@ -280,7 +280,9 @@ def _str(self): | |||
or (self.device.type == 'cuda' and torch.cuda.current_device() != self.device.index): | |||
suffixes.append('device=\'' + str(self.device) + '\'') | |||
|
|||
has_default_dtype = self.dtype in (torch.get_default_dtype(), torch.int64, torch.bool) | |||
# TODO: add an API to map real -> complex dtypes | |||
_default_complex_dtype = torch.cfloat if torch.get_default_dtype() == torch.float else torch.cdouble |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
so if default dtype is half this is cdouble?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good point! no it should be cdouble if double else cfloat, similar to the logic here
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
will update! updated
💊 CI failures summary and remediationsAs of commit 8b2acc1 (more details on the Dr. CI page):
Extra GitHub checks: 1 failed
ci.pytorch.org: 1 failedThis comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions on the GitHub issue tracker. This comment has been revised 6 times. |
[ghstack-poisoned]
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Test please? :)
Differential Revision: [D21502915](https://our.internmc.facebook.com/intern/diff/D21502915) [ghstack-poisoned]
@anjali411 merged this pull request in 375ddb0. |
Stack from ghstack:
Differential Revision: D21502915