-
Notifications
You must be signed in to change notification settings - Fork 23.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AT_DISPATCH_FLOATING_TYPES_AND2 fails with ScalarType::Byte #34826
Comments
You probably also need to add a ScalarTypeToCPPType mapping from Byte to uint8_t here: Lines 100 to 109 in e93e7b2
If that's not sufficient, there might be other similar mappings missing. |
Hi @colesbury, |
PyTorch devs: we might want to ensure that the AT_DISPATCH macros work with all the built-in scalar types |
Hi @colesbury , I'll submit the fix for |
Notes: Due to a bug in AT_DISPATCH_FLOATING_TYPES_AND2 (see pytorch#34826), I used AT_DISPATCH_ALL_TYPES_AND.
…pes." Fixes: #34826 [ghstack-poisoned]
…pes." Fixes: #34826 Differential Revision: [D21476009](https://our.internmc.facebook.com/intern/diff/D21476009) [ghstack-poisoned]
🐛 Bug
Using AT_DISPATCH_FLOATING_TYPES_AND2, in
cuda
code is causing a compilation error:I am trying to enable support for
uint8
for nearest Neighbor upsampling.Therefore, I tried to replace
AT_DISPATCH_FLOATING_TYPES_AND_HALF
withAT_DISPATCH_FLOATING_TYPES_AND2(at::ScalarType::Half, at::ScalarType::Byte, ...
Which caused a compilation. On the other hand, using
AT_DISPATCH_ALL_TYPES_AND(at::ScalarType::Half ...
works fine.To Reproduce
Steps to reproduce the behavior:
bdd7dbfd4b
AT_DISPATCH_FLOATING_TYPES_AND2
Expected behavior
My understanding is that the code should compile since using
AT_ALL_TYPES_AND
is compiling.cc @yf225
The text was updated successfully, but these errors were encountered: