Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

example_input_array dtype #2286

Closed
hjalmarlucius opened this issue Jun 19, 2020 · 4 comments · Fixed by #2510
Closed

example_input_array dtype #2286

hjalmarlucius opened this issue Jun 19, 2020 · 4 comments · Fixed by #2510
Assignees
Labels
bug Something isn't working discussion In a discussion stage

Comments

@hjalmarlucius
Copy link

Currently assumed that example_input_array dtype to be equal to model dtype. This is not necessarily correct - e.g. if input is a vector of INT.

https://github.com/PyTorchLightning/pytorch-lightning/blob/7dc58bd286b1e81ca4d293f05bddff5e93361020/pytorch_lightning/core/memory.py#L192

@awaelchli
Copy link
Contributor

awaelchli commented Jun 23, 2020

Hi, I don't understand. Does it throw an error or does it display nothing? Could you clarify?
I don't think we can very accurately define the "input shape" for anything other than tensors.
For this reason we exclude things like dicts from the overview, because it is not very practical to visualize this in a table.

@awaelchli awaelchli added the discussion In a discussion stage label Jun 23, 2020
@awaelchli awaelchli self-assigned this Jun 23, 2020
@hjalmarlucius
Copy link
Author

Hi, currently the model is run with input_ as an input. If the model expects a tensor of INTs then it will crash if floats come. I encountered this issue when pretraining an ALBERT-like model. This receives word embeddings as inputs, which have to be integers as they're going into a nn.Embedding

@awaelchli
Copy link
Contributor

okay I see, so we should not change the dtype as given by example_input_array.
I can't recall why I added this conversion, maybe it was because of amp and the half conversions. I'll have a closer look, thanks for bringing it up.

@awaelchli
Copy link
Contributor

as a workaround until it is fixed, cast your input to int before feeding to the embedding layer, or don't use the example_input_array.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working discussion In a discussion stage
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants