This repository was archived by the owner on Feb 12, 2022. It is now read-only.
-
Notifications
You must be signed in to change notification settings - Fork 488
Model crashes under pytorch 0.4 #39
Comments
@zou3519 @Smerity I get this error once I attempt to correct the bug:
Here is how I attempted to fix the broken method:
Note that I tried for every dimension sizes in P.S. you can reuse/modify/license my pasted code above without any restrictions whatsoever. |
Some fixes with #43 This is how I fixed the issue with def repackage_hidden(h):
"""Wraps hidden states in new Tensors,
to detach them from their history."""
if isinstance(h, torch.Tensor):
return h.detach()
else:
return tuple(repackage_hidden(v) for v in h) For the issue in X = embed._backend.Embedding.apply(words, masked_embed_weight,
padding_idx, embed.max_norm, embed.norm_type,
embed.scale_grad_by_freq, embed.sparse
) with: X = F.embedding(
words, masked_embed_weight,
padding_idx,
embed.max_norm, embed.norm_type,
embed.scale_grad_by_freq, embed.sparse
) |
It works, thanks @shawntan |
Thanks for this! Let me look at this carefully and merge it once I run some tests. |
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Hi,
The folks over at pytorch are working on cutting a new 0.4 release. We'd like to make the transition as smooth as possible (if you were planning on upgrading), so we've been testing a number of community repos.
I ran a model and it errors out due to a change in pytorch. Minimal repro:
Stack trace: https://gist.github.com/zou3519/142d48df1c03db9fe9c11717ad9a59f2
Pytorch 0.4 adds zero-dimensional tensors that cannot be iterated over, which seems to be what the error is complaining about. Changing
awd-lstm-lm/utils.py
Line 8 in f2e8867
cc @soumith
The text was updated successfully, but these errors were encountered: