You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I noticed that in main.py, you zero out the embeddings for special words if they are absent in vocabulary:
# zero out the embeddings for padding and other special words if they are absent in vocabforidx, iteminenumerate([Constants.PAD_WORD, Constants.UNK_WORD, Constants.BOS_WORD, Constants.EOS_WORD]):
emb[idx].zero_()
Is there any reason for doing so? Why not using random normal vectors?
Thanks.
The text was updated successfully, but these errors were encountered:
As far as I remember, when initialising the embeddings, I realised that the PAD_WORD needs to be zeroed out. At the time, I was unsure what to do with the other special words, and left them as zero-ed out to start with. I believe you can try initializing them normally, it should be fine.
Do let me know if you get a chance to try out random normal initialization!
Hi,
I noticed that in
main.py
, you zero out the embeddings for special words if they are absent in vocabulary:Is there any reason for doing so? Why not using random normal vectors?
Thanks.
The text was updated successfully, but these errors were encountered: