Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

question about position embedding. #24

Open
Jingyilang opened this issue Dec 7, 2018 · 1 comment
Open

question about position embedding. #24

Jingyilang opened this issue Dec 7, 2018 · 1 comment

Comments

@Jingyilang
Copy link

Thanks for you code.But I have a question about the implementation of the position embedding. It seems like position endoding is randomly initialized and updated in the training just like tokens embedding. What confuses me is how does this ways works learn specific position information?

@liuliu8622
Copy link

liuliu8622 commented May 13, 2019

it is a little complex to explain this question, although it is my research field.
i am not the author of the project by the way.
this process somewhat like "to learn how the position's affection works, by watching what a role does the word in this specified position play in the different sentence, according to compute the hidden_dims trainable floats".
i suffered a lot of problems during the process of the debug but finally it works.
i will note them out dependently in another post.
although i complained the troubles this project takes to me, but i still have to approbate author's job. thank you for the big help of your project.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants