-
Notifications
You must be signed in to change notification settings - Fork 9.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BERT vs Word2vec #362
Comments
@Tina-19 is that repo just do encoding? |
Yes, it provides fixed-length vectors for sentences using BERT that can be used instead of Word2Vec. |
Is it possible to see similar words in BERT like if I search for "radar sensor companies", so I can get the similar words related to above query @Tina-19 @andrewluchen @jacobdevlin-google |
I think this link can help you. |
One thing to realise is that word2vec provides context-free embeddings (static) whereas BERT gives contextualised embeddings (dynamic). For instance, take these two sentences "I like apples", "I like apple MacBooks". Word2vec will give the same embedding for the word apple in both sentences whereas BERT will give you a different one depending on the context. Now coming back to your question, here is a step by step tutorial I wrote to obtain contextualised embeddings from BERT: |
Hello All,
Can you please help me out in getting similar words from the BERT model, as we do in Word2Vec?
Best regards,
Vikas
The text was updated successfully, but these errors were encountered: