-
Notifications
You must be signed in to change notification settings - Fork 89
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Domain Specific Pre-training Model #4
Comments
Hi, the run_pretraining.py script is exactly the same as https://github.com/google-research/bert, and you can get help from there. We used our modified version of the script (which is not shared) to handle multi-gpu and server specific issues for saving the models, so the result might be quite different from what you'll get using the original script. Thank you. |
Hi, Thanks |
Hi, Is there anyway to load this model in tensorflow hub.module()? If not, how can we use the model to get the embeddings? Please suggest the way forward |
Hi, @abhinandansrivastava, |
Hi @Sriyella, |
If it's not related to pre-trained weights of BioBERT, please report BioBERT related issues in https://github.com/dmis-lab/biobert, or BERT related issues in https://github.com/google-research/bert. |
I am also looking for the same. How to use the pre-trained weigths to get the embeddings. |
This might help! |
Hi @jhyuklee , Do we need to create our own vocab.txt after doing pretraining of domain specific model, as the model saved after the pretraining process does not have vocab.txt and bert_config.json file. If yes, then how? Thanks |
you don't have to create your own vocab.txt if you used the same vocab.txt and bert_config.json while pre-training. See #1. Thanks. |
Embedding related issues are at dmis-lab/biobert#23. Closing this issue. |
Hi @jhyuklee , |
Hi @abhinandansrivastava , |
Hi,
I have run the code run_pretraining.py script on my domain specific data.
It seems like only checkpoints are saved. I have got two files 0000020.params and 0000020.states.
How can I save the model or get a model from .params and .states files in checkpoint folder so that I can use that model to get contextual embeddings.
Can someone please help me with this?
The text was updated successfully, but these errors were encountered: