Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about pretraining and special tokens #2

Open
King-of-Infinite-Space opened this issue Mar 1, 2023 · 0 comments
Open

Question about pretraining and special tokens #2

King-of-Infinite-Space opened this issue Mar 1, 2023 · 0 comments

Comments

@King-of-Infinite-Space
Copy link

Thanks for releasing this model. I hope the authors can provide more information regarding the following questions:

  1. Was this model trained in the same way as the original BERT paper, i.e. masked LM and NSP?
  2. What was the format of input sequences used in training? Were they complete sentences (e.g. couplets)?
  3. What is the meaning of token * and # in the vocabulary?
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant