We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
这种根据中心词来预测中心词的上下文,有什么比较直观的解释吗?像CBOW那种,上下文预测中心词,脑海里想起来比较直观,好理解一些,但是skip-gram模型脑海里却想不到直观的解释,有什么想法或者参考资料吗?
The text was updated successfully, but these errors were encountered:
抛砖引玉,自己先回答一下: 举个例子吧,给你若干个词,让你扩展成一句话,就类似于 skip-gram 模型,只是没有的窗口的概念,而且中心词不止一个。另一个相关的模型是训练句子向量用的 skip-thought,给定一个句子,去生成前一个句子和后一个句子。
Sorry, something went wrong.
No branches or pull requests
这种根据中心词来预测中心词的上下文,有什么比较直观的解释吗?像CBOW那种,上下文预测中心词,脑海里想起来比较直观,好理解一些,但是skip-gram模型脑海里却想不到直观的解释,有什么想法或者参考资料吗?
The text was updated successfully, but these errors were encountered: