Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

关于进行pretrain相关问题 #36

Open
Rong609 opened this issue Apr 24, 2024 · 1 comment
Open

关于进行pretrain相关问题 #36

Rong609 opened this issue Apr 24, 2024 · 1 comment

Comments

@Rong609
Copy link

Rong609 commented Apr 24, 2024

作者您好,我有三个问题
1、我在运行您的代码过程中想直接进行下游任务,不进行预训练,从而对比预训练的效果,不知道是否可以做到,如果可以,该怎么修改代码呢?
2:我是否可以用传统的bert-base-uncased作为预训练模型进行预训练操作,再进行下游任务,从而与VLP-MABSA预训练模型进行对比?
3:我改了预训练文件的mlm_enabled、mrm_enabled等参数,预训练结束后生成了model0和model40两个文件夹,我将下游任务15_pretrain_full.sh的--model_config和--checkpoint改成了相应的参数(model0和model40两个文件夹下的bin文件和config.json文件),但在执行下游任务15_pretrain_full.sh结果值并没有发生变化,不知道是否是我修改的不对?还是有其他原因
image
image
image
如果可以,希望您能看下,万分感谢

@lyhuohuo
Copy link
Collaborator

不使用预训练模型的话,直接在下游任务的脚本里替换checkpoint为原始的bart模型即可,或者直接在脚本里把checkpoint一行删除也可以

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants