We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
After I download the pretrained model, I fond'config.json, weights.th and vocabulary folder. However,
tokenizer = BertModel.from_pretrained('bert-base-uncased') model = BertModel.from_pretrained("bert-base-uncased") device = torch.device(f'cuda:{args.gpu_number}' if torch.cuda.is_available() else 'cpu') torch.cuda.set_device(device) # change allocation of current GPU print("\n\nuse trained model") model.load_state_dict(torch.load('weights.th',map_location='cuda:0')) print("no error!")
this doesn't work. How can I load pre-trained model??
The text was updated successfully, but these errors were encountered:
Hi, I rewrote the script to train of every MRQA dataset which is compatible with any type of transformer model on any hardware. https://github.com/lucadiliello/mrqa-lightning
Sorry, something went wrong.
No branches or pull requests
After I download the pretrained model, I fond'config.json, weights.th and vocabulary folder. However,
this doesn't work. How can I load pre-trained model??
The text was updated successfully, but these errors were encountered: