We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
tried to run:
!OMP_NUM_THREADS=8 ./train_gpt2
RuntimeError:
[GPT-2] max_seq_len: 1024 vocab_size: 50257 num_layers: 12 num_heads: 12 channels: 768 num_parameters: 124439808 train dataset num_batches: 1192 val dataset num_batches: 128 num_activations: 73323776 val loss 5.252006 step 0: train loss 5.356172 (took 26772.736942 ms) Error: must forward with targets before backward
The text was updated successfully, but these errors were encountered:
This is already being discussed #19, there are a few workarounds for this issue at the moment.
Sorry, something went wrong.
No branches or pull requests
tried to run:
!OMP_NUM_THREADS=8 ./train_gpt2
RuntimeError:
The text was updated successfully, but these errors were encountered: