Skip to content

Commit

Permalink
examples/docs: caveat that PL examples don't work on TPU (#8309)
Browse files Browse the repository at this point in the history
  • Loading branch information
sshleifer authored Nov 9, 2020
1 parent 76e7a44 commit ebde57a
Showing 1 changed file with 2 additions and 1 deletion.
3 changes: 2 additions & 1 deletion examples/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,7 +62,8 @@ When using PyTorch, we support TPUs thanks to `pytorch/xla`. For more context an
very detailed [pytorch/xla README](https://github.com/pytorch/xla/blob/master/README.md).

In this repo, we provide a very simple launcher script named [xla_spawn.py](https://github.com/huggingface/transformers/tree/master/examples/xla_spawn.py) that lets you run our example scripts on multiple TPU cores without any boilerplate.
Just pass a `--num_cores` flag to this script, then your regular training script with its arguments (this is similar to the `torch.distributed.launch` helper for torch.distributed).
Just pass a `--num_cores` flag to this script, then your regular training script with its arguments (this is similar to the `torch.distributed.launch` helper for torch.distributed).
Note that this approach does not work for examples that use `pytorch-lightning`.

For example for `run_glue`:

Expand Down

0 comments on commit ebde57a

Please sign in to comment.