🚀 Feature
It would be nice to expose default random generator for XLA devices, just like CPU (at torch.default_generator) and GPU (at torch.cuda.default_generators) devices in PyTorch. The default generators can live in e.g. torch_xla.core.random.default_generators and the torch_xla.core.random can additionally provide a number of methods to get, set and seed the random number generator states (e.g. seed, manual_seed, get_rng_state, set_rng_state).
Motivation
This will be helpful for reproducing the results of neural networks when using TPUs.