You can use PyTorch Lightning with TPUs, but running them in a local environment is not straightforward. TPUs are primarily available on Google Cloud (GCP), Google Colab, and Kaggle Environments. These platforms provide the necessary infrastructure and setup for using TPUs with PyTorch Lightning[1][2].
To use TPUs in a local environment, you would need to have access to a TPU device, which is typically not feasible without a cloud setup. However, you can simulate or test your models locally using other accelerators like GPUs before deploying them on TPUs in the cloud.
If you want to use TPUs, the most common approach is to set up a project on Google Cloud and use their TPU services. PyTorch Lightning supports running on TPUs through the PyTorch XLA integration, which compiles high-level operations into optimized operations for TPUs[4][5]. Google Colab offers a free and accessible way to start using TPUs with PyTorch Lightning by providing a Jupyter notebook environment with TPU support[1][3].
In summary, while you can't easily use TPUs in a local environment without cloud infrastructure, you can leverage cloud services like Google Colab or Google Cloud to utilize TPUs with PyTorch Lightning.
Citations:[1] https://lightning.ai/docs/pytorch/1.5.9/advanced/tpu.html
[2] https://pytorch-lightning.readthedocs.io/en/1.0.8/tpu.html
[3] https://stackoverflow.com/questions/75693020/how-to-set-up-tpu-on-google-colab-for-pytorch-and-pytorch-lightning
[4] https://cloud.google.com/blog/products/ai-machine-learning/train-ml-models-with-pytorch-lightning-on-tpus
[5] https://pytorch.org/xla/release/2.2/index.html
[6] https://github.com/Lightning-AI/pytorch-lightning/issues/19274
[7] https://www.datacamp.com/tutorial/pytorch-lightning-tutorial
[8] https://github.com/Lightning-AI/lightning/issues/16464