To install PyTorch/XLA on Python 3.9, you can follow these steps:
1. Ensure Python 3.9 is Supported: PyTorch/XLA supports Python versions up to 3.11, so Python 3.9 should work fine[3][7].
2. Install PyTorch: First, install PyTorch using pip. You can install a version compatible with Python 3.9:
bash
pip install torch~=2.6.0 torchvision
3. Install PyTorch/XLA: Next, install PyTorch/XLA. If you are using TPUs, you can install the TPU-specific version:
bash
pip install 'torch_xla[tpu]~=2.6.0' \
-f https://storage.googleapis.com/libtpu-releases/index.html \
-f https://storage.googleapis.com/libtpu-wheels/index.html
If you encounter issues finding the correct version, ensure that your pip is up-to-date and that you have correctly specified the version and index URLs.
4. Optional: Install Pallas Dependencies: If you need custom kernels, you might also want to install Pallas dependencies:
bash
pip install 'torch_xla[pallas]' \
-f https://storage.googleapis.com/jax-releases/jax_nightly_releases.html \
-f https://storage.googleapis.com/jax-releases/jaxlib_nightly_releases.html
5. Verify Installation: After installation, verify that PyTorch/XLA is working by creating an XLA tensor:
python
import torch
import torch_xla
import torch_xla.core.xla_model as xm
t = torch.randn(2, 2, device=xm.xla_device())
print(t.device)
print(t)
This should help you set up PyTorch/XLA on Python 3.9 for use with TPUs or other XLA devices.
Citations:[1] https://pypi.org/project/torch-xla/
[2] https://pytorch.org/xla/master/gpu.html
[3] https://stackoverflow.com/questions/79314991/why-am-i-getting-no-matching-distribution-found-for-torch-xla-2-5-0-when-inst
[4] https://pytorch.org/xla/release/r2.4/index.html
[5] https://pytorch.org/get-started/locally/
[6] https://github.com/pytorch/xla
[7] https://github.com/pytorch/xla/issues/3662
[8] https://forums.developer.nvidia.com/t/error-pytorch-2-1-0-with-python-3-9-wheel-file-not-supported-on-this-platform-error/290713