Training on TPU
Welcome to the 🤗 Optimum-TPU training guide! This section covers how to fine-tune models using Google Cloud TPUs.
Supported Models
See Supported Models.
Getting Started
Prerequisites
Before starting the training process, ensure you have:
- A configured Google Cloud TPU instance (see Deployment Guide)
- Optimum-TPU installed with PyTorch/XLA support:
pip install optimum-tpu -f https://storage.googleapis.com/libtpu-releases/index.html
export PJRT_DEVICE=TPU
Example Training Scripts
We provide several example scripts to help you get started:
Gemma Fine-tuning:
- See our Gemma fine-tuning notebook for a step-by-step guide
LLaMA Fine-tuning:
- Check our LLaMA fine-tuning script for detailed instructions