Google TPUs documentation

Training on TPU

Hugging Face's logo
Join the Hugging Face community

and get access to the augmented documentation experience

to get started

Training on TPU

Welcome to the 🤗 Optimum-TPU training guide! This section covers how to fine-tune models using Google Cloud TPUs.

Supported Models

See Supported Models.

Getting Started

Prerequisites

Before starting the training process, ensure you have:

  1. A configured Google Cloud TPU instance (see Deployment Guide)
  2. Optimum-TPU installed with PyTorch/XLA support:
pip install optimum-tpu -f https://storage.googleapis.com/libtpu-releases/index.html
export PJRT_DEVICE=TPU

Example Training Scripts

We provide several example scripts to help you get started:

  1. Gemma Fine-tuning:

  2. LLaMA Fine-tuning: