TPU Research Cloud

Accelerate your cutting-edge machine learning research with free Cloud TPUs.

The TPU Research Cloud (TRC) program was previously known as the TensorFlow Research Cloud (TFRC). Anyone enrolled in TFRC is automatically included in the TRC program.

Learn more about the TRC program

TRC enables researchers to apply for access to a cluster of more than 1,000 Cloud TPUs. Researchers accepted into the TRC program will have access to v2 and v3 devices at no charge and can leverage a variety of frameworks including TensorFlow, PyTorch, Julia and JAX to accelerate the next wave of open research breakthroughs.

Participants in the TRC program will be expected to share their TRC-supported research with the world through peer-reviewed publications, open source code, blog posts, or other means. They should also be willing to share detailed feedback with Google to help us improve the TRC program and the underlying Cloud TPU platform over time. In addition, participants accept Google's Terms and Conditions, acknowledge that their information will be used in accordance with our Privacy Policy, and agree to conduct their research in accordance with the Google AI principles.

Machine learning researchers around the world have done amazing things with the limited computational resources they currently have available. We'd like to empower researchers from many different backgrounds to think even bigger and tackle exciting new challenges that would be inaccessible otherwise.

Use Cloud TPUs for free, right in your browser

If you’d like to get started with Cloud TPUs right away, you can access them for free in your browser using Google Colab . Colab is a Jupyter notebook environment that requires no setup to use.

We’re excited to help researchers and students everywhere expand the machine learning frontier by making Cloud TPUs available for free.

To get started, try one of these TPU-compatible notebook examples:


Hello, TPU in Colab

Fashion MNIST with Keras and TPUs

Predict Shakespeare with Cloud TPUs and Keras

BERT on Cloud TPUs (state-of-the-art NLP: paper, code)

Cloud TPUs: Built to train and run ML models

Cloud TPU hardware accelerators are designed from the ground up to expedite the training and running of machine learning models. The TPU Research Cloud (TRC) provides researchers with access to more than 1,000 Cloud TPUs, each of which can provide up to 180 (v2) or 420 (v3) teraflops of ML acceleration.