How does the TRC program work?
Anyone can apply for the program by signing up at sites.research.google/trc. Invitations are sent out to approved applicants on a rolling basis. When an invitation is accepted, free Cloud TPU quota is granted to the invitee's Google Cloud Platform project on a temporary basis and is ready to use within minutes.
Is it really free?
While Cloud TPUs are free to use for TRC participants, other GCP services are not. Participants can expect to utilize small VM instances (n1-standard-2) to drive their TPUs as well as Google Cloud Storage (GCS) buckets to hold training data. These costs are generally minimal. If you have questions about TRC-related charges please contact us.
What are the program requirements?
Participants in the TRC program are expected to:
- Share their TRC-supported research with the world through peer-reviewed publications, open source code, blog posts, or other means
- Share detailed feedback with Google to help us improve the TRC program and the underlying Cloud TPU platform over time
- Agree to conduct their research in accordance with the Google AI Principles
- Accept Google's Terms and Conditions
I need time to prepare, can I defer my free TPU start date?
Once you've been accepted to the TRC program, the choice of when to start your free TPU access is up to you, subject to availability. Whenever you are ready, simply follow the instructions in the introductory email.
I'm ready - where do I start?
You can access Cloud TPUs for free in your browser using Google Colab, a Jupyter notebook environment that requires no setup to use.
To get started right away, try one of these TPU-compatible notebook examples:
- Hello, TPU in Colab
- Fashion MNIST with Keras and TPUs
- Predict Shakespeare with Cloud TPUs and Keras
- BERT on Cloud TPUs (state-of-the-art NLP: paper, code)
Where can I find sample TPU code?
Can TRC quota be used with CMLE / CAIP / Vertex?
Cloud TPU quota granted under the TRC program is not compatible with CMLE / CAIP / Vertex-based workflows; we recommend using the GCE tutorials to get started instead.
Where can I get technical help?
I'm ready to publish - how can I acknowledge the program?
Thanks for asking - we'd recommend "Research supported with Cloud TPUs from Google's TPU Research Cloud (TRC)" or similar.
Can I have more?
Our goal is to accelerate open machine learning research. If you have a proposal for using large amounts and/or specific generations of Cloud TPU devices please contact us with additional information about your project's goals, needs, and timeline. For example, the open source MiniGo project successfully used 640 Cloud TPUs simultaneously via GKE.
Have a question that isn't answered here?
Email the team at email@example.com