Former OpenAI executive Mira Murati’s startup Thinking Machines Lab has signed a new multibillion-dollar deal to expand its use of Google Cloud’s AI infrastructure, including systems powered by Nvidia’s latest GPUs, TechCrunch can exclusively reveal.
The deal is worth billions of dollars and includes infrastructure services to support model training and deployment, as well as access to Google’s latest AI systems built on Nvidia’s new GB300 chips, the people said.
Google is actively signing a number of cloud deals with AI developers to integrate its cloud services with other services such as storage, the Kubernetes engine, and its database product Spanner. Earlier this month, Anthropic signed deals with Google and Broadcom for gigawatts of tensor processing unit (TPU) capacity (these are AI chips custom designed by Google for machine learning workloads).
However, competition is fierce. Just this week, Anthropic signed a new deal with Amazon that secures up to 5 gigawatts of capacity for Claude training and deployment.
Earlier this year, Thinking Machines partnered with Nvidia in a deal that included an investment from the chipmaker. However, this is the first time the institute has signed a contract with a cloud service provider. The deal isn’t exclusive, so Thinking Machines may use multiple cloud providers in the future, but it’s still a sign that Google wants to lock in fast-growing Frontier Labs early.
Murati left his position as OpenAI’s lead engineer and founded Thinking Machines in February 2025. The company, which raised a $2 billion seed round at a $12 billion valuation soon after, launched its first product in October, although it has been closely guarded. Called Tinker, it’s a tool that automates the creation of custom Frontier AI models.
Wednesday’s deal provided insight into what Thinking Machines is developing. Google said in a press release that Tinker’s architecture can support startup reinforcement learning workloads on which it relies. Reinforcement learning is the training approach that has powered recent advances in labs like DeepMind and OpenAI, and the size of the deal with Google Cloud reflects how computationally expensive that work is.
tech crunch event
San Francisco, California
|
October 13-15, 2026
According to Google, Thinking Machines was one of the first Google Cloud customers to have access to GB300-powered systems, delivering 2x faster training and processing speeds compared to previous generation GPUs.
“Google Cloud allows us to operate at record speeds with the reliability we demand,” Miles Ott, founding researcher at Thinking Machines, said in a statement.
If you buy through links in our articles, we may earn a small commission. This does not affect editorial independence.
