Former OpenAI executive Mira Murati’s startup, Thinking Machines Lab, has signed a new multi-billion-dollar agreement to expand its use of Google Cloud’s AI infrastructure, consisting of systems powered by Nvidia’s latest GPUs.
The deal is valued within the single-digit billions, in as per a source recognizable with the matter, and consists of access to Google’s latest AI systems built a top Nvidia’s latest GB300 chips, alongside infrastructure services to guide model training and deployment.
Google has been actively noticeable a number of cloud deal with AI developers because it target to wrap together its cloud offerings with other services like storage, a Kubernetes engine, and Spanner, its database product. Earlier this month, Anthropic signed an agreement with Google and Broadcom for more than one gigawatts of tensor processing unit (TPUs) capacity (these are Google’s custom-designed AI chips for machine learning workloads).
But the competition is fierce. Just this week, Anthropic also signed a latest agreement with Amazon to secure up to 5 gigawatts of capacity for training and deploying Claude.
Earlier this year, Thinking Machines partnered with Nvidia in a deal that included an funding from the chipmaker. But this is the first time the lab has struck a deal with a cloud services provider. The deal is not special, so Thinking Machines may also use more than one cloud providers over time, however it’s nonetheless a sign that Google is looking to lock in rapid-growing frontier labs early.
Murati left her job as OpenAI’s chief technologist and founded Thinking Machines in February 2025. The corporation, which soon afterwards increased a $2 billion seed round at a $12 billion valuation, has stayed distinctly secretive, but released its first product in October. Dubbed Tinker, it’s a tool that automates the creation of custom frontier AI models.
Wednesday’s deal provided some insight into what Thinking Machines is evolving. In a press launch, Google mentioned that it is able to support the startup’s reinforcement learning workloads, which Tinker’s architecture is based on. Reinforcement learning is a training technique that has underpinned latest breakthroughs at labs, along with DeepMind and OpenAI, and the scale of the Google Cloud deal displays how computationally expensive that work can get.
Thinking Machines is most of the first Google Cloud customers to access its GB300-powered systems, which provide a 2X improvement in training and serving pace as compared to prior-generation GPUs, per Google.
“Google Cloud got us running at record speed with the reliability we need,” Myle Ott, a founding researcher at Thinking Machines, said in a statement.











