Meta is in talks to use Google’s chips in challenge to Nvidia

Meghan BobrowskyKatherine BluntRobbie Whelan, The Wall Street Journal
3 min read26 Nov 2025, 06:33 AM IST
logo
A significant deal with Meta would represent a potential crack in Nvidia’s market dominance for Google and other chip makers to exploit.(REUTERS)
Summary
A deal to use Google’s TPUs for Meta’s AI models could be worth billions and eat into Nvidia’s dominant market share.

Meta Platforms is in talks to use chips made by Google in its artificial-intelligence efforts, a step toward diversifying away from its reliance on Nvidia, according to people familiar with the matter.

A deal could be worth billions of dollars, but the talks are continuing and may not result in one. It is still up in the air whether Meta would use the chips, known as tensor processing units or TPUs, to train its AI models or to do inference, one of the people said. Inference, the process a trained model uses to generate the response to a query, requires less computational power than training.

Google has been working for years to refine its chips and scale up that part of its business. A significant deal with Meta would represent a potential crack in Nvidia’s market dominance for Google and other chip makers to exploit. Nvidia’s shares sank 7% Tuesday morning following a report on the talks.

Google said its Google Cloud is experiencing “accelerating demand” for both its custom TPUs and Nvidia GPUs and that the company is “committed to supporting both, as we have for years.” Nvidia declined to comment. Meta didn’t respond to requests for comment.

“The biggest story in AI right now is that Google and Nvidia are being extraordinarily competitive,” said Adam Sullivan, chief executive of data-center operator Core Scientific. “They’re in a race to secure as much data-center capacity as they can.”

Both Nvidia and Google are courting potential customers and offering them financing arrangements to help ease the purchase of their chips.

“They don’t care about how much revenue they generate,” Sullivan said. “This is about who gets to [artificial general intelligence] first.”

The tech news site The Information previously reported on the talks between Google and Meta on Monday night.

On Tuesday morning, after its shares fell in early trading, Nvidia posted a statement on X: “We’re delighted by Google’s success—they’ve made great advances in AI and we continue to supply to Google. NVIDIA is a generation ahead of the industry—it’s the only platform that runs every AI model and does it everywhere computing is done.”

Google first began using its TPU chips about a decade ago, initially for internal purposes, such as making its search engine more efficient. In 2018, it began offering its cloud customers the opportunity to use TPUs for their training and inference needs.

More recently, Google has used the chips to train and operate its own Gemini large language models and sold them to customers including Anthropic, developer of the Claude AI model. Last month, Anthropic announced that starting next year it would spend tens of billions of dollars to buy up to one million Google TPUs—enough to supply roughly 1 gigawatt of computing capacity—to power more AI research and help Anthropic serve rising customer demand for its enterprise AI tools.

In recent years, most large language models have been trained using Nvidia’s GPUs, or graphics processing units.

Originally used to improve graphics for videogames, GPUs pack billions of transistors onto tiny slices of silicon and are capable of running billions of computations simultaneously, an ability that is crucial to training and running AI models. Nvidia built itself into the world’s most valuable company after it realized that they were useful for AI computing.

Nvidia’s chips are used by thousands of different app developers, who access them either directly or via cloud service providers that install them in servers inside massive data centers. Google’s TPUs, by contrast, are what is known as application-specific integrated circuits, or ASICs, meaning they were designed for a particular computing task, allowing them to be more energy-efficient.

Investors, analysts and data-center operators say Google’s TPUs represent one of the biggest threats to Nvidia’s dominance in the AI computing market, but to challenge Nvidia, Google must start selling the chips more widely to external customers.

Write to Meghan Bobrowsky at meghan.bobrowsky@wsj.com, Katherine Blunt at katherine.blunt@wsj.com and Robbie Whelan at robbie.whelan@wsj.com

Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.

More