OpenAI, AMD announce massive computing deal, marking new phase of AI boom

OpenAI CEO Sam Altman. OpenAI committed to purchasing 6 GW worth of AMD’s chips, starting with the MI450 chip next year. (File Photo: Reuters)
OpenAI CEO Sam Altman. OpenAI committed to purchasing 6 GW worth of AMD’s chips, starting with the MI450 chip next year. (File Photo: Reuters)
Summary

Five-year agreement will challenge Nvidia’s market dominance, as OpenAI plans deployment of AMD’s new MI450 chips.

Sam Altman, CEO of OpenAI, and Lisa Su, CEO of Advanced Micro Devices, at a Senate Commerce, Science, and Transportation Committee hearing in Washington in May.

OpenAI and chip-designer Advanced Micro Devices announced a multibillion-dollar partnership to collaborate on AI data centers that will run on AMD processors, one of the most direct challenges yet to industry leader Nvidia.

Under the terms of the deal, OpenAI committed to purchasing 6 gigawatts worth of AMD’s chips, starting with the MI450 chip next year. The ChatGPT maker will buy the chips either directly or through its cloud computing partners. AMD chief Lisa Su said in an interview Sunday that the deal will result in tens of billions of dollars in new revenue for the chip company over the next half-decade.

The two companies didn’t disclose the plan’s expected overall cost, but AMD said it costs tens of billions of dollars per gigawatt of computing capacity.

OpenAI will receive warrants for up to 160 million AMD shares, roughly 10% of the chip company, at 1 cent per share, awarded in phases, if OpenAI hits certain milestones for deployment. AMD’s stock price also has to increase for the warrants to be exercised.

AMD’s processors are widely used for gaming, in personal computers and traditional data center servers.

The deal is AMD’s biggest win in its quest to disrupt Nvidia’s dominance among AI semiconductor companies. AMD’s processors are widely used for gaming, in personal computers and traditional data center servers, but it hasn’t made as much of a dent in the fast-growing market for the pricier supercomputing chips needed by advanced AI systems.

OpenAI plans to use the AMD chips for inference functions, or the computations that allow AI applications such as chat bots to respond to user queries. As the profusion of large language models and other tools has picked up, demand for inference computing has skyrocketed, OpenAI Chief Executive Sam Altman said in a joint interview with Su.

“It’s hard to overstate how difficult it’s become" to get enough computing power, Altman said in a joint interview with Su. “We want it super fast, but it takes some time."

The two CEOs said the deal will tie their companies together and give them incentives to commit to the AI infrastructure boom. “It’s a win for both of our companies, and I’m glad that OpenAI’s incentives are tied to AMD’s success and vice versa," Su said.

Nvidia remains the preferred chip supplier among AI companies, but it is also facing competition from almost every corner of the market. Cloud giants such as Google and Amazon design and sell their own AI chips, and OpenAI recently signed a $10 billion deal with Broadcom to build its own in-house chip. Nvidia is releasing its highly-anticipated Vera Rubin chip next year, promising that it will be more than twice as powerful as its current generation, known as Grace Blackwell.

OpenAI will begin using 1 gigawatt worth of the MI450 chip starting in the second half of next year to run its AI models.

Altman said the fate of many companies will increasingly be linked as demand for AI services, along with the computing and infrastructure needs that accompany them, is set to far outstrip supply.

“We are in a phase of the buildout where the entire industry’s got to come together and everybody’s going to do super well," Altman said. “You’ll see this on chips. You’ll see this on data centers. You’ll see this lower down the supply chain."

Altman has been on a dealmaking spree over the past month, at times using creative financing structures to secure hundreds of billions of dollars worth of computing power. His aim is to lock up enough data center capacity to win the race to develop superintelligence, or AI systems that rival humans for reasoning and intuition.

In late September, Nvidia announced that it would invest $100 billion in OpenAI over the next decade. Under the terms of the circular arrangement, OpenAI plans to use the cash from Nvidia to buy Nvidia’s chips and deploy up to 10 gigawatts of computing power in AI data centers. The deal highlighted how the market’s seemingly endless enthusiasm for Nvidia’s stock is providing a financial backstop for the entire AI market.

The Nvidia deal isn’t finalized yet. The two companies have signed a letter of intent and have yet to disclose specific terms in a regulatory filing.

Nvidia GPUs on stage after Nvidia CEO Jensen Huang spoke at Computex 2025 in Taipei in May.

OpenAI and AMD described Monday’s announcement as “definitive," and planned to immediately file details with securities regulators, according to people familiar with the matter.

Altman also recently signed a $300 billion megadeal with Oracle, the software company founded by multibillionaire Larry Ellison, to purchase another 4.5 gigawatts of cloud computing power over 5 years.

“The thing you have to believe if you are us, or our whole industry, is that given everything we’re seeing in our research and in our product metrics, is that the demand for AI at a reasonable revenue rate is going to continue to steeply increase," Altman said.

The dealmaking frenzy, which has drawn much of the technology industry into the maelstrom, has contributed to growing fears that a bubble is building in AI infrastructure. Companies such as OpenAI, Meta, Alphabet and Microsoft are spending money on chips, data centers and electrical power at levels that dwarf the largest build-outs in history, including the 19th century railroad boom and the construction of the modern electrical and fiber-optic grids.

“I’m far more worried about us failing because of too little compute than too much," said Greg Brockman, OpenAI’s president and co-founder.

In late September, OpenAI and Oracle executives gathered in Abilene, Texas, to lay out their vision to spend trillions of dollars on AI data centers that they said would help satisfy the explosive demand for ChatGPT, which has 700 million weekly users.

It is unclear how OpenAI will pay for the hundreds of billions of dollars worth of infrastructure investments to which it has committed. The startup recently told investors that it was likely to spend around $16 billion to rent computing servers alone this year, and that the number could rise to $400 billion in 2029, The Wall Street Journal reported. OpenAI is on pace to generate $13 billion in revenue this year, and Altman said the company is focusing more on profitable tasks that can be accomplished using its tools.

Mizuho Securities estimates that Nvidia controls more than 70% of the market for AI chips, though AMD and other rivals have sought in recent years to offer more affordable alternatives. Nvidia’s most powerful combination chips for use in AI data centers can cost $60,000 each.

Write to Robbie Whelan at robbie.whelan@wsj.com and Berber Jin at berber.jin@wsj.com

Catch all the Corporate news and Updates on Live Mint. Download The Mint News App to get Daily Market Updates & Live Business News.
more

topics

Read Next Story footLogo