Nvidia’s $100bn bet on OpenAI raises more questions than it answers
What if OpenAI hits a roadblock?
ONE THING is clear about the announcement on September 22nd that Nvidia may invest up to $100bn in OpenAI in order to help the maker of ChatGPT buy 4m-5m of Nvidia’s artificial-intelligence (AI) chips. Silicon Valley is becoming more incestuous than ever.
Just days after Nvidia announced a $5bn investment in Intel, as part of a deal to help boost the business of its beleaguered American chipmaking rival, the proposed partnership between Nvidia and OpenAI, due to start in the second half of next year, is yet more eye-popping. It makes today’s AI-driven stockmarket rally increasingly dependent on the intertwined fortunes of the world’s most valuable firm and America’s biggest private tech firm. For good measure, OpenAI is also deeply entangled with Microsoft, the world’s second-richest firm.
Nvidia’s shares climbed by almost 4% after announcing the letter of intent with OpenAI, raising its value to close to $4.5trn. Jensen Huang, Nvidia’s chief executive, spoke of the deal as additive to its sales of graphics processing units (GPUs), which probably buoyed the stock. He also said selling as many as 5m extra GPUs would be roughly the same as Nvidia’s entire GPU shipments this year. There was another unspoken benefit. The deal would make OpenAI more dependent on Nvidia’s chips, reducing the incentive to build its own.
It was also apparent that Nvidia would fund the GPU sales via the $100bn it is proposing to invest in OpenAI, which will increase in $10bn increments for every gigawatt (GW) of Nvidia-supported data-centre capacity that OpenAI builds—up to 10GW. Some Nvidia bulls celebrated the proposed investment as an expedient way for the chipmaker to fund sales. In effect, said Pierre Ferragu, of New Street Research, a firm of IT analysts, Nvidia would invest $10bn for every $35bn of GPUs OpenAI buys from it, meaning OpenAI will pay 71% in cash and 29% in shares.
But some also expressed concerns about the transaction. In an interview with CNBC, Stacy Rasgon of Bernstein, an investment firm, acknowledged that it would exacerbate worries about the “circular dynamics" of Nvidia investing in companies that it supplies with GPUs. The size of the deal will “clearly start to raise some questions", he said.
Moreover, OpenAI’s use of its privately held shares as currency may also deepen concerns about its cash constraints as it makes ever-bigger spending pledges. It has reportedly struck a $300bn deal with Oracle, a data firm, to build 4.5GW of data-centre capacity over five years starting in 2027, which was the main contributor to Oracle’s blowout earnings projections earlier this month. That is connected to the “Stargate" project President Donald Trump announced at the White House in January.
How OpenAI finances such expenditures remains an open question, however. Though ChatGPT has more than 700m weekly active users, making it by far the most popular AI application, the response to GPT-5, the research lab’s latest family of models, has been underwhelming. For now, the sums OpenAI is promising to spend dwarf its revenues, which run at close to $13bn a year.
Cash is not its only constraint. Additional power capacity of 10GW is almost half of the 22GW of utility-scale electricity generation added in America in the first half of this year—or the equivalent of ten nuclear power plants. Even with a laxer infrastructure-permitting regime, that could take years to bring online.
Sam Altman, OpenAI’s boss, acknowledged three difficulties in particular to overcome as he announced the Nvidia partnership. One was pushing the frontiers of AI research. The second was building products that entice users. The third was the “unprecedented infrastructure challenge", such as obtaining chips and power supply. A lot of interconnected wealth is riding on the hope that he can solve all three challenges simultaneously. None is proving as easy as getting his well-heeled friends in Silicon Valley to believe his promises.
