Nvidia nears $2 trillion valuation on insatiable AI chip demand

Summary
The chips that perform artificial-intelligence calculations are so valuable that they are delivered in armored cars.It took Nvidia 24 years as a public company for its valuation to reach the rarefied air of $1 trillion. Thanks to the chip maker’s role in powering the AI revolution, the company is closing in on adding a second trillion in just eight months.
The journey to become one of the three most-valuable U.S. companies might have started at a Denny’s in 1993, but it has been fast-tracked by Nvidia’s dominance of GPUs, or graphics processing units. These chips, worth tens of thousands of dollars each, have become a scarce, treasured commodity like Silicon Valley has seldom seen, and Nvidia is estimated to have more than 80% of the market.
Voracious demand has outpaced production and spurred competitors to develop rival chips. The ability to secure GPUs governs how quickly companies can develop new artificial-intelligence systems. Companies tout their access to GPUs to recruit AI workers, and the chips have been used as collateral to back billions of dollars in borrowing.
The chips are so valuable that they are delivered to the networking company Cisco Systems by armored car, said Fletcher Previn, Cisco’s chief information officer, at The Wall Street Journal’s CIO Network Summit this month.
On Wednesday, after Nvidia turned in a third straight quarter of forecast-beating results, company executives said that supplies were still tight and that a new generation of AI chips expected to be launched this year will be supply-constrained.
The design of the chips makes them critical parts for training the giant language models that underpin generative AI bots such as OpenAI’s ChatGPT. Much of the AI spending by such tech giants as Microsoft, Alphabet and Amazon.com has gone to GPUs.
Jensen Huang, Nvidia’s chief executive officer and co-founder, said generative AI is kicking off a wave of investment worth trillions of dollars, which he believed would double the amount of data centers in the world in the next five years and deliver market opportunities for Nvidia.
“A whole new industry is being formed, and that’s driving our growth," he said on the company’s earnings call. Nvidia on Wednesday reported quarterly sales of $22.1 billion and forecast another $24 billion for its current quarter, each more than triple what was posted a year earlier and ahead of Wall Street’s bullish expectations.
The results lifted Nvidia shares Thursday to their highest-ever close of $785.38, valuing the company at $1.96 trillion. The stock has jumped 59% so far this year after more than tripling in 2023.
Founded more than 30 years ago with an initial focus on computer graphics chips for PC gaming, Nvidia latched on early to AI.
Huang, one of the tech industry’s longest-tenured CEOs, owns 86.6 million Nvidia shares, according to FactSet, valued at about $68 billion.
Huang laid the groundwork for Nvidia’s AI rise in 2006 when he opened up its chips for purposes beyond computer graphics. Engineers soon started to use them for AI calculations, where they proved to be especially proficient. The kind of math needed to build complex AI systems dovetails with the way graphics chips work—by doing a multitude of calculations at once—more than the way traditional central processing units work.
Tens of thousands of Nvidia’s most advanced GPUs, called H100s, are commonly used in the creation of the most sophisticated AI systems. And they are pricey, going for around $25,000 each, according to analyst estimates.
Analysts estimate Nvidia can make around 1.2 million of the chips a year, but meeting demand has become difficult. Nvidia designs the chips and contracts out their production to Taiwan Semiconductor Manufacturing Co., which has run into a bottleneck in later steps of the chip-making process where pieces of silicon are assembled into a final chip. TSMC is aiming to double capacity in these later steps this year.
Surging demand has led competitors to develop their own AI-focused chips. Advanced Micro Devices has started selling chips that aim to compete with Nvidia’s offerings and projects sales of those chips at more than $3.5 billion this year. The British chip designer Arm Holdings has touted the usefulness of its chips for AI, and Intel has started selling central processing units that can handle AI calculations.
There are also a raft of startups making AI chips. And big cloud-computing companies such as Google and Amazon are building up internal AI chip development efforts. Microsoft unveiled its first AI chip, called the Maia 100, in November.
Meanwhile, startups and big tech companies alike have been touting how many of Nvidia’s chips they have amassed. Last month Meta Platforms Chief Executive Mark Zuckerberg said on Instagram that his company plans to have 350,000 of Nvidia’s H100 chips by the end of this year—a setup that would cost at least several billion dollars at current prices for the chips.
CoreWeave, which counts Nvidia as an investor, in August secured $2.3 billion of financing backed by its Nvidia H100s. The effective interest rate was high, reflecting its risk, according to a person familiar with the deal.
Even some universities are touting H100 inventories for recruiting and bragging rights. Princeton’s Language and Intelligence Initiative has “state-of-the-art computational infrastructure with 300 Nvidia H100 GPUs," its director, Sanjeev Arora, said in a message last year on the website for the group, which was recruiting a software engineer and research scientist.
Google has set up an executive committee to decide on how to divide computing resources between the company’s internal and external users. Microsoft has instituted a similar rationing program, called GPU councils, where executives determine how the remaining computing resources are divided up between Microsoft’s internal projects.
Many analysts and industry executives say Nvidia’s advantages can’t easily be eroded by the competition, thanks to the depth and complexity of the software it has spent years building around its chips.
But Andrew Ng, an artificial-intelligence pioneer who runs AI Fund, said AMD and Intel have made significant headway in developing competing software systems to go with AI-powering chips.
“I think in a year or so the semiconductor shortage will feel much better," he said at the Journal’s CIO conference.
Write to Asa Fitch at asa.fitch@wsj.com