Nvidia’s Stunning Ascent Has Also Made It a Giant Target
Summary
In the AI-driven bonanza, even the chip maker’s own customers are looking to move in on its business.Jensen Huang likes to say he doesn’t have a long-term plan. He doesn’t wear a watch, he says, because “now is the most important time."
For the Nvidia CEO, now is a moment of enormous triumph—and more than a little peril.
After an AI-driven business bonanza that has sent Nvidia’s valuation to the $2 trillion stratosphere, Huang is the king of the tech universe. He has managed to adapt his company’s semiconductor designs from computer graphics to training artificial-intelligence systems, making his chips essential for tech companies from Microsoft to Tesla. In the process Huang has earned a fortune estimated at more than $68 billion.
It has been a stunning ascent for the Taiwanese-born immigrant who famously mapped out plans for his company at a Denny’s restaurant more than 30 years ago. He now hobnobs with world leaders, holds court with fellow billionaires who are hooked on his chips and holds forth for hours on his AI views at Nvidia’s annual conference, wearing his signature black jacket, black jeans and black sneakers.
It has also made Nvidia a giant target for a host of companies that want to diminish its dominance—including both rivals and some of the customers that are forking over billions of dollars a year for the must-have chips.
Intel and Advanced Micro Devices have accelerated their AI chip offerings. Amazon.com, Google and Microsoft—huge buyers of Nvidia’s AI-training chips—all have or are developing their own in-house designs. On Wednesday, the same day Nvidia unveiled its latest blockbuster quarterly results, Microsoft and Intel announced a deal for the software giant to build custom chips using Intel’s manufacturing operation.
As much as Huang talks about the importance of the present, he’s also anxious about securing the company’s future. He is developing uses for Nvidia chips in other industries, investing in startups building their businesses around his technology, and pitching it to governments worldwide to build their own AI infrastructure.
A fan of “Only The Paranoid Survive," a book by legendary Intel CEO Andy Grove about converting near-failure into lasting success, the 61-year-old Huang encourages a startup-like culture where his 30,000 employees act as though they’re a month away from going out of business. He has said he searches for hard problems, solves them, then tries to forget how difficult it all was.
“It must be the feeling that marathoners go through," he told an audience of Indian technology graduates in Santa Clara in September. “I used to use this phrase—I use it less now—but the entertainment value of pain and suffering can’t be understated."
That Nvidia has stayed ahead in the AI race has surprised many of Huang’s competitors and detractors. Nvidia’s revenue in the latest quarter was more than 3.5 times the level a year earlier. Profit rose nearly ninefold. Its stock is up sixfold in 16 months. Expectations are high.
A Grand Slam breakfast
Huang has been Nvidia’s chief executive since starting it more than three decades ago, a tenure almost unheard of in fast-moving Silicon Valley. He and his co-founders, fellow engineers Chris Malachowsky and Curtis Priem, formulated their plan at a Denny’s in San Jose. The booth where they talked got a plaque last September—now outdated—after Nvidia hit a $1 trillion valuation.
Nvidia’s foundational idea was to target so-called accelerated computing. The chips at the heart of every computer—central processing units, or CPUs—are digital jacks-of-all-trades, capable of doing a variety of calculations reasonably well. Accelerated computing was the notion that specialist chips could do better at some tasks. Nvidia focused on improving computer graphics.
The company made billions of dollars selling graphics processing units, or GPUs, in its first couple of decades, catering to PC gamers who prized sharper resolution and faster-refreshing screens.
In 2006, after noticing that medical researchers were starting to use Nvidia graphics cards, Huang opened up the GPUs for anyone.
Several years later, researchers outside the company discovered that Nvidia’s GPUs were excellent for AI computation. The math needed to build complex AI systems dovetails with the way graphics chips work—by doing a multitude of calculations at once.
The artificial-intelligence systems behind the current boom in generative AI—language models like the ones that power OpenAI’s ChatGPT—are especially dependent on Nvidia’s GPUs. Educating these systems involves crunching data on a scale with little precedent. ChatGPT was built using tens of thousands of Nvidia’s GPUs.
OpenAI’s release of ChatGPT in late 2022 launched a tech-industry arms race. Production of Nvidia GPUs couldn’t keep up with demand. The company on Wednesday said its supply remained constrained, though the situation is improving.
Shaping the world
Suddenly, Nvidia became the gravitational center of the tech universe. Tech industry leaders bemoaned the difficulty finding H100s—Nvidia’s advanced GPUs.
“GPUs at this point are considerably harder to get than drugs," Elon Muskquipped last May, as he was developing AI capabilities at Tesla and building a separate company called xAI.
Larry Ellison, founder and chairman of business-software giant Oracle, at a company conference in September recounted a dinner that he and Musk had with Huang at the luxe Japanese restaurant Nobu in Palo Alto, Calif. “Elon and I were begging, I guess is the best way to describe it," Ellison recalled. “An hour of sushi and begging."
Meta Platforms CEO Mark Zuckerberg last month talked up plans to spend what will amount to billions more dollars this year for his growing AI ambitions. “By the end of this year, we’re going to have 350,000 Nvidia H100s," he said.
The shortage of Nvidia’s chips has given Huang immense power. How Nvidia allocates its limited supplies could influence who wins or loses in the AI race. Nvidia has said little about how it decides. Asked by an analyst on Wednesday about how he does that, Huang said the company tries to allocate fairly but avoids selling chips to people who won’t immediately use them.
Huang added that Nvidia often brings customers to cloud-computing companies and ties in an allocation of chips to support them. Huang cast those moves as beneficial for the cloud companies.
More than a year into the AI craze, Nvidia is estimated to account for more than 80% of the market for AI chips.
Customers’ efforts to develop their own chips for AI could endanger billions of dollars of Nvidia’s revenue if they’re successful. Google and Amazon are improving in-house AI chips that they started making years ago. Microsoft and Meta have more recently entered the fray.
Nvidia is hugely dependent on the biggest buyers of its chips. Just one of them, which it didn’t name, accounted for nearly a fifth of its sales in its last fiscal year, or more than $11 billion.
Nvidia had sales of more than $9.2 billion from cloud-computing companies like Google, Microsoft and Amazon in its latest quarter alone—more than a quarter of those three companies’ roughly $35 billion in capital spending in the same period.
Microsoft said in a statement that its custom chips were complementary to Nvidia’s, not replacements, and allow customers to choose the optimal price and performance. Google offers its own chips alongside Nvidia’s to give customers a wide menu of chips based on their budget and technical needs, according to a person familiar with the company’s chip strategy. Even as it develops homegrown options, Amazon pointed to its long-term partnership with Nvidia, expanded last year, and said it offered the widest array of the company’s chips in its cloud service.
Nvidia’s gross profit margins, meanwhile, are expected to be about 76% in its current quarter, the company said Wednesday—what amounts to a juicy target for customers and competitors. Analyst Srini Pajjuri of Raymond James estimated that it costs Nvidia just over $3,000 to make one of its advanced H100 chips, which sells for about $25,000.
Advanced Micro Devices is already selling chips that aim to compete with Nvidia. British chip designer Arm Holdings and Intel also are touting their chips for AI, and shifts in the market could play to their strengths. While Nvidia’s chips are unmatched at training AI models, other chips from Intel, AMD and a raft of AI chip startups are in some cases equally adept for deploying AI systems after they are trained.
Growth quest
To build future business, Huang has recently been courting potential government customers. He has urged officials around the world to keep their data and computing infrastructure local, instead of farming out AI development to outsiders—to build “sovereign AI," as he called it at a government conference in Dubai this month.
Huang must navigate geopolitical concerns about Nvidia’s chips, though, especially related to China. U.S. authorities in the past two years have placed restrictions on shipments to China of AI chips from Nvidia and others out of concern that the Chinese military could use them to make advanced weapons and conduct cyberwarfare. Nvidia reported a sharp falloff in its revenue from China in its last quarter.
With Nvidia’s dominant market position, competition regulators in China, the U.K., France and the European Union have launched probes into Nvidia’s business practices, although none of them have resulted in sanctions against the company.
Huang has feverishly invested in startups as the company rises to prominence, focusing on generative AI, robotics, automation and healthcare companies developing technology around Nvidia’s chips. The company grew the value of its holdings in other companies more than fivefold in its last fiscal year, to about $1.55 billion at the end of January. In all, it invested in some three dozen startups in 2023, according to Dealogic figures, more than tripling its activity from the previous year.
Umesh Padval, a managing director at venture-capital firm Thomvest Ventures who has known Huang for decades, said the Nvidia investment strategy fits with its co-founder’s tendency to make bets based on a vision 10 years into the future. Thomvest participated with Nvidia in a funding round last year for Cohere, a Canadian AI company.
“He started aggressively saying, look, what do I need to do to get the ecosystem built such that I can sell more of my chips and systems?" he said.
Another Nvidia investment was in a cloud-computing company called CoreWeave. It operates large data centers filled with Nvidia’s AI chips and rents out their computing power, putting it in competition with giants like Amazon and Microsoft.
CoreWeave, unlike its larger rivals, isn’t working to develop its own chips and has acquired a large fleet of Nvidia’s most advanced chips. Those chips are such a valuable asset that CoreWeave used them as collateral in a $2.3 billion financing deal last year.
Huang is also relying on the stickiness of Nvidia’s software, called CUDA, a digital doorway for using its GPUs that also has become deeply enmeshed in the AI boom. Millions of lines of code have been written over the years for CUDA that have made developing new AI applications easier when using Nvidia’s chips. Competitors have developed their own AI software in an attempt to dislodge Nvidia, but have yet to gain wide traction.
So valuable is access to Nvidia’s chips that some customers are fearful of being punished for dealing with the company’s competitors. Jonathan Ross, CEO of chip-startup Groq, said that fear was palpable among customers who wanted alternatives but were wary of poking the beast.
“A lot of people that we meet with say that if Nvidia were to hear that we were meeting, they would disavow it," he said. “The problem is you have to pay Nvidia a year in advance, and you may get your hardware in a year or it may take longer, and it’s, ‘Aw shucks, you’re buying from someone else and I guess it’s going to take a little longer.’"
The next frontier
Huang’s life has changed along with Nvidia’s growth. He is driven around in a black Mercedes EQS electric car—he doesn’t drive himself any more, he says, for security reasons.
One of his long-term focuses outside of AI is the notion that his chips could help in drug discovery and computational biology. He usually brings it up when asked what he’s most excited about in the future. Nvidia has invested in it, too, including a $50 million stake in drug-discovery startup Recursion last year.
To many of the chip-industry people he interacts with, it’s much like AI might have been a decade ago: a far-off possibility with a limited market for now, one of what Huang dubs his “zero-billion dollar markets."
In January, he had a discussion at a J.P. Morgan healthcare conference with the chairman of Recursion in which he discussed cryo-electron microscopy, X-ray crystallography and structure prediction—all arcane industry terms.
“I don’t mean to show off, but what other chip CEO would talk like that?" he said. “Along the entire pipeline of the discovery of drugs and medicine, we have algorithms and we have mathematics and we have expertise that we can be a partner to you, so please do reach out to us."
Getting into new industries like AI early on, Huang says, gives him more time to figure out complex problems.
“I’m not very fast, but I’m very persistent," he told a Chinese semiconductor professionals group recently. “I work on something for a very long time. So it allows me, if I choose problems and our company chooses problems that are very hard to do, we have a long time to work on them."
Write to Asa Fitch at asa.fitch@wsj.com