AMD bets big on 'pervasive AI' to keep pace with the chip race

Last May, CEO of AMD entrusted president Victor Peng (above) with the task of executing the company's broad AI strategy and accelerating this key part of it's business.
Last May, CEO of AMD entrusted president Victor Peng (above) with the task of executing the company's broad AI strategy and accelerating this key part of it's business.

Summary

  • AMD’s aim is to capture a bigger share of the AI chip market from Nvidia
  • Pervasive AI encompasses not just data centre graphic processing units but applications including servers, networking, edge computing

Advanced Micro Devices, Inc. (AMD), the US fabless chipmaker, is betting big on artificial intelligence (AI) to not only power the chips used in data centres but also those used in personal computers or PCs, mobile devices, cars, factories and smart cities. The aim is to capture a bigger share of the AI chip market from Nvidia, which controls about 80% of the market, on the back of escalating demand to implement and scale up AI and generative AI projects.

"AI represents an unprecedented opportunity for AMD," CEO of AMD, Lisa Su, told investors during the first-quarter earnings call this April.

Last May, she entrusted president Victor Peng with the task of executing the company's "broad AI strategy" and "significantly accelerating this key part of our business," for which she merged AMD's multiple AI teams into a single entity under his leadership. Peng asserted that "the concept of 'pervasive AI' is central to AMD's vision."

"It encompasses not just data centre graphic processing units (GPUs) but a broad spectrum of applications including servers, networking, edge computing, and various AI-enabled devices in smart cities, smart factories, and automotive industries," he said in a video interview from his US office last week.

Peng, who rejoined AMD in February 2022 after the company completed its $35 billion acquisition of Xilinx where he was CEO, is responsible for shaping AMD's AI strategy across its product range that includes Ryzen processors for AI PCs, the Radeon RX 7000 Series graphics cards for gaming, customisable chips, integrated system chips, AI accelerators like MI325X, the MI30 and M1350 GPUs, EPYC (pronounced 'epic') server central processing units (CPUs), neural processing units (NPUs) and data processing units (DPUs).

Dedicated accelerators

According to Peng, the acquisition of data centre optimisation startup Pensando in April 2022 for $1.9 billion, too, is "enhancing its (AMD's) capabilities in networking and integrating AI technologies across AMD's product range". This integration is evident in the development of the first x86 AI PC with a "fully integrated monolithic NPU, termed XDNA", according to Peng.

AI-capable PCs use dedicated AI accelerators such as NPUs to handle AI tasks more efficiently and consume less power. An estimated 48 million AI-capable PCs will ship worldwide in 2024, predicts Canalys, representing 18% of total PC shipments. The market research firm expects vendors to ship 205 million AI-capable PCs in 2028. Other than AMD, Intel, Nvidia and Qualcomm too provide chips for AI PCs.

Also Read: AMD takes on Nvidia - Qualcomm with new AI chipsets. All you need to know

Peng pointed out that "pervasive AI" technology is not confined to PCs but extends to various edge devices, including cars and smart infrastructure. AMD's technology, for instance, powers not only the computational needs for in-car experiences, like what it does for Tesla, but also the communication infrastructure that supports autonomous driving and other AI-driven functionalities.

A distinctive feature of AMD's AI architecture, according to Peng, "is its focus on low latency and power efficiency, crucial for applications like autonomous driving, where real-time processing is critical."

AMD has 11,492 patents globally, of which 7,923 have been granted, most of them in the US, according to a 25 April report by 'Insights' from innovation consulting firm GreyB. Of these patents, 572 have been filed from India.

Peng underscored AMD's significant presence and investment in India, highlighting the country's young demographic and vast pool of talented engineers. In July 2023, AMD said it plans to invest $400 million over the next five years, which included setting up of an AMD campus in Bengaluru to serve as the company’s largest design centre, and add about 3,000 engineering roles by the end of 2028.

Having set up operations in 2001, AMD has grown to more than 7,000 employees after acquiring Xilinx and Pensando, which had local operations. India now accounts for a quarter of AMD's global workforce and the local unit is "heavily involved in mission-critical R&D (research and development)" and in partnerships with the Tata Group, Reliance Jio and institutions like the Indian Space Research Organisation (Isro) to drive technological advancements, according to Peng.

Intense rivalry

But AMD faces stiff competition from Intel and Nvidia. The overall market for just data centre AI chips in 2023 was $17.7 billion, with Nvidia accounting for a 65% market share, according to market research firm TechInsights. Intel came a distant second with a 22% market share, while AMD had 11%.

For 2023, AMD reported revenue of $22.7 billion, a 3.9% decline from 2022. Intel's revenue was $54.2 billion during the same period, a 14% decline from 2022. However, Nvidia's revenue rose 126% to $60.9 billion, overtaking that of Intel, during the same period on the back of growth in "accelerated computing and generative AI" projects.

Also Read: Could AMD break Nvidia's chokehold on chips?

In terms of market capitalisation, though, while Intel is valued at a little over $130 billion, AMD stands at $253.97 billion. However, Nvidia is leagues ahead at $3.25 trillion, just trailing Microsoft ($3.33 trillion) and Apple ($3.35 trillion).

That said, AMD, based in Santa Clara, California, offers business leaders and CXOs several advantages, according to Peng. First, its hardware solutions are not only high-performance but also power-efficient, addressing the significant challenge of power scarcity in data centres.

He explained that this efficiency is demonstrated by AMD's technology powering the world's first exascale supercomputer, Frontier, "which not only leads in performance but also in energy efficiency". Exascale systems (supercomputers and high-performance computing or HPC systems), which can execute a quintillion operations per second (in the US, a quintillion is equal to a million trillion), are typically used to simulate the complex interactions behind weather, genomics, physics, and beyond.

Peng said business leaders implementing AI projects should focus on the "total cost of ownership (TCO) and performance per watt, areas where AMD excels." He also advocated the adoption of an open-ecosystem approach to leverage rapid innovations and avoid getting locked into a single solution.

Also Read: How a shifting AI chip market will shape Nvidia's future

Given the early stages of AI development, with enterprises mostly in the proof-of-concept phase, staying adaptable and embracing best-in-class technologies is crucial, according to Peng. In this context, AMD's commitment to open-source software and collaborative development sets it apart, according to Peng.

He cited the example of AMD's ROCm platform, which, in partnership with Hugging Face and other open-source initiatives, provides developers with a flexible and powerful environment to build AI solutions. "This avoids vendor lock-in and enables innovation," Peng insisted.

Productivity enhancer

Addressing concerns about AI's impact on jobs, Peng said he believes in a "copilot" model where AI enhances human productivity rather than replaces jobs outright. While some jobs might be automated, the interplay between humans and AI can create more value.

Peng cited the examples of LinkedIn and Expedia to illustrate how new technologies often enhance rather than eliminate jobs, turning skilled users into power users. Peng emphasised the need for upskilling and reskilling workers to integrate them into the evolving technological landscape, ensuring they can contribute meaningfully to the new economy.

Regarding the future of AI, Peng dismissed the notion of an impending "AI winter," as suggested by a section of industry experts, asserting that the technology is "real and transformative". He acknowledges that while there might be cycles of overhype and correction, AI's potential is comparable to electricity in its wide-ranging impact.

Peng, though, advocates thoughtful, focused regulations that consider marginal risks without stifling innovation. He cautions against creating a "spaghetti code" of regulations that could slow progress and increase societal costs.

 

 

 

 

 

 

 

 

 

 

Catch all the Corporate news and Updates on Live Mint. Download The Mint News App to get Daily Market Updates & Live Business News.
more

topics

MINT SPECIALS