New Delhi: In early January, Advanced Micro Devices Inc. (AMD) unveiled next generation chips that are more efficient and can boost performance at lower power. These 7 nanometer (nm) chips are aimed at competing with bigger rivals Intel Inc. and Nvidia Inc.

In a phone interview from AMD’s US headquarters, Mark Papermaster, the firm’s chief technology officer, talked about the significance of the chips and shared emerging technology trends. Edited excerpts:

What do these 7nm chips mean for the industry?

There’s an insatiable demand for more computational power, whether it’s running higher resolution video displays, or IoT (Internet of Things) devices where there’s so much more information to process. And of course, in data centres. Hence, there’s a need to keep technology on the traditional Moore’s Law pace (doubling performance every 18-24 months). We get excellent benefits but we don’t get the speed boost we used to get.

Long story short, 7nm is a very important step in bringing more performance from the semiconductor node combined with the design elements, resulting in significant performance gains.

Further, because this new technology node allows significantly more packing density of transistors, there will be a very noticeable improvement. So, if you’re running a mobile device (whether smartphone or PC), you will get more performance with a longer battery life.

How will these chips benefit data centres?

Data centres today are limited by power. In the data centre and the cloud, you will see a doubling of performance (of server racks) with the same amount of power that you were spending before.

Will these chips allow new form factors?

PCs are already becoming thinner and lighter with every new generation, and 7nm will accelerate that trend.

As CPUs (central processing units) become more efficient, do you think they could replace GPUs (graphic processing units) in data centres?

If you look at machine learning algorithms, GPUs are a great way to accelerate this workload. We started shipping our 7nm GPUs recently. What we have found is that traditional algorithms still want to be on the CPU, and emerging algorithms can take advantage of the GPU—in fact, you will see combinations of the two. It’s the combination that can provide even additional processing power if you make it easy to move back and forth.

What are the biggest trends in technology today?

There’s a confluence of disruptions going on now. There is the revolution of IoT, there are better displays, not only two-dimensional displays but virtual and augmented reality.

How we even interface with computing will change quite disruptively. Machine learning is the biggest disruptive technology. I believe that the new algorithms that allow us to leverage data and predict results based on that data, will be the single biggest disruption.

Devices are getting smarter. You see it in voice-controlled devices, smart appliances, etc. And it goes beyond your home. Think about factories (where) all of the manufacturing equipment are becoming smart, embedded with many sensors. It’s all about how you can leverage all of this data.

What are your thoughts on AI and robotics taking away people’s jobs?

I think people jump to the conclusion that it’s just around the corner--that computers can be sentient. There’s much more research and technology development required to even come close to such technology developments that the human brain has today. As excited as I am about the advancements in machine learning, I think you’re seeing people being overoptimistic in how quickly the gains could be made to replace true human sentient thought.

We’ve seen a lot of GPU usage in the crypto space over the past year or so, do you see this as a whole new market to target?

Cryptocurrency is a point usage of blockchain. We do believe that’s a long-term market, and cryptocurrency is a piece of that and we’re happy to service it. But we’re focused on our broader GPU markets of gaming, servers and blockchain as a whole.

Close