Get Instant Loan up to ₹10 Lakh!
You may wonder why a company would want to use supercomputers, or high-performance computing (HPC), to make shampoos. But that’s exactly what a European cosmetics company did to get the right mix of materials that would make the shampoo smooth.
“Nobody wants to buy shampoo that is not stable—one in which ingredients get segregated like in a salad dressing so you have to shake it before you can use it,” said Dave Turek, who leads HPC (high performance computing) strategy at International Business Machines (IBM) Corp., which worked with the shampoo maker on devising a cost-efficient HPC solution. According to Turek, ordinarily, one would start the shampoo-making process by getting a laboratory to mix the ingredients (water, detergent, thickeners, etc.), testing them and checking the mixture’s properties like how it “behaves, sitting on the shelf” of a bathroom. “This will teach you something about the ratio of materials, and then you do another set of experiments with another ratio and so on.”
But doing these iterative processes in a physical lab was proving to be time consuming and expensive for the cosmetics company, which selected IBM to simulate them on a computer. Further, said Turek, “We decided to use cognitive methods to orchestrate the way simulations are run” which involved using machine learning algorithms to train the HPC system as more data was being fed into it and interpret the output in a constantly improving way. “The results were quite astounding: we solved the problem with one-third the compute power estimated to be required and ten thousand times the fidelity or precision of what the result was—in this case, the texture of the shampoo,” he said.
HPC, more popularly known as supercomputing, has come a long way since the Cray supercomputers—named after the late American engineer Seymour Roger Cray—were introduced in the 1960s. Initially built as monolithic machines for highly niche applications such as weather forecasting, new drug discovery and academic research, HPC systems are now being routinely used in industries ranging from manufacturing and retailing to financial services and e-commerce. Also, HPC systems are no longer single, large bulky machines. They now comprise multiple computers put together to act as a single unit in what is called a “computer cluster” (bit.ly/2oTRLtE).
“A couple of changes have occurred in the past several years that have changed how HPC is used,” said S. Sadagopan, director of the International Institute of Information Technology, Bengaluru. One, he said, is that data is now available everywhere instantly and in “machine-readable form”. For instance, you can get all the purchasing history and browsing data from Flipkart or Amazon, and the traffic patterns at different times of the day for cell towers in specific areas. “So, data has become more ‘interesting’ than it ever was. Not only that, all this data is now available on all kinds of mobile devices, including cellphones and iPads,” he said. What this data availability is doing is providing solution providers “a wider swathe of opportunities” in HPC usage.
Experts say that big data (that includes structured data from company databases and unstructured text, voice and video and social media conversations) and emerging technologies such as artificial intelligence (AI) are shaping up HPC and extending its use in everyday applications in a phenomenon termed ‘democratization of HPC’.
“Earlier, the emphasis was on peak power or sheer computer power and the applications were mainly restricted to areas such as forecasting for earthquakes and high-resolution computation for drug discovery,” said Sadagopan.
Now, the expanded scope of applications is fuelling growth of the HPC market. Market intelligence firm Intersect360 Research projects that the total HPC revenue worldwide, including sales of servers, storage systems and software, will grow at a compound annual growth rate (CAGR) of 5.2% over 2015-2020—reaching $36.9 billion in 2020 from $28.6 billion. The growth rate may appear slight at first glance, but becomes significant if one considers the fact that the overall server revenue has been declining globally: it declined 1.9% in the fourth quarter of 2016 and 5.8% in the third quarter of the same year, according to research firm Gartner Inc.
Prakash Mallya, managing director, South Asia, Intel Corp., says that Intel has “a strong play in democratizing HPC”. He said that HPC, big data, and AI are “all interlinked” and the advances made in HPC and big data will “help propel the AI space in India and the world.” According to information available on its website, Intel collaborates with the Centre for Development of Advanced Computing (C-DAC), a research and development organization of the ministry of electronics and information technology (MeitY), as well as with the Indian Institute of Science-Bangalore, on several HPC-related initiatives.
Graphics processing unit (GPU) maker Nvidia Corp., which competes with Intel on supplying processors used to build HPC systems, is of the view that the Indian government’s National Supercomputing Mission (NSM) will have “a transformative impact” on research quality and quantity. It will facilitate the training of Indian scientists and the development of “home-grown applications” in medicine, agriculture and technology, according to Vishal Dhupar, managing director of South Asia at Nvidia Corp. Under NSM, the government had announced a fund of Rs4,500 crore in March 2015 to install 73 supercomputers in different parts of the country over a seven-year period (bit.ly/2pF80bI).
“For 50 years HPC infrastructure has been used to forecast climate changes, understand natural resources in the sea, build large flyovers, and for bioinformatics, etc., and today we rely on HPC systems for our daily use—to search (on the web), to find an old friend (on social media) and to create special effects in movies, among other applications,” said Dhupar. He added that the need for HPC is “directly affected” by the accessibility factor. “When we talk about democratizing HPC, we are really talking about the increasing availability of the HPC infrastructure and its usage to fuel innovations in numerous research areas,” he said.
IBM’s Turek said that “the lines between HPC, big data and AI are certainly blurring” and everything you could do in the real world can now be done “digitally through HPC simulation”.
The rapid pace at which AI is growing is expected to accelerate demand for HPC systems to handle the machine learning and deep learning algorithms used in AI. Machine learning algorithms are software programs that allow a computer to learn from data without the need for further programming. Deep learning is a sub-domain in machine learning that is inspired by the structure and inter-connectivity of neurons in the brain. Research firm International Data Corporation’s estimates show worldwide spending on AI is expected to grow to $47 billion in 2020, up from $8 billion in 2016.
In a March, 2016 interview to Fortune, Jen-Hsun Huang, the CEO of Nvidia, said, “Two years ago we were talking to 100 companies interested in using deep learning. This year we’re supporting 3,500. We’re talking about medical imaging, financial services, advertising, energy discovery, automotive applications. In two years’ time there has been 35x growth.” (for.tn/1VzYK3Y)
All these developments indicate the future applications of HPC. “... you will see a chair designed specifically to suit your back or a cycle that is custom-built for you (using HPC and big data analytics). So even ordinary folks will be able to enjoy the benefits of great design,” said Sadagopan.
Catch all the Industry News, Banking News and Updates on Live Mint. Download The Mint News App to get Daily Market Updates.