Home > Technology > News > Supercomputers lend speed to researchers fighting covid-19
Photo: iStock
Photo: iStock

Supercomputers lend speed to researchers fighting covid-19

High-performance machines come in handy as scientists race against time to find solutions

With the world desperately looking for a vaccine or drug that can put a halt to the covid-19 pandemic, the need to crunch more numbers and run tougher simulations has increased. While studies are using conventional and time-consuming methods such as X-ray crystallography and nuclear magnetic resonance (NMR) to study the structure of virus proteins, there is a growing shift to using computer simulations.

However, carrying out these simulations need a lot of compute power, way beyond the realm of an ordinary computer. That is where supercomputers such as IBM’s Summit can accelerate research by providing a million times more computing power than a regular computer. Summit is considered the fastest supercomputer in the world and is already turning out to be useful in the fight against covid-19. It has already enabled researchers from the Oak Ridge National Lab and The University of Tennessee to simulate 8,000 compounds in a few days to create a model that can restrict the growth of the virus. They have identified 77 small-molecule compounds, which can potentially disrupt covid’s ability to infect host cells.

Similarly, a team of researchers from the National Institute of Technology (NIT), Warangal, are using IBM Research’s covid-19 supercomputer to study the dependence of structure and dynamics of the virus on temperature and humidity in the atmosphere. The NIT Warangal team is led by Soumya Lipsa Rath, assistant professor of the department of biotechnology, at NIT. The study could help in characterizing future classes in the coronavirus family and in categorization and designing drugs for the viruses.

IBM isn’t the only big tech company in this space either. Researchers at the University of Arkansas leveraged supercomputing resources at the Texas Advanced Computing Centre (TACC) to study the differences between SARS-CoV and the much deadlier SARS-CoV-2. The simulations were run on two supercomputers Frontera and Longhorn.

The former offers compute speed of 23.5 Linpack petaflops and the latter can muster 2.3 Linpack petaflops, where one petaflops (pflops) is equivalent to one quadrillion (19 to power 15) floating point operations per second. One pflops is equivalent to 1,000 tflops. Flop stands for floating point operations per second.

IBM’s Summit can accelerate research by providing a million times more computing power than a regular computer
View Full Image
IBM’s Summit can accelerate research by providing a million times more computing power than a regular computer

“Supercomputers are 100 times faster than a decade ago. Covid-19 spike protein simulations in various scenarios can run in one day on current supercomputers, rather than weeks and months. Next year, we will have exascale supercomputers, which will be 100 times faster than today. We have to keep investing in high-end computing as it is a critical tool in many areas of technology and critical health related research," said Priyadarshan Vashishta, dean’s professor, biomedical engineering, computer science, University of Southern California. Closer home, the Indian Institute of Technology (IIT), Delhi, is letting government and private researchers use its supercomputer PADUM for covid-19 research. The apex institute intends to make it available to researchers for a duration of three months, giving them 1 crore worth of computational time.

“High-performance computing has become an integral part of all research. Covid-19 is not an exception. Considering the need to come up with some solutions to fight covid-19 very fast, several institutes and consortiums are offering free high-performance computing (HPC) resources to the research community working on it," said Suryachandra Rao, chief scientist at the Indian Institute of Tropical Meteorology, which has a HPC named Pratyush. “As far as I know, artificial intelligence/machine learning is extensively used to enhance our understanding on how this new virus behaves and spreads," he said.

There is no doubt that supercomputers can be very effective, but they are expensive and use up a lot of power. That is where many researchers have decided to pool their resources and build a crowdsourced equivalent of supercomputer. Stanford University’s Folding@Home and University Berkeley’s Rosetta@Home are some of these distributed computing projects being actively used for covid-19 research.

Unlike supercomputers that are located at one place, these projects pool compute power of thousands and lakhs of personal computers and smartphones and then use them to run their simulations.

“Over the last two months, we have gone from having around 30,000 to 4 million devices running folding at home," Dr. Greg Bowman, director, Folding@Home, told Which-50 Media last week. The projects use their systems only during downtime, which is when the CPU and GPU are not being fully utilized. Some even let users choose the time when they want to allocate their resources. Rosetta@Home already has 100,000 host devices at its beck and call with estimated cloud compute power of 1.26 petaflops.

Subscribe to newsletters
* Enter a valid email
* Thank you for subscribing to our newsletter.

Click here to read the Mint ePaper Livemint.com is now on Telegram. Join Livemint channel in your Telegram and stay updated

My Reads Logout