Shared supercomputing and everyday research4 min read . Updated: 24 Nov 2009, 12:16 AM IST
Shared supercomputing and everyday research
Shared supercomputing and everyday research
Portland, Oregon: For decades, the world’s supercomputers have been the tightly guarded property of universities and governments. But what would happen if regular folks could get their hands on one?
The price of supercomputers is dropping quickly, in part because they are often built with the same off-the-shelf parts found in PCs, as a supercomputing conference here last week made clear. Just about any organization with a few million dollars can now buy or assemble a top-flight machine.
These advances are pulling down the high walls around computing-intensive research. A result could be a democratization that gives ordinary people with a novel idea a chance to explore their curiosity with heavy computing firepower— and maybe find something unexpected. The trend has spurred some of the world’s top computing experts and scientists to work towards freeing valuable stores of information. The goal is to fill big computers with scientific data and then let anyone in the world with a personal computer, including amateur scientists, tap into these systems.
“It’s a good call to arms," said Mark Barrenechea, the chief executive of Silicon Graphics, which sells computing systems to labs and businesses. “The technology is there. The need is there. This could exponentially increase the amount of science done across the globe."
The notion of top research centres sharing information is hardly new. Some of the earliest incarnations of what we now know as the World Wide Web came to life so that physicists and other scientists could tap into large data stores from afar. In addition, universities and government labs were early advocates of what became popularized as grid computing, where shared networks were created to shuttle data about.
The current thinking, however, is that the labs can accomplish far more than was previously practical by piggybacking on some of the trends sweeping the technology industry. And, this time around, research bodies big and small, along with brainy individuals, can participate in the sharing agenda.
For inspiration, scientists are looking at cloud computing services such as Google’s online office software, photo-sharing sites and Amazon.com’s data centre rental programme. They are trying to bring that type of Web-based technology into their labs and make it handle enormous volumes of data.
“You’ve seen these desktop applications move into the cloud," said Pete Beckman, the director of the Argonne Leadership Computing Facility in Illinois. “Now science is on that same track. This helps democratize science and good ideas."
With $32 million from the energy department, Argonne has set to work on Magellan, a project to explore the creation of a cloud-computing infrastructure that scientists around the globe can use.
Beckman argued that such a system would reduce the need for smaller universities and labs to spend money on their own computing infrastructure.
Another benefit is that researchers would not need to spend days downloading huge data sets so that they could perform analysis on their own computers. Instead, they could send requests to Magellan and just receive the answers.
Even curious individuals on the fringe of academia may have a chance to delve into things like climate change and protein analysis.
“Some mathematician in Russia can say, ‘I have an idea’," Beckman said. “The barrier to entry is so low for him to try out that idea. So, this really broadens the number of discoverers and, hopefully, discoveries."
The computing industry has made such a discussion possible. Historically, the world’s top supercomputers relied on expensive, proprietary components. Government laboratories paid vast sums of money to use these systems for classified projects. But over the last 10 years, the vital innards of supercomputers have become more mainstream, and a wide variety of organizations have bought them.
At the conference, undergraduate students competed in a contest to build affordable mini-supercomputers on the fly. And a supercomputer called Jaguar at the Oak Ridge National Laboratory in Tennessee officially became the world’s fastest machine. It links thousands of mainstream chips from Advanced Micro Devices (AMD).
Seven of the world’s top 10 supercomputers use standard chips from AMD and Intel, as do about 90% of the 500 fastest machines. “I think this says that supercomputing technology is affordable," said Margaret Lewis, an AMD director. “We are kind of getting away from this ivory tower." While Magellan and similar projects are encouraging signs, researchers have warned that much work lies ahead to free what they consider valuable information for broader analysis. At the Georgia Institute of Technology, for example, researchers have developed software that can evaluate scans of the brain and heart, and identify anomalies that might indicate problems. To advance such techniques, the researchers need to train their software by testing it on thousands of body scans. But it is hard to find a repository of such scans that a hospital or a government organization such as the National Institutes of Health is willing to share, even if personal information can be stripped away, said George Biros, a professor at the Georgia Institute of Technology. “Medical schools don’t make this information available," he said.
Bill Howe, a senior scientist at the eScience Institute at the University of Washington, has urged research organizations to reveal their information. “All the data that we collect in science should be accessible, and that’s just not the way it works today," he said.
Howe said high school students and so-called citizen scientists could make new discoveries if given the chance.
“Let’s see what happens when classrooms of students explore this information," he said.
©2009/THE NEW YORK TIMES