OPEN APP
Home >Industry >Infotech >Power, pollution and the Internet
A security officer stands watch at a data centre in Las Vegas. Photo: Ethan Pines/NYT (Ethan Pines/NYT)
A security officer stands watch at a data centre in Las Vegas. Photo: Ethan Pines/NYT
(Ethan Pines/NYT)

Power, pollution and the Internet

Data centres waste vast amounts of energy, belying the information industry’s eco-friendly image

Santa Clara, California: Jeff Rothschild’s machines at Facebook had a problem he knew he had to solve immediately. They were about to melt.

The company had been packing a 40-by-60-foot rental space here with racks of computer servers that were needed to store and process information from members’ accounts. The electricity pouring into the computers was overheating Ethernet sockets and other crucial components.

Thinking fast, Rothschild, the company’s engineering chief, took some employees on an expedition to buy every fan they could find—“We cleaned out all of the Walgreens in the area," he said—to blast cool air at the equipment and prevent the website from going down.

That was in early 2006, when Facebook had a quaint 10 million or so users and the one main server site. Today, the information generated by nearly 1 billion people requires outsize versions of these facilities, called data centres, with rows and rows of servers spread over hundreds of thousands of square feet, and all with industrial cooling systems.

They are a mere fraction of the tens of thousands of data centres that now exist to support the overall explosion of digital information. Stupendous amounts of data are set in motion each day as, with an innocuous click or tap, people download movies on iTunes, check credit card balances on Visa’s website, send Yahoo email with files attached, buy products on Amazon, post on Twitter or read newspapers online.

A yearlong examination by The New York Times has revealed that this foundation of the information industry is sharply at odds with its image of sleek efficiency and environmental friendliness.

Most data centres, by design, consume vast amounts of energy in an incongruously wasteful manner, interviews and documents show. Online companies typically run their facilities at maximum capacity around the clock, whatever the demand. As a result, data centres can waste 90% or more of the electricity they pull off the grid, The Times found.

To guard against a power failure, they further rely on banks of generators that emit diesel exhaust. The pollution from data centres has increasingly been cited by the authorities for violating clean air regulations, documents show. In Silicon Valley, many data centres appear on the state government’s Toxic Air Contaminant Inventory, a roster of the area’s top stationary diesel polluters.

Worldwide, the digital warehouses use about 30 billion watts of electricity, roughly equivalent to the output of 30 nuclear power plants, according to estimates industry experts compiled for The Times. Data centres in the US account for one-quarter to one-third of that load, the estimates show.

“It’s staggering for most people, even people in the industry, to understand the numbers, the sheer size of these systems," said Peter Gross, who helped design hundreds of data centres. “A single data centre can take more power than a medium-size town."

Energy efficiency varies widely from company to company. But at the request of The Times, the consulting firm McKinsey and Co. analysed energy use by data centres and found that, on average, they were using only 6-12% of the electricity powering their servers to perform computations. The rest was essentially used to keep servers idling and ready in case of a surge in activity that could slow or crash their operations.

A server is a sort of bulked-up desktop computer, minus a screen and keyboard, that contains chips to process data. The study sampled some 20,000 servers in about 70 large data centres spanning the commercial gamut: drug companies, military contractors, banks, media companies and government agencies.

“This is an industry dirty secret, and no one wants to be the first to say mea culpa," said a senior industry executive who asked not to be identified to protect his company’s reputation. “If we were a manufacturing industry, we’d be out of business straightaway."

These physical realities of data are far from the mythology of the Internet: where lives are lived in the “virtual" world and all manner of memory is stored in “the cloud".

The inefficient use of power is largely driven by a symbiotic relationship between users who demand an instantaneous response to the click of a mouse and companies that put their business at risk if they fail to meet that expectation.

Even running electricity at full throttle has not been enough to satisfy the industry. In addition to generators, most large data centres contain banks of huge, spinning flywheels or thousands of lead-acid batteries—many of them similar to automobile batteries—to power the computers in case of a grid failure as brief as a few hundredths of a second, an interruption that could crash the servers.

“It’s a waste," said Dennis P. Symanski, a senior researcher at the Electric Power Research Institute, a non-profit industry group. “It’s too many insurance policies."

At least a dozen major data centres have been cited for violations of air quality regulations in Virginia and Illinois alone, according to state records. Amazon was cited with more than 24 violations over a three-year period in Northern Virginia, including running some of its generators without a basic environmental permit.

A few companies say they are using extensively re-engineered software and cooling systems to decrease wasted power. Among them are Facebook and Google, which also have redesigned their hardware. Still, according to recent disclosures, Google’s data centres consume nearly 300 million watts and Facebook’s about 60 million watts.

Many of these solutions are readily available, but in a risk-averse industry, most companies have been reluctant to make wholesale change, according to industry experts.

Improving or even assessing the field is complicated by the secretive nature of an industry that is largely built around accessing other people’s personal data.

For security reasons, companies typically do not even reveal the locations of their data centres, which are housed in anonymous buildings and vigilantly protected. Companies also guard their technology for competitive reasons, said Michael Manos, a longtime industry executive.

“All of those things play into each other to foster this closed, members-only kind of group," he said.

That secrecy often extends to energy use. To further complicate any assessment, no single government agency has the authority to track the industry. In fact, the federal government was unable to determine how much energy its own data centres consume, according to officials involved in a survey completed last year.

The survey did discover that the number of federal data centres grew from 432 in 1998 to 2,094 in 2010.

To investigate the industry, The Times obtained thousands of pages of local, state and federal records, some through freedom of information laws, that are kept on industrial facilities that use large amounts of energy. Copies of permits for generators and information about their emissions were obtained from environmental agencies, which helped pinpoint some data centre locations and details of their operations.

In addition to reviewing records from electrical utilities, The Times also visited data centres across the country and conducted hundreds of interviews with current and former employees and contractors.

Some analysts warn that as the amount of data and energy use continue to rise, companies that do not alter their practices could eventually face a shake-up in an industry that has been prone to major upheavals, including the bursting of the first Internet bubble in the late 1990s.

“It’s just not sustainable," said Mark Bramfitt, a former utility executive who now consults for the power and information technology industries. “They’re going to hit a brick wall."

Wearing an FC Barcelona T-shirt and plaid Bermuda shorts, Andre Tran strode through a Yahoo data centre in Santa Clara where he was the site operations manager. Tran’s domain—there were servers assigned to fantasy sports and photo sharing, among other things—was a fair sample of the countless computer rooms where the planet’s sloshing tides of data pass through or come to rest.

Aisle after aisle of servers, with amber, blue and green lights flashing silently, sat on a white floor punctured with small round holes that spit out cold air. Within each server were the spinning hard drives that store the data. The only hint that the centre was run by Yahoo, whose name was nowhere in sight, could be found in a tangle of cables coloured in the company’s signature purple and yellow.

“There could be thousands of people’s emails on these," Tran said, pointing to one storage aisle. “People keep old emails and attachments forever, so you need a lot of space."

This is the mundane face of digital information—player statistics flowing into servers that calculate fantasy points and league rankings, snapshots from nearly forgotten vacations kept forever in storage devices. It is only when the repetitions of those and similar transactions are added up that they start to become impressive.

Each year, chips in servers get faster, and storage media get denser and cheaper, but the furious rate of data production goes a notch higher.

Jeremy Burton, an expert in data storage, said that when he worked at a computer technology company 10 years ago, the most data-intensive customer he dealt with had about 50,000 gigabytes in its entire database. (Data storage is measured in bytes. The letter N, for example, takes one byte to store. A gigabyte is 1 billion bytes of information.)

Today, roughly 1 million gigabytes are processed and stored in a data centre during the creation of a single 3-D animated movie, said Burton, now at EMC, a company focused on the management and storage of data.

Just one of the company’s clients, the New York Stock Exchange, produces up to 2,000 gigabytes of data per day that must be stored for years, he added.

EMC and International Data Corp. together estimated that more than 1.8 trillion gigabytes of digital information were created globally last year.

“It is absolutely a race between our ability to create data and our ability to store and manage data," Burton said.

Some three-quarters of that data, EMC estimated, was created by ordinary consumers.

With no sense that data is physical or that storing it uses up space and energy, those consumers have developed the habit of sending huge data files back and forth, like videos and mass emails with photo attachments. Even the seemingly mundane actions like running an app to find an Italian restaurant in Manhattan or a taxi in Dallas requires servers to be turned on and ready to process the information instantaneously.

The complexity of a basic transaction is a mystery to most users: Sending a message with photographs to a neighbour could involve a trip through hundreds or thousands of miles of Internet conduits and multiple data centres before the email arrives across the street.

“If you tell somebody they can’t access YouTube or download from Netflix, they’ll tell you it’s a God-given right," said Bruce Taylor, vice-president of the Uptime Institute, a professional organization for companies that use data centres.

To support all that digital activity, there are now more than 3 million data centres of widely varying sizes worldwide, according to figures from International Data Corp.

Nationwide, data centres used about 76 billion kilowatt-hours in 2010, or roughly 2% of all electricity used in the country that year, based on an analysis by Jonathan G. Koomey, a research fellow at Stanford University who has been studying data centre energy use for more than a decade. Datacenter Dynamics, a London-based firm, derived similar figures.

Engineers at Viridity Software, a startup that helped companies manage energy resources, were not surprised by what they discovered on the floor of a sprawling data centre near Atlanta.

Viridity had been brought on board to conduct basic diagnostic testing. The engineers found that the facility, like dozens of others they had surveyed, was using the majority of its power on servers that were doing little except burning electricity, said Michael Rowan, who was Viridity's chief technology officer.

A senior official at the data centre already suspected that something was amiss. He had previously conducted his own informal survey, putting red stickers on servers he believed to be “comatose"—the term engineers use for servers that are plugged in and using energy even as their processors are doing little if any computational work.

“At the end of that process, what we found was our data centre had a case of the measles," the official, Martin Stephens, said during a Web seminar with Rowan. “There were so many red tags out there it was unbelievable."

The Viridity tests backed up Stephens’ suspicions: In one sample of 333 servers monitored in 2010, more than half were found to be comatose. All told, nearly three-quarters of the servers in the sample were using less than 10% of their computational brainpower, on average, to process data.

The data centre’s operator was not some seat-of-the-pants app developer or online gambling company, but LexisNexis, the database giant. And it was hardly unique.

In many facilities, servers are loaded with applications and left to run indefinitely, even after nearly all users have vanished or new versions of the same programs are running elsewhere.

“You do have to take into account that the explosion of data is what aids and abets this," said Taylor of the Uptime Institute. “At a certain point, no one is responsible anymore, because no one, absolutely no one, wants to go in that room and unplug a server."

Kenneth Brill, an engineer who in 1993 founded the Uptime Institute, said low utilization began with the field’s “original sin".

In the early 1990s, Brill explained, software operating systems that would now be considered primitive crashed if they were asked to do too many things, or even if they were turned on and off. In response, computer technicians seldom ran more than one application on each server and kept the machines on around the clock, no matter how sporadically that application might be called upon.

So as government energy watchdogs urged consumers to turn off computers when they were not being used, the prime directive at data centres became running computers at all cost.

A crash or a slowdown could end a career, said Michael Tresh, formerly a senior official at Viridity. A field born of cleverness and audacity is now ruled by something else: fear of failure.

 ©2012/The New York Times

Subscribe to Mint Newsletters
* Enter a valid email
* Thank you for subscribing to our newsletter.

Click here to read the Mint ePaperMint is now on Telegram. Join Mint channel in your Telegram and stay updated with the latest business news.

Close
×
Edit Profile
My Reads Redeem a Gift Card Logout