Saturday, 19 de April de 2014

Ficha del recurso:

Fuente:

Vínculo original en businessweek.com
TOM VANDERBILT

Fecha de publicación:

Thursday, 18 de June de 2009

Última actualización:

Thursday, 18 de June de 2009

Entrada en el observatorio:

Thursday, 18 de June de 2009

Idioma:

Inglés

Archivado en:


Data Center Overload

It began with an Xbox game.

On a recent rainy evening in Brooklyn, I was at a friend’s house playing (a bit sheepishly, given my incipient middle age) Call of Duty: World at War. Scrolling through the game’s menus, I noticed a screen for Xbox Live, which allows you to play against remote users via broadband. The number of Call of Duty players online at that moment? More than 66,000.

Walking home, I ruminated on the number. Sixty-six thousand is the population of a small city — Muncie, Ind., for one. Who and where was this invisible metropolis? What infrastructure was needed to create this city of ether?

We have an almost inimical incuriosity when it comes to infrastructure. It tends to feature in our thoughts only when it’s not working. The Google search results that are returned in 0.15 seconds were once a stirring novelty but soon became just another assumption in our lives, like the air we breathe. Yet whose day would proceed smoothly without the computing infrastructure that increasingly makes it possible to navigate the world and our relationships within it?

Much of the daily material of our lives is now dematerialized and outsourced to a far-flung, unseen network. The stack of letters becomes the e-mail database on the computer, which gives way to Hotmail or Gmail. The clipping sent to a friend becomes the attached PDF file, which becomes a set of shared bookmarks, hosted offsite. The photos in a box are replaced by JPEGs on a hard drive, then a hosted sharing service like Snapfish. The tilting CD tower gives way to the MP3-laden hard drive which itself yields to a service like Pandora, music that is always “there,” waiting to be heard.

But where is “there,” and what does it look like?

“There” is nowadays likely to be increasingly large, powerful, energy-intensive, always-on and essentially out-of-sight data centers. These centers run enormously scaled software applications with millions of users. To appreciate the scope of this phenomenon, and its crushing demands on storage capacity, let me sketch just the iceberg’s tip of one average individual digital presence: my own. I have photos on Flickr (which is owned by Yahoo, so they reside in a Yahoo data center, probably the one in Wenatchee, Wash.); the Wikipedia entry about me dwells on a database in Tampa, Fla.; the video on YouTube of a talk I delivered at Google’s headquarters might dwell in any one of Google’s data centers, from The Dalles in Oregon to Lenoir, N.C.; my LinkedIn profile most likely sits in an Equinix-run data center in Elk Grove Village, Ill.; and my blog lives at Modwest’s headquarters in Missoula, Mont. If one of these sites happened to be down, I might have Twittered a complaint, my tweet paying a virtual visit to (most likely) NTT America’s data center in Sterling, Va. And in each of these cases, there would be at least one mirror data center somewhere else — the built-environment equivalent of an external hard drive, backing things up.

Small wonder that this vast, dispersed network of interdependent data systems has lately come to be referred to by an appropriately atmospheric — and vaporous — metaphor: the cloud. Trying to chart the cloud’s geography can be daunting, a task that is further complicated by security concerns. “It’s like ‘Fight Club,’ ” says Rich Miller, whose Web site, Data Center Knowledge, tracks the industry. “The first rule of data centers is: Don’t talk about data centers.”

Yet as data centers increasingly become the nerve centers of business and society — even the storehouses of our fleeting cultural memory (that dancing cockatoo on YouTube!) — the demand for bigger and better ones increases: there is a growing need to produce the most computing power per square foot at the lowest possible cost in energy and resources. All of which is bringing a new level of attention, and challenges, to a once rather hidden phenomenon. Call it the architecture of search: the tens of thousands of square feet of machinery, humming away 24/7, 365 days a year — often built on, say, a former bean field — that lie behind your Internet queries.

INSIDE THE CLOUD

Microsoft’s data center in Tukwila, Wash., sits amid a nondescript sprawl of beige boxlike buildings. As I pulled up to it in a Prius with Michael Manos, who was then Microsoft’s general manager of data-center services, he observed that while “most people wouldn’t be able to tell this wasn’t just a giant warehouse,” an experienced eye could discern revelatory details. “You would notice the plethora of cameras,” he said. “You could follow the power lines.” He gestured to a series of fluted silver pipes along one wall. “Those are chimney stacks, which probably tells you there’s generators behind each of those stacks.” The generators, like the huge banks of U.P.S. (uninterruptible power supply) batteries, ward against surges and power failures to ensure that the data center always runs smoothly.

After submitting to biometric hand scans in the lobby and passing through a sensor-laden multidoor man trap, Manos and I entered a bright, white room filled with librarylike rows of hulking, black racks of servers — the dedicated hardware that drives the Internet. The Tukwila data center happens to be one of the global homes of Microsoft’s Xbox Live: within those humming machines exists my imagined city of ether. Like most data centers, Tukwila comprises a sprawling array of servers, load balancers, routers, fire walls, tape-backup libraries and database machines, all resting on a raised floor of removable white tiles, beneath which run neatly arrayed bundles of power cabling. To help keep servers cool, Tukwila, like most data centers, has a system of what are known as hot and cold aisles: cold air that seeps from perforated tiles in front is sucked through the servers by fans, expelled into the space between the backs of the racks and then ventilated from above. The collective din suggests what it must be like to stick your head in a Dyson Airblade hand dryer.

(Continúe leyendo en nytimes.com)