Author Topic: INSIDE THE ARCTIC CIRCLE, WHERE YOUR FACBOOK DATA LIVES  (Read 875 times)

Offline Reginald Hudlin

  • Landlord
  • Honorary Wakandan
  • *****
  • Posts: 9884
    • View Profile
INSIDE THE ARCTIC CIRCLE, WHERE YOUR FACBOOK DATA LIVES
« on: October 10, 2013, 02:12:13 pm »
Every year, computing giants including Hewlett-Packard (HPQ), Dell (DELL), and Cisco Systems (CSCO) sell north of $100 billion in hardware. That’s the total for the basic iron—servers, storage, and networking products. Add in specialized security, data analytics systems, and related software, and the figure gets much, much larger. So you can understand the concern these companies must feel as they watch Facebook (FB) publish more efficient equipment designs that directly threaten their business. For free.

The Dells and HPs of the world exist to sell and configure data-management gear to companies, or rent it out through cloud services. Facebook’s decision to publish its data center designs for anyone to copy could embolden others to bypass U.S. tech players and use low-cost vendors in Asia to supply and bolt together the systems they need.

“There is this massive transition taking place toward what the new data center of tomorrow will look like.” —Peter Levine
Instead of buying server racks from the usual suspects, Facebook designs its own systems and outsources the manufacturing work. In April 2011, the social networking company began publishing its hardware blueprints as part of its so-called Open Compute Project, which lets other companies piggyback on the work of its engineers. The project now sits at the heart of the data center industry’s biggest shift in more than a decade. “There is this massive transition taking place toward what the new data center of tomorrow will look like,” says Peter Levine, a partner at venture capital firm Andreessen Horowitz. “We’re talking about hundreds of billions if not trillions of dollars being shifted from the incumbents to new players coming in with Facebook-like technology.” (Bloomberg LP, which owns Bloomberg Businessweek, is an investor in Andreessen Horowitz.)

The heart of Facebook’s experiment lies just south of the Arctic Circle, in the Swedish town of Luleå. In the middle of a forest at the edge of town, the company in June opened its latest megasized data center, a giant building that comprises thousands of rectangular metal panels and looks like a wayward spaceship. By all public measures, it’s the most energy-efficient computing facility ever built, a colossus that helps Facebook process 350 million photographs, 4.5 billion “likes,” and 10 billion messages a day. While an average data center needs 3 watts of energy for power and cooling to produce 1 watt for computing, the Luleå facility runs nearly three times cleaner, at a ratio of 1.04 to 1. “What Facebook has done to the hardware market is dramatic,” says Tom Barton, the former chief executive officer of server maker Rackable Systems (SGI). “They’re putting pressure on everyone.”

The location has a lot to do with the system’s efficiency. Sweden has a vast supply of cheap, reliable power produced by its network of hydroelectric dams. Just as important, Facebook has engineered its data center to turn the frigid Swedish climate to its advantage. Instead of relying on enormous air-conditioning units and power systems to cool its tens of thousands of computers, Facebook allows the outside air to enter the building and wash over its servers, after the building’s filters clean it and misters adjust its humidity. Unlike a conventional, warehouse-style server farm, the whole structure functions as one big device.

To simplify its servers, which are used mostly to create Web pages, Facebook’s engineers stripped away typical components such as extra memory slots and cables and protective plastic cases. The servers are basically slimmed-down, exposed motherboards that slide into a fridge-size rack. The engineers say this design means better airflow over each server. The systems also require less cooling, because with fewer components they can function at temperatures as high as 85F. (Most servers are expected to keel over at 75F.)

When Facebook started to outline its ideas, traditional data center experts were skeptical, especially of hotter-running servers. “People run their data centers at 60 or 65 degrees with 35-mile-per-hour wind gusts going through them,” says Frank Frankovsky, Facebook’s vice president of hardware design and supply chain operations, who heads the Open Compute Project. Its more efficient designs have given the company freedom to place its data centers beyond the Arctic. The next one will go online in Iowa, where cheap wind power is plentiful. The company has also begun designing its own storage and networking systems. Frankovsky describes the reaction from hardware suppliers as, “Oh my gosh, you stole my cheese!”