Here's A Tour Of Facebook's Revolutionary New Data centre That It's Using To Clobber Google


Photo: Courtesy of Open Compute Project

Facebook just opened a futuristic, energy efficient data centre that cost them tens of millions of dollars to build.The data centre is the company’s first — Facebook previously leased data centre space — and it’s located in Prineville, Oregon. 

It’s 38% more energy efficient than the standard data centre and cost 24% less to build.

The best trick: Facebook released all the specs for the data centre as part of a new Open Compute Project.

The goal is to move the entire data centre industry forward — and gradually reduce one of Google’s big advantages.

A high efficiency electrical system wastes less power.

The data centre uses special power supplies that can accept 277 volts of AC power instead of the usual 208 volts. This lets power enter the building and go directly to the servers, instead of going through various AC-to-DC conversions that waste power.

The chassis for each server is perfectly smooth.

The case for each server is as simple as possible: no screws, no sharp corners, and has simple ways to swap out new components like motherboards. It's also taller than normal so it can fit a taller heat sink and bigger fan inside -- which means less outside air is needed for cooling. Data centre workers can access cabling from the 'cool aisle' in the front instead of the 'hot aisle' at the back, where temperatures are usually around 100 F.

The actual guts of the servers are bare-bones motherboards.

Inside the server, motherboards from Intel and AMD were stripped down to the bare necessities to save money. It looks complicated, but in fact a lot of features found in typical motherboards are missing.

The servers are racked into triplet racks.

Each column has 30 servers, for a total of 90 per rack.

The battery cabinet provides back up power.

Between the triplets, a set of uninterruptible power supplies (UPSs -- think big batteries) provides backup in case of a power outage.

The cooling system uses outside air -- there's no air conditioning.

Cooling is usually one of the the biggest costs in a data centre. Facebook is using 100% outside air, which means no power has to be used on air conditioning. The air enters through a group of vents on the second floor, then is passed through filters and a 'misting chamber' which adds humidity if necessary. The entire cooling system is on the second floor, and air is blown down directly on to the servers -- no ducts required.

The Oregon high desert has a great climate for a data centre.

Facebook studied many possible locations before deciding on Prineville. It's high desert -- this is the landscape outside of town -- which means it's very dry and seldom gets very hot. The all-time high temperature there is 105 F, and Facebook has designed its data centre to be able to handle temps of 110.

One area where Facebook DIDN'T save money was the cool blue light.

Facebook decided to use blue LEDs because they're cool looking and match the Facebook logo, even though they cost $0.07 apiece. Green ones would have cost only $0.02 each.

Efficiency not only saves money -- it's also good for the earth!

With all of these advances, the Facebook data centre has a power usage effectiveness (PUE) of 1.073. That means 93% of the power that comes into the data centre actually makes it into the server. Normal data centres have a PUE of around 1.5. Less power used means fewer fossil fuels burned, which means you don't have to feel quite guilty for spending all day playing FarmVille.

But what about working at Facebook HQ?

NOW WATCH: Tech Insider videos

Business Insider Emails & Alerts

Site highlights each day to your inbox.

Follow Business Insider Australia on Facebook, Twitter, LinkedIn, and Instagram.