Facebook just opened a futuristic, energy efficient data centre that cost them tens of millions of dollars to build.The data centre is the company’s first — Facebook previously leased data centre space — and it’s located in Prineville, Oregon.
It’s 38% more energy efficient than the standard data centre and cost 24% less to build.
The best trick: Facebook released all the specs for the data centre as part of a new Open Compute Project.
The goal is to move the entire data centre industry forward — and gradually reduce one of Google’s big advantages.
The data centre uses special power supplies that can accept 277 volts of AC power instead of the usual 208 volts. This lets power enter the building and go directly to the servers, instead of going through various AC-to-DC conversions that waste power.
The case for each server is as simple as possible: no screws, no sharp corners, and has simple ways to swap out new components like motherboards. It's also taller than normal so it can fit a taller heat sink and bigger fan inside -- which means less outside air is needed for cooling. Data centre workers can access cabling from the 'cool aisle' in the front instead of the 'hot aisle' at the back, where temperatures are usually around 100 F.
Inside the server, motherboards from Intel and AMD were stripped down to the bare necessities to save money. It looks complicated, but in fact a lot of features found in typical motherboards are missing.
Between the triplets, a set of uninterruptible power supplies (UPSs -- think big batteries) provides backup in case of a power outage.
Cooling is usually one of the the biggest costs in a data centre. Facebook is using 100% outside air, which means no power has to be used on air conditioning. The air enters through a group of vents on the second floor, then is passed through filters and a 'misting chamber' which adds humidity if necessary. The entire cooling system is on the second floor, and air is blown down directly on to the servers -- no ducts required.
Facebook studied many possible locations before deciding on Prineville. It's high desert -- this is the landscape outside of town -- which means it's very dry and seldom gets very hot. The all-time high temperature there is 105 F, and Facebook has designed its data centre to be able to handle temps of 110.
Facebook decided to use blue LEDs because they're cool looking and match the Facebook logo, even though they cost $0.07 apiece. Green ones would have cost only $0.02 each.
With all of these advances, the Facebook data centre has a power usage effectiveness (PUE) of 1.073. That means 93% of the power that comes into the data centre actually makes it into the server. Normal data centres have a PUE of around 1.5. Less power used means fewer fossil fuels burned, which means you don't have to feel quite guilty for spending all day playing FarmVille.