Facebook announced the Open Compute Project in 2011 as a way to openly share the designs for its data centres — “to spark a collaborative dialogue … [and] collectively develop the most efficient computing infrastructure possible.”
Starting in 2009, three Facebook employees dedicated themselves to custom-designing servers, server racks, power supplies, UPS units, and battery backup systems for the company’s first data center in Prineville, Oregon.
By 2011, Facebook’s data center in Prineville used 38% less energy to do the same work as the company’s other data centres at the time, while costing 24% less.
Since then, Facebook has improved on its designs, and last summer, opened another data center in Lulea, Sweden.
With the help of Facebook and photographer Alan Brandt, we compiled some photos to show off what Facebook’s data centres look like from the inside and outside. And these are some really gorgeous-looking facilities.
The interior of Facebook's data center in Forest City, North Carolina. The company launched this center in 2010.
Inside Lulea's data center, you can see Facebook's 'vanity free' approach to design, since there are no plastic bezels in front of its servers -- something commonly found in other data centres -- to allow those servers to draw in more air.
In the Lulea data center, web server and storage designs use snaps and spring-loaded catches to hold components in place.
Lulea's rapid deployment data center (RDDC) design is all about being lean, which allows Facebook to deploy two data halls in the time it previously took to deploy one, thus reducing the cost of construction.
This is Facebook's data center in Prineville, Oregon, which is the first data center deployed using the company's Open Compute Project designs.
Facebook used 1,560 tons of steel to build its Prineville data center, which is the equivalent to 900 mid-size cars.
Facebook's Prineville data center also uses a lot of wires and cables. In fact, there are 950 miles worth of wires and cables in this data center alone, which is roughly the distance between Boston and Indianapolis.
The Prineville data center also has a ton of concrete: 14,254 cubic yards, to be exact. Imagine a sidewalk that's 24.3 miles long.
Thanks to Facebook's unique server design, technicians, like this one working in Prineville, don't have to spend time finding the right tools and unscrewing multiple components every time they need to replace a failed component.
With the efficiency gains afforded by the unique server designs, Facebook has reduced the average repair time to swap parts by more than 50%.
Facebook's rapid data center deployment is similar to assembling a car: The structural frame is built before all of the components, which are attached on an assembly line in a factory. The entire structure is driven to the building site on a truck.
Here you can see technicians delivering server racks to the Lulea data center. Facebook takes 'the Ikea approach,' in which pieces of the finished constructs are packed together in a flat box, simplifying the assembly process to avoid any mistakes.
As a result of these unique data centres, Facebook can handle 6 billion daily 'Likes,' as well as the 400 billion photos and 7.8 trillion messages that have been sent since Facebook was founded a decade ago.
NOW WATCH: Tech Insider videos
Business Insider Emails & Alerts
Site highlights each day to your inbox.