On Thursday, Facebook’s industry-changing Open Compute Project (OCP) took on one of the biggest, most important new fields in tech: machine learning and artificial intelligence.
Facebook is giving away for free some hardware designs that make computers smarter and able to learn.
While most artificial intelligence is powered by software, as the software gets more sophisticated, it relies on more powerful and sophisticated hardware to execute its needs.
In a blog post from Facebook’s engineering team, engineers Kevin Lee and Serkan Piantino explain:
Over the past several years, Facebook has open sourced infrastructure components and design and has developed software that can read stories, answer questions about scenes, play games, and even learn unspecified tasks. Facebook’s AI work, which it’s done in the open, has (until today) consisted of research papers and code; now, the company is adding hardware to this list for the first time ever, providing a blueprint for how other organisations can set up AI-specific infrastructure — whether they’re startups, academics or large companies.
In other words, Facebook has already freely given away much of the machine-learning software it created. Now it’s handing a key hardware design from a project code named Big Sur to OCP.
OCP is an organisation Facebook created to do for hardware what the Linux operating system did for software, make it free and open-source. That means that there are no guarded trade secrets. Engineers are free to collaborate, to see each other’s hardware designs, use them, change them, and share their changes.
Facebook + Nvidia
Big Sur is a graphics processor unit (GPU) designed for “training” smart computer and networks of smart computers (also known as “neural networks”).
ook created this GPU system with the biggest gun in the GPU world, Nvidia. Big Sur uses Nvidia’s new Tesla M40 GPU accelerators, introduced last month. Facebook is the first company to use Nvidia’s new GPU processors to train neural networks, Nvidia says.
That doesn’t mean that the Nvidia chip’s design is being given away for free. Companies wanting to build a neural network using Facebook’s design would still need to buy the chips from Nvidia.
But as with all OCP projects, engineers can swap out components and use other components besides Nvidia with the Big Sur design. But by using Nvidia’s latest tech, Facebook Big Sur is “twice as fast as our previous generation, which means we can train twice as fast and explore networks twice as large.”
What does Facebook get out of this deal? Any engineer anywhere in the world with an idea can help Facebook build better tech for lower costs.
In 2015, Facebook estimated that it had saved $2 billion on its data centres from the OCP project.
As engineers Lee and Piantino write:
We want to make it a lot easier for AI researchers to share techniques and technologies. As with all hardware systems that are released into the open, it’s our hope that others will be able to work with us to improve it. We believe that this open collaboration helps foster innovation for future designs, putting us all one step closer to building complex AI systems that bring this kind of innovation to our users and, ultimately, help us build a more open and connected world.