Google’s artificial intelligence division has created a computer that can learn how to play video games and eventually beat humans at them.
Google is working on a lot of advanced research projects, but AI is particularly important for the company because it could help improve search results — Google’s core business. CEO Larry Page has said that he wants search to be able to anticipate what a person wants before they even know they want it.
Last year, Google bought an AI company called DeepMind for a reported $US628 million, and this project comes out of that group. The researchers shared their results in the journal Nature this morning.
Researchers showed the computer 49 games on the Atari 2600, a simple game console that was popular in the 1980s. They gave the computer no instructions on how to play the game, but instead forced it to watch and learn on its own. They set up a system that “rewarded” the computer for playing well, so it knew when it was improving.
The computer was able to beat all humans in 29 of the games, and performed better than any other known computer algorithm in 43. It did particularly well at simple games like Breakout, where you move a paddle to hit a ball and try and break bricks down.
The one gap, according to the MIT Technology Review, was in games like Ms. Pac Man, where it had to plan ahead ot figure out how to clear the last dots from the maze. Computers have trouble planning even a few seconds ahead, and this one was only able to look at the immediate past — the past 1/15th of a second — and learn from its mistakes in that time.