Google on Tuesday announced a new tool on Wednesday for app developers that’s absolutely amazing and also a little disturbing: it lets apps, robots and drones “see.”
The tech is called the Google Cloud Vision API and it allows any app developer to tap into Google’s smart “machine learning” service that can identify objects, including faces and emotions.
Cloud Vision solves a hard computer problem of “seeing.” For instance, your computer can scan and reproduce an image of a mountain or an image of a baby, but to the computer, they look the same: a bunch of pixels. Your computer can’t sift through your photos and find baby photos for you, unless you’ve tagged them “baby.”
Google Photos, on the other hand, can find the baby photos.
And now Google is making that technology available to programmers to add to their apps.
Cloud Vision can even detect various emotions on a face such as a happy smile or an angry frown, Google says.
To demonstrate the power of Cloud Vision, Google built a cute little robot with Cloud Vision, via a DYI robot kit known as GoPiGo.
At 1:14 in the video below, the robot demonstrates how it can follow faces and detect emotions. At 2:08, the robot demonstrates how it can detect and identify other objects like glasses, a banana, money.
As for the terrifying part …
While the demo robot is adorbs, it doesn’t take much imagination to see less cuddly uses for computer vision.
For instance, at least one drone maker has been testing Google Vision out: Aerosense, owned by Sony Mobile Communications.
“We have drones that take thousands of photos per flight. We find that Google Cloud Vision API as the best way to turn those huge number of photos, automatically produced, into meaningful insights,”Aerosense General Manager Tomoaki Kobayakawa says on Google’s blog.
And given Amazon’s recent news that its drone delivery project is progressing, we can’t help thinking of that Audi commercial featuring people-seeking delivery drones run-amok.