How Google's self-driving cars see the world

Google self driving carGoogleGoogle’s self-driving cars are already in a few cities.

Google’s self-driving cars are travelling more naturally than they ever have before.

Decked out with GPS, sensors, cameras, radar, and lasers, Alphabet’s (Alphabet is the new parent company of Google) cars are capable of gathering tons of data about their environment from a 360 degree perspective so that they can seamlessly operate in a constantly changing environment.

According to Google’s Self-Driving Car Project website, sensors on the car can detect objects up to two football fields away, including people, vehicles, construction zones, birds, cyclists, and more.

But the data collected by each vehicle does more than allow it to respond in the moment. All of the data each car collects is used to constantly improve the software, so that all cars can learn from one vehicle’s experience.¬†

Given that Google’s self-driving cars have driven more than 1.2 million miles in autonomous mode since 2009, the software knows how to react in a lot of different situations.

Chris Urmson, the head of technical development for the project, gave a thorough look at how its cars are operating in real-life scenarios in June during a Ted Talk presentation.

“We can take all of the data that the vehicles see overtime, the hundreds and thousands of pedestrians, cyclists, and vehicles that have been out there and understand what they look like and use that to understand infer what other vehicles should look like and other pedestrians should look like,” Urmson said.¬†

Here’s a breakdown of how Google’s self-driving cars see the world around them and how they are using real-time data to respond to a wide range of scenarios.

Google's self-driving vehicles first establish their location by using mapping and sensor data.

Then it uses its sensor data to understand what it sees in the moment. The software processes all of the data and classifies objects based on size, shape, and movement patterns.

In this particular image, the car sees other vehicles, which are represented by the purple figures, a cyclist, which is outlined in red, and some orange cones in the top left corner.

But the vehicle doesn't just need to know about its surroundings, it also needs to make predictions about how other vehicles, people, and objects around it will move. Especially on city streets where there are a lot more pedestrians and there is significantly more traffic.

RAW Embed

So the vehicle takes its collected data and determine how to make momentary decisions, like the correct trajectory, or whether it should slow down or speed up.

RAW Embed

The car is programmed to understand and respond to things like the flashing lights of a police car or when a school bus is stopped in front of it.

RAW Embed

It can also determine when a police officer has halted traffic or when the car is being signalled to move forward.

RAW Embed

When a cyclist is trying to merge into a lane, the vehicle also knows to slow down and let the cyclist enter.

RAW Embed

While Google's cars can anticipate a lot of things by using this collected data, there are still going to be situations that arise that have never happened before.

For example, Urmson said that one time when one of its test cars was driving through Mountain View, it came across a woman in an electric chair chasing a duck in circles on the road. But the vehicle was still able to adjust and slow down to avoid her.

RAW Embed

Check out more of what Google's self-driving car sees and Urmson's full presentation below.

NOW WATCH: Briefing videos

Business Insider Emails & Alerts

Site highlights each day to your inbox.

Follow Business Insider Australia on Facebook, Twitter, LinkedIn, and Instagram.