Artificial intelligence and autonomous cars are no longer exclusive to science fiction. Google’s self-driving cars have driven more than five million kilometres, while some Teslas can drive themselves under certain conditions, and Singapore could see a fully autonomous taxi hit the streets by the end of the year.
While all these initiatives are focused on what’s outside the car, on equipping and teaching machines to understand and react to the outside world, it’s also important to understand the people inside.
That’s the mission of Canberra-based Seeing Machines. The solution they created – a camera that can understand when drivers are fatigued or distracted, has helped save thousands of lives.
The company traces its origins to a group of roboticists at the Australian National University in 1997. According to Adrian Dean, director of marketing at Seeing Machines, the researchers were imagining the “car of the future” and realised that an autonomous car would need to interact with passengers.
“The future car would need to understand and relate to the occupants and the driver, in a similar way that a humanoid robot would need to,” says Dean.
“Understanding the driver was the next step in marrying the two together. Because it is about how the driver is interacting with the robot vehicle, but also so that the autonomous car can understand what the occupants are doing when it’s driving.”
That an autonomous car should know what its occupants are up to is especially important in the handover period between human and autonomous drivers. Many of the recent laws allowing autonomous cars require a person to sit behind the wheel – Tesla’s Autopilot requires the driver to touch the wheel regularly. But for this to be useful, the car needs to know that the human is capable of taking over, that they aren’t asleep, texting or otherwise distracted.
To understand the passengers inside a car, Seeing Machines married hardware and software. They created a camera to sit on the dashboard, and developed algorithms to process the information. The cameras can collect a whole series of data points – from the positioning of the eyes and the face, to detecting heart rate through the skin. They can use this to infer a great deal about the state of the driver.
“We can measure eyelid closure very accurately, which lets us measure and intervene in micro sleeps. There’s also the head pose — whether it’s tilted a certain way. You can understand if someone is looking down at their phone,” says Dean.
“What we can also detect is where you looking – using infra-red we detect the glint off your eye to interpret your region of interest. The technology works even if the driver is wearing glasses.”
In the future, these capabilities will be vital for autonomous cars to understand the state and intention of their passengers. But they also have very real uses right now, and the technology has already been adopted by many in the mining, aviation and transport industries.
Truck drivers and pilots often work long shifts, operating complex and potentially deadly equipment. Pilots will spend hours on end looking at instrument clusters, trying to figure out what is going on. These are situations ripe for distractions and micro sleeps. Both are huge causes of accidents.
Seeing Machines sold more than 4,000 units worldwide to mining companies. The devices monitor driver fatigue and distraction, send out alerts in case of micro-sleep, and record all the data so companies could react to new information.
The company is doing similar things for truck drivers, partnering with fleet owners like Toll and Linfox to monitor and alert drivers when they are in dangerous situations. There’s also extra information they can add into the mix, like GPS, to infer even more about what drivers are experiencing. Altogether, these technologies allow companies to know more about what’s going on in the front line, to react and reduce risk where they can.
“We’ve saved the lives of thousands of drivers, whether they are on the mine sites or the public roads,” says Dean.
“It’s quite a difficult technology to sell to a truck driver, but then they get home to their kids because we woke them up before they had an accident. At its core, that’s probably the biggest impact that we’ve had.”