- AImotive is developing self-driving car technology that relies on inexpensive cameras, rather than a pricey Lidar system.
- The company recently began testing a car equipped with its technology in Silicon Valley.
- I got a test ride, and it was an unnerving experience.
MOUNTAIN VIEW, Calif. – Covering the development of self-driving cars has offered some unique experiences.
I’ve ridden in a car with no one in the driver’s seat. I’ve ridden in one that had the driver’s seat turned around so that it faced the backseat. And I’ve ridden in one that piloted itself around the narrow, hilly streets of San Francisco.
But until last week, I’d never ridden in a self-driving car that sensed the world around it entirely with cameras. And I’ve got to say it wasn’t the most reassuring experience.
The car – a late-model Toyota Prius – was a test vehicle equipped with technology from AImotive, a Hungarian startup (Hungary is primarily known in the automotive world for its tiny, toy-like “microcars”).
AImotive is developing autonomous vehicle software that relies on cameras rather than laser-based Lidar arrays to detect vehicles, pedestrians, and other obstacles. Cameras cost a small fraction of Lidar systems, and the company is betting that by using them instead, its partners will be able to deploy self-driving cars faster and at a lower price.
Even at this early stage in the era of self-driving cars however, the sight of a vehicle without lidar is almost as jarring as that of a car without a steering wheel.
Our ride began like many ongoing test drives in autonomous cars, with a human driver behind the wheel and with another human in the passenger’s seat monitoring the car’s sensors and autonomous driving systems. But this ride was a bit different in that the human driver actually piloted the vehicle for a while.
AImotive is only testing its cars on highways right now
At the moment, AImotive has only developed its technology enough to road test it on highways; it won’t begin to test it on city streets (a more challenging environment for autonomous cars) until early next year. So AImotive’s driver had to steer us from the company’s house-like office at the end of a dead-end street in an industrial area here to the entrance to Highway 101, Silicon Valley’s north-south artery. AImotive’s system didn’t kick in until we were already on the freeway, heading south.
Unlike other autonomous vehicles I’ve ridden in, this one lets you know when the computer is in control – it has a box on its dash that illuminates the word “self-drive.” The AImotive engineers turned on the system, and we were under robot control.
The first thing I noticed was that the car kept drifting to the right side of the lane we were in. It never crossed the line, but it repeatedly got uncomfortably close. It was particularly disquieting near the beginning of my ride, when a semi-truck with a trailer was immediately to the right of us. A part of me was wishing AImotive’s human driver would take control and steer us back to the center of our lane and away from the truck.
Laszlo Kishonti, AImotive’s CEO, said it wasn’t clear why our car was drifting to the right of its lane. It’s possible, he and his colleagues suggested, that it had to do with the road being banked, but they also said it was likely just “a maths problem.”
“We are trying to find that reason,” Kishonti said. He continued: “I think we’ll solve this in the next two weeks.”
The startup has a lot of work ahead
But it wasn’t just the drifting that was unnerving. As we headed south, another car cut right in front of us. When it did, the human driver immediately took control and applied the brakes. The system could have handled the situation, but likely would have slowed down much more abruptly than the human driver did, because it hasn’t yet been tuned for such scenarios, Kishonti said.
The test drive took place around 1 p.m. on a bright and clear day. In front of him, Bence Varga, AImotive’s head of European sales, who road in the front-passenger seat, had a computer monitor that showed what car’s cameras and systems could see. The display showed the view from the various cameras, labelled the lane lines in front of us, and identified the cars around us.
Lane drifting and sudden stopping aside, the big question for AImotive is how well its system will do at night, in the fog, or in other low-visibility conditions. Unlike Lidar or radar systems, regular cameras have similar limitations to human eyes in such situations – they just can’t see very far. Kishonti said AImotive’s system will respond to them the way human drivers do – by slowing down.
It remains to be seen if that will be good enough for consumers, AImotive’s customers, or regulators. Regardless,my ride seemed to indicate that even on the basics of highway driving, the company still has a lot of work ahead of it.