When Henry Ford developed the assembly line technique that would lead to mass vehicle production, he would have never anticipated the modern problems that cars have caused: congested streets, air pollution, or even angry commuters sitting in traffic.
It was an unintended consequence of innovation, said Alysha Naples, former senior director of user interaction and experience at Magic Leap: the secretive billion-dollar Google-backed startup that is building technology that “augments” human vision with digital imagery.
Over the weekend it was revealed that Magic Leap is scrambling to finish a working prototype of its portable augmented reality device to present to its board this week.
We got a look at the innovative prototype described to Business Insider as “PEQ0”:
Speaking at Pause Fest 2017 in Melbourne last week, Naples had a stern warning for technology creators: Slow down, or risk being defined by your mistakes.
“Technology and innovation is helping us scale so much faster,” she said.
“But looking at the history of technology, and what we are doing in innovation, mistakes have been made.
“(For example), we form lines of angry people sitting in cars, wishing they can be somewhere else,” she said.
“This is not what Henry Ford, and his employees worked to enable. This is an unintended consequence of a very well intentioned technological invention.”
She reflected on the failures of Microsoft’s AI chatbot “Tay”, which turned into a genocidal racist, and Google Photo’s facial recognition technology that allegedly tagged two black people’s faces with the word “gorillas”.
While it was “tremendous work” and a great idea, Naples says innovation sometimes means an idea can go wrong “and when it does, it’s really not OK”.
The same issue can be seen in Facebook’s fake news problem.
During the US presidential election, teenagers in Macedonia created fake, pro-Trump news stories designed to go viral on Facebook — and they succeeded.
The issue for Naples is that in all these circumstances the human element, the emotion, was taken out of the process.
“In the hopes of creating an algorithm to remove the bias that people bring to curating and reporting, (Facebook removed) the editorial staff from the trending feature,” she said.
“Because we all have bias… they thought it would be more fair to let an algorithm (do the work).”
The problem was that the algorithm did not have the ability to understand truth from lie, people found a hole in the technology, manipulated the system and disaster struck.
While Facebook had the best intentions, she said “by not thinking about how it can be exploited or twisted, they left themselves vulnerable”.
This is an example of how “we are defined by our mistakes more than by our successes,” she says.
“It’s now almost impossible to get the cat back into the bag.”
Naples said she has spent nearly two decades working on the future. While admitting she is an optimist, she also has some pretty serious concerns when she sees designers not thinking things through enough.
“Being good is not good enough,” she said.
“I use these cases to show you that even with the best of intentions you can create a product which is harmful.
“Technology is not about breakthroughs. Innovation is about the unintended long-term consequences of the decisions that we are making.
“We have to approach this from holistic viewpoint because if we don’t we’ll end up in a world of hurt.
“We need to slow down. Make the effort to anticipate the issues that your technological invention may be creating.”
The key, Naples said, was to make yourself the villain in your own story.
“You have to say, ‘OK, I’m the bad guys, how can I twist this to my will?’.”
But if things are going badly, she says you must own your mistakes.
“Admit it and fix it.”
“Right now we have opportunity to make these changes and make them better.
“If we take the time to think about connection and belonging… we will be on the way to a better future”.
*The author travel to Melbourne as a guest of Pause Fest 2017.
Business Insider Emails & Alerts
Site highlights each day to your inbox.