Siri represents a mainstream success in getting people to communicate with their electronics, but it’s hardly the first time someone talked to a computer.Siri has plenty of ancestors dating all the way back to the 1960s, before anyone even heard the term “artificial intelligence.”
Did you know that the first program that understood typed English was created as a means to provide therapy to the user?
How intelligent can software become? Can it ever be “alive?”
The ability to talk to machines raises all of these questions and more. We caught an excellent episode of Radiolab on WNYC that tackles all of them.
Language is totally nuts. Consider things like grammar, syntax, tone, and sarcasm. In order for a computer to communicate seamlessly with a user, it should understand all these aspects of language.
And here's the scary thing: the best programmers can make this happen.
It's pretty amazing that a computer can turn something as complex as human speech into ones and zeroes.
Even the most basic human actions are made up of countless subroutines. If we want to put on a hat, for example, we just put it on. But for a computer to understand 'put on a hat,' it has to know what a hat is. Then it has to 'locate the hat,' 'pick up the hat,' and 'place hat on head.'
Well-programmed chatbots can understand loads of these primitive elements of language.
The idea of personally communicating with a computer started in 1966.
MIT professor Joseph Weizenbaum became aware of 'non-directive Rogerian therapy.' It was a system in which a therapist would identify key words that a patient used and repeat them back.
For example, a patient might say, 'I'm feeling depressed today.' The therapist would say, 'I'm sorry to hear you're feeling depressed.
Weizenbaum thought that behaviour would be easy enough to program, so he did. He called it ELIZA, and it changed everything.
Weizenbaum left ELIZA running on a computer for anyone to use. He found that people wanted to be left alone while using it, as if they were having a private conversation.
In one incident, his assistant poured out her heart to ELIZA, revealing all kinds of personal details to the program. Weizenbaum became distraught -- he was so disturbed by what was happening that, in an odd 180-degree turn, he spent the rest of his career crusading against artificial intelligence.
Mathematician Alan Turing raised two important questions: Can computers think? And how will we know when they can?
He proposed his famous Turing test as a means to figure this out. A person would have text-based conversations via computer -- half of them would be with a real person, the other half with software. At the end of the test, the person in the hotseat would have to guess which were people and which were software.
Turing's magic number was 30%. When software was capable of 'human' conversation 3 out of 10 times, it could be said to be intelligent. (You might think that 51% is a more suitable number, but that would have a scary connotation -- at 51%, the software is more human than a human.)
Rollo Carpenter created a piece of chat software called Cleverbot. He wanted a way to make the machine learn, letting it grow a little bit every time you talked to it.
It learns every phrase you type to it, then attempts to pick the best response based on its saved catalogue of words and phrases.
It grew very slowly at first, but Carpenter put it online for public use and it exploded, learning 5 million phrases in 10 years. He says he still gets several emails a day insinuating that there must be real people operating it, not software.
You can talk to Cleverbot for yourself by clicking here.
Caleb Chung, designer of the famous Furby toys, says it can. He suggests there are three elements to creating something that appears alive to a human:
- The ability to feel and show emotions
- To be aware of itself and the environment
- The ability to change over time
Under Chung's outline, Siri is almost alive. It can change over time as your schedule and reminders change. With its geofence-based reminders, it's certainly aware of its environment. But we have yet to see Siri display emotion.