Apple's iPhone 4S features the personal assistant app called Siri bringing natural language processing to the smartphone - it seems like a winning combination but similar winning combinations have been complete flops before. Is the time now right for the technology?
Who wouldn't like to be able to ask questions such as "What's today's weather going to be like in Seattle?" or ask for help such as making a restaurant reservation. Well Apple's iPhone 4S in going to have this help built-in courtesy of Siri, its personal assistant app.
See it in action in this demo:
Siri's voice recognition component is powered by software from Nuance, the company which also produces Dragon Naturally Speaking voice recognition software for Windows PCs.
Apple bought Siri in April 2010 and some users have already been using it on the iPad. As of October 14th, when it launches in beta with the debut of the iPhone 4S, it will reach a much larger audience. Initially it will support English, French and German with more languages and more features being added over time, according to Apple. Apple have no plans to bring it to any prior models, nor any other iOS device.
According to New Scientist, Siri's roots can be traced back to 2003, when DARPA funded a research program called "Cognitive Assistant that Learns and Organizes" or CALO. The intention was to create an automated assistant that could learn from the user and handle a variety of tasks. The project was lead by SRI International, a California based research institute - when it came to an end a number of people working on the CALO project founded Siri - the company that was snapped up by Apple.
In 1987, then CEO of Apple John Sculley described a device known as the “Knowledge Navigator” in his book Odyssey. Described in its simplest terms, it was a personal assistant that allowed the user to navigate information in an interactive way. The user would be able to speak in natural language, and the artificial intelligence would reason out the intention of the user. Watch it in action in this vintage Apple-produced video:
Of course the basic idea goes back a lot further than this, perhaps to Vannevar Bush and the memex. Apple certainly didn't think up the idea of augmenting human intelligence with a little bit of AI and a natural input mode. The trick is not to think up such a device - that is just applied science fiction; the trick is to make it work. An earlier step-too-far worth remembering was the Apple Newton, which featured handwriting recognition before its time. The technology wasn't that bad but it was bad enough to matter. After a few hilarious misrecognitions most users decided that the keyboard was more reliable.
Q: How many Apple Newton users does it take to screw in a lightbulb? A: Only one, tharks to the extnq-producilve handwritling processcr.
The project was cancelled but the whole tablet format eventually lead up to the iPhone and iPad.
In this case there are two really big questions.
The first is do we want to talk to our iPhones and iPads and if we do will the results be acceptable or laughable?
All it will take are a few really good but true jokes about asking for a light snack only to be directed to light bulb manufacturers. Even better all we need is one truly horrible error on the part of Siri and the technology will be derided, even if there is some good in it. In the real world it is not the raw error rates that matter but the stupidity of the errors that are made.
There is a long history of technologies that work well in the lab or the development environment that do not fare well once they get out into the real world. Real users will expect things to work like magic, and if the technology falls short, will soon abandon it. AI is moving fast and developments are exciting and very real, but there is still room to fail.
Siri on the iPhone 4S launches in beta next week. Let's wait and see what happens.