Artificial intelligence promised us great technology. But has it delivered?
Stanford University computer science professor John McCarthy coined the phrase in 1956 to mean "the science and engineering of making intelligent machines," In the early years of the artificial intelligence movement, enthusiasm ran high and artificial intelligence pioneers made some bold predictions.
In 1965, artificial intelligence innovator Herbert Simon said that "machines will be capable, within 20 years, of doing any work a man can do."
Two years later, MIT researcher Marvin Minsky predicted, "Within a generation ... the problem of creating 'artificial intelligence' will substantially be solved."
Popular culture jumped onto the artificial intelligence bandwagon and gave us Rosie the Robot from the Jetsons, HAL from the movie 2001 and R2D2 from Star Wars.
Yet, here we are, decades later and what has artificial intelligence done for us lately? If you define artificial intelligence as self-aware, self-learning, mobile systems, then artificial intelligence has been a huge disappointment.
On the other hand, every time you search the Web, get a movie recommendation from NetFlix, or speak to a telephone voice recognition system, tools developed chasing the great promise of intelligent machines do the work. In other words, we may not have full-functioning robots that cater to our every need, but artificial intelligence is embedded in our everyday lives.
"Once tools get far enough out of the lab, they're no longer AI, just common computer science," says Professor George Luger of the University of New Mexico. "AI just went to work."
One of the biggest boosts to artificial intelligence is Moore's Law, because artificial intelligence needs CPU power. "It took 20 years to go from a 5MHz chip to a 500MHz chip, but only eight months after that to get to a 1GHz chip," says futurist Daniel Burrus, author of the best seller Technotrends: How to Use Technology to Go Beyond Your Competition and founder of Burrus Research.
"The new Sony Playstation came out a year ago," says Burrus, "but if it came out five years earlier it would be considered a supercomputer." Burrus likens the growth of processing power on a graph to a hockey stick. "In the 90s, the graph was still low. In 2000, the graph started up a little. In 2008, we're getting on the handle of the hockey stick."
Burrus listed off multiple uses of artificial intelligence and expert systems that work behind the scenes. "The first application of successful AI was in the financial services industry for loan qualifications. Loan qualification went from one to two weeks down to minutes." Other examples include systems that help Navy pilots land jets on aircraft carriers.
His personal favourite is the use of an expert system to manage room service orders at Marriott hotels. "AI tells them when to start cooking and when to deliver. Marriott tells me exactly when breakfast will be delivered while others give me a 15 minute window. That's a competitive advantage for Marriott."
While energy prices soar, Burrus noted the cost of intelligence keeps going down. "Maybe we can offset the energy trend as we make appliances more intelligent."