When IBM’s Watson computer triumphed over human champions in the quiz show “Jeopardy!” it was a stunning achievement that suggested limitless horizons for artificial intelligence.
Soon after, IBM’s leaders moved to convert Watson from a celebrated science project into a moneymaking business, starting with health care.
Yet the next few years after its game show win proved humbling for Watson. Today, IBM executives candidly admit that medicine proved far more difficult than they anticipated. Costs and frustration mounted on Watson’s early projects. They were scaled back, refocused and occasionally shelved.
IBM’s early struggles with Watson point to the sobering fact that commercializing new technology, however promising, typically comes in short steps rather than giant leaps.
Despite IBM’s own challenges, Watson’s TV victory — five years ago this month — has helped fuel interest in A.I. from the public and the rest of the tech industry. Venture capital investors have poured money into A.I. start-ups, and large corporations like Google, Facebook, Microsoft and Apple have been buying fledgling A.I. companies. That investment reached $8.5 billion last year, more than three and a half times the level in 2010, according to Quid, a data analysis firm.
And software engineers with A.I. skills are treated like star athletes, prompting bidding wars for their services.
“We’re definitely at a peak of excitement now,” said Jerry Kaplan, a computer scientist, entrepreneur and author, who was a co-founder of a long-forgotten A.I. start-up in the 1980s. “Expectations are way ahead of reality.”
The term A.I. has long been a staple of science fiction — as machines that think for themselves and help humankind or as ungrateful creations that try to wipe us out. Or so the thinking at the movies goes.
The reality, however, is a little less dramatic. The automated voice on your smartphone that tries to answer your questions? That’s a type of A.I. So are features of Google’s search engine. The technology is also being applied to complex business problems like finding trends in cancer research.
The field of artificial intelligence goes back to the beginning of the computer age and it has rolled through cycles of optimism and disillusion ever since, encouraged by a few movie robots and one very successful game show contestant.
The history of tech tells A.I. backers to hang in there. Silicon Valley veterans argue that people routinely overestimate what can be done with new technology in three years, yet underestimate what can be done in 10 years.
Predictions made in the ’90s about how the new World Wide Web would shake the foundations of the media, advertising and retailing industries did prove to be true, for example. But it happened a decade later, years after the dot-com bust.
Today’s A.I., even optimists say, is early in that cycle.
“I think future generations are going to look back on the A.I. revolution and compare its impact to the steam engine or electricity,” said Erik Brynjolfsson, director of the Initiative on the Digital Economy at Massachusetts Institute of Technology’s Sloan School of Management. “But, of course, it is going to take decades for this technology to really come to fruition.”
There are reasons for enthusiasm. Computers continue to get cheaper even as they get more powerful, making it easier than ever to crunch vast amounts of data in an instant. Also, sensors, smartphones and other tech devices are all over the place, feeding more and more information into computers that are learning more and more about us.
Just in the last year or two, researchers have made rapid gains using a machine-learning technique called deep learning to improve the performance of software that recognizes images, translates languages and understands speech. That work has been done at start-ups and at big companies like Google, Facebook and Microsoft, as well as at universities and private research centers like the Allen Institute for Artificial Intelligence.
“There’s been surprising progress in the problems of perception, seeing, hearing and language,” said Peter Lee, corporate vice president for Microsoft Research.
At Enlitic, a start-up in San Francisco, Jeremy Howard, the founder and chief executive, believes that A.I. can transform the huge industry of health care, saving lives and money — an ambition similar to IBM’s. “But that’s a 25-year project,” he said.
Enlitic is concentrating first on radiology. Medical images are nearly all in digital form, Mr. Howard notes, and the tireless scanning for telltale signs of abnormal tissue is a task for which deep-learning image recognition technology is well suited.
Enlitic has tested its software against a database of 6,000 lung cancer diagnoses, both positive and negative, made by professional radiologists. In research soon to be published, its algorithm was 50 percent more accurate than human radiologists, Enlitic said.
“You have to take technology that works and apply it to a known problem,” Mr. Howard said. “Innovation alone is a mistake.”
No company has made as big or as broad a bid to commercialize its A.I. technology as IBM has with Watson. It set up Watson as its own business in 2014, and invested billions to accelerate the development and adoption of the technology, including buying several companies. The Watson unit now has 7,000 employees.
The Watson technology has been totally revamped. In its “Jeopardy!” days, Watson was a room-size computer. Today, it is so-called cloud software, delivered over the Internet from remote data centers. The Watson software itself has been carved up into dozens of separate A.I. components including a language classifier, text-to-speech translation and image recognition.
IBM is trying to position Watson as the equivalent of an A.I. operating system, a software platform others use to build applications. Nearly 80,000 developers have downloaded and tried out the software. IBM now has more than 500 industry partners, from big companies to start-ups, in industries including health care, financial services, retailing, consumer products and legal services.
At IBM, Watson’s early struggles in health care are viewed as a learning experience. The IBM teams, the executives say, underestimated the difficulty of grappling with messy data like faxes and handwritten notes and failed to understand how physicians make decisions.
“There were a lot of challenges with the early customers,” said John Kelly, the senior vice president who oversees Watson, adding that the business was “taking off” now.
IBM does not break out financial results for Watson. It describes the business as “large and growing,” contributing to the company’s $18 billion a year in revenue from business analytics.
At the University of Texas MD Anderson Cancer Center in Houston, Watson technology is one ingredient in an automated expert adviser for cancer care. The University of Texas health system is also using Watson in software to help diabetes patients and caregivers manage the disease, in a project that is expected to be introduced by the end of this year.
“It was a lot harder than we thought,” said Dr. Lynda Chin, the chief innovation officer for the university health system. “But our experience has convinced me that we can build an A.I. engine that improves care.”