There’s a data expert making a name for himself in the corporate world today, and he’s attracting a lot of attention. He’s a lightning-fast learner, he speaks eight languages and he’s considered an expert in multiple fields. He’s got an exemplary work ethic, is a speed reader and finds insights no one else can. On a personal note, he’s a mean chef and even offers good dating advice.
The name of this new paragon? Watson. IBM Watson.
Named after IBM’s first CEO, Watson was born back in 2007 as part of an effort by IBM Research to develop a question-answering system that could compete on the American quiz show “Jeopardy.” Since trouncing its human opponents on the show in 2011, it has expanded considerably. What started as a system focused on a single core capability — answering questions posed by humans in natural language — now includes dozens of services spanning language, speech, vision and data analysis.
Watson uses some 50 technologies today, tapping artificial-intelligence techniques such as machine learning, deep learning, neural networks, natural language processing, computer vision, speech recognition and sentiment analysis. But IBM considers Watson more than just AI, preferring the term “cognitive” instead. Whereas existing computers must be programmed, Watson understands the world in the way that humans do: through senses, learning and experience, IBM says.
“When we say ‘cognitive,’ we mean that it can learn, understand, reason and interact,” said Steve Abrams, director of the IBM Watson Platform. “Watson can do each of those things with people, data or other systems.”
With the ability to read more than 800 million pages per second, it can analyze vast volumes of data — including the unstructured kind — processing it by understanding natural language, generating hypotheses based on evidence, and learning as it goes.
It’s tempting to imagine Watson as some giant “brain” churning away behind a curtain in the core of IBM’s research facilities, but the reality is very different.
“It’s an oversimplification to call Watson a cognitive computer,” said Roger Kay, principal analyst at Endpoint Technologies. “What it does is marshal domain-specific resources and make that information available to humans through a natural-language interface.”
The “cognitive” part is that Watson can “flash through its knowledge base for potential answers to users’ questions by employing AI and machine-learning algorithms,” Kay added. “What IBM has done is create a huge engine for this sort of analysis and put together a fairly simple means to program it as well as a straightforward human interface for end users.”
Today, Watson can be viewed as a cloud utility, he said: “a powerful capability run by IBM that can be accessed via the web.”
In 2014 IBM established a dedicated Watson Group with a global headquarters in New York City to propel and commercialize the technology. A Boston-based health unit and an IoT headquarters in Germany followed the next year. Today, Watson is available to partners and developers via the cloud and some 30 application programming interfaces (APIs). Hundreds of IBM clients and partners across 36 countries and more than 29 industries now have active projects underway with Watson, IBM says.
The Watson developer community represents more than 550 developers across 17 industries and disciplines, and more than 100 of them have already introduced commercial “cognitive” apps, products and services as a result. More than a million developers globally are using the Watson Developer Cloud on IBM’s Bluemix platform, meanwhile, to pilot, test and deploy new business ideas. IBM has allocated $100 million for venture investments to support this community.
OmniEarth is an environmental analytics company that recently partnered with IBM to leverage Watson’s visual-recognition services to decipher and classify physical features in aerial and satellite images, and it’s using those analyses to help tackle California’s ongoing drought.
“Fundamentally, we’re looking for what we can learn about outdoor water use to anticipate how much water a particular parcel of land might need,” said OmniEarth Lead Data Scientist Shay Strong.
It can take inordinate amounts of time and expertise to manually examine aerial photographs and satellite images to identify swimming pools and other pertinent landscape features on a particular lot, Strong said.
Now, OmniEarth uses a variety of machine-learning algorithms to do it — some home-grown, and some that are part of Watson. (You can test out Watson’s vision API for yourself here; OmniEarth’s technology can be seen here.) Vast amounts of data are involved — close to a terabyte for Los Angeles alone, Strong said — but machine learning speeds up the process enormously. OmniEarth can now process aerial images 40 times faster than it could before, for example, tackling 150,000 images in just 12 minutes rather than several hours.
“It buys us incredible efficiency,” Strong said.
It also enables better planning and budgeting. Whereas previously water districts like the City of Folsom and the East Bay Municipal Water District often used statewide averages to gauge their upcoming needs, OmniEarth’s AI-based analyses allow them to create much more accurate forecasts. Watson is also helping regional utilities and conservation groups such as the Inland Empire Utility Agency and the Santa Ana Watershed Project Authority fine-tune their outreach programs to better educate families about modifying their water usage.
Macy’s is another recent Watson user. This past summer the retail giant launched a Watson-based web service designed to help customers navigate its stores while they shop. Delivered through location-based engagement software from IBM partner Satisfi, Macy’s On-Call allows customers to input questions in natural language about each participating store’s unique product assortment, services and facilities and then receive a customized response. Macy’s is currently piloting the new tool in 10 store locations across the United States.
“We have a long list of other potential uses,” said Serena Potter, the retailer’s group vice president of digital media. “Our ultimate goal is to implement additional cognitive services in the future.”
Other uses of Watson so far include self-driving vehicle “Olli” along with numerous efforts in healthcare and Deep Thunder, a hyperlocal weather-forecasting tool for businesses that stems from IBM’s recent acquisition of The Weather Company.
In making all this happen, one of IBM’s goals is to shield partners and developers as much as possible from the hairy mathematics that underlie AI. “We want to make this very consumable for developers, and most are not machine-learning experts,” Abrams said.
Watson’s natural language classifier, for instance, is designed to understand the intent of human statements, and it can be trained for specific purposes. One example might be answering questions at a hotel’s front desk, such as via the Watson-based “Connie” robot Hilton Hotels began pilot testing earlier this year. “Watson can understand ‘what time are you going to kick me out of here?’ as ‘what time is checkout time?'” Abrams explained.
Rather than offer the capability in a general-purpose fashion, though, IBM asks developers to upload a spreadsheet listing the human utterances they want Watson to understand. IBM then works behind the scenes to train the model accordingly.
“We’ve used world-class machine-learning techniques to create the classifier, but I don’t want developers to worry about that,” Abrams said. “From my chair, Watson is a collection of services that make it possible for developers to build cognitive systems.”
Looking ahead, IBM aims to advance the science underlying Watson as well as simplify and scale its associated services, he added.
IBM has placed big bets on Watson and cognitive computing, and it now considers the technology not just the next frontier in computing but a core part of its business.
“When everyone’s digital, and every industry has its Uber, its Tesla, its whatever, what differentiates you?” said IBM chairman, president and CEO Ginni Rometty in a recent interview. “The thing that can most differentiate your company is cognitive.”
That emphasis could pay off handsomely. According to Allied Market Research, the cognitive computing market is expected to generate $13.7 billion in revenue by 2020.
Of course, IBM isn’t the only company vying for a piece of that pie.
“The techniques used by Watson are being replicated everywhere by other companies,” said Endpoint’s Kay. But “of all general AI/machine-learning tools, Watson is the most developed and longest-standing.”
Said Abrams, “It’s our objective to make cognitive the premier paradigm for computing, and to make Watson the premier platform on which that’s done.”