Microsoft’s Cortana AI gives programmable bots listening and decisionmaking skills
Remember the Cortana-powered prototype that was at last year's Build? It's baaack.
By Mark Hachman
PCWorldMay 6, 2019 9:27 am PDT
Image: Mark Hachman / IDG
Cortana used to represent Microsoft’s user-facing AI and reside largely in Windows. But times have changed, and she’s now evolved from one persona into the technology powering hundreds of automated bots, architected by Microsoft’s customers—meaning you could meet a Cortana-like intelligence in a lot more places.
That includes the Cortana prototype device that Microsoft demonstrated at last year’s Microsoft Build conference, which Microsoft showed off on stage at Build 2019 once again.
In advance of its Microsoft Build developer conference May 6-8 in Seattle, Microsoft has announced new cloud-based cognitive services that will benefit business users, such as transcribing meetings and extracting text from forms.
Remember Microsoft’s ”conference room of the future” demo from last year’s Build? A Cortana-powered device (housed within a tabletop device similar to an Amazon Alexa or Google Assistant hub) recognized, assisted, and recorded a meeting. To do that, Cortana (or another agent) needed to be able to sense the world around it. That same device showed up at the 2019 Build conference, too.
At Build 2019, Microsoft is making some of those capabilities available to third-party agents via its Cognitive Services. Specifically, the company is building in conversation transcription, so that a bot will be able to “hear” and transcribe a conversation in real time, assigning the text to a particular speaker. Microsoft is building improved “sight” capabilities—the ability to recognize digital ink, and what it calls “Forms Recognizer,” or the ability to cull data a user has entered into a form. Microsoft is also building in a Q&A Maker to engage users in preset dialogue trees.
The point behind artificial intelligence, however, is that Microsoft’s services should go beyond recording input to analyzing it and providing assistance. A new component of Cognitive Services, called Decision, can take the input and act upon it. A related service, called Personalizer, tries to interpret those results in the context of a particular user.
What Microsoft wants to use as the engine of all of this is Azure, its cloud service, and what Microsoft is positioning as the “cloud for AI.” Microsoft says more than 1.3 million developers are using Azure Cognitive Services, with 3,000 new bots being created each week. In all, there are almost 400,000 digital agents, Microsoft says.
Microsoft’s using machine learning (ML) as a way to accelerate the training of these AI models. In 2016 the company began talking about using field-programmable gate array (FPGAs) inside of servers as a way of increasing performance. While a general-purpose CPU like an Intel Xeon can be programmed to run an algorithm, a dedicated fixed-function ASIC (application-specific integrated circuit) is generally the fastest implementation. But as ASICs don’t allow for much improvement in the underlying design, an FPGA splits the difference between performance and flexibility. Now, Microsoft said it’s making generally available the hardware-accelerated Azure Machine Learning models that run on FPGAs.
Updated at 9:22 AM on May 6, noting that the Cortana-powered prototype device was at the 2019 Build, too.