Behind the scenes, something’s going on with Intel PCs. They’re becoming smarter. Intel envisions a future where your PC will simply anticipate your habits and act accordingly. But it’s not clear when that future will arrive, how realistic that vision will be, or whether consumers will tolerate a computer that predicts your every move.
What we know is this: Intel’s building a future version of its tiny desktop PCs, the NUCs, with Amazon’s Alexa assistant built in. The Intel “Bean Canyon” NUC—Bean for “coffee bean,” or the “Coffee Lake” chip built inside of it—will arrive later this year. (Intel is currently working with Amazon to obtain certification.) Meanwhile, Intel is adapting its Movidius chips into “AI chips” that will power these intelligent, future experiences.
At Computex, Intel NUC marketing manager Bruce Patterson showed off Movidius Compute Sticks plugged into a NUC. Those Sticks were running a new version of Movidius’ Myriad X, a visual processing unit with an embedded compute engine inside of it. In a demonstration, the Compute Stick showed off its ability to recognize faces as a movie trailer played. Though the software was scanning for and identifying faces as soon as they appeared on screen, the main CPU was running at less than 20 percent utilization, offloading it to the Myriad X chip itself.
If you’ve never heard of Movidius or the Myriad X before, don’t worry. Intel bought Movidius in 2016, when Movidius was publicly designing computer vision chips to interpret what they could see from several cameras scattered around a car, as a precursor to autonomous vehicles. Since it acquired the company, though, Intel has begun combining the Movidius ability to “see” with intelligence that it’s adding behind the scenes.
“I expect it’s going to become so normal that you’re just going to put it on the motherboard,” Patterson said of the Myriad X. (Asus has apparently already done so, the company said earlier this week, though it’s not clear whether the chip is actively doing anything.)
So what else needs to happen? According to Intel, developers need to support Windows ML, the machine-learning language that Microsoft debuted last year within Windows 10. That technology is still in its infant days, as even Microsoft’s own product teams showed off how Windows ML could eventually be used to “read” text in the real world and import it into apps like OneNote.
For now, AI within the PC experience will be more within the retail space. Intel is engaging with a number of software developers. Once Windows ML starts being tapped into by developers later this year, “that’s when we’re going to see more consumer experiences start to happen,” Patterson said.
Imagine, Patterson said, if your computer could see who you are, or simply know that you checked Outlook every day at 9 a.m. “So the idea is that you’d sit down at 9 a.m., and Outlook would already be open,” he said. Your PC could “see” you sitting down, recognize the time, and so on. Understanding your habits means that your PC could anticipate them.
But it’s unclear how Intel plans to actually architect this future. Though Intel clearly influences hardware design, from the depth cameras used by Windows Hello to new dual-screen prototypes, it’s struggled to push consumer-facing software and services like True Key into the mainstream. Microsoft would clearly have something to say about how consumers should be using AI to benefit their lives. And with companies like Facebook in the news, again, for sharing data without user knowledge, consumers will have something to say, too.
Nevertheless, the Myriad X has taken the first steps down a tried-and-true path toward integration: first a discrete chip mounted on a card or external device, then a chip mounted directly on the motherboard, then a chipset component or even dedicated functionality within the CPU. It’s happened before with audio and basic integrated graphics. Intel is clearly eyeing a future where a more intelligent PC is being powered by Intel Inside.