Anyone concerned that Microsoft is evolving into a more accessible version of IBM, rather than the consumer company many would like it to be, isn’t going to feel any better after the company’s Build developer conference starting May 8 in Seattle. Two expected moves will reinforce that enterprise direction: a Kinect sensor for Azure, and two HoloLens apps that are being adapted for businesses using mixed reality.
Microsoft chief executive Satya Nadella is expected to open Build on Monday by describing the “intelligent cloud and the intelligent edge,” which has been Microsoft’s unofficial mantra for about a year. Microsoft plans to define what it means by intelligent edge: By 2020, there will be about 30 billion connected devices, each generating about 1.5GB of data per day. Smart buildings and connected factories will add to that. Expect to hear quite a bit about the Internet of Things on Monday—“the world is a computer,” Nadella is expected to say.
But the technology that was once enjoyed by consumers is now being extended into businesses, where the margins are higher and businesses will pay for subscriptions. Project Kinect for Azure will debut in 2019, using what was considered the eyes of the Xbox as a product for developers. One of the original demos attached to the HoloLens, Remote Assist, is also being pitched for “frontline workers” using mixed reality for training or collaboration.
Windows users will be tossed a bone, though: Microsoft plans to announce a partnership with DJI, the world’s largest drone company, on an SDK for Windows PCs. The SDK means that Windows users will essentially be able to remote-pilot drones right from their PC, including flight control and real-time data transfer.
Why this matters: If you’re a Windows fan, a consumer, or even an office worker who uses Windows, you may not hear a lot that interests you early in the day at Microsoft Build. Microsoft’s moved on from its arms race for Windows app developers—and now is engaged in an arms race on the AI front, competing with Google and Amazon and others to build for and with AI.

A partnership between DJI and Microsoft could make controlling drones simpler for PCs.
The vision beyond Windows is AI…and vision
One of Nadella’s key announcements will be AI for Accessibility, a $25 million, five-year fund aimed at using AI to improve the lives of users with disabilities. To Microsoft’s credit, the company has invested heavily in making Windows accessible, with everything from audible Narrator cues, dictation, eye tracking, and more serving as assistive technologies.

Microsoft has pitched HoloLens as a business solution for almost as long as it’s been available, but it’s unclear how many have signed on.
Microsoft sees AI as a technology that lives both in the cloud, and locally, where it can be used to make decisions for a device that can’t reach the cloud. An example Microsoft is expected to unveil is Custom Vision, a service that will run on what Microsoft calls Azure IoT Edge.
Custom Vision sounds like it will work hand in hand with Project Kinect for Azure, the ongoing evolution of the Kinect sensor into something more powerful and more portable. Kinect debuted on the Xbox 360 as a standalone sensor bar that perched on your TV. But by the time Xbox One debuted, camera-enabled games had fallen out of favor, and the Kinect was eventually discontinued.

Project Kinect for Azure
A version of Kinect was then incorporated into the HoloLens, but that device, too, has resided mainly in Microsoft’s development labs since 2016. In a sense, HoloLens then gave way to mixed reality, a key feature of the Windows 10 Fall Creators Update. Unfortunately, that release came and went without anything more than a few mentions of Microsoft’s mixed-reality vision.
Project Kinect for Azure is described as a “package of sensors,” including Microsoft’s depth camera and additional sensors in a “small, power-efficient form factor.” “It can input fully articulated hand tracking and high fidelity spatial mapping, enabling a new level of precision solutions,” Microsoft said in a statement.
Because Microsoft describes Project Kinect for Azure as a enabling “new scenarios for developers working with ambient intelligence,” it’s likely that Kinect for Azure won’t yield a consumer product, at least for Microsoft, but will serve as a component of sorts for third-party hardware. In a demonstration, Microsoft will fly drones across the stage, using machine vision to identify defective pipes.

It’s unclear how many mixed-reality devices have been sold. Given Microsoft’s relative silence on the subject, probably not many. Valve’s Steam service, though, is now live on MR, Microsoft said last week.
Vision, especially augmented vision that could be shared, was one of the incredible features shown off in the original 2015 HoloLens demo in Redmond, where testers (including myself) were offered the opportunity to rewire a live light switch, assisted by a remote user who could “see” out of the front-facing external camera, via Skype. One of the new features of mixed reality will be Microsoft Remote Assist, where mixed-reality users (and presumably HoloLens users, too) will be able to share what they see with any member of their Teams contact list, while being able to keep working with their hands to fix the problem.
HoloLens users will also be able to construct holograms and 3D models of room layouts with what Microsoft calls Microsoft Layout. These two apps, known as Microsoft Remote Assistance and Microsoft Layout, will be available for the HoloLens on May 22.
Nadella and Scott Guthrie, executive vice president of the Cloud and AI Group, are expected to talk for about three hours Monday morning, describing their vision for the intelligent edge, intelligent cloud, and AI in more detail. Tuesday has been given over to Joe Belfiore, head of the Devices and Experiences Group, who is expected to lay out more of Microsoft’s plans to enhance Windows experiences on smartphones.
This story was updated at 9:52 AM with additional detail.