Microsoft’s research division has developed a keyboard that can interpret basic hand gestures, potentially bridging a gap between touch devices and more traditional input methods.
Presented at the Computer Human Interaction (CHI) conference in Toronto, the prototype keyboard has 64 sensors that detect the movement of hands as they brush over the top of the keyboard. Swiping a hand over the left or right side, for instance, can bring up left and right side menus in Windows 8.
The main goal is for users to be able to keep their hands on or very close to the keyboard while typing and using input gestures, said Stuart Taylor, a Microsoft senior research engineer.
Some of the gestures can replace existing keyboard shortcuts, like the Alt and Tab combination for switching between applications.
“What we’ve found is that for some of the more complicated keyboard shortcut combinations, performing gestures seems to be a lot less overhead for the user,” he said.
Gesture control in touchscreens is commonplace for tasks like flicking through photos or pulling up menus. Even some mice can interpret gestures, but keyboards have largely stuck to their traditional input method.
Taylor said Microsoft’s keyboard can interpret a number of gestures, though only a few were working at the conference in Toronto. He also said it’s not designed to replace a mouse.
“It’s less about fine-grain navigation, which would still be performed with a mouse or touchpad,” he said.
The team has been working on the project for about a year-and-a-half and will continue to refine the gesture interpretation. The sensors on the keyboard are in pairs, with one sensor emitting infrared light and the other reading the light reflected back. It’s not unlike the technology in Microsoft’s Kinect gaming system.
Since it is still a research project there are no immediate plans for commercialization, but technology like this could give Microsoft a much-needed leg up in the computing race.