Two Microsoft Research projects presented at the Conference on Human Factors in Computing Systems used unique methods, completely absent of any cameras, to sense gestures.
Called Soundwave, one project used a laptop’s standard speakers and microphone to detect what motions users were making with their hands. The system emitted an inaudible ultrasonic tone that bounced off a user’s hand. Because of the Doppler Effect, the sound waves shifted in frequency and the microphone could hear that shift. The computer then interpreted the shift and turned it into an action, explained Sidhant Gupta a former Microsoft Research intern.
To see Soundwave and another project called Humantenna, watch a video on YouTube.
Gupta has mapped five gestures to computer actions, but thinks that the system could support up to 10.
“There’s a fundamental limit to what gestures you can perform,” he said. “The Doppler Effect only senses motion so if you simply keep you hand in front of it [the computer] it is something we simply cannot detect.”
The system samples its surroundings every 100 milliseconds and adapts if, for example, a users enters a loud room.
Gupta said that one potential application for Soundwave could be in cars, where the system is immune to changes in lighting, unlike some vision-sensing machines. He said it could also be complimentary to vision systems.
“Soundwave could be used for detecting if a user is performing a gesture and then turn on some of the other systems to do some more fine grain gesture sensing,” he said.
Another Microsoft Research project, called Humantenna, used electromagnetic and radio interference to sense gestures.
A device a bit larger than a wrist watch could be attached to a users. When they wave an arm, kick a leg or stomp a foot, a computer could be trained to pair those gestures with actions.
“We’re simply measuring the voltage on the body,” said Gabe Cohn a former Microsoft Research intern.”And part of that is a low frequency component that moves because I’m moving and there’s a high frequency component that’s changing because I’m changing my proximity to noise sources in the environment like power lines and appliances.”
Traditionally, in order for a device to detect what users are doing with their legs or feet, accelerometers would need to be attached to them. With Cohn’s project, only the small wrist device is needed.
It’s not perfect though. The system seemed to work intermittently during a demonstration because of what Cohn described “as a small training set.”
He said that the system had achieved more than 90 percent accuracy during a study period.
While both Humantenna and Soundwave offered gesture sensing alternatives, they’re only research projects so there are no immediate plans for commercialization.
Nick Barber covers general technology news in both text and video for IDG News Service. E-mail him at Nick_Barber@idg.com and follow him on Twitter at @nickjb.