I’ve wanted a Minority Report-esque way to control my computer for years, especially since a lot of my astronomy image-processing workflow involves countless mouse clicks and there's no way to automate the process. But I shouldn’t have to wait much longer, thanks to a team at MIT that has created a hand-detection system out of Microsoft’s Kinect camera. By detecting where your palms and fingers are, the Kinect lets you scroll through images, select them, and even enlarge them at will, using one or both hands. No clumsy gloves required!
The interface was designed by members of the Learning and Intelligent Systems group and Robot Locomotion Group in the Computer Science and Artificial Intelligence Laboratory at MIT. If you want to try your hand (pun intended) at coding up a similar gesture-detecting Kinect system, the researchers have posted the code for this hack for your perusal.
If you wind up building your own Kinect-based computer interface, let us know in the comments!
Like this? You might also enjoy...
- Japan's Holographic 3D Soccer Dream Denied
- 8 Great Kinect Hacks
- 3D Scanner Built Out of Lego, Lasers, Awesome