Touchscreens are already pretty effective with all the multi-touch swiping, tapping, and pinching gestures we can use on them. But why stop there? Researchers at Carnegie Mellon University are looking to expand the capability of touchscreens to detect exactly what they’ve been touched with.
The new system called TapSense was developed by Human-Computer Interaction Institute students Chris Harrison, Julia Schwartz, and professor Scott Hudson. TapSense can distinguish when it’s been tapped with a finger pad, nail, tip, or even a knuckle. It does this by using a microphone attached to the screen to tell exactly what is tapping against it. This new feature could be used to open up a host of new tap specific gestures, such as capitalizing letters with a fingernail tap or toggling between erasing and painting by using swiping with different parts of your finger.
“TapSense basically doubles the input bandwidth for a touchscreen,” said Chris Harrison, who also worked on OmniTouch featured on GeekTech last week. “This is particularly important for smaller touchscreens, where screen real estate is limited. If we can remove mode buttons from the screen, we can make room for more content or can make the remaining buttons larger.”
The researchers found that TapSense could “distinguish between the four types of finger inputs with 95 percent accuracy, and could distinguish between a pen and a finger with 99 percent accuracy.”
Unfortunately the researchers also say that the system requires an external mic because the ones in smartphones are optimized for vocal recognition. But we imagine it would be easy to recalibrate them though firmware for other uses when you are not taking a call.