New Interfaces Challenge Touch
After showing a project that let users control a music player by moving their eyes, Japanese mobile phone operator NTT DoCoMo said there were no plans to add it to any products. Shown at Ceatec 2009 near Tokyo and again at Mobile World Congress in Barcelona in 2010, it was a crowd pleaser, but in an August 2011 email NTT DoCoMo spokesman Yoshifumi Kuroda said: "The research is ongoing, but there are currently no plans to use this technology in any products."
The prototype includes earbuds that measure the changes in electrical state when a user's eye moves. Those impulses could then be translated into actions like skipping to the next track or turning up the volume.
Germany's Hasso Plattner Institute took a different approach to gesture interaction. Led by Patrick Baudisch, the Berlin-based group has developed what it calls imaginary interfaces that allow users to interact with mobile devices when they're not in front of them. Imagine hearing your phone ring in your pocket, but instead of taking it out, you hold up your palm and swipe your finger across it to ignore the call.
The prototype system won't be portable anytime soon. It uses depth-sensing cameras mounted above the users, or sometimes on the users' shoulder, to locate where their fingers are and what they're touching.
Baudisch credited Apple with replacing styluses with touchscreens, but he and his team wanted to take it one step further.
"Why don't we leave this [stylus] out and retrieve no devices at all for these tiny interactions such as turning off an alarm or picking up a phone call or sending to voicebox," he said during CHI 2011. "People will interact directly on the palm of their hand."
The system could work because users can remember about 70 percent to 80 percent of their 20 home screen icons and where they're located, he said.
Just as there is an acclimation period when switching from a mobile device with a keyboard to one with only a touchscreen, Baudisch imagined that there would be a similar adjustment to using a device users can't see.
Touchscreens have been around for decades and they won't be replaced anytime soon, according to Gartner analyst Ken Dulaney. The real power behind touchscreens is the software with which users can interact, he said.
"Pointing to something is human nature," said Dulaney in an interview. Speech recognition isn't perfect and if a word or two is missed the entire context could be changed, he said.
In the short term, Dulaney said, improving the accuracy of the interfaces and reducing fingerprints will be on the minds of developers. However, he imagines that transparent displays might become popular in the future. Users could simply hold their phones up and content could be overlaid, similar to how today's augmented reality applications use a phone's camera, he said.
At Ceatec 2010 in Japan, TDK showed off transparent screens and according to a May 2011 press release, the company has begun mass production of them. Called electroluminescent displays by TDK, the screens have a resolution of 320-by-240 pixels and are "mainly intended for use as the main display panel in mobile phones and other mobile devices."
Brain control interfaces abandon touch and gesture control and rely solely on the power of thought. Researchers at Riken, Japan's government-run research body, have developed a brain machine interface (BMI) that lets users control a wheelchair using thought. The thought patterns are picked up by electroencephalography or EEG sensors mounted on a user's head. The data is then relayed to a laptop, which interprets it and and sends the control signals to the wheelchair.
The system needs about three hours of training per day for a week to achieve a 95-percent accuracy rate, according to Riken.
Plans to use the technology in rehabilitation and therapy are already under way, according to Andrzej Cichocki, head of the Laboratory for Advanced Brain Signal Processing at Riken.
Based on the same principles, one company showed off a BMI that let users type by just concentrating on letters they want to use. Shown at Cebit 2011, Guger Technologies presented intendiX, a system that consists of a skullcap with electrodes, a pocket-sized brainwave amplifier and a Windows application that analyzes and decodes the brain waves.
To enter a letter, the user must stare at that letter on a virtual keyboard. The software flashes the columns and rows of the keyboard and the system tries to detect a response in the brain when the desired letter is flashed. The system looks for brainwaves that are triggered 300 milliseconds after a stimulus.
"The signal is called P300, it is just a usual signal," said Markus Bruckner, with the company. "For example, when you drive behind a car and it steps on its brakes and the red light flashes you have the same response."
It takes quite a bit of concentration and time to type out just a few letters, but for someone who has no other way of typing, it could bring new opportunities to communicate.
The company hopes to improve the response time and said it's down to one second in the lab.
Many of the prototypes from universities and research groups will never make it into commercial products.
"My job is to prove the concept, not to bring it to market," said Santiago Alfaro, an MIT Media Lab researcher who created Surround Vision.
His project extends the traditional television screen onto an iPad or similar mobile device. When users move the device around, additional content will be displayed. He imagines it being use for sporting events or concerts where there are multiple camera angles.
While Alfaro said there is no link between the two, Nintendo employs similar technology in its Wii U game system.
He said there's a benefit to when large companies commercialize new interface technology. "This proves that people are thinking the same way," he said. "People will become much more comfortable with the new technology."
While it might take a large corporation to commercialize a new technology, mainstream adoption will ultimately rely on whether consumers can become comfortable with it.
(Martyn Williams and Jay Alabaster in Tokyo contributed to this report.)
New Interfaces Challenge Touch