Google Glass competitors vie for attention as industry grows
Plenty of eyes may be focused on Google Glass as the device attracts attention in the field of “augmented reality,” but a crop of other players developing their own glasses-like products are also hoping to stand out as the industry matures.
Take Scope Technologies, which has partnered with Epson to develop computer-assisted glasses it calls the “augmented reality training system,” designed to help guide the user through complicated industrial maintenance processes like assembling fuel pumps, making changes to HVAC systems, or fixing broken car parts.
The glasses, which launched last September, use a built-in camera to provide instructions and animated diagrams to the wearer, which are overlaid directly on the person’s field of view as he or she goes about completing the task. Computer CAD models that are fed into a content management system are used to provide content for the unit
Part of the idea behind the product is to address the problem of having to constantly stop and refer back to a manual. “It’s about going back to the ancient way of doing things to show you what to do,” said Scott Montgomerie, chief technology officer at Scope. The company hopes its device, which sells for about $700 and looks like a bulkier form of wraparound sunglasses, can be used in a variety of industries such as oil, the military and aerospace.
As for how its aesthetics compared to Google Glass? “We think it’s less geeky,” said Montgomerie.
Scope Technologies is among roughly a half dozen groups exhibiting glasses-like devices Tuesday and Wednesday at Augmented World Expo in Santa Clara, California, an annual trade show focused on showcasing “augmented experiences” across nearly all aspects of life including commerce, education, industrial, government, architecture and automotive applications. More than 50 companies and research groups are exhibiting at the show.
Another player is Seebright, which is developing a headset called Spark, designed to provide immersive experiences to users based on existing technologies like Bluetooth sensors and gaming peripherals via Wi-Fi. The head-mounted unit lets users drop their iPhone or Android device into a slot like it’s a toaster, and uses a series of mirrors to expand and wrap the displayed image around the user.
Applications could include the ability to watch movies across a wearer’s full field of view through the display like a personal Imax screen, or educational content like 3D renderings of anatomical models or human skeletons, said Seebright CEO John Murray.
The company is hoping to distinguish the product from Google’s Glass by providing content geared around shorter, specific experiences—like cooking recipes or a workout routine, rather than designing it to be worn all day, like some might with Glass.
Seebright is currently inviting coders to apply for its developer competition, which will kick off in the fall with a low-cost developer version of the device priced about $100.
Meanwhile, Innovega was another exhibitor at the show with an even bolder idea: It is working on building a high-tech contact lens that would sit directly on top of the user’s eye to “enhance” normal vision by super-imposing certain images on top of a user’s regular field of vision. The applications of the technology are still being worked out, but Innovega is looking to partner with the U.S. Department of Defense for military uses, the company said.
Innovega’s proposition is that devices like Google Glass, which align flat-panel displays in the frame with other optical components to focus the image, result in a small-scale viewing experience with bulky hardware. Therefore, the company is eliminating optics entirely from its device, “enabling higher performance and better style,” Innovega says.
Augmented reality is a small field still in its early days. The basic idea behind most products with that label is to provide additional information or content to “augment” or improve people’s daily experiences.
As opposed to virtual reality, which typically creates a new simulated world for the user, augmented reality works with what people can already see, the thinking goes.
But because the field’s aims are so vague, there is some debate over whether it is its own market at all, or just a sub-domain of products build around mobile devices like smartphones and existing social media. Many features on the current form of Google Glass, for instance, are built around letting wearers quickly record and then share content with their contacts on Google+.
Using Augmented World Expo, now in its fourth year, as a barometer, interest in the field is at last growing. The more than 1,000 attendees at this year’s event represent a 40 percent increase in attendance over last year, organizers said.
And there are plenty of other players looking to make waves in augmented reality that aren’t using glasses-based products to do it. ChatPerf, also exhibiting at the show, is a Japanese company that makes an iPhone adaptor that emits perfume scents. Users of the adaptor can “receive” scents by text.
And then there’s ARPool, or augmented reality pool, developed by students at the Robotics and Computer Vision Lab at Queen’s University in Ontario, which uses custom computer algorithms and a projector-camera system to display ball trajectories onto the table as the player lines up a shot. Its developers were hoping to attract a commercial partner at the trade show to take the technology further.
Even Honda is getting involved in augmented reality. At the Honda Research Institute, researchers are working to make people better drivers by using computer-generated sensory information like GPS data and video to give drivers more information about their surroundings as they drive, possibly by displaying it directly on the windshield.
While making a left-hand turn, for instance, the driver could be alerted to the speed of an oncoming car with a certain image on the windshield if there is not enough time to complete the turn, said Victor Ng-Thow-Hing, principal scientist at the institute.
But augmented reality applications, especially when used in cars, have to be designed carefully, Ng-Thow-Hing said.
Because unlike software crashes, an automobile can crash literally. “Dying is a bad user experience,” he said.