- Sputnik International
Multimedia
When a picture is worth a thousand words. See what's happening in the world from a more visual perspective with Sputnik's photo galleries, infographics and other multimedia content.

‘OmniTouch’ projection interface unveiled

Subscribe
Microsoft and Carnegie Mellon University’s Human-Computer Interaction Institute unveiled a new device that projects a hover-capable, motion-tracking, multi-touch interface on any surface, even your own hand.

Microsoft and Carnegie Mellon University’s Human-Computer Interaction Institute unveiled a new device that projects a hover-capable, motion-tracking, multi-touch interface on any surface, even your own hand.
‘OmniTouch, Wearable Multitouch Interaction Everywhere’ co-authored by Chris Harrison, a Ph.D. student at Carnegie Mellon University and a former Microsoft Research intern, Benko, and Andy Wilson is a wearable system that enables graphical, interactive, multitouch input on arbitrary, everyday surfaces.
OmniTouch employs a short-range depth-sensing camera, and laser pico-projector similar to the Microsoft Kinect is mounted on a user's shoulder to track the user's fingers on everyday surfaces. This allows users to control interactive applications by tapping or dragging their fingers, much as they would with touchscreens found on smartphones or tablet computers.
The pico projector can superimpose keyboards, keypads and other controls onto any surface, and a 3D scanning system, like that of XBox Kinect, to recognize motion in three dimensions in front of it automatically adjusting for the surface's shape and orientation to minimize distortion of the projected images whether it's a wall, table, arm, lap, leg, or what-have-you. The result is a mobile interface that works like a touchscreen but without a screen.
The system's software supports multi-touch input and can track digits in 3D space, differentiating between a finger that's hovering over a surface and one that's actively "clicked" an area. And, since it's shoulder-mounted, the system's first-person perspective doesn't require any user calibration or special training. A user can even transfer the interface from one surface to another; say from the back of a notebook onto a nearby wall.
"It's conceivable that anything you can do on today's mobile devices, you will be able to do on your hand using OmniTouch," said Chris Harrison, a Ph.D. student in Carnegie Mellon's Human-Computer Interaction Institute. The palm of the hand could be used as a phone keypad, or as a tablet for jotting down brief notes. Maps projected onto a wall could be panned and zoomed with the same finger motions that work with a conventional multitouch screen.
Harrison added that the device ultimately could be the size of a deck of cards, or even a matchbox, so that it could fit in a pocket, be easily wearable, or be integrated into future handheld devices.
"With OmniTouch, we wanted to capitalize on the tremendous surface area the real world provides," said Benko, a researcher in Microsoft Research's Adaptive Systems and Interaction group. "We see this work as an evolutionary step in a larger effort at Microsoft Research to investigate the unconventional use of touch and gesture in devices to extend our vision of ubiquitous computing even further.

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала