CMU Researchers Turn Any Surface Into A Touchscreen
Bob Lawrence (VE1RLL)
Posts: 1,720
Soon you, too, will be able to talk to the hand. A new interface created jointly by Microsoft and the Carnegie Mellon Human Computer Interaction Institute allows for interfaces to be displayed on any surface, including notebooks, body parts, and tables. The UI is completely multitouch and the “shoulder-worn” system will locate the surface you’re working on in 3D space, ensuring the UI is always accessible. It uses a picoprojector and a 3D scanner similar to the Kinect.
The product is called OmniTouch and
Complete Story:
http://techcrunch.com/2011/10/17/cmu-researchers-turn-any-surface-into-a-touchscreen/
Demo Video:
http://www.youtube.com/watch?feature=player_embedded&v=Pz17lbjOFn8
The product is called OmniTouch and
Complete Story:
http://techcrunch.com/2011/10/17/cmu-researchers-turn-any-surface-into-a-touchscreen/
Demo Video:
http://www.youtube.com/watch?feature=player_embedded&v=Pz17lbjOFn8
Comments
Or how large could a "button" be? Could a dog stand on a certain spot, then a door opens?
Or could a person stand in front of your front door, then a doorbell rings?
Twitter users will be even more amusing and rude with this as well.
Point the cell phone at something, then you have a regular sized keyboard!
This is way cooler, and way more powerful. And my only question is, how do we integrate this into the Propeller / P2?