A Carnegie Mellon University grad student and former Microsoft Research intern has created a way to turn your hand into a mobile handset. More specifically, your hand can take the place of a cell phone’s keypad or even any button-based computer interface.
Gesture-based, variable-surface, wearable computing isn’t new. Just think of the mind-blowing SixthSense system demoed on TED last year by Pranav Mistry as part of Pattie Mae’s Fluid Interfaces Group at MIT Media Labs. SixthSense practically turned Mistry into a cyborg, but did so using cameras and visual tracking. Skinput works differently, by sensing hand and arm vibrations through a wearable armband sensor. Any gesture that fingers, hand or arm can experience can be detected through unique vibrations through the sensor. These vibrations can then be coded to correspond to different mobile phone commands, or also used for general computing.
Since a hand could essentially become a virtual mobile handset using Skinput, the system’s versatility means we could potetentially get away from awkward keypad interfaces on our smartphones. This also means phones could potentially get smaller, not constrained by the size of a keypad. Unfortunately, this system isn’t likely to see commercial use for 2-7 years, but is being trialled in Seattle, Washington and has had as high as a 96% accuracy rate.
An image of Skinput in use is shown below, snapped from this CNN video.