Wednesday, June 29, 2011

HCI

Skinput uses sound to turn your body into an input device

skinput-1In our previous article we wrote about a new material which gives the sense of touch, and here is an invention with a different twist. Since devices with increasingly larger computational power and various capabilities are becoming much smaller, they can be easily carried on our bodies. However, their small size typically leads to limited interaction space and consequently reduces their usability and functionality.
There have been many suggestions on how to solve this problem as using augmented reality projected onto our glasses or retina, small projectors which can turn the tables and walls into our interactive surface or a combination of a projector and a camera into one of our favorites – SixthSense. However, the researchers Chris Harrison from Carnegie Mellon University, Desney Tan and Dan Morris from Microsoft research, claim there is one surface that has been previous overlooked as an input canvas – our skin.
Appropriating the human body as an input device is appealing not only because we have roughly two square meters of external surface area, but also because much of it is easily accessible by our hands (e.g., arms, upper legs, torso). Furthermore, proprioception (our sense of how our body is configured in three-dimensional space) allows us to accurately interact with our bodies in an eyes-free manner. Few external input devices can claim this accurate, eyes-free input characteristic and provide such a large interaction area.
The user needs to wear an armband, which contains a very small projector that projects a menu or keypad onto a person’s hand or forearm. The armband also contains an acoustic sensor. The acoustic sensor is used because when you tap different parts of your body, it makes unique sounds based on the area’s bone density, soft tissue, joints and other factors.
skinput-2
The software in Skinput is able to analyze the sound frequencies picked up by the acoustic sensor and then determine which button the user has just tapped. Wireless Bluetooth technology then transmits the information to the device. So if you tapped out a phone number, the wireless technology would send that data to your phone to make the call. Harrison claims they have achieved accuracies ranging from 81.5 to 96.8 percent and enough buttons to control many devices.
We think it’s not a question wheatear to use Skinput or SixthSense, because both technologies should be incorporated along with some features from their competition in order to make a practical interface. While SixthSense could perform better in loud environments and offers more features, the Skinput doesn’t require any markers to be worn and it is more suitable for persons with sight impairments, since it is much easier to operate it even with your eyes closed.

Sunday, June 26, 2011

PROJECT

Skinput Uses Your Body as Input Device

Skinput Uses Your Body as Input Device
Chris Harrison has developed Skinput, a way in which your skin can become a touch screen device or your fingers a keypad. Harrison says that as electronics get smaller and smaller they have become more adaptable to being worn on our bodies, but the monitor and keypad/keyboard still have to be big enough for us to operate the equipment. This can defeat the purpose of small devices but with the clever acoustics and impact sensing software, Harrison and his team can give your skin the same functionality as a keypad.
Chris has used tables and walls as touch screens but has experimented using the surface area of our bodies because technology is now small enough to be carried around with us and we can’t always find an appropriate surface.
Skinput Uses Your Body as Input Device 2
Harrison, a 3rd-year PhD student in the Human-Computer Interaction Institute at Carnegie Mellon University, says we have roughly two square meters of external surface area, and most of it is easily accessible by our hands (eg: arms, upper legs, torso). He has used the myriad sounds our body makes when tapped by a finger on different areas of say, an arm or hand or other fingers, and married these sounds to a computer function.  Technically its called “proprioception”.
Skinput Uses Your Body as Input Device 3
Harrison and his team have created its own bio-acoustic sensing array that is worn on the arm meaning that no electronics are attached to the skin. Harrison explains that when a finger taps the body, bone densities, soft tissues, joint proximity, etc, affect the sound this motion makes. The software he has created recognizes these different acoustic patterns and interprets them as function commands.
Skinput Uses Your Body as Input Device 4
Harrison’s research paper, co-authored by Desney Tan and Dan Morris from Microsoft Research, titled *Skinput: Appropriating the Body as an Input Surface* will appear in Proceedings of the 28th Annual SIGCHI Conference on Human Factors in Computing Systems (Atlanta, Georgia) in April.