Tap sensor takes touch to the next level

CARNEGIE MELLON (US) — A touchscreen sensor uses sound to distinguish between the tap of a fingertip, finger pad, fingernail, and knuckle.

By taking greater advantage of the finger’s anatomy and dexterity, TapSense could change the way smartphone and tablet computer owners control their touchscreens.

While typing on a virtual keyboard, for instance, users might capitalize letters simply by tapping with a fingernail instead of a finger tip, or might switch to numerals by using the pad of a finger, rather than toggling to a different set of keys.

httpv://www.youtube.com/watch?v=-oN96cucBr4

[sources]

Another possible use would be a painting app that uses a variety of tapping modes and finger motions to control a pallet of colors, or that allows users to switch between drawing and erasing without pressing buttons.

“TapSense basically doubles the input bandwidth for a touchscreen,” says Chris Harrison, a Ph.D. student at Carnegie Mellon University. “This is particularly important for smaller touchscreens, where screen real estate is limited. If we can remove mode buttons from the screen, we can make room for more content or can make the remaining buttons larger.”

TapSense, which uses a microphone to distinguish tap sounds, was developed by Harrison, fellow Ph.D. student Julia Schwarz, and Scott Hudson, a professor in the HCII. Harrison will discuss the technology at the Association for Computing Machinery’s Symposium on User Interface Software and Technology in Santa Barbara, Calif.

“TapSense can tell the difference between different parts of the finger by classifying the sounds they make when they strike the touchscreen,” Schwarz says.

An inexpensive microphone could be readily attached to a touchscreen for this purpose. The microphones already in devices for phone conversations would not work well for the application, however, because they are designed to capture voices, not the sort of noise that TapSense needs to operate.

The technology also can use sound to discriminate between passive tools (i.e., no batteries) made from such materials as wood, acrylic and polystyrene foam. This would enable people using styluses made from different materials to collaboratively sketch or take notes on the same surface, with each person’s contributions appearing in a different color or otherwise noted.

The researchers found that their proof-of-concept system was able to distinguish between the four types of finger inputs with 95 percent accuracy, and could distinguish between a pen and a finger with 99 percent accuracy.

More news from Carnegie Mellon University: www.cmu.edu/news