Designing User-, Hand-, and Handpart-Aware Tabletop Interactions with the TOUCHID Toolkit
Recent work in multi-touch tabletop interaction introduced many novel techniques that let people manipulate digital content through touch. Yet most only detect touch blobs. This ignores richer interactions that would be possible if we could identify (1) which hand, (2) which part of the hand, (3) which side of the hand, and (4) which person is actually touching the surface. Fiduciary-tagged gloves were previously introduced as a simple but reliable technique for providing this information. The problem is that its lowlevel programming model hinders the way developers could rapidly explore new kinds of user- and handpartaware interactions. We contribute the TOUCHID toolkit to solve this problem. It allows rapid prototyping of expressive multi-touch interactions that exploit the aforementioned characteristics of touch input. TOUCHID provides an easy-to-use event-driven API. It also provides higher-level tools that facilitate development: a glove configurator to rapidly associate particular glove parts to handparts; and a posture configurator and gesture configurator for registering new hand postures and gestures for the toolkit to recognize. We illustrate TOUCHID’s expressiveness by showing how we developed a suite of techniques (which we consider a secondary contribution) that exploits knowledge of which handpart is touching the surface.
Design, Human Factors