Air pointing: Design and evaluation of spatial target acquisition with and without visual feedback

Date
2011
Journal Title
Journal ISSN
Volume Title
Publisher
Elsevier
Abstract
Sensing technologies such as inertia tracking and computer vision enable spatial interactions where users make selections by ‘air pointing’: moving a limb, finger, or device to a specific spatial region. In addition of expanding the vocabulary of possible interactions available, air pointing brings the potential benefit of enabling ‘eyes-free’ interactions, where users rely on proprioception and kinaesthesia rather than vision. This paper explores the design space for air pointing interactions, and presents tangible results in the form of a framework that helps designers understand input dimensions and resulting interaction qualities. The framework provides a set of fundamental concepts that aid in thinking about the air pointing domain, in characterizing and comparing existing solutions, and in evaluating novel techniques. We carry out an initial investigation to demonstrate the concepts of the framework by designing and comparing three air pointing techniques: one based on small angular ‘raycasting’ movements, one on large movements across a 2D plane, and one on movements in a 3D volume. Results show that large movements on the 2D plane are both rapid (selection times under 1 s) and accurate, even without visual feedback. Raycasting is rapid but inaccurate, and the 3D volume is expressive but slow, inaccurate, and effortful. Many other findings emerge, such as selection point ‘drift’ in the absence of feedback. These results and the organising framework provide a foundation for innovation and understanding of air pointing interaction.
Description
Keywords
Citation
Collections