Air pointing: Design and evaluation of spatial target acquisition with and without visual feedback
05 social sciences
0202 electrical engineering, electronic engineering, information engineering
0501 psychology and cognitive sciences
02 engineering and technology
DOI:
10.1016/j.ijhcs.2011.02.005
Publication Date:
2011-02-26T12:36:21Z
AUTHORS (5)
ABSTRACT
Sensing technologies such as inertia tracking and computer vision enable spatial interactions where users make selections by 'air pointing': moving a limb, finger, or device to a specific spatial region. In addition of expanding the vocabulary of possible interactions available, air pointing brings the potential benefit of enabling 'eyes-free' interactions, where users rely on proprioception and kinaesthesia rather than vision. This paper explores the design space for air pointing interactions, and presents tangible results in the form of a framework that helps designers understand input dimensions and resulting interaction qualities. The framework provides a set of fundamental concepts that aid in thinking about the air pointing domain, in characterizing and comparing existing solutions, and in evaluating novel techniques. We carry out an initial investigation to demonstrate the concepts of the framework by designing and comparing three air pointing techniques: one based on small angular 'raycasting' movements, one on large movements across a 2D plane, and one on movements in a 3D volume. Results show that large movements on the 2D plane are both rapid (selection times under 1s) and accurate, even without visual feedback. Raycasting is rapid but inaccurate, and the 3D volume is expressive but slow, inaccurate, and effortful. Many other findings emerge, such as selection point 'drift' in the absence of feedback. These results and the organising framework provide a foundation for innovation and understanding of air pointing interaction.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (59)
CITATIONS (78)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....