iFAD Gestures: Understanding Users' Gesture Input Performance with Index-Finger Augmentation Devices
Radu-Daniel Vatavu
CHI 2023: The ACM CHI Conference on Human Factors in Computing Systems
Session: Pointing and Icons
We examine gestures performed with a class of input devices with distinctive quality properties in the wearables landscape, which we call "index-Finger Augmentation Devices" (iFADs). We introduce a four-level taxonomy to characterize the diversity of iFAD gestures, evaluate iFAD gesture articulation on a dataset of 6,369 gestures collected from 20 participants, and compute recognition accuracy rates. Our findings show that iFAD gestures are fast (1.84s on average), easy to articulate (1.52 average rating on a difficulty scale from 1 to 5), and socially acceptable (81% willingness to use them in public places). We compare iFAD gestures with gestures performed using other devices (styli, touchscreens, game controllers) from several public datasets (39,263 gestures, 277 participants), and report that iFAD gestures are two times faster than whole-body gestures and as fast as stylus and finger strokes performed on touchscreens.
Web:: https://programs.sigchi.org/chi/2023/...
Pre-recorded presentation videos for papers at CHI 2023