A Gaze-Assisted Multimodal Approach to Rich and Accessible Human-Computer Interaction

Multimodal Interaction
DOI: 10.48550/arxiv.1803.04713 Publication Date: 2018-01-01
ABSTRACT
Recent advancements in eye tracking technology are driving the adoption of gaze-assisted interaction as a rich and accessible human-computer paradigm. Gaze-assisted serves contextual, non-invasive, explicit control method for users without disabilities; with motor or speech impairments, text entry by gaze primary means communication. Despite significant advantages, is still not widely accepted because its inherent limitations: 1) Midas touch, 2) low accuracy mouse-like interactions, 3) need repeated calibration, 4) visual fatigue prolonged usage, 5) lower typing speed, so on. This dissertation research proposes gaze-assisted, multimodal, paradigm, related frameworks their applications that effectively enable interactions while addressing many current limitations. In this regard, we present four systems leverage interaction: gaze- foot-operated system precise point-and-click dwell-free, system. gesture-based authentication system, toolkit. addition, also goals to be achieved, technical approach, overall contributions research.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....