- Tactile and Sensory Interactions
- Hand Gesture Recognition Systems
- Music and Audio Processing
- Music Technology and Sound Studies
- Gaze Tracking and Assistive Technology
META Health
2024
Enabling computing systems to understand user interactions with everyday surfaces and objects can drive a wide range of applications. However, existing vibration-based sensors (e.g., accelerometers) lack the sensitivity detect light touch gestures or bandwidth recognize activity containing high-frequency components. Conversely, microphones are highly susceptible environmental noise, degrading performance. Each time an object impacts surface, Surface Acoustic Waves (SAWs) generated that...
AR/VR devices have started to adopt hand tracking, in lieu of controllers, support user interaction. However, today's input rely primarily on one gesture: pinch. Moreover, current mappings motion use cases like VR locomotion and content scrolling involve more complex larger arm motions than joystick or trackpad usage. STMG increases the gesture space by recognizing additional small thumb-based microgestures from skeletal tracking running a headset. We take machine learning approach achieve...