Can You Move These Over There? An LLM-based VR Mover for Supporting Object Manipulation
FOS: Computer and information sciences
Computer Science - Computation and Language
Artificial Intelligence (cs.AI)
Emerging Technologies (cs.ET)
Computer Science - Artificial Intelligence
Computer Science - Human-Computer Interaction
Computer Science - Emerging Technologies
Computation and Language (cs.CL)
Human-Computer Interaction (cs.HC)
DOI:
10.48550/arxiv.2502.02201
Publication Date:
2025-01-01
AUTHORS (7)
ABSTRACT
64 pages (30 in main text), 22 figures (19 in main text)<br/>In our daily lives, we can naturally convey instructions for the spatial manipulation of objects using words and gestures. Transposing this form of interaction into virtual reality (VR) object manipulation can be beneficial. We propose VR Mover, an LLM-empowered solution that can understand and interpret the user's vocal instruction to support object manipulation. By simply pointing and speaking, the LLM can manipulate objects without structured input. Our user study demonstrates that VR Mover enhances user usability, overall experience and performance on multi-object manipulation, while also reducing workload and arm fatigue. Users prefer the proposed natural interface for broad movements and may complementarily switch to gizmos or virtual hands for finer adjustments. These findings are believed to contribute to design implications for future LLM-based object manipulation interfaces, highlighting the potential for more intuitive and efficient user interactions in VR environments.<br/>
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....