An Integrated Framework for Understanding Multimodal Embodied Experiences in Interactive Virtual Reality
immersion
Embodied experiences
3D environments
user experience analysis
task modeling
interaction
[INFO]Computer Science [cs]
[INFO.INFO-HC]Computer Science [cs]/Human-Computer Interaction [cs.HC]
navigation
scene ontology
DOI:
10.1145/3573381.3596150
Publication Date:
2023-08-29T17:51:36Z
AUTHORS (6)
ABSTRACT
International audience<br/>Virtual Reality (VR) technology enables ``embodied interactions'' in realistic environments where users can freely move and interact, with deep physical and emotional states. However, a comprehensive understanding of the embodied user experience is currently limited by the extent to which one can make relevant observations, and the accuracy at which observations can be interpreted. Paul Dourish proposed a way forward through the characterisation of embodied interactions in three senses: ontology, intersubjectivity, and intentionality. In a joint effort between computer and neuro-scientists, we built a framework to design studies that investigate multimodal embodied experiences in VR, and apply it to study the impact of simulated low-vision on user navigation. Our methodology involves the design of 3D scenarios annotated with an ontology, modelling intersubjective tasks, and correlating multimodal metrics such as gaze and physiology to derive intentions. We show how this framework enables a more fine-grained understanding of embodied interactions in behavioural research.<br/>
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (53)
CITATIONS (6)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....