SocialMind: LLM-based Proactive AR Social Assistive System with Human-like Perception for In-situ Live Interactions
DOI:
10.1145/3712286
Publication Date:
2025-03-04T17:10:14Z
AUTHORS (7)
ABSTRACT
Social interactions are fundamental to human life. The recent emergence of large language models (LLMs)-based virtual assistants has demonstrated their potential revolutionize and lifestyles. However, existing assistive systems mainly provide reactive services individual users, rather than offering in-situ assistance during live social with conversational partners. In this study, we introduce SocialMind, the first LLM-based proactive AR system that provides users assistance. SocialMind employs human-like perception leveraging multi-modal sensors extract both verbal nonverbal cues, factors, implicit personas, incorporating these cues into LLM reasoning for suggestion generation. Additionally, a multi-tier collaborative generation strategy update mechanism display suggestions on Augmented Reality (AR) glasses, ensuring timely provided without disrupting natural flow conversation. Evaluations three public datasets user study 20 participants show achieves 38.3% higher engagement compared baselines, 95% willing use in interactions.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (84)
CITATIONS (0)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....