Understanding Emotional Body Expressions via Large Language Models
DOI:
10.1609/aaai.v39i2.32135
Publication Date:
2025-04-11T09:37:36Z
AUTHORS (6)
ABSTRACT
Emotion recognition based on body movements is vital in human-computer interaction. However, existing emotion methods predominantly focus enhancing classification accuracy, often neglecting the provision of textual explanations to justify their classifications. In this paper, we propose an Emotion-Action Interpreter powered by LargeLanguage Model (EAI-LLM), which not only recognizes emotions but also generates treating 3D movement data as unique input tokens within large language models (LLMs). Specifically, a multi-granularity skeleton tokenizer designed for LLMs, separately extracts spatio-temporal and semantic from data. This approach allows LLMs generate more nuanced descriptions while maintaining robust performance. Furthermore, treat sequence specific unified token module. module leverages extensive background knowledge processing capabilities address challenges joint training heterogeneous datasets, thereby significantly accuracy individual datasets. Experimental results demonstrate that our model achieves comparable methods. More importantly, with support can detailed results, even when trained limited amount labeled
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (0)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....