Can an android’s posture and movement discriminate against the ambiguous emotion perceived from its facial expressions?
Emotional expression
Expression (computer science)
Modalities
Facial muscles
DOI:
10.1371/journal.pone.0254905
Publication Date:
2021-08-10T17:34:39Z
AUTHORS (4)
ABSTRACT
Expressing emotions through various modalities is a crucial function not only for humans but also robots. The mapping method from facial expressions to the basic widely used in research on robot emotional expressions. This claims that there are specific muscle activation patterns each expression and people can perceive these by reading patterns. However, recent human behavior reveals some expressions, such as emotion “intense”, difficult judge positive or negative just looking at alone. Nevertheless, it has been investigated whether robots express ambiguous with no clear valence addition of body make clearer humans. paper shows an android be perceived more clearly viewers when postures movements added. We conducted three experiments online surveys among North American residents 94, 114 participants, respectively. In Experiment 1, calculating entropy, we found “intense” was they were shown expression. Experiments 2 3, analyzing ANOVA, confirmed participants better judging whole android, even though same 1. These results suggest should designed jointly achieve communication order smoother cooperative human-robot interaction, education robots, conveyed combination both face necessary convey robot’s intentions desires
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (43)
CITATIONS (3)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....