Emotion recognition based on phoneme classes

0202 electrical engineering, electronic engineering, information engineering 02 engineering and technology
DOI: 10.21437/interspeech.2004-322 Publication Date: 2021-08-24T07:06:48Z
ABSTRACT
Recognizing human emotions/attitudes from speech cues has gained increased attention recently. Most previous work has focused primarily on suprasegmental prosodic features calculated at the utterance level for this purpose. Notably, not much attention is paid to details at the segmental phoneme level in the modeling. Based on the hypothesis that different emotions have varying effects on the properties of the different speech sounds, this paper investigates the usefulness of phoneme-level modeling for the classification of emotional states from speech. Hidden Markov models (HMM) based on short-term spectral features are used for this purpose using data obtained from a recording of an actress’ expressing 4 different emotional states anger, happiness, neutral, and sadness. We designed and compared two sets of HMM classifiers: a generic set of ”emotional speech” HMMs (one for each emotion) a set of broad phoneticclass based HMMs for each emotion type considered. Five broad phonetic classes were used to explore the effect of emotional coloring on different phoneme classes, and it was found that (spectral properties of) vowel sounds were the best indicator to emotions in terms of the classification performance. The experiments also showed that the best performance can be obtained by using phoneme-class classifiers over generic “emotional” HMM classifier and classifiers based on global prosodic features. To see complementary effect of the prosodic and spectral features, two classifiers were combined at the decision level. The improvement was 0.55% in absolute compared with the result from phoneme-class based HMM classifier.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (87)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....