arXiv Analytics

Sign in

arXiv:2201.07906 [cs.CV]AbstractReferencesReviewsResources

The Role of Facial Expressions and Emotion in ASL

Lee Kezar, Pei Zhou

Published 2022-01-19Version 1

There is little prior work on quantifying the relationships between facial expressions and emotionality in American Sign Language. In this final report, we provide two methods for studying these relationships through probability and prediction. Using a large corpus of natural signing manually annotated with facial features paired with lexical emotion datasets, we find that there exist many relationships between emotionality and the face, and that a simple classifier can predict what someone is saying in terms of broad emotional categories only by looking at the face.

Related articles: Most relevant | Search more
arXiv:2112.00585 [cs.CV] (Published 2021-12-01, updated 2022-03-30)
Neural Emotion Director: Speech-preserving semantic control of facial expressions in "in-the-wild" videos
arXiv:1710.06836 [cs.CV] (Published 2017-10-18)
Using Deep Convolutional Networks for Gesture Recognition in American Sign Language
arXiv:2104.01291 [cs.CV] (Published 2021-04-03)
Fingerspelling Detection in American Sign Language