Researchers have found that people can judge emotions quite accurately by observing the movements of a person’s head, even without seeing their face or hearing their voice.
When talking or singing, people usually move their head to emphasize what they are saying or singing. In crowded or loud places, seeing how a person nods or tilts their head, can give us a good idea of the person’s emotional state. Infact the absence of these movements is irritating, imagine a singer on stage not moving their head at all, no matter how good they sang you would feel that something is missing, and vice versa, if you can not hear what is being sung but can see the performance, you can get an idea about what kind of song is being sung.
Researchers believe the visual information about our emotional state conveyed by our head movements could help in improving interaction between humans and robots. Expressive machines could improve the human experience where face-to-face communication is necessary, like receptionists or care robots for the ill or elderly. The robots could better understand the emotional state of the human and in return show emotions with corresponding head movements. The discovery could also help in automated recognition of emotional states in crowds, or aid people with hearing impairments.