It is likely that you, a human, can tell when your fellow humans are upset based on the sound of their voice. You might even be able to tell when your non-human pet is upset. But what about non-mammals, like frogs? What about birds?
As far back as Darwin, folks have thought that the way animals vocalize could offer hints to their emotions. Past research seems to demonstrate that universal emotional signals exist among mammals, but a new study from an international team of researchers takes it even further. Its results could have important implications for the understanding of animal emotions, or even for designing artificial intelligence.
“We know animals can decode emotions in other species but the only species that we could rely on have been mammals,” study author Piera Filippi from the Artificial Intelligence Laboratory, Vrije Universiteit Brussel in Belgium told Gizmodo. In this new study, “we showed humans can understand emotional arousal in nine different species.”
Emotional arousal might sound dirty, but the researcher’s definition was simple:
“Arousal is a state of the brain or the body reflecting responsiveness to sensory stimulation. Arousal level typically ranges from low (very subdued) to high (very excited). Examples of low arousal states (i.e. of low responsiveness to sensory stimulation) are calmness or boredom. Examples of high arousal states (i.e. of high responsiveness to sensory stimulation) are anger or excitement.”
With this information in mind, 25 English speakers, 25 native Mandarin speakers, and 25 native German speakers listened to vocalizations from nine different species: the hourglass tree frog, African bush elephant, giant panda, domestic pig, Barbary macaque, American alligator, common raven, black-capped chickadee and human. They simply needed to compare vocalizations among species members and determine which noise corresponded to a higher state of arousal.
Regardless of the participant’s native language, they answered the prompts correctly more than 85 percent of the time for every animal except the pig, raven and macaque (which they answered correctly around 60 to 70 percent of the time), according to the study published today in the Proceedings of the Royal Society B.
This is a first step, of course. All of the emotional noises represented animals in a state of distress, as opposed to positive emotional arousal, Filippi pointed out. But the results are clear: you can probably tell when an animal is upset based on the sounds it’s making.
The question stands just how universal these emotional sounds are, and what they mean. Co-author Bart de Boer from the Vrije Universiteit Brussel Artificial Intelligence Lab felt differing pitches between states of emotional arousal could come from pure physics: “higher arousal leads to more muscle tension etc. and this leads (through the physics of vocal production) to louder and higher-pitched calls,” he said in an email.
Determining to what degree humans are able to discern other species’ emotional states is a next step, said Filippi. “It would be nice to have a comparison between low and high arousal not only in negative valence calls, but in positive valence calls,” when animals are in a good mood. “This would give us a more complete picture of the evolution of emotional communication.”
De Boer mentioned a few couple limitations of the work, mainly that it didn’t take loudness into account. He commended Filippa’s work, and her pushing the agenda so the large effort (across several countries) could proceed.
Others were impressed with the paper’s results. “In general I can say that it is an important contribution and a big step forward in the studies of emotion communication,” Tamás Faragó from the Comparative Ethology Research Group at Eotvos Lorand University in Budapest (who was not involved with the study) told Gizmodo. “This is the first study in which wide range of evolutionary distant species’ calls are used, and the method seems to be well thought and the results are pretty convincing. He did point out that calls come from varying social contexts that could affect what the calls sound like and introduce a confounding factor. But still, the work offer good evidence into the hypothesis, that there exist some “acoustic universals in emotion communication.”
Understanding how animal sounds act as emotional cues could have important applications, too, said Filippi. Perhaps it can help with animal welfare, or even with artificial intelligence. AI research can overlook the importance of emotional aspects of speech recognition, she said. She’d like to see her work applied in things like speech synthesis and improving communications between humans and computers.
And who knows, if the effect really is universal, then maybe your dog/cat/frog/bird can tell when you’re pissed off, too.