How do animals feel at a particular moment? Humans have long been aware of certain well-known behaviors like cat hissing as warnings, but in many cases we have little clue as to what is happening in the animal's head.
We have a better idea thanks to Milan-based researchers who have developed an AI model that claims that their calls can detect whether they represent positive or negative emotions. The Stavros Ntalampiras deep learning model is published in the scientific report and can recognize emotional tones across seven hoofed animals, including pigs, goats, and cows. This model covers call sharing features, including pitch, frequency range, and sound quality.
The analysis showed that negative calls tended to be more moderate to high frequency, and positive calls spread evenly across the spectrum. High-pitched calls were particularly beneficial in pigs, but in sheep and horses, the medium range carried more weight. This shows that animals not only share general markers of emotions, but also express them in different ways depending on the species.
For scientists who have long tried to unleash the animal's signals, this discovery of interspecies emotional traits is the latest leap in the field being transformed by AI.
The meaning is broad. Farmers can receive previous warnings of livestock stress, conservationists may remotely monitor the emotional health of wild populations, and zoos can respond more quickly with subtle changes in welfare.
This possibility of a new layer of insight into the animal world also raises ethical questions. What responsibility should humans bear if the algorithms can be reliably detected when an animal is suffering? And how do we protect the excess total, assuming that all signs of awakening mean the same in all species?
Bark and buzz
The tools devised by Ntalampiras are not trained to “translate” animals in the human sense, but are detected to perceive behavioral and acoustic patterns that are too subtle for us, as we are not worried.
The same work is currently underway for whales. In Whale, the New York-based research institute Project CETI (Cetacean Translation Initiative) analyzes a patterned click sequence called Codas. For a long time, when encoding social meanings, these have been mapped on a large scale using machine learning, revealing patterns that may correspond to each whale's identity, belonging or emotional state.

Aron
In dogs, researchers have linked facial expressions, vocalizations, and tail waving patterns with emotional states. One study showed that subtle changes in the muscles in dogs' faces correspond to fear and excitement. Another found that the direction of the tail wag differs depending on whether the dog encounters a close friend or a potential threat.
The Dublin City University's Data Analysis Insights Center develops detection collars worn by aid dogs trained to recognize the onset of seizures in people suffering from epilepsy. The collar uses sensors to pick up dogs' trained behaviors, such as spinning.
Funded by Research Ireland, the project strives to demonstrate how AI can leverage animal communication to improve safety, support timely interventions, and improve quality of life. In the future, we aim to train models to recognize instinctive dog behaviors such as pawning, nudging, and bare.
The bees are also located under the AI lens. Their complex wagle dances – eight-digit movements indicating food sources – are decoded in real time with computer vision. These models highlight how small positional shifts affect how well other bees interpret the message.
caveat
These systems promise the true benefits of animal welfare and safety. The collar, which senses the first signs of stress in a working dog, will not spare it from fatigue. Dairy cows monitored by visually-based AI may receive sick time or days of treatment before farmers notice.
But detecting a cry of pain is not the same as understanding what it means. AI can show that two whale codas often occur together, or that pig cries share characteristics with goat burritos. Milan's research goes further by broadly categorizing such calls positively or negatively, but even this uses pattern recognition that attempts to decipher emotions.
Emotional classifiers risk flattening rich behavior into a coarse binary of happiness/sadness or mild/stress, such as cutting down dog tail waving as “consent” or recording stress as “consent.” As Ntalampiras points out in his work, pattern recognition is not the same as understanding.
One solution is for researchers to develop models that integrate vocal data with visual cues such as posture and facial expressions, as well as physiological signals such as heart rate, to build more reliable indicators of the animal's feelings. AI models are most reliable when interpreted in context, along with the knowledge of someone who has experienced in the species.

MovChanzentsoma
It is also worth keeping in mind that listening is high ecological prices. Using AI adds carbon costs to reduce the highly conservation targets they claim to provide in vulnerable ecosystems. Therefore, it is important that technology that truly helps animal welfare is useful for everything, rather than simply satisfying human curiosity.
Whether we welcome it or not, AI is here. The machine is now deciphering the signal that has been polished long ago for us, and will continue to improve.
But the real test isn't how well we listen, but what we do with what we hear. If we burn animal signals, but use information to misuse them or manage them more firmly, it's us, not science.
