The World is Heading to Decoding Animal Communication with AI

The World is Heading to Decoding Animal Communication with AI
Published on

ESP known for its work in decoding non-human language has crossed yet another milestone in deciphering the animal language

Animal farm, the satirical allegorical book written by George Orwell would have been different had we known how animals communicate, and the story ended on a happy note. Technology, as usual, has been put into work to decode the what, how and whys of animal communication for years. AI Animal translators, translating the oinks and whoofs of our pets have been on the ground giving pet owners an immense sense of ownership, despite the opinion differences between the biologists and linguists on animal communication. A recent attempt by Earth Species Project, a Californian non-profit group known for its work in decoding non-human language, has put AI and machine learning algorithms to use, to decipher the animal language, not limited to an animal or species. According to a Guardian report, Raskin, the co-founder, and president of ESP says, "The goal is to unlock communication within our lifetimes. The end we are working towards is, can we decode animal communication, discover non-human language," says Raskin.

What is ESP aiming to achieve?

Understanding animal communication has occupied man's fascination and his quest for getting through has primitive and instinctual connections. Humans could understand a few animal behaviors by instinct and a few by observation. But by and large, it remained observational and fell short of data to label it as a language despite having large chunks of data from sensors and body-mounted cameras. ESP is of the firm opinion that with machine learning algorithms, the subtle and complex signs and signals can well be decoded and that too for the whole animal community, like real-time communication happens between people of different languages. Raskin says it is a long-drawn process and they intend to take one step at a time. ESP earlier published a paper to study the cocktail party problem, which refers to perceiving voices in a noisy setting. Bioacoustic Cocktail Party Problem Network (BioCPPNet), proposed by ESP, could separate the animal noises from their bioacoustics sources. BioCPPNet, a deep neural network architecture could successfully separate acoustic mixtures containing overlapping calls into models of disentangled sources. Through the bioacoustics classification and labeling, they could achieve around 99.3% accuracy across 8 individual species.

Not all animal species have similar communication needs

Animal communication, as experts say, happens in more complex formats than that of humans. A signal or sound might carry a different meaning, though delivered at the same intensity and frequency. And also in the animal community, there is an evolutionary limit. Life on earth has evolved more pivoted around the threats, environmental aspects, and predatory traits of dominant species. Therefore, in developed species like human beings, communication never stops. We primarily evolved as social creatures, whereas Prairie dogs though they talk a lot, do not need to express how they feel because animals only develop language if there is some benefit in doing so. Animals with few natural predators and higher intelligence, like dolphins will have more to express than simply doing the talk. To demonstrate this, Raskin refers to an experiment, just as requested, wherein dolphins seem to have expressive capabilities and perform out of collective decision rather than out of instinctual behavior.

Machine learning is here to mend the songs

More than decoding animal language for the sake of it, there is a conservational undertone to every effort in this direction. Regeant Honeyeater, a rare bird found in south-eastern Australia is gradually getting extinct. Scientists believe they are losing their natural songs, singing weird songs just because they do not cohabit anymore. "They don't get the chance to hang around with other honeyeaters and learn what they're supposed to sound like," explained Dr. Ross Crates to BBC News. The developments in machine learning in understanding animal sounds might provide a lead in teaching the captive birds to learn the songs anew. Jussi Karlgren, a computational linguist, says, "The hope is this: That if we collect a large corpus, a large collection of dolphin whistles and click trains, [we might be] able to segment them". Raskin says, "Along the way and equally important is that we are developing technology that supports biologists and conservation now."

Data, the quintessential oil that runs the show

Raskin is not all too optimistic about artificial intelligence in helping humans with decoding the not-so-nuanced but complicated animal communication.  Depending on machine learning without enough data is almost equal to chasing a mirage and there is a probability that we are already doing it. To analyze the overwhelming amounts of animal chatter that happens among different species we need to have an overwhelming amount of data to understand it and the only way to be sure is to go out and collect data, opines Robert Seyfarth, a professor emeritus in Biological Anthropology at the University of Pennsylvania.

Join our WhatsApp Channel to get the latest news, exclusives and videos on WhatsApp

                                                                                                       _____________                                             

Disclaimer: Analytics Insight does not provide financial advice or guidance. Also note that the cryptocurrencies mentioned/listed on the website could potentially be scams, i.e. designed to induce you to invest financial resources that may be lost forever and not be recoverable once investments are made. You are responsible for conducting your own research (DYOR) before making any investments. Read more here.

Related Stories

No stories found.
logo
Analytics Insight
www.analyticsinsight.net