Pin It
test 2

Google Translate for whales? How AI is being used to speak with animals

A new wave of researchers are using machine learning to understand animals on their own terms

In 1963, a researcher named Margaret Howe Lovatt entered a NASA-funded project to attempt to communicate with dolphins. For three months, Howe Lovatt spent 24 hours a day in an isolation tank with a dolphin under the instruction of neuroscientist John Lilly, where she tried to teach it to make human-like sounds through its blowhole. Lilly’s method grew increasingly controversial – later attempts would see him administer doses of LSD to the dolphins – and the study was ultimately unsuccessful, with funding for the experiment getting shut down after three years. “Lilly’s research killed the field for decades, it was like dropping a bomb on marine biology,” says David Gruber, the scientist behind Project Ceti, a nonprofit initiative using AI to understand the communication of sperm whales. 

While past research into animal communication was focused on trying to get animals to speak like humans, a new wave of research takes a biocentric approach, using AI to understand non-human communication on its own terms, while establishing new ways to help animals against the ongoing environmental crisis – climate change has already reduced bird songs globally, and natural habitats are destroyed daily. “AI can be used to build more empathy and understanding between species,” agrees Gruber. While these ideas have long been accepted across indigenous belief systems, the rise of AI models can help us map out new methods removed from the structures of anthropocenic (and western) thought. “Technology that allows us to communicate with other species directly would impart a new way of being-in-the-world,” writes K McDowell in their 2023 book Air Age Blueprint

Given that researchers have already invented an AI to decipher ancient human languages, including ancient Babylonian texts, the idea that we can develop a Google Translate-like programme to decode non-human communication isn’t that far off. “It looks like the information that can be carried in sperm whale clicks is a lot greater than we thought – we’re onto something very promising,” says Gruber. For example, Project CETI is working on decoding the language of sperm whales – considered a federally endangered species in the US – while the Earth Species Project (ESP) is cataloguing the calls of Hawaiian crows, attempting to build new technology that could help humans talk to animals. Elsewhere, DeepSqueak is a software that uses deep learning algorithms to identify, process and categorise the ultrasonic squeaks of rodents – and there’s even an open-source programme that visualises thousands of bird sounds using machine learning.

Among these advancements, the emerging field of digital bioacoustics combines computer vision with natural language processing to pick up animal patterns without disrupting the ecosystem. The technology works by attaching lightweight digital recorders to the backs of, for example, whales or turtles to detect sounds beyond human hearing range. According to Karen Bakker, the author of 2022 book The Sounds of Life, bioacoustics have already helped us to understand bat vocalisations – for example, its been revealed that bats remember favours and harbour grudges, and have specific calls that function as names. 

Accepting that non-human life has a language and world of its own – a quality previously assumed to exist only within humans – can bring us closer to animals and plants. This helps us to move away from man versus nature, and towards an interconnected ecology. “A lot of the motivation for doing this project is trying to find an example where it can benefit other species, other than us,” says Gruber, who points to the 1970 album Songs of the Humpback Whale, which launched the Save The Whales movement and put a stop to commercial whaling. “We’re also looking into the policy ramifications of understanding what they’re saying, [and] we’ll be able to hopefully draft better legislation that could protect the sperm whales from our adverse effects.”

But Gruber warns that there’s still a long way to go before we can understand non-verbal animal communication. “All these large language models that we’re hearing so much about in recent months are all based on human language – and now it’s even interesting to be like ‘well, how much of this technology is even applicable for understanding animals and whales?’, and that's something that’s still a question mark because we don’t have any other database of, as extensive as we have of our own language,” he says.

Still, there are some ethical questions to consider, like do we have the right to eavesdrop on animals without their consent? And, even worse, what if they’re complaining about us? Or plotting a revolution? Do we even want to know? “Like, what is UX for whales? And what is the purpose of it?” asks McDowell. “These are really huge questions that aren’t answered, because we’ve been so enthralled with the idea that we can actually do this. I think it is beautiful – it comes from a good place that people want to make contact in that way, but we need to decide why we’re doing it.”

Join Dazed Club and be part of our world! You get exclusive access to events, parties, festivals and our editors, as well as a free subscription to Dazed for a year. Join for £5/month today.

Download the app 📱

  • Build your network and meet other creatives
  • Be the first to hear about exclusive Dazed events and offers
  • Share your work with our community
Join Dazed Club