Talking to Animals: Using AI to Decode the Language of Whales

Humanity has long dreamed of understanding what animals say to each other. AI could soon make that a reality.
Eric James Beyer
Eric James Beyer

When you dive into the ocean, the physiology of your body changes. As you go deeper into the water, your heart rate slows. Blood flows from your extremities toward your vital organs, keeping your heart and brain oxygenated and your lungs from collapsing under the increasing pressure. In an environment that is seemingly hostile to its survival, the body becomes remarkably efficient at keeping you alive.

The mammalian dive reflex, more romantically termed the “Master Switch of Life” by its discoverer, the physiologist Per Scholander, helped shape how we view our relationship to the water. If our bodies were so at home in the ocean, scientists wondered, what did that say about our evolutionary history?

While covering a freediving competition called the Individual Depth World Championship in Kalamata, Greece in 2011, journalist James Nestor saw up close just how logic-defying that reflex can appear. In deep and open water ten miles off the coast, he witnessed individuals descend 300 feet (91 m) on a single breath of air, reappearing at the surface relatively unphased.

His fascination with the freedivers at that competition would send him on a year-and-a-half-long journey around the world, exploring humanity’s connection to the water in every way he could. 

An aerial photo of a humpback whale mother and her calf swimming next to her.
Source: Will Turner/Unsplash

“I discovered that we’re more closely connected to the ocean than most people would suspect,” he writes in Deep: Freediving, Renegade Science, and what the Ocean Tells Us About Ourselves. “It’s this connection—between the ocean and us, between us and the sea creatures with whom we share a great deal of DNA—that drew me deeper and deeper still.” Nestor was drawn so deep that he eventually found himself freediving with whales to learn more about them.

"Sperm whale behavior more closely resembles our culture and intellect than any other creature’s on the planet."

He’s not alone in being captivated by the creatures, and for good reason. Studies have shown that whales may be effectively cancer-exempt, for example. Humans also have a lot more in common with them than many realize. A look at phylogenetics, the study of the evolutionary relationships of different species, reveals that whales are more closely related to humans than they are to sharks. 

We’ve also learned a great deal about their behavior, with studies from the 1960s detailing how whales use a variety of noises to identify objects, survey their surroundings, and communicate with each other. Thanks to advances in machine learning and artificial intelligence, we’re now beginning to pull back the curtain on that communication in truly exciting ways. 

Don't call us, whale call you

In 2011, Zooniverse, a platform for conducting scientific research with the help of volunteers, paired with Scientific American to start the Whale FM project. The initiative collected over 15,000 samples of pilot and orca (killer) whale calls from off the coasts of the Bahamas, Iceland, and Norway to see whether computer analysis could decipher anything about these populations and the ways they use their sonic repertoire. 

“We wanted to analyze and profile whale communication,” explained Whale FM lead AI scientist Lior Shamir in an interview with Interesting Engineering. 

“We started with supervised machine learning, just to see if the computer can identify differences between the audio, and then we started to use unsupervised learning. I asked the computer, “What can you tell us about the relationships between the sets of audio samples that we have?”

"The same species might have a different dialect based on where they live, just like people have different accents."

The AI Shamir designed had no human guidance while analyzing this audio, it only knew that different whale types existed in the dataset. After running the program across 300 computer processors for a total of seven weeks, the AI produced a map that grouped the pilot whales and orcas separately. 

“That wasn’t very surprising,” Shamir said. “They’re different species, we expect they’d speak differently. But what was interesting, inside those groups, it identified and clustered those pods of whales by where they live on the globe.” 

Shamir and the Whale FM team were surprised. Their work had uncovered evidence for different communication styles within each species. Norwegian orcas, for example, were speaking a unique dialect compared to their Icelandic relatives. The same was true for pilot whales in the Bahamas and in Norway. 

“[The results] showed that the same species might have a different dialect based on where they live, just like people have different accents. They are actually communicating, they are actually speaking with each other, and eventually we’ll figure out what they’re saying.” 

Enter Project CETI

Eventually could come sooner rather than later. In April of this year, a team of experts in robotics, linguistics, and machine learning established Project CETI (Cetacean Translation Initiative), a scientific endeavor with the goal of applying new advances in machine learning to better understand sperm whale language. 

Nestor, one of the project's founding members, is convinced sperm whales are a unique species, an animal whose behavior, he writes, “more closely resembles our culture and intellect than any other creature’s on the planet.” 

Sperm whales “talk” to one another through intensely loud series of clicks, called codas. The clicks are so loud they can reach upwards of 230 decibels—louder than a rocket launch. This makes them the loudest animal on the planet and enables them to communicate with one another over distances of hundreds of miles. 

View this post on Instagram

A post shared by Underwater Photography (@uwphotosociety)

Some scientists wonder if there is a link between the codas and the startlingly familiar brain physiology that produces them. Much research supports the idea that cetaceans (marine mammals) have highly-developed brains capable of complex cognition. Sperm whale brains in particular are six times the size of a human’s. They also contain rare and specialized neurons called spindle cells, which, according to NewScientist, are also found in human brains in the regions responsible for empathy, social organization, and speech. 

“We are hoping to ignite a worldwide movement to reshape our relationship with life on earth."

CETI’s first priority is to build its database. In order to build a reliable neural network, a type of machine learning where a system “learns” to perform a task, you need to train it on data—a lot of it. One of the reasons why it’s relatively new to see AI being applied to whale communication has been the absence of large pools of data the technology can draw from.

This is because obtaining this information is neither cheap nor expedient, requiring scientists to spend months living on boats collecting information in logistically challenging ways. 

“You’re not going to find any marine biologists saying these animals aren’t communicating in a very sophisticated way,” Nestor said in a talk with Bioneers recently. “We’ve known that for 50 years. We just haven’t had the technology or the means to really research it.”

CETI is taking these difficulties to task, and is poised to collect as much information as they can by building new audio and video equipment to record sperm whale calls by the millions. The team will use a combination of static sensor rays, electronic tags connected to the whales themselves, and underwater robots to get as complete a sonic picture as possible. 

Whether these whales possess anything similar to what we might call grammar or syntax in their communication is one of the questions the project hopes to answer. 

View this post on Instagram

A post shared by Underwater Photography (@uwphotosociety)

The project’s head AI researcher is Michael Bronstein, chair of machine learning and pattern recognition at Imperial College London. While giving a talk for the Simons Institute at the University of California, Berkeley in August 2020, Bronstein noted that machine learning will be particularly helpful in identifying whether some differences in sperm whale codas are due to a fundamental change in their semantic meaning or a dialect variation, like what Shamir and his team discovered amongst orca and pilot whales. 

He also spoke on some of the key communicative areas of analysis they'll be directing AI toward. 

“We will look for evidence of displacement, capability of language to communicate about things that are not immediately present in time and in space,” Bronstein said, “which is also a feature that is believed to never [have] been observed in other animals.” 

There is a fair amount of optimism surrounding these AI analyses. Since the time of the Whale FM project, machine learning has only gotten better at what it does. Neural networks are now capable of translating two human languages without being given dictionaries or parallel texts (identical documents that exist in each language, like a Rosetta Stone).

Given a large enough data pool, the right programming, and an almost fanciful amount of computing power, Project CETI might just bring the mystery of sperm whale communication up from the depths and into clearer waters. 

Breaching the anthropic surface

The project’s goals are admittedly ambitious. Bronstein himself has called the project, “the craziest moonshot I have ever participated in.” And, while the CETI website states that it will share its data publicly with an eye towards “better understanding animals both in the ocean and on land,” its motivations stretch well beyond that.

Lead CETI researcher David Gruber is a marine biologist who has spent years putting people on to the beauty of the world’s oceans. His research includes the discovery of the first evidence of biofluorescence in Arctic snailfish, the development of the world’s first “shark-eye” camera, and warnings of mass coral reef die-offs and extinction events. 

He’s now funneling his passion for the oceans into Project CETI. “Through an understanding of a non-human species and the power of collaborative research,” he said in an interview with the Harvard School of Engineering and Applied Sciences in April, “we are hoping to ignite a worldwide movement to reshape our relationship with life on earth and connect to and learn from nature.”

That sounds grandiose, but the idea stands on solid scientific ground. In the last few decades, the increasing public awareness of human-made climate issues has paralleled an expansion of research that reveals the importance of biodiversity on our planet. Taken together, these are beginning to erode the idea of human exceptionalism.

Two orca whales break the water's surface at sunset.
Source: Bart van Meele/Unsplash

These studies instead point to the genealogical, physiological, and even behavioral kinship between ourselves and the millions of animal species we share the planet with. The ethical profundity of this evidence is likely to force humanity to rethink how we conceive of and treat the natural world, and CETI’s work could be a significant part of that reassessment. 

Shamir is one researcher who has helped bring more voices to this increasingly prominent environmental conversation. He sees himself as an example of someone who has shifted their perspective on whales due to the discoveries that he has helped bring about. 

“The fact that whales pick up a dialect, just like us humans, it gives us a social aspect [...] It goes beyond language, it goes to the social behavior of whales and its similarity to how humans behave. After you listen to so many hours of it, analyzing it, when you’re exposed to it, you have to look at whales differently. They seem more human.” 

Add Interesting Engineering to your Google News feed.
Add Interesting Engineering to your Google News feed.
message circleSHOW COMMENT (1)chevron
Job Board