There is no doubt that artificial intelligence (AI) machines are getting better at understanding language. In fact, they likely do so every day as they are used in our emails for spell checking and in our phones for voice assistance. But apparently, AI can do so much more in this regard.
Now, WIRED's Mary Lou Jepsen and John Ryan are arguing these same techniques will soon be applied to help us understand our pets and the animal kingdom around us. "Take a leap. Imagine that whale songs are communicating in a word-like structure. Then, what if the relationships that whales have for their ideas have dimensional relationships similar to those we see in human languages?" argue the authors.
If that's the case, then we should be able to map out what they are saying and perhaps even talk back to them. Many animals have shown advanced signs of intelligent thinking, so there is no reason why we can't assume that they have thinking patterns similar to ours.
There are already examples of AI doing that. Take, for instance, software that can decode conversations between small monkeys called marmosets.
Marmosets have a vocabulary that includes 10 to 15 calls, and previous studies suggest that they learn to communicate by hearing other marmosets talk to them. Although the system is currently just set up to recognize between monkey noises and background noise, it is an excellent first start to recognize and start depicting monkey language.
There's another area that AI could help us communicate with animals, and that is by recognizing their facial expressions. Expression is a very useful form of language.
Computer scientists at the University of Cambridge in the United Kingdom trained an AI to spot pain in sheep, something only trained veterinarians can do. So with such interesting projects, how close are we to really communicating with our fellow animals?
A 2017 Amazon-sponsored report on future trends predicted that in just 10 years, we’ll have a translator for pets. And we hope they are right!