Even though you might not be an animal person (and by that we mean someone who loves animals), if you watch almost any well-made documentary on the evolution of human civilisation over its history, the important role that our pets and domesticated fauna have played will be quite evident. From agriculture to poultry, and even to cuddling with us on a bad day (or not, if you have a cat), animals have been there for us. Which explains our need to try and understand what they’ve been saying all these years.
As any pet owner would tell you, there is a particular way in which most animals communicate. Dogs have evolved over the years to be particularly suited as pets, even learning complex actions through training and for rewards. But that still doesn’t mean that a dog can do much beyond wagging its tail, and barking in a particular way. There still remains a gap in most animal-human communication that, if filled, would at the very least help us understand greatly what an animal goes through when we treat them the way we do. So, in today’s era of artificial intelligence and flying cars, have we really been able to come up with technology that can enable all of us to become Dr Dolittles for our furry and winged friends?
Do they even talk?
To understand what an animal is saying, we first need to make sure that they are actually saying something. For ages, we’ve lived with dogs, parrots, horses and more, and there have been instances where we’ve seen them make intelligible sounds, actions and even pronounce words. We’re talking about talking parrots here. Even killer whales, orcas and elephants can seemingly pull this off. But almost all of it has been trained behaviour in return for rewards and they haven’t actually been trying to communicate at all.
Animals have been known to exhibit other complex behaviours. Mammals of all kinds have been shown to have what we, as humans, define as a culture. They employ deception, recognise themselves in mirrors, and are even capable of showing emotions to their fellow animals. But language is an entirely different ball game.
To put it simply, possessing a language is dependent on certain abilities – the ability to question (ask ‘why?’) and the ability to negate (say ‘no’), neither of which have been displayed by animals or birds. Even famous examples of animals imitating human words or creating human-like vowel-consonant combined sounds have been due to a behaviour called vocal imitation. Except for a few exceptions like Alex the African grey parrot who learnt more than a 1000 words and would perform complex interactions like wishing his trainer goodnight when she left and indicating when he wants to sit near a window using a verbal phrase.
|Check out how Google translate for animals works.|
So will we be able to understand them?
One of the main purposes of technology is to solve problems we humans are incapable of, or rather slow at. When it comes to bridging the gap in human-animal communication, technology like wearables and AI are coming to the aid of research scientists and inventors.
Dr Con Slobodchikoff, an animal behaviour expert, has been researching prairie dogs (tiny rodents that live in burrows beneath the plains) and their vocalisation for more than 30 years. His research has shown him that not only do prairie dogs have complex warning calls, they even communicate the colour of an individual’s clothing to each other. Last year, Dr Slobodchikoff and a colleague created an algorithm to parse these vocalisations into English, effectively creating a prairie dog translator and also started Zoolingua, a startup dedicated to decoding animal languages. And he believes that the results he has achieved with prairie dogs can be achieved with other species too.
His efforts involve showing his AI programme thousands of videos of dog behaviours and vocalisations for it to learn from. To start off, these algorithms are going to identify and distinguish dog behaviour like tail wagging, a low bark, gritted teeth and more. At this point, they’ll need human intervention to associate meaning with these actions. Over time, with bigger datasets to learn from, the final objective of this effort would be to be able to point your phone camera at a dog that is barking at you, and an app telling you what the dog was saying, what mood it was in, and also highlighting signs of friendliness or danger, based on the dog’s posture and body movements, etc.
And it’s not just dogs. Dolphins are capable of quite a sophisticated level of communication, pretty close to human language complexity. Swedish language technology company Gavagai AB is working with KTH Royal Institute of Technology in Stockholm to use AI and build a dolphin language dictionary by 2021. But all of these are future goals, what about now?
While AI and machine learning tech takes its time to help us understand our furry and feathered friends, we have other avenues to help us understand them. For instance, Georgia Institute of Technology researcher Melody Jackson, director of the university’s BrainLab, has developed pet communication technology that could save a human life. It doesn’t exactly convert their vocals into human speech but teaches them to use a device to do so.
Along with Google Glass developer Thad Starner, Jackson channelled her passion for animals into developing an add-on for a typical dog vest that lets dogs touch their snouts to it in a specific way to play specific, verbal messages. The interesting thing about this is how easily and quickly dogs were able to figure out how to use it. One particular test subject even learnt (within 27 seconds!) that without touching, just hovering his nose over the button can do the trick. It’s all part of a field called Animal-computer interaction.
A service dog can be trained to perform other actions with the same device, like send an SOS alert with GPS coordinates. The same message can be altered to fulfil a host of purposes. For instance, a guide dog can tell its visually impaired owner exactly what kind of obstacle is there in front of them.
In a report co-authored for Amazon, futurologist William Higham of Next Big Thing says he believes devices that can talk dog could be less than 10 years away. And if the current applications of AI and machine learning to the problem are anything to go by, it is more likely to happens sooner, rather than later.
If the gap in human-animal communication is bridged, innumerable doors would open up. While most of them would be good – like your dog telling you exactly what it is afraid of, helping you help it at the right point – some of it can upset existing establishments as well. For instance, if the same tech is applied to dairy and poultry livestock – would you really be comfortable consuming products from a being that can tell you exactly how much it hates being used for your food? To solve these problems is exactly why we’ve evolved over all these years to be the most superior species on the face of the Earth.