Elephant Translator: Can Technology Enable Humans to Talk to Animals? | Science & Technology

Out of more than eight million species that live on the planet, humans can only understand the language of one. After decades of searching for ways to communicate with animals, several scientists have turned to artificial intelligence to detect patterns in their sounds and behavior, understand their intentions, and interact with them. However, despite the promising progress of several investigations, creating translators for elephants, dogs or whales presents several challenges.

Eva Meijer, author of Animal Languages: The Secret Conversations of the Living World, explains that animals talk all the time—both among themselves and in multispecies environments to survive, form friendships, discuss social rules, and even flirt. The scientific evidence, the expert points out, shows that they have languages, cultures and complex inner lives, that they fall in love and mourn the loss of their partners.

As she explains in her book, dolphins call each other by name, prairie dogs describe intruders in great detail, bats love to gossip, and grammatical structures can be found in the songs of some birds. Wild chimpanzees communicate through dozens of different gestures and bees dance to communicate, and they can recognize and memorize human faces.

Studying animal language and behavior is important not only for learning how they communicate with each other, but also for figuring out how they communicate with us. Some, like dogs, birds and horses, are even able to learn words: according to a study published in the journal behavioral processes, a Border Collie can remember more than a thousand. Additionally, some animals respond to tone of voice and body language, explains Melody Jackson, a professor at the Georgia Institute of Technology and an expert on dog-computer interactions: Soft tones convey friendship, while hard or strong tones can be threatening. Touch can also be used as a reward in dogs and horses.

READ :  AWE alumna creates mental health care app

Artificial intelligence to “talk” to animals.

Several scientists have turned to artificial intelligence and other technologies to understand and improve this communication. Clara Mancini, an animal-computer interaction researcher at the Open University in the UK, explains that sensors can be used to record, analyze and interpret many different animal signals, including those that may be difficult for human ears to detect .

Proponents of the Wild Dolphin Project have spent more than 30 years collecting a database of dolphin behaviors and their sounds, including three: whistles for long-distance communication and as contact calls between mothers and calves when separated, B. clicks for orientation and navigation , and so-called burst pulses, closely spaced click packets that occur in the case of social behavior in close proximity such as e.g. B. Fighting can be used. The goal of this project is to create machine learning algorithms to find patterns in these sounds and develop systems that can generate “words” to interact with dolphins in the wild.

There are many similar projects. Elephant Voices researchers have created an online ethogram of vocalizations and behaviors of elephants in Kenya and Mozambique, including examples such as the trumpet sounds they typically make when emerging from the water after playing. Another team has developed software that can automatically detect, analyze, and categorize the ultrasonic vocalizations of rodents; it’s called DeepSqueak and has also been used on lemurs, whales and other sea creatures. Some scientists have developed systems to detect distress calls from chickens, while others are trying to understand dogs by using machine learning to determine if their whines indicate sadness or happiness.

READ :  Artificial intelligence could help billions of people around the world with skin care challenges - here's how | Science | news

The Challenges of Creating “Translators”

Although some researchers have identified the structure and some of the meaning of some animals’ vocalizations, the creation of “translators” presents several challenges. First of all, as Mancini points out, understanding the semantic and emotional meaning of what they communicate is a highly complex task: we are not in their minds, and we do not have the same physical, sensory and cognitive properties through which they experience the world . If these differences and complexities are not taken into account, their messages can be trivialized – and misinterpreted.

Add to that the fact that current technologies require environmental or wearable sensors, which are not always practical. Sometimes the appropriate cameras are not available and it can be very difficult to film a moving animal well enough for video analysis. Furthermore, the interpretation of their communication based solely on vocal expression omits other channels that might be relevant to understanding their meaning, such as B. their behavior.

Animals also communicate through their actions, their gestures, and even their facial expressions. For example, when two groups of elephants get together and tuck in their ears while quickly waving at you, they express a warm greeting, which is part of their welcoming ceremony, according to Elephant Voices. And sheep can use facial expressions to express pain. In fact, computer scientists at the University of Cambridge have developed an artificial intelligence system that analyzes their faces to detect when they are injured.

Some researchers study dogs’ posture and behavior to predict how they’re feeling, sometimes turning to biometrics to try to pinpoint changes in heart rate, breathing and temperature that could provide clues to their emotions, Jackson says . Some of these dog interpretation systems use body sensors to measure position and movement, and others use cameras to record and analyze the videos.

READ :  The beautiful intersection of simulation and AI

Vests for training dogs and robotic bees

Being able to communicate with animals can be useful in several contexts. For example, Jackson’s team has developed technology that allows a human handler to remotely guide a search and rescue dog using vibrating motors attached to a vest. They have also developed wearable computers that allow a service dog to contact emergency services with a GPS location if its owner is having a seizure.

Humans may never sing like a whale or hum like a bee, but machines may. In fact, a team of German researchers have developed a biomimetic robot called RoboBee, which mimics the dances that bees use to communicate, and the results have been successful: with this robot, they claim to have managed to recruit real bees and convert them to a specific one Aim to guide locations.

The progress is quite promising. However, it is still too early to predict if there will ever be animal translators. Jackson believes that as computers and sensors get smaller and more powerful, tiny implantable systems will be developed to provide more clues about their behavior and one day achieve true two-way communication.

Sign up for our weekly newsletter to get more English language coverage from EL PAÍS USA Edition