Elephant translation: Can technology allow humans to talk to animals? | Science and Technology

Of the more than eight million species that live on Earth, only humans understand one language. After decades of finding ways to communicate with animals, some scientists are turning to artificial intelligence to detect their vocal and behavioral patterns in an attempt to understand their intentions and interact with them. However, despite promising progress from multiple investigations, creating translators for elephants, dogs, or whales still faces multiple challenges.

Eva Major, Author Animal Language: Secret Conversations of the Living World, explaining that animals are always talking – whether among themselves or in a multi-species environment, to survive, make friends, discuss social rules and even flirt. Experts point to scientific evidence that they have languages, cultures and complex inner lives in which they fall in love and mourn their partners.

As she explains in the book, dolphins call each other by name, groundhogs describe intruders in great detail, bats like to gossip, and grammatical structures can be found in the songs of some birds. Wild chimpanzees understand each other through dozens of different gestures and bee dances, and can recognize and remember human faces.

Studying the language and behavior of animals is important not only for understanding how they communicate with each other, but also for understanding how they communicate with us.Some, Like Dogs, Birds and Horses, Are Even Able to Learn Words: According to a Study Published in Journal Behavioral process, A border collie can remember more than a thousand. In addition, some animals respond to tone of voice and body language, explains Melody Jackson, a professor at the Georgia Institute of Technology and an expert in dog-computer interaction: A soft tone of voice conveys friendship, while a harsh or strong tone can be threatening. Touching can also be used as a reward for dogs and horses.

Artificial intelligence and animals “dialogue”

Multiple scientists have turned to artificial intelligence and other technologies to understand and improve this communication. Clara Mancini, an animal-computer interaction researcher at the Open University in the UK, explains that sensors can be used to record, analyze and interpret many different animal signals, including those that may be imperceptible to the human ear.

The founders of the Wild Dolphin Project have spent more than 30 years amassing a database of dolphin behavior and their sounds, including three: whistles, used for long-distance communication, and contact numbers when cows are separated from their calves, used for orientation and navigation clicks, as well as so-called bursts, which are packets of clicks spaced closely together for close social behaviors such as combat. The goal of this project is to create machine learning algorithms to look for patterns in these sounds and develop systems that can generate “words” to interact with dolphins in the wild.

There are many similar projects. Researchers at Elephant Voices created an online ethnographic map of the vocalizations and behavior of elephants in Kenya and Mozambique, including examples of the trumpets that elephants typically make when they come out of the water after playing. Another team developed software that automatically detects, analyzes and classifies ultrasonic vocalizations in rodents; it’s called DeepSqueak, and it’s also been used on lemurs, whales and other marine animals. Some scientists have developed systems to detect distress calls from chickens, while others have tried to use machine learning to understand dogs to determine whether their whines are expressing sadness or happiness.

The challenge of creating a “translator”

While some researchers have identified the structure and partial meaning of some animal vocalizations, creating a “translator” presents multiple challenges. First, understanding the semantic and emotional meaning of their communication is a very complex task, as Mancini points out: we are not in their heads, we do not have the same physical, sensory and cognitive features that they experience. world. Their messages can be dismissed and misinterpreted if these differences and complexities are not taken into account.

Beyond that, current technologies require ambient or portable sensors that are not always practical. Sometimes without the right camera, it can be very difficult to capture animals in motion for video analysis. Furthermore, interpreting their communication based solely on vocal expressions misses other channels that may be relevant to understanding their meaning, such as their behavior.

Animals also communicate through their movements, gestures and even facial expressions. For example, if two groups of elephants come together and wave their ears quickly, they are expressing a warm greeting as part of their welcome ritual, according to Elephant Voices. And sheep can express pain with facial expressions. In fact, computer scientists at the University of Cambridge have developed an artificial intelligence system that analyzes their faces to detect when they are injured.

Some researchers study dogs’ posture and behavior to predict how they’re feeling, sometimes turning to biometrics to try to pinpoint changes in heart rate, breathing and body temperature that might provide clues to their mood, Jackson said. Some of these canine interpretation systems use body sensors to measure position and movement, while others use cameras to record and analyze video.

Vests for training dogs and robotic animals

Being able to communicate with animals is useful in a variety of situations. For example, Jackson’s team has developed technology that allows human operators to remotely guide search-and-rescue dogs using vibrating motors attached to vests. They’ve also created portable computers that allow service dogs to contact emergency services via GPS location if their owners have a seizure.

Humans may never be able to sing like a whale or buzz like a bee, but maybe a machine will. In fact, a group of German researchers have built a bionic robot called the RoboBee that mimics the dances bees use to communicate, and they’ve had success: With it, they claim, they’ve managed to recruit real bees and Guide them to a specific local location.

Progress is very promising. However, it is too early to predict whether animal translation will exist. Jackson believes that as computers and sensors become smaller and more powerful, tiny implantable systems will be developed to provide more clues about their behavior and one day enable true two-way communication.

sign up our weekly newsletter Get more news stories in English from EL PAÍS USA Edition

Source link