The new open-source tool harnesses the power of artificial intelligence and simplifies animal behavior analysis

Sample images merge outlines of an organism’s position at different points in time to represent motion and provide a still image that increases LabGym’s accuracy in recognizing behavior types. Shown are the pattern images for mice (top right, bottom left) and Drosophila larvae (top left, bottom right). Photo credits: Yujia Hu and Bing Ye, UM Life Sciences Institute.

Study: LabGym: Quantifying Custom Animal Behaviors 1 Using Learning-Based Holistic Assessment (DOI: 10.1016/j.crmeth.2023.100415)

A University of Michigan team has developed a new software tool to help life science researchers analyze animal behavior more efficiently.

The open source software LabGym uses artificial intelligence to identify, categorize and count defined behaviors in various animal model systems.

Scientists need to measure animal behavior for a variety of reasons, from understanding all the ways a particular drug can affect an organism to mapping how circuits in the brain communicate to elicit a particular behavior.

For example, researchers in UM faculty member Bing Ye’s lab are analyzing movements and behaviors in Drosophila melanogaster — or fruit flies — as a model to study the development and functions of the nervous system. Because fruit flies and humans share many genes, these studies of fruit flies often provide insight into human health and disease.

“Behaviour is a function of the brain. So, analyzing animal behavior provides essential information about how the brain works and how it changes in response to disease,” said Yujia Hu, a neuroscientist in Yes’s lab at the UM Life Sciences Institute and lead author of a Cell Reports Methods study dated February 24 the new software.

READ :  New milk chocolate bar uses AI to reduce sugar without losing taste

However, manually identifying and counting animal behavior is time-consuming and highly subjective to the researcher analyzing the behavior. And while there are some software programs to automatically quantify animal behavior, they present challenges.

“A lot of these behavior analysis programs are based on preset definitions of a behavior,” said Ye, who is also a professor of cell and developmental biology at the school of medicine. “For example, if a Drosophila larva rotates 360 degrees, some programs count a roll. But why doesn’t 270 degrees also matter? Many programs don’t necessarily have the flexibility to count that without the user knowing how to transcode the program.”

Think more like a scientist

To overcome these challenges, Hu and his colleagues decided to create a new program that more closely mimics the human cognition process — which “thinks” more like a scientist — and is more user-friendly for biologists who may not have coding experience. With LabGym, researchers can enter examples of the behavior they want to analyze and teach the software what to count. The program then uses deep learning to improve its ability to recognize and quantify the behavior.

A recent development in LabGym that helps apply this more flexible insight is the use of both video data and a so-called “master image” to improve program reliability. Scientists use videos of animals to analyze their behavior, but videos contain time-series data that can be challenging for AI programs to analyze.

To help the program identify behaviors more easily, Hu created a still image showing the animal’s movement pattern by merging outlines of the animal’s position at different points in time. The team found that combining the video data with the sample images increased the program’s accuracy in detecting behavior types.

READ :  Data is key to an economic future

LabGym is also designed to ignore irrelevant background information and consider both the animal’s overall movement and changes in position over space and time, much like a human researcher would. The program can also track multiple animals at the same time.

Species flexibility improves utility

Another key feature of LabGym is its species flexibility, Ye said. Although it was developed with Drosophila, it is not restricted to one species.

“It’s actually rare,” he said. “It was written for biologists to adapt to the species and behavior they want to study without the need for programming skills or high-performance computers.”

After hearing a presentation on the program’s early development, UM pharmacist Carrie Ferrario offered to help Ye and his team test and refine the program in the rodent model system she works with.

Ferrario, associate professor of pharmacology and associate associate professor of psychology, studies the neural mechanisms that contribute to addiction and obesity using rats as a model system. To complete the necessary observation of drug-induced behavior in the animals, she and her laboratory staff had to rely largely on manual evaluation, which is subjective and extremely time-consuming.

“I’ve been trying to solve this problem since grad school, but the technology in terms of artificial intelligence, deep learning, and computation just wasn’t there,” Ferrario said. “This program solved an existing problem for me, but it also has a really broad benefit. I see the potential that it can be useful in almost limitless conditions to analyze animal behavior.”

Next, the team plans to further refine the program to improve its performance in even more complex conditions, such as observing animals in nature.

READ :  Opinion: Data rights will not save democracy

This research was supported by the National Institutes of Health.

In addition to Ye, Hu and Ferrario, the authors of the study are: Alexander Maitland, Rita Ionides, Anjesh Ghimire, Brendon Watson, Kenichi Iwasaki, Hope White and Yitao Xi from the University of Michigan and Jie Zhou from Northern Illinois University.