AI-Powered Micro-Measurements Unlock Insights into Animal Behavior

TL;DR:

  • AI tools are transforming the study of animal behavior by providing valuable data for simulating animals and understanding behavior and the brain.
  • These tools, such as SLEAP and DeepLabCut, use deep neural networks and pose estimation to track and analyze animal movements with precision.
  • Ethologists leverage AI tools to track natural behaviors and replicate them in a lab, allowing for simultaneous measurement of neural activity and behavior.
  • Insights gained include understanding how animals catch flying insects, the neural basis for different behaviors in bullied mice, and more.
  • Behavioral data is being used to create simulated animals or “digital twins” with fully modeled limbs, skeletons, and muscles.
  • The development of these simulated animals requires a combination of AI models and new tools, contributing to the study of embodied artificial general intelligence.

Main AI News:

Cutting-edge AI tools are revolutionizing the study of animal behavior, providing scientists with valuable data to simulate animals and gain a deeper understanding of behavior and the brain. These advancements hold significant implications for fields like medicine and the development of artificial general intelligence (AGI).

Talmo Pereira, a fellow at the Salk Institute for Biological Sciences, emphasizes the crucial role of the body in bridging the gap between the brain’s evolutionary purpose and its real-world operations. According to Pereira, comprehending the circuits of neurons involved in different behaviors can aid researchers in developing treatments for psychiatric and neurodegenerative diseases, where behavioral changes often serve as early symptoms.

AI methods, including SLEAP, DeepLabCut, and others, are increasingly being employed to measure animal behavior—a task that traditionally involves meticulous observation and manual annotation by researchers. These tools leverage deep neural networks and pose estimation techniques from computer vision to identify and track the joints of an animal’s body in images or videos, providing precise spatial coordinates.

Notably, Microsoft’s Xbox Kinect utilized pose estimation to track players’ movements using infrared sensors and translate them into in-game character actions. Besides gaming, these techniques have found applications in athlete performance analysis, monitoring the well-being of dairy cows through pose tracking, and even potential surveillance based on an individual’s gait.

Animal behavior scientists, also known as ethologists, employ AI tools to monitor the natural behaviors of single or multiple animals. The gathered information can then be replicated in a laboratory setting, enabling simultaneous measurement of neural activity and manipulation of neurons to observe their impact on behavior.

These AI tools have already yielded valuable insights. For instance, researchers have discovered how marmoset monkeys catch flying insects and identified the neural basis for different behaviors in bullied mice—whether they fight back or flee. Cory Miller, a neurobiologist at the University of California San Diego, hails these tools for their ability to quantify behavior at millisecond precision, mirroring the brain’s operational scale.

SLEAP, a tool developed by Pereira, aids in quantifying the body language of museum-goers, detecting early behavioral changes related to ALS, and studying the effects of genetic modifications on plant root systems.

Another tool called MoSeq allows researchers to identify and predict various behaviors by isolating shorter units of behavior called “syllables.” Harvard University neurobiologist Bob Datta’s team has leveraged MoSeq to study the influence of hormones on laboratory mice behavior, revealing that the behavior of female mice, often excluded from studies due to hormonal variations, is more predictable than that of males.

However, as Pereira notes, the brain does not generate positional coordinates. To truly understand the relationship between observed movements and the brain’s processes, researchers must establish a connection between the two.

Looking ahead, scientists are striving to create simulated bodies or animals using behavioral data. These “digital twins” of mice, rats, and flies would possess fully modeled limbs, skeletons, and muscles, enabling them to behave like their real counterparts. By comparing the behavior of these digital twins to real animals and iteratively updating the model, researchers aim to gain insights into how the brain generates behavior.

Building these simulated animals will require combining various AI models and developing new tools, as highlighted by Datta. Ultimately, this endeavor can lead to a deeper understanding of embodied artificial general intelligence—an ongoing debate among AI researchers regarding the necessity of embodying intelligence within physical structures.

Conclusion:

The integration of AI tools into the study of animal behavior has brought significant advancements. The ability to accurately measure and analyze behavior at a millisecond scale provides valuable insights into the brain’s role in producing behavior. This has implications for various markets, including healthcare and pharmaceuticals, where a better understanding of behavior can aid in the development of treatments for psychiatric and neurodegenerative diseases. Additionally, the creation of simulated animals has the potential to impact the gaming industry, allowing for more realistic and behaviorally accurate virtual characters. Overall, these advancements present opportunities for innovation and research in multiple sectors, driving advancements in AI technology and our understanding of the brain and behavior.

Source