TL;DR:
- Neuroscientists at the University of Chicago have used 3D X-ray videography and machine learning to study tongue movements in non-human primates.
- The tongue’s complex movements during feeding were captured and decoded from neural activity in the brain’s sensorimotor cortex.
- This breakthrough opens possibilities for brain computer interface-based prosthetics to restore lost functions of feeding and speech.
- The tongue’s freedom of movement is nearly limitless compared to the constrained movements of limbs.
- The study employed novel techniques such as X-ray Reconstruction of Moving Morphology (XROMM) and deep neural networks.
- The decoded information about the tongue shape is represented in the brain, similar to arm movements and hand positions.
- Previous research has successfully translated similar data to drive the movements of robotic prosthetic limbs.
- The findings hold potential for developing assistive devices for individuals with difficulties in swallowing and dysphagia.
- The research showcases the novel decoding of soft tissue shapes, which is different from conventional studies of skeletal systems.
Main AI News:
Advancements in neuroscience have revolutionized our understanding of how the brain controls various movements, such as walking and reaching. However, decoding the mechanics behind essential behaviors like eating, drinking, and communication has proven to be more challenging due to the hidden nature of a crucial component: the tongue.
Addressing this obstacle, researchers from the University of Chicago have developed a groundbreaking approach that combines 3D X-ray videography and machine learning to capture the intricate movements of the tongue in non-human primates during feeding. Their study, published in Nature Communications, demonstrates that the brain’s sensorimotor cortex can accurately decode the 3D shape of the tongue. This breakthrough opens up new possibilities for brain-computer interface-based prosthetics, offering hope for restoring lost functions related to feeding and speech.
One of the primary hurdles in studying tongue movements lies in its concealed position within the mouth. Unlike the limbs, which are governed by the predictability of bones and joints, the tongue’s movements are more complex. Composed entirely of muscle and soft tissue, the tongue boasts virtually limitless freedom of movement, except for the rare individuals unable to roll their tongues into a U-shape. J.D. Laurence-Chasen, Ph.D., the lead author of the study and a former postdoctoral scholar at UChicago, now working as a researcher at the National Renewable Energy Laboratory, explains, “The tongue has a totally different anatomy. There are no rigid internal structures. There’s a ton of different muscles with overlapping functions, and so, it has functionally infinite degrees of freedom.”
Laurence-Chasen’s expertise in data analytics and machine learning enabled him, as a postdoc and Ph.D. student, to investigate how the brain controls the dynamic movements of the tongue and jaw, vital for feeding and speech. In collaboration with Professors Nicho Hatsopoulos, Ph.D., and Callum Ross, Ph.D., from the Department of Organismal Biology and Anatomy, Laurence-Chasen captured the tongue movements of two male Rhesus macaque monkeys as they fed on grapes. To record the movements and shape of the tongue within the mouth, each monkey’s tongue was equipped with seven markers detectable by two X-ray video cameras—a technique similar to motion capture technology used in movies and video games.
Given the monkeys’ rapid chewing rate of two to three times per second, the researchers employed an innovative 3D imaging technology called X-ray Reconstruction of Moving Morphology (XROMM) to process the high-speed data capturing the tongue’s movements, shape changes, and deformations.
Concurrently, microelectrode arrays implanted in the motor cortex recorded neural activity during feeding. Laurence-Chasen and the team employed deep neural networks, a form of machine learning software, to analyze brain activity and extract valuable information. By comparing this data with the actual movements recorded by the X-ray cameras, they discovered that the motor cortex contained information about the 3D shape and movement of the tongue. This data allowed them to accurately decode and predict the tongue’s shape based solely on the neuron activity.
Laurence-Chasen expresses his surprise at the extent and resolution of information about the tongue shape that could be readily extracted: “We knew from some earlier research that basic movements of the tongue involved the cortex, but we were surprised by the extent and resolution of information about the tongue shape that we could extract so readily.”
The significance of this discovery lies in the resemblance between how this data is represented in the brain and the representation of arm movements and 3D hand positions. Previous research by Hatsopoulos, along with Sliman Bensmaia, Ph.D., James and Karen Frank Family Professor of Organismal Biology and Anatomy at UChicago, has successfully translated brain signals into software algorithms that drive the movements of robotic prosthetic limbs. These advancements have allowed amputees and quadriplegics to control prosthetics using their minds while experiencing the natural sensations of touch. Although the application of this technology to tongue-related functions is not as developed, a similar approach could immensely benefit patients who have lost the ability to feed or speak.
Hatsopoulos envisions a future where the information obtained about the tongue’s shape and its decoding can be utilized to predict swallowing events. This knowledge could be incorporated into a device capable of stimulating the appropriate muscles, aiding individuals with dysphagia and difficulty swallowing, particularly the elderly. Hatsopoulos emphasizes the novelty of Laurence-Chasen’s achievement in decoding soft tissue shapes, which differ from the rigid skeletal systems typically studied, and expresses his excitement about the potential applications: “What J.D. has been able to do here to decode the shapes of soft tissue, not a skeletal system, is novel. I think it’s super exciting.”
The study, titled “Robust cortical encoding of 3D tongue shape during feeding in male macaques,” also involved Fritzie Arce-McShane from the University of Washington as a co-author. This breakthrough research offers a promising path toward developing soft prosthetics and improving the quality of life for individuals with feeding and speech impairments.
Conlcusion:
The groundbreaking research conducted by the neuroscientists at the University of Chicago, utilizing 3D x-ray videography and machine learning to decode tongue movements, has significant implications for the market. The ability to accurately decipher the 3D shape and movements of the tongue from neural activity opens up possibilities for the development of brain-computer interface-based prosthetics. This advancement paves the way for innovative solutions that can restore lost functions of feeding and speech. Such technologies have the potential to address the needs of a market segment comprised of individuals with difficulties in swallowing and dysphagia, particularly the elderly population.
Moreover, the successful decoding of soft tissue shapes, diverging from conventional studies focused on skeletal systems, represents a novel approach that may have far-reaching implications for future developments in the field of assistive devices and prosthetics. The market can anticipate the emergence of cutting-edge solutions driven by this groundbreaking research, improving the quality of life for individuals with speech and feeding impairments.