Study Uses Augmented Reality Tech To Learn About "Active Sensing"
A new study conducted by researchers from Johns Hopkins and the New Jersey Institute of Technology (NJIT) has shed new light on something called active sensing. Active sensing is a process found across the animal kingdom that involves the production of motion, sound, or other signals to gather sensory feedback about the environment. The challenge for researchers is that they had struggled to understand how the brain can control active sensing.
Researchers in the study have used augmented reality technology to alter the link between the active sensing behavior and the sensory feedback it creates and learn more about how the process works. In the study, a species of weakly electric fish known as glass knifefish were put under sensory feedback control. The study proposes that the fish use a dual-control system for processing feedback from the active sensing movements.
The team believes their findings have implications for neuroscience as well as engineering of new, artificial systems. The discovery could benefit things like robotics and self-driving cars. The team thinks this is the first study to use AR to probe the fundamental process of movement-based active sensing used by most animals. One of the key defenses for this sort of fish is to stay in its refuge to avoid predators.
To stay in a moving refuge, the fish use station-keeping, the ability to remain in a position relative to their hideout no matter how it moves. The team used open loop and closed loop experiments in its test and found that the fish swims the longest distance to gain sensory information in closed loop experiments where the movement of the fish was synced to the shuttle motion of the refuge.
The fish reacted differently when the stimulus was controlled by an individual versus played back to them via a recording. The team hopes similar tests will be performed in humans. The hope of the researchers is for similar experiments to be conducted to investigate vision in humans.