Your Ears Reveal Where Your Eyes Are Looking, New Study Finds

hearing and vision connection
HHTM
December 6, 2023

DURHAM, NORTH CAROLINA — Scientists have discovered that minute sounds emitted from a person’s ears can expose which direction their eyes are looking. The subtle noises occur as neural signals tell the ears to sharpen visual focus based on gaze, though precisely why remains uncertain.

Published recently in the Proceedings of the National Academy of Sciences journal, the researchers say a unique pattern of ear sounds corresponds to horizontal, vertical or diagonal eye movements. Detailed analysis of the resulting waveforms using common headphones enabled the team to accurately track eye positioning across a screen.

Surprisingly, the effect also works in reverse according to lead investigator Dr. Jennifer Groh, a professor at Duke University studying auditory neuroscience and perception. Her group could predict what the ear signal would resemble based solely on visual targets subjects were told to track.

Hearing and Vision Link

Groh’s lab previously identified the acoustic-visual link in 2018 but has now decoded the ear sounds to objectively determine gaze vectors from them. They hypothesize the eyes may directly influence structures regulating hearing sensitivity like muscles and sensory hair cells. Since both inputs provide spatial awareness in the brain, subtle crosstalk probably helps match sights and sounds.

The association likely represents an adaptive mechanism that helps focus auditory attention similar to visual attention. But whether microscopic movements within the ear actively sharpen acoustic cues remains unproven.

“We think this is part of a feedback system allowing the brain to synchronize where visual stimuli occur with corresponding sounds. It may help improve spatial perception when your eyes change position while the head and ears don’t otherwise shift.”

–Dr. Jennifer Groh

Precisely mapping eye orientations requires monitoring dark pupil positions with advanced infrared cameras. Capturing subtle ear sounds as surrogate tracking markers utilizes only basic headphones.

According to lead author and graduate student Stephanie Lovich, accumulating clinical evidence shows that various ear components selectively amplify or dampen sounds. Studying their associated waveforms could therefore help assess which anatomical structures have impaired function in hearing disorders.

“If each section follows particular acoustic rules, measurements may aid diagnosing exactly which part of an individual’s ear machinery is malfunctioning,” Lovich said. “That could lead to developing novel diagnostic tests beyond standard exams.”

Participants tracked a green dot on a screen while researchers listened to the sounds made in their ear canals using mic-embedded earbuds. Image credit: Meredith Schmehl, Duke University

The research involved 16 participants with normal vision and hearing. Wearing a pupil-tracking apparatus and microphone-equipped earphones, subjects followed a moving dot around a screen. Comparing gaze coordinates to simultaneous earwave recordings let investigators match subtle audio cues to ocular positioning.

Complex mathematical analysis then extracted unique signatures within the largely inaudible sounds corresponding to upward, downward or sideways glances made without head turning. Having cracked the ear code, the models can now compute gaze angles directly from microphone readings alone.

Sensory Integration

Intriguingly, sound-regulating ear mechanisms were once thought to only assist hearing by turning up faint noises or suppressing loud ones. But emerging research shows they also take cues from eye movements and shifts in attention, likely increasing perceptual sensitivity to important stimuli.

In future work, Groh aims to clarify what role the newly discovered eye-ear interplay serves for linking visual inputs with acoustic environments. Differences among individuals could relate to real-world audiovisual navigation performance. Those variations may provide clinical value for specific diagnoses eventually.

The research adds to a growing appreciation that the classic five senses integrate at deeper brain targets than previously thought. Leveraging those connections may someday improve treatments ranging from neuroprosthetics to augmented reality interfaces. But first, scientists must decode the complex choreography coordinating peripheries and cortices.

Reference:

  • “Parametric Information About Eye Movements is Sent to the Ears,” Stephanie N. Lovich, Cynthia D. King, David L.K. Murphy, Rachel Landrum, Christopher A. Shera, Jennifer M. Groh. Proceedings of the National Academy of Sciences, Nov. 21, 2023. DOI: 10.1073/pnas.2303562120

 

Source: Duke, PNAS

Leave a Reply