P02Session 2 (Tuesday 13 January 2026, 14:10-16:40)Oculomotor tracking of selective attention in speech-in-noise listening
Background: Successful interaction with complex auditory scenes in real life requires dynamic engagement of various cognitive processes, including arousal, attention, distractor suppression, and memory. Previous research has established that such information can be uncovered from our eyes, through orienting ocular responses like phasic Pupil Dilation (PD), Microsaccades (MS), and blinks.
Our recent studies using tone sequences have indicated that these oculomotor measures can provide instantaneous readouts of arousal and attention allocation in auditory processing. In the present experiment, we show that combining these measures provide rich information about time-to-time cognitive mechanisms in speech-in-noise processing.
Methods: We simultaneously recorded pupil and oculomotor movements (MS and blinks) with eye-tracking using a speech-in-noise working memory task (N=30). The task required subjects to focus attention on a target speaker (and report the words they produced) while ignoring temporally interleaved words from a distractor speaker, all embedded in multi-speaker babble.
Results: Distinct response patterns to behaviourally-relevant target and distractor words - reflecting fluctuations in instantaneous arousal and attentional engagement - were observed across pupil dilation (PD), microsaccades (MS), and blink time series. Time-locked additional analyses revealed robust response patterns to targets vs. distractors, as well as signatures of attentional states (e.g. attention lapse) associated with behavioural outcomes.
Each measure exhibited unique temporal characteristics, suggesting that they index different facets of attentive listening. Specifically, increased pupil responses (diameter and dilation rate) indicated heightened arousal prior to and following target words; while oculomotor inhibition was enhanced by the allocation of temporal attention, observed across both MS and blink dynamics, signalling a push-and-pull interaction between the oculomotor and auditory system, where visual sampling is reduced when attentional resources are deployed for the auditory modality.
Discussion: These results suggest that pupil and oculomotor dynamics can track time-to-time attentional allocation and arousal engagement processes in listening, even in difficult listening conditions. In addition, distinct patterns of different eye movements indicate potentially separate underlying neural pathways.
Understanding such underpinnings of oculomotor dynamics in hearing provide deeper insights into auditory attention and cross-modal interactions, as well as potential guidance for research into challenges in effortful listening, many of which possibly arising from breakdowns in distractor suppression in real-life complex auditory scenes.