Research

Revealing Neural Mechanisms of Auditory Selective Attention

To investigate how the brain selectively attends to specific auditory stimuli in complex acoustic environments. (The figure shows the difference in the alpha and beta oscillations between weeks 1 and 4 of selective attention training during the preparatory period before the trial begins (Shim et al., 2023).)

A collection of brain scan images showing different colors.

Decoding Attentional States in Music and Speech Listening

Using EEG to predict which conversation / musical instrument a listener is focusing on in social settings / musical pieces. (The video shows real-time decoding of selective attention on a speech stream out of two at the RIT Immersive Audio Lab.)

Gamified Speech Perception Training in Noise

Developing algorithms to enhance speech intelligibility in noisy environments through gamified training sessions based on neural attention mechanisms. (The video shows real-time gamified selective attention training designed to enhance speech perception in noise at the RIT Immersive Audio Lab.)

Neuro-Steered Hearing Aids

Creating smart hearing aids that adapt to the user’s attentional state and acoustic environment. (The figure shows an example block diagram of neuro-steered hearing aids. (https://cocoha.org/the-cocoha-project/). Our approaches and results are forthcoming.)

A diagram explaining hearing aids.