Integrating real-time fNIRS with biofeedback to promote fluency in people who stutter

Liam Barrett, one of the Win a Brite winners, has been busy collecting data for his research project on using biofeedback and fNIRS to promote fluency in people who stutter. He was kind enough to share his setup and progress with us, which you can read all about in this blog post!


Over at the Speech Lab here in University College London, we are in the midst of data collection with the wearable fNIRS system, Brite, from Artinis. We’ve been investigating the haemodynamic biomarkers of stuttering along with cortical responses of altered feedback during speech.

We’ve tried a range of different optode templates such as the four 2x2 grids plus two Short Separation Channel (SSC) over the bilateral inferior frontal gyri [the IFG is involved in language production] and bilateral post-central gyri [provides somatosensory feedback, important for speech motor control] (figure 1 & 2).

We got optode digitization up and running (figure 3) such that we can estimate where the fNIRS signal is originating from for each participant!

 
 
Can’t wait to plug the data into our machine learning models to try and decode different aspects of speech control from the haemodynamic signal!
— Liam Barrett

About the project

In this video, Liam Barrett, one of the Win a Brite winners, elaborates their research and explains how they will incorporate functional near-infrared spectroscopy (fNIRS) to promote fluency in people who stutter.

 

Previous
Previous

Exploring early brain development in real-world settings: an interview with Dr. Paola Pinti

Next
Next

The Starstim fNIRS - Combining tES brain stimulation and EEG + fNIRS neuroimaging in one headcap