Smartphone-based ear-electroencephalography to study sound processing in everyday life

The European journal of neuroscience(2023)

引用 0|浏览0
暂无评分
摘要
In everyday life, people differ in their sound perception and thus sound processing. Some people may be distracted by construction noise, while others do not even notice. With smartphone-based mobile ear-electroencephalography (ear-EEG), we can measure and quantify sound processing in everyday life by analysing presented sounds and also naturally occurring ones. Twenty-four participants completed four controlled conditions in the lab (1 h) and one condition in the office (3 h). All conditions used the same paired-click stimuli. In the lab, participants listened to click tones under four different instructions: no task towards the sounds, reading a newspaper article, listening to an audio article or counting a rare deviant sound. In the office recording, participants followed daily activities while they were sporadically presented with clicks, without any further instruction. In the beyond-the-lab condition, in addition to the presented sounds, environmental sounds were recorded as acoustic features (i.e., loudness, power spectral density and sounds onsets). We found task-dependent differences in the auditory event-related potentials (ERPs) to the presented click sounds in all lab conditions, which underline that neural processes related to auditory attention can be differentiated with ear-EEG. In the beyond-the-lab condition, we found ERPs comparable to some of the lab conditions. The N1 amplitude to the click sounds beyond the lab was dependent on the background noise, probably due to energetic masking. Contrary to our expectation, we did not find a clear ERP in response to the environmental sounds. Overall, we showed that smartphone-based ear-EEG can be used to study sound processing of well defined-stimuli in everyday life. Participants were equipped with cEEGrids (ectroencephalography), a neckspeaker (present sounds) and microphones (record non-experimental sounds). They completed conditions beyond the lab and in the lab. We found expected ERPs to presented sounds, but not to non-experimental, natural sounds.image
更多
查看译文
关键词
auditory processing,beyond-the-lab experimentation,cEEGrid,ear-EEG,ERPs,smartphone-based
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要