Skip Nav Destination
Close Modal
Update search
NARROW
Format
Journal
Date
Availability
1-1 of 1
Matteo Neri
Close
Follow your search
Access your saved searches in your account
Would you like to receive an alert when new items match your search?
Sort by
Journal Articles
Publisher: Journals Gateway
Network Neuroscience 1–12.
Published: 18 November 2024
Abstract
View article
PDF
Understanding the complex neural mechanisms underlying speech and music perception remains a multifaceted challenge. In this study, we investigated neural dynamics using human intracranial recordings. Employing a novel approach based on low-dimensional reduction techniques, the Manifold Density Flow (MDF), we quantified the complexity of brain dynamics during naturalistic speech and music listening and during resting state. Our results reveal higher complexity in patterns of interdependence between different brain regions during speech and music listening compared with rest, suggesting that the cognitive demands of speech and music listening drive the brain dynamics toward states not observed during rest. Moreover, speech listening has more complexity than music, highlighting the nuanced differences in cognitive demands between these two auditory domains. Additionally, we validated the efficacy of the MDF method through experimentation on a toy model and compared its effectiveness in capturing the complexity of brain dynamics induced by cognitive tasks with another established technique in the literature. Overall, our findings provide a new method to quantify the complexity of brain activity by studying its temporal evolution on a low-dimensional manifold, suggesting insights that are invisible to traditional methodologies in the contexts of speech and music perception. Author Summary Understanding the complex neural mechanisms underlying speech and music perception remains a challenging task. In this study, we used human intracranial recordings to investigate brain dynamics while participants listened to naturalistic speech and music, as well as during resting states. Using a novel approach called Manifold Density Flow (MDF), which applies manifold learning techniques, we quantified the complexity of brain dynamics across these conditions. Our findings reveal that listening to speech and music produces more complex interdependencies between brain regions than resting, with speech showing higher complexity than music. This suggests that the cognitive demands of these auditory tasks shape brain dynamics in distinct ways. Additionally, we validated MDF’s ability to capture complex brain dynamics through experiments on a toy model and by comparing it with another established technique. Overall, our results offer a new way to analyze brain activity by tracking its temporal evolution in a simplified low-dimensional space, highlighting insights into speech and music perception that may be missed using traditional methods.
Includes: Supplementary data