Shared and Unshared Feature Extraction in Major Depression During Music Listening Using Constrained Tensor Factorization
Xiulin Wang, Wenya Liu, Xiaoyu Wang, Zhen Mu, Jing Xu, Yi Chang, Qing Zhang, Jianlin Wu & Fengyu Cong
Frontiers in Human Neuroscience 15 (2021)
AbstractOngoing electroencephalography signals are recorded as a mixture of stimulus-elicited EEG, spontaneous EEG and noises, which poses a huge challenge to current data analyzing techniques, especially when different groups of participants are expected to have common or highly correlated brain activities and some individual dynamics. In this study, we proposed a data-driven shared and unshared feature extraction framework based on nonnegative and coupled tensor factorization, which aims to conduct group-level analysis for the EEG signals from major depression disorder patients and healthy controls when freely listening to music. Constrained tensor factorization not only preserves the multilinear structure of the data, but also considers the common and individual components between the data. The proposed framework, combined with music information retrieval, correlation analysis, and hierarchical clustering, facilitated the simultaneous extraction of shared and unshared spatio-temporal-spectral feature patterns between/in MDD and HC groups. Finally, we obtained two shared feature patterns between MDD and HC groups, and obtained totally three individual feature patterns from HC and MDD groups. The results showed that the MDD and HC groups triggered similar brain dynamics when listening to music, but at the same time, MDD patients also brought some changes in brain oscillatory network characteristics along with music perception. These changes may provide some basis for the clinical diagnosis and the treatment of MDD patients.
Similar books and articles
Illusions of intentionality, shared and unshared.Robert R. Provine - 2005 - Behavioral and Brain Sciences 28 (5):713-714.
Transformation of Nonmultiple Cluster Music Cyclic Shift Topology to Music Performance Style.Jing Li - 2021 - Complexity 2021:1-11.
Matching Subsequence Music Retrieval in a Software Integration Environment.Zhencong Li, Qin Yao & Wanzhi Ma - 2021 - Complexity 2021:1-12.
Bridge of Waves: What Music is and How Listening to It Changes the World.W. A. Mathieu - 2010 - Shambhala.
Music Listening in Times of COVID-19 Outbreak: A Brazilian Study.Fabiana Silva Ribeiro, João Paulo Araújo Lessa, Guilherme Delmolin & Flávia H. Santos - 2021 - Frontiers in Psychology 12.
Modeling Listeners' Emotional Response to Music.Tuomas Eerola - 2012 - Topics in Cognitive Science 4 (4):607-624.
Listening Through the Noise: The Aesthetics of Experimental Electronic Music.Joanna Demers - 2010 - Oup Usa.
Rhythm and Existence.Marcia Sá Cavalcante Schuback - 2018 - Research in Phenomenology 48 (3):318-330.
Beyond Musical Metaphysics: A Philosophical Account of Listening to Music.Paskalina Bourbon - 2018 - Revista Portuguesa de Filosofia 74 (4):1377-1398.
Music Therapy for Depression Enhanced With Listening Homework and Slow Paced Breathing: A Randomised Controlled Trial.Jaakko Erkkilä, Olivier Brabant, Martin Hartmann, Anastasios Mavrolampados, Esa Ala-Ruona, Nerdinga Snape, Suvi Saarikallio & Christian Gold - 2021 - Frontiers in Psychology 12.
Moved by Sad Music: Pleasure and Emotion in Sad Music Listening Experiences.Matthew Dunaway - unknown
Autonomous perceptual feature extraction in a topology-constrained architecture.Sylvain Chartier & Gyslain Giguère - 2008 - In B. C. Love, K. McRae & V. M. Sloutsky (eds.), Proceedings of the 30th Annual Conference of the Cognitive Science Society. Cognitive Science Society. pp. 1868--1873.
A Bad Case Of The Flu?: The Comparative Phenomenology of Depression and Somatic Illness.Matthew Ratcliffe - 2013 - Journal of Consciousness Studies 20 (7-8):198-218.
Added to PP
Historical graph of downloads
Citations of this work
No citations found.
References found in this work
Play it again Sam: Repeated exposure to emotionally evocative music polarises liking and smiling responses, and influences other affective reports, facial EMG, and heart rate.Charlotte Vo Witvliet & Scott R. Vrana - 2007 - Cognition and Emotion 21 (1):3-25.