Abstract
Ongoing electroencephalography signals are recorded as a mixture of stimulus-elicited EEG, spontaneous EEG and noises, which poses a huge challenge to current data analyzing techniques, especially when different groups of participants are expected to have common or highly correlated brain activities and some individual dynamics. In this study, we proposed a data-driven shared and unshared feature extraction framework based on nonnegative and coupled tensor factorization, which aims to conduct group-level analysis for the EEG signals from major depression disorder patients and healthy controls when freely listening to music. Constrained tensor factorization not only preserves the multilinear structure of the data, but also considers the common and individual components between the data. The proposed framework, combined with music information retrieval, correlation analysis, and hierarchical clustering, facilitated the simultaneous extraction of shared and unshared spatio-temporal-spectral feature patterns between/in MDD and HC groups. Finally, we obtained two shared feature patterns between MDD and HC groups, and obtained totally three individual feature patterns from HC and MDD groups. The results showed that the MDD and HC groups triggered similar brain dynamics when listening to music, but at the same time, MDD patients also brought some changes in brain oscillatory network characteristics along with music perception. These changes may provide some basis for the clinical diagnosis and the treatment of MDD patients.