Europe PMC

This website requires cookies, and the limited processing of your personal data in order to function. By using the site you are agreeing to this as outlined in our privacy notice and cookie policy.

Abstract 


Facial mimicry (FM) is an automatic response to imitate the facial expressions of others. However, neural correlates of the phenomenon are as yet not well established. We investigated this issue using simultaneously recorded EMG and BOLD signals during perception of dynamic and static emotional facial expressions of happiness and anger. During display presentations, BOLD signals and zygomaticus major (ZM), corrugator supercilii (CS) and orbicularis oculi (OO) EMG responses were recorded simultaneously from 46 healthy individuals. Subjects reacted spontaneously to happy facial expressions with increased EMG activity in ZM and OO muscles and decreased CS activity, which was interpreted as FM. Facial muscle responses correlated with BOLD activity in regions associated with motor simulation of facial expressions [i.e., inferior frontal gyrus, a classical Mirror Neuron System (MNS)]. Further, we also found correlations for regions associated with emotional processing (i.e., insula, part of the extended MNS). It is concluded that FM involves both motor and emotional brain structures, especially during perception of natural emotional expressions.

Free full text 


Logo of frontpsycholLink to Publisher's site
Front Psychol. 2018; 9: 52.
Published online 2018 Feb 6. https://doi.org/10.3389/fpsyg.2018.00052
PMCID: PMC5807922
PMID: 29467691

Neural Correlates of Facial Mimicry: Simultaneous Measurements of EMG and BOLD Responses during Perception of Dynamic Compared to Static Facial Expressions

Abstract

Facial mimicry (FM) is an automatic response to imitate the facial expressions of others. However, neural correlates of the phenomenon are as yet not well established. We investigated this issue using simultaneously recorded EMG and BOLD signals during perception of dynamic and static emotional facial expressions of happiness and anger. During display presentations, BOLD signals and zygomaticus major (ZM), corrugator supercilii (CS) and orbicularis oculi (OO) EMG responses were recorded simultaneously from 46 healthy individuals. Subjects reacted spontaneously to happy facial expressions with increased EMG activity in ZM and OO muscles and decreased CS activity, which was interpreted as FM. Facial muscle responses correlated with BOLD activity in regions associated with motor simulation of facial expressions [i.e., inferior frontal gyrus, a classical Mirror Neuron System (MNS)]. Further, we also found correlations for regions associated with emotional processing (i.e., insula, part of the extended MNS). It is concluded that FM involves both motor and emotional brain structures, especially during perception of natural emotional expressions.

Keywords: facial mimicry, EMG, fMRI, mirror neuron system, emotional expressions, dynamic, happiness, anger

Introduction

Facial mimicry (FM) is an unconscious and unintentional automatic response to the facial expressions of others. Numerous studies have shown that observing the emotional states of others leads to congruent facial muscle activity. For example, observing angry facial expressions can result in enhanced activity in the viewer's muscle responsible for frowning (CS), while viewing happy images leads to Increased activity in the facial muscle involved in smiling (ZM), and decreased activity of the CS (Hess et al., 1998; Dimberg and Petterson, 2000). However, it has recently been suggested that FM may not be an exclusive automatic reaction but rather a multifactorial response dependent on properties such as stimulus modality (e.g., static or dynamic) or interpersonal characteristics (e.g., emotional contagion susceptibility) (for review see Seibt et al., 2015).

There are two main psychological approaches trying to explain the mechanisms of FM. One of these is the perception-behavior link model which assumes perception and execution of a specific action show a certain overlap (Chartrand and Bargh, 1999). According to the theory, mere perception of the emotional facial expressions of others automatically evokes the same behavior in the perceiver, and the facial expression is copied spontaneously (Chartrand and Bargh, 1999; Dimberg et al., 2000). This notion was supported by recent evidence from neuroimaging literature showing that both the perception and execution of facial emotional expressions engage overlapping brain structures, such as the inferior frontal gyrus (IFG) and inferior parietal lobule (IPL) (Carr et al., 2003; Rizzolatti and Craighero, 2004; Iacoboni and Dapretto, 2006), being regions constituting the classical mirror neuron system (MNS). An example of empirical support for this assumption can be found in a study involving patients with Parkinson's disease where patients demonstrated difficulties with both execution of emotional expression and identification of emotions (Livingstone et al., 2016). Another approach describes FM as a consequence of contagion to the emotional states of others (Hatfield et al., 1993; Bastiaansen et al., 2009). In other words, the observation of other's emotional facial expressions triggers corresponding emotions in the observer. It is suggested that contagion occur due to direct activation of neural substrate, which is involved in the experience of the observed emotion (Wicker et al., 2003). Those emotional-related brain structures, i.e. insula and amygdala, among others related to extended MNS, were activated during both the observation and execution of emotional facial expressions (Carr et al., 2003; van der Gaag et al., 2007; Kircher et al., 2013).

It is worth noting that most of what we know about the neural correlates of automatic FM has been derived from functional neuroimaging studies during the passive viewing or the imitation of emotional facial displays presented to subjects. Direct investigation of the neural correlates of FM, such as simultaneous measurement of BOLD responses (using functional magnetic resonance imaging, fMRI) and facial muscular reactions (using electromyography, EMG) may contribute to improved understanding of the neural basis of FM. To date, only one study (Likowski et al., 2012) has examined the brain structures involved in the occurrence of automatic facial reactions by simultaneously measuring BOLD and facial EMG signals in an MRI scanner. These investigators found that automatic and spontaneous FM of happiness, sadness, and anger displays led to activation of a prominent part of the classic MNS (i.e., the IFG), as well areas responsible for emotional processing (i.e., the insula). They concluded that the perception of emotional facial expressions activated a variety of structures presumed to belong to the classic and extended MNS, but only a small number were correlated with the magnitude of FM. It is currently unknown whether the perception of real, dynamic emotional facial expressions rather than static avatars, used in the study (Likowski et al., 2012), would reveal more associations between the strength of the FM reactions and regional brain activation. Importantly, recent neuroimaging studies (Trautmann et al., 2009; Arsalidou et al., 2011; Kessler et al., 2011) have found that the perception of dynamic emotional stimuli, in comparison to static stimuli, engages a widespread activation pattern that involves parts of the MNS, including the IFG (Sato et al., 2004, 2015; Kessler et al., 2011) and other emotion-related structures like the amygdala and insula (Kilts et al., 2003; Trautmann et al., 2009). Indeed, it has been demonstrated that dynamic emotional facial expressions can improve emotion recognition of subtle facial expressions (Ambadar et al., 2005; Trautmann et al., 2009), enhance emotional arousal (Sato and Yoshikawa, 2007), and elicit stronger FM than static presentations (Weyers et al., 2006; Sato et al., 2008; Rymarczyk et al., 2011). In light of these studies, determining which brain structures are involved in automatic, spontaneous FM could be addressed, at least in part, by simultaneous measurement of facial muscular activity (EMG) and the BOLD responses (fMRI) during passive perception of real, dynamic emotional facial expressions.

In the present study, we simultaneously recorded EMG and BOLD signals during the perception of realistic dynamic and static emotional facial expressions. We measured facial EMG responses from three muscles, the ZM, CS, and OO, while participants passively viewed happy, angry, and neutral displays. Following earlier research, we measured facial muscle activity over the cheek region ZM involved in smiling and over the brow CS region responsible for frowning (e.g., Andréasson and Dimberg, 2008). The activity over the eye OO, typically linked with true joy, smile expression (Hess and Blairy, 2001; Hess and Bourgeois, 2010; Korb et al., 2014) was also measured. It was proposed that contraction of OO transforms a non-Duchenne into a Duchenne smile (Ekman and Rosenberg, 2012). Other researchers suggested that contractions of OO could be additionally indicative for the negative signal value of anger configurations, discomfort-pain or distress-cry situations (Russell and Fernandez-Dols, 1997). Based on previous studies (van der Gaag et al., 2007; Jabbi and Keysers, 2008; Likowski et al., 2012), we anticipated that motor and emotional brain structures would be responsible for differences in automatic FM during perception of dynamic compared to static displays. We examined which of the classic and extended MNS regions showed a relationship with the strength of facial reactions. Furthermore, since dynamic facial expressions constitute a more powerful medium for emotional communication than static presentations, we anticipated that regional brain activation and muscle responses would be more pronounced for dynamic emotional facial expressions. We predicted that presentations of dynamic happy facial expressions would engage brain areas associated with the representation of pleasant feelings and reward (such as the basal ganglia structures, in particular the nucleus accumbens) and would correlate with increased activity of the ZM and OO muscles. For dynamic facial expression anger, we predicted co-activation of limbic structures (i.e., amygdala), proposed to be involved in the automatic detection of evolutionary threats (van der Zwaag et al., 2012), would be associated with CS activity.

Methods

Subjects

Forty-six healthy individuals (21 females, 26 males; mean age = 23.7 ± 2.5 years) participated in this study. The subjects had normal or corrected to normal eyesight and none of them reported neurological diseases. This study was carried out in accordance with the recommendations of Ethics Committee at the University of Social Sciences and Humanities with written informed consent from all subjects. All subjects gave written informed consent in accordance with the Declaration of Helsinki. The protocol was approved by the Ethics Committee at the University of Social Sciences and Humanities. An informed consent form was signed by each participant after the experimental procedures had been clearly explained. After the scanning session, subjects were informed about the aim of the study.

Facial stimuli and apparatus

We used videos and static emotional pictures illustrating forward-facing facial expressions of happiness and anger taken from The Amsterdam Dynamic Facial Expression Set (van der Schalk et al., 2011). Additionally, we included neutral conditions (no visible emotional facial expression) presented as static and dynamic displays. Dynamic stimuli clips of three males and females were used (F01, F03, F09, M03, M07, M11). Each character presented happy, angry, and neutral facial expressions and participants observed only one type of expression at a time (as a photo or a video). In the case of neutral dynamic condition, the motion could be observed because characters were either closing their eyes or slightly changing the position of their head. Each stimulus in the neutral static condition presented one frame from the dynamic video clip, used in the neutral dynamic condition. Stimuli were 576 pixels in height and 720 pixels in width. Expressions were presented on a gray background. All procedures were controlled using Presentation® software running on a computer with Microsoft Windows operating system and were displayed on a 32-inch NNL LCD MRI-compatible monitor (1,920 × 1,080 pixels resolution; 32 bit color rate; 60 Hz refresh rate) from a viewing distance of approximately 140 cm.

EMG acquisition

Data were recorded using an MRI-compatible BrainCap (Brain Products) consisting of 3 bipolar and one reference electrode with a diameter of 2 mm and filled with electrode paste. The electrodes were positioned in pairs over three muscles—the CS, ZM and OO on the left side of the face (Cacioppo et al., 1986; Fridlund and Cacioppo, 1986). A reference electrode, 2 mm in diameter, was attached to the forehead. Before the electrodes were attached, the skin was cleaned with alcohol and a thin coating of electrode paste was applied. This procedure was repeated until electrode impedance was reduced to 5 kΩ or less. The digitized EMG signals were recorded using a BrainAmp MR plus ExG amplifier and BrainVision Recorder. The hardware low-pass filtered the signal at 250 Hz. Finally, data was digitized with a sampling rate of 5 kHz, and stored on a computer running MS Windows 7 for offline analysis.

Image acquisition

MRI acquisition was acquired on a Siemens Trio 3 T MR-scanner equipped with 12-channel phased array head coil. Functional MRI images were registered using T2*-weighted EPI gradient-echo pulse sequence with the following parameters: TR = 2,000 ms, TE = 25 ms; 90° flip angle, FOV = 250 mm, matrix = 64 × 64, voxel size = 3.5 × 3.5 × 3.5 mm, interleaved even acquisition, slice thickness = 3.5 mm, 39 slices.

Procedure

Each volunteer was introduced to the experimental procedure and signed a consent form. To conceal the true purpose, facial electromyography recordings, participants were told that sweat gland activity was being recorded while watching the faces of actors selected for commercials by an external marketing company. Following the attachment of the electrodes of the FaceEMGCap-MR, participants were reminded to carefully observe the actors presented on the screen and were positioned in the scanner. The subjects were verbally encouraged to feel comfortable and behave naturally.

The scanning session started with a reminder of the subject's task. In the session subjects were presented with 72 trials that lasted approximately 15 min. Each trial started with a white fixation cross, 80 pixels in diameter, which was visible for 2 s in the center of the screen. Next, one of the stimuli with a facial expression (happy, angry or neutral, each presented as static image or dynamic video clip) was presented for 6 s. The expression was followed by a blank gray screen presented for 2.75–5.25 s (see Figure Figure1).1). All stimuli were presented in the center of the screen. In summary, each stimulus was repeated once, for a total of 6 presentations within a type of expression (e.g., 6 dynamic presentations of happiness). The stimulus appeared in an event-related manner, pseudo-randomized trial by trail with constraints in randomization: no facial expression from the same actor, and no more than 2 actors of the same sex or the same emotion were presented consecutively. In total, 6 randomized event-related sessions with introduced constraints were balanced between subjects.

An external file that holds a picture, illustration, etc.
Object name is fpsyg-09-00052-g0001.jpg

Scheme of procedure used in the study.

Data analysis

EMG analysis

Pre-processing was carried out using The BrainVision Analyser 2 (version 2.1.0.327). First, EPI gradient-echo pulse artifacts were removed using the average artifact subtraction AAS method (Allen et al., 2000) implemented in the Analyser based on the sliding average calculation and consisting of 11 consecutive functional volumes marked in the data logs. A successful AAS method is possible due to synchronization hardware and markers that were created by the triggers received from the MR system. Next, data were filtered at 30 Hz high-pass and 500 Hz low-pass filters. After rectification and integration over 125 ms, the signal was resampled to 10 Hz. Artifacts related to EMG were detected in two ways. Firstly, when single muscle activity was above 8 μV at baseline (visibility of the fixation cross) (Weyers et al., 2006; Likowski et al., 2008, 2011), the trial was classified as an artifact and excluded from further analysis (M = 3,8 trials per participant were excluded). All remaining trials were blind-coded and visually checked for artifacts. Later, trials were baseline corrected such that the EMG response was measured as the difference of averaged signal activity between the stimuli duration (6 s) and baseline period (2 s). Finally, the signal was averaged for each condition, for each participant and was imported to SPSS 21 for statistical analysis.

For testing differences in EMG responses, a two-way repeated-measures ANOVA with two within-subjects factors (expression: happiness, anger, neutral; and stimulus modality: dynamic, static) were used. Separate ANOVAs were calculated for responses from a single muscle and reported with a Bonferroni correction. In order to confirm that EMG activity changed from baseline and FM occurred, the EMG data of each significant effect were tested for a difference from zero (baseline) using one-sample, two-tailed t-tests.

Image processing and analysis

Image processing and analysis were carried out using SPM12 (6470) run in MATLAB 2013b (The Mathworks Inc., 2013). Functional images were motion-corrected and co-registered to the mean functional image. Brain structural images were segmented into different tissue classes—gray matter, white matter, and non-brain (cerebrospinal fluid, skull) using the segmentation module. Next, a DARTEL algorithm was used to create a study-specific template for all participants based on segmented structural images. The template was later affine registered to the MNI space, and the functional images were warped to this template and resliced to 2 × 2 × 2 mm isotropic voxels to be later smoothed with an 8 × 8 × 8 mm full-width at half maximum Gaussian kernel. Single subject design matrices included six experimental conditions (dynamic: happiness, anger, neutral; and static: happiness, anger, neutral) that were modeled with standard hemodynamic response function and other covariates produced by Artifact Detection Toolbox (ART) that included head movements and other parameters excluding the artificial fMRI signal. Later, the same sets of contrasts of interest were calculated for each subject and used in group level analysis (i.e., one-sample t-test) for statistical Regions of Interest (ROIs) analysis. The analysis was performed using the MarsBar toolbox (Brett et al., 2002) specifically for the separate ROIs. Anatomical region of interest masks were created with the WFU Pickatlas (Wake Forest University, 2014) (primary motor, premotor cortex, IPL, BA44, BA45, amygdala, ACC, insula, caudate head, putamen, nucleus accumbens, globus pallidus), and SPM Anatomy Toolbox (Eickhoff, 2016) [MT+/V5, primary somatosensory cortex (Areas 1, 2, 3a, 3b)]. The STS and pre-SMA ROIs were based on activation peaks from the literature (Van Overwalle, 2009) and a meta-analysis (Kohn et al., 2014), and were defined as an overlapping set of peaks with a radius of 8 mm each. The data were extracted as mean values of each ROI and statistics of brain activity were reported with Bonferroni correction applied to the data (i.e., p-value divided by number of ROIs).

Correlation analysis

Pearson correlation coefficients were calculated between selected contrasts of brain activity (happiness dynamic, happiness static, anger dynamic, anger static) and corresponding mimicry muscles activity in order to understand mutual relationship between brain activity and the facial muscle activity. Additionally, bias-corrected and accelerated (BCa) bootstrap 95% confidence intervals (samples = 1,000) were computed for Pearson correlation coefficients.

Each brain-ROI was represented by a single mean value (of all the voxels in anatomical atlas in each hemisphere). Each value was specific to each participant and ROI. Muscle activity was defined as average baseline corrected EMG trials of the same muscle and type. So the correlations were performed in pairs of variables of muscles (in specific conditions) and EMG responses, e.g., happiness_static_ZM with happiness_static_insulaRight.

Results

EMG measures

M. corrugator supercilii

ANOVA showed a significant main effect of expression [F(1.722, 65.422) = 30.676, p < 0.001, η2 = 0.447], indicating that activity of the CS for happiness (M = −0.366, SE = 0.072) was lower than for angry [M = 0.168, SE = 0.067; t(36) = 6.271, p < 0.001] and neutral expressions [M = 0.067, SE = 0.030; t(36) = 6.186, p < 0.001]. The main effect of modality [F(1, 38) = 4.020, p = 0.052, η2 = 0.096] approached significance with the activity of CS generally higher for static (M = 0.007, SE = 0.047) than dynamic [M = −0.094, SE = 0.050] facial expressions. The significant interaction of expression × modality [F(1.389, 52.774) = 3.964, p = 0.039, η2 = 0.094] revealed that activity of the CS for dynamic and static happiness was lower than that for angry [t(33)dynamic = 5.044, p < 0.001; t(33)static = 5.219, p < 0.001] and neutral [t(33)dynamic = 4.815, p < 0.001; t(33)static = 3.959, p < 0.01] facial expressions, respectively (see Figure Figure2).2). The decrease of CS activity was higher for dynamic than static happiness conditions [t(33) = 2.269, p = 0.029].

An external file that holds a picture, illustration, etc.
Object name is fpsyg-09-00052-g0002.jpg

Mean (±SE) EMG activity changes and corresponding statistics for corrugator supercilii during presentation conditions. Asterisks with lines beneath indicate significant differences between conditions (simple effects) in EMG responses: *p < 0.05, ***p < 0.001. Separate asterisks indicate significant differences from baseline EMG responses: +p < 0.1, *p < 0.05, **p < 0.01, ***p < 0.001.

One-sample t-tests revealed significantly lower CS activity for dynamic [t(40) = −4.595, p < 0.001] and static [t(40) = −2.618, p = 0.012] happiness conditions, compared to baseline. CS responses for static anger [t(41) = 2.724, p = 0.009] were higher than baseline. All other conditions were marginally higher than baseline [t(39)anger_dynamic = 2.016, p = 0.051; t(39)neutral_dynamic = 1.858, p = 0.071; t(39)neutral_static = 1.827, p = 0.075].

M. orbicularis oculi

ANOVA showed a significant main effect of expression [F(2, 76) = 15.561, p < 0.001, η2 = 0.291], indicating that activity of the OO for happiness (M = 0.207, SE = 0.075) was higher than for angry [M = −0.054, SE = 0.055; t(36) = 4.279, p < 0.001] and for neutral expressions [M = −0.111, SE = 0.045; t(36) = 4.746, p < 0.001]. A significant expression × modality interaction [F(1.688, 64.132) = 5.217, p = 0.011, η2 = 0.121] revealed that OO activity for dynamic expressions was higher than for static happiness [t(33) = 3.099, p = 0.009]. Other observed differences included higher OO activity in the happiness dynamic compared to the angry dynamic [t(33) = 4.303, p < 0.001] and neutral dynamic [t(33) = 4.679, p < 0.001] facial expressions (see Figure Figure33).

An external file that holds a picture, illustration, etc.
Object name is fpsyg-09-00052-g0003.jpg

Mean (±SE) EMG activity changes and corresponding statistics for orbicularis oculi during presentation conditions. Asterisks with lines beneath indicate significant differences between conditions (simple effects) in EMG responses: *p < 0.05, ***p < 0.001. Separate asterisks indicate significant differences from baseline EMG responses: **p < 0.01.

One-sample t-tests revealed increased OO activity, compared to baseline, for dynamic happiness [t(40) = 3.328, p = 0.002] and reduced activity for dynamic neutral [t(40) = −2.862, p = 0.007] facial expressions. All other OO activities did not differ from baseline [t(40)happiness_static = 1.032, p = 0.308; t(39)anger_dynamic = −0.916, p = 0.365; t(41)anger_static = −0.113, p = 0.911; t(39)neutral_static = −0.857, p = 0.397].

M. zygomaticus major

ANOVA showed a significant main effect of expression [F(1.142, 43.404) = 11.060, p < 0.001, η2 = 0.225], indicating that activity of the ZM for happiness [M = 0.404, SE = 0.138] was increased compared to angry [M = −0.125, SE = 0.054; t(36) = 3.458, p = 0.004] and neutral expressions [M = −0.140, SE = 0,043; t(36) = 3.358, p = 0.005]. The main effect of modality approached significance [F(1, 38) = 3.545, p = 0.067, η2 = 0.085], with activity of the ZM greater for dynamic [M = 0.091, SE = 0.091] than static [M = 0.003, SE = 0.043] facial expressions. A significant expression × modality interaction [F(1.788, 67.943) = 4.385, p = 0.020, η2 = 0.103] revealed that ZM activity was higher for dynamic than for static happiness [t(33) = 2.681, p = 0.011]. Higher ZM activity was observed in dynamic happiness compared to angry [t(33) = 3.541, p = 0.003] and neutral [t(33) = 3.354, p = 0.006] facial expressions. Results of higher ZM activity were observed during comparison of static happiness with static angry [t(33) = 3.124, p = 0.011] and neutral [t(33) = 3.050, p = 0.013] facial expression conditions (see Figure Figure44).

An external file that holds a picture, illustration, etc.
Object name is fpsyg-09-00052-g0004.jpg

Mean (±SE) EMG activity changes and corresponding statistics for zygomaticus major during presentation conditions. Asterisks with lines beneath indicate significant differences between conditions (simple effects) in EMG responses: *p < 0.05, ***p < 0.001. Separate asterisks indicate significant differences from baseline EMG responses: *p < 0.05, **p < 0.01.

One-sample t-tests revealed increased ZM activity, compared to baseline, for dynamic [t(40) = 3.217, p = 0.003] and static [t(40) = 2.415, p = 0.020] happiness and lower activity for dynamic [t(39) = −2.307, p = 0.026] and static [t(39) = 3.612, p = 0.001] neutral facial expressions. Mean ZM activity for anger did not differ from baseline [t(39) dynamic = −0.688, p = 0.498; t(41) static = −1.589, p = 0.120].

fMRI data

Region of interest (ROI) analyses were carried out for the contrasts comparing brain activation during dynamic vs. static expressions, resulting in 11 contrasts of interest: happiness dynamic > happiness static, anger dynamic > anger static, neutral dynamic > neutral static, emotion dynamic > emotion static (emotion dynamic—pooled dynamic happiness, and anger conditions; emotion static—similar pooling), all dynamic > all static (all dynamic—pooled dynamic happiness, anger and neutral conditions; all static—similar pooling), happiness dynamic > neutral dynamic, happiness static > neutral static, anger dynamic > neutral dynamic, anger static > neutral static, emotion dynamic > neutral dynamic, emotion static > neutral static. Mentioned contrasts were calculated in order to investigate two types of questions. The contrast emotion/happiness/anger/all dynamic/static > neutral dynamic/static addresses neural correlates of FM of emotional/happiness/anger/all expressions. The other contrasts (i.e., emotion/happiness/anger/all dynamic > emotion/happiness/anger/all static) relate to the difference in processing between dynamic and static stimuli.

ROI analyses indicated that for the happiness dynamic > happiness static contrast, V5/MT+ and STS were activated bilaterally. Other structures for the contrast were activated only in the right hemisphere (i.e., pre-SMA, IPL, BA45) (see Table Table1;1; for whole brain analysis see Supplementary Table 1).

Table 1

Summary statistics of ROIs' activations for happiness dynamic > happiness static contrast.

Region of interestLeft hemisphereRight hemisphere
MtpMtp
V5/MT+0.4669.8760.000***0.87712.0490.000***
Primary Motor Cortex−0.023−0.5330.702−0.031−0.6530.741
Premotor Cortex0.0070.2170.4150.0361.1450.129
Pre-SMA0.0200.5320.2990.1603.8070.000**
Primary Somatosensory Cortex0.0060.1480.4410.0450.9860.165
Inferior Parietal Lobule0.0682.0070.0250.1563.6980.000**
Superior Temporal Sulcus0.29810.0310.000***0.44111.7780.000***
BA440.0972.4410.0090.0762.4130.010
BA450.0932.1050.0200.1404.6780.000***
Amygdala0.0802.1360.0190.0882.5160.008
Anterior Cingulate Cortex0.0040.1400.4450.0130.5640.288
Insula0.0562.3790.0110.0441.7660.042
Caudate Head0.0711.7570.0430.0902.1660.018
Putamen0.0532.0460.0230.0602.4280.010
Nucleus Accumbens0.0330.8790.1920.0611.7450.044
Globus Pallidus0.0492.2540.0150.0502.7660.004

Asterisks indicate significant, Bonferroni corrected, activations of each ROI:

**p < 0.01,
***p < 0.001.

For the anger dynamic > anger static contrast, V5/MT+ and STS were also activated bilaterally. However, this contrast revealed additional bilateral activation of the amygdala. Other structures revealed by this contrast were visible only in the right hemisphere (i.e., pre-SMA and BA45) (see Table Table2;2; for whole brain analysis see Supplementary Table 2).

Table 2

Summary statistics of ROIs' activations for anger dynamic > anger static contrast.

Region of interestLeft hemisphereRight hemisphere
MtpMtp
V5/MT+0.4759.4990.000***1.04212.1040.000***
Primary Motor Cortex0.0250.6030.2750.0320.7420.231
Premotor Cortex0.0130.3860.3510.0571.7330.045
Pre-SMA0.0320.8460.2010.1634.0870.000**
Primary Somatosensory Cortex0.0260.6100.2720.0711.5790.061
Inferior Parietal Lobule0.0290.7980.2140.0651.5310.066
Superior Temporal Sulcus0.2909.3730.000***0.45011.6350.000***
BA440.0531.2610.1070.0571.7620.042
BA450.1082.1970.0170.1343.8340.000**
Amygdala0.1303.7360.000**0.1514.6060.000***
Anterior Cingulate Cortex−0.031−1.0000.839−0.021−0.6490.740
Insula0.0150.5260.3010.0361.3450.093
Caudate Head0.0160.4140.3400.0581.4230.081
Putamen0.0331.1380.1310.0572.2190.016
Nucleus Accumbens0.0140.3790.3530.0240.6120.272
Globus Pallidus0.0301.1480.1280.0211.0700.145

Asterisks indicate significant, Bonferroni corrected, activations of each ROI:

**p < 0.01,
***p < 0.001.

For the neutral dynamic > neutral static contrast, only V5/MT+ and STS were activated bilaterally (see Table Table3;3; for whole brain analysis see Supplementary Tables 35).

Table 3

Summary statistics of ROIs' activations for neutral dynamic > neutral static contrast.

Region of interestLeft hemisphereRight hemisphere
MtpMtp
V5/MT+0.2205.0290.000***0.54813.4790.000***
Primary Motor Cortex0.0200.6800.2500.0180.6230.268
Premotor Cortex0.0150.5600.2890.0421.7830.041
Pre-SMA−0.003−0.0510.5200.0821.6980.048
Primary Somatosensory Cortex0.0180.7110.2400.0301.0480.150
Inferior Parietal Lobule−0.006−0.1960.5770.0501.6050.058
Superior Temporal Sulcus0.1555.5670.000***0.2417.3300.000***
BA440.0581.9770.0270.0552.1100.020
BA450.0080.1920.4240.0752.6750.005
Amygdala0.0341.1380.1300.0491.9390.029
Anterior Cingulate Cortex−0.030−0.9950.837−0.016−0.6180.730
Insula0.0211.0870.1410.0261.4590.076
Caudate Head0.0270.6710.2530.0371.0680.146
Putamen0.0200.7330.2340.0210.9370.177
Nucleus Accumbens0.0180.4840.3150.0391.0770.144
Globus Pallidus0.0060.2570.3990.0331.5530.064

Asterisks indicate significant, Bonferroni corrected, activations of each ROI:

***p < 0.001.

The emotion dynamic > emotion static contrast, revealed bilateral activations of V5/MT+, STS, BA45, BA44, and Amygdala. Additionally, this contrast revealed activations of Putamen, Globus Pallidus, IPL and pre-SMA in the right hemisphere (see Table Table4;4; for whole brain analysis see Supplementary Table 4).

Table 4

Summary statistics of ROIs' activations for emotion dynamic > emotion static contrast.

Region of interestLeft hemisphereRight hemisphere
MtpMtp
V5/MT+0.94111.3350.000***1.92013.7020.000***
Primary Motor Cortex0.0020.0280.4890.0010.0180.493
Premotor Cortex0.0190.4260.3360.0942.0380.024
pre-SMA0.0520.9780.1670.3235.3530.000***
Primary Somatosensory Cortex0.0330.5050.3080.1161.7100.047
Inferior Parietal Lobule0.0972.1860.0170.2213.6380.000*
Superior Temporal Sulcus0.58812.6330.000***0.89216.6070.000***
BA440.1503.0260.002+0.1333.0240.002+
BA450.2013.6830.000**0.2746.1510.000***
Amygdala0.2114.3510.000**0.2395.4850.000***
Anterior Cingulate Cortex−0.027−0.6800.750−0.007−0.1720.568
Insula0.0701.7970.0400.0802.0620.022
Caudate Head0.0881.6980.0480.1482.4750.009
Putamen0.0862.3520.0120.1173.6760.000*
Nucleus Accumbens0.0480.9730.1680.0861.7140.047
Globus Pallidus0.0792.5100.0080.0712.9850.002+

Asterisks indicate significant, Bonferroni corrected, activations of each ROI:

+p < 0.1,
*p < 0.05,
**p < 0.01,
***p < 0.001.

The all dynamic > all static contrast, illustrating general processing of dynamic compared to static expressions, revealed bilateral activations of V5/MT+, STS, BA44, BA45, and the amygdala. Moreover, few cortical areas and subcortical structures were activated only in the right hemisphere (i.e., premotor cortex (trend effect), pre-SMA, IPL, caudate head, putamen and globus pallidus) (see Table Table5;5; for whole brain analysis see Supplementary Table 5).

Table 5

Summary statistics of ROIs' activations for all dynamic > all static expressions contrast.

Region of interestLeft hemisphereRight hemisphere
MtpMtp
V5/MT+1.16111.2840.000***2.46814.9160.000***
Primary Motor Cortex0.0210.3320.3710.0200.2980.384
Premotor Cortex0.0340.6970.2450.1352.8840.003+
pre-SMA0.0500.7990.2140.4055.7330.000***
Primary Somatosensory Cortex0.0510.7590.2260.1461.9490.029
Inferior Parietal Lobule0.0911.8660.0340.2713.9740.000**
Superior Temporal Sulcus0.74312.4570.000***1.13215.9470.000***
BA440.2083.9140.000**0.1874.0350.000**
BA450.2093.8510.000**0.3497.2890.000***
Amygdala0.2454.1920.000**0.2885.5720.000***
Anterior Cingulate Cortex−0.057−1.2350.888−0.024−0.5050.692
Insula0.0912.1130.0200.1052.4270.010
Caudate Head0.1142.0880.0210.1853.2090.001*
Putamen0.1072.7520.0040.1374.1190.000**
Nucleus Accumbens0.0651.2080.1170.1242.1860.017
Globus Pallidus0.0852.6870.0050.1044.1160.000**

Asterisks indicate significant, Bonferroni corrected, activations of each ROI:

+p < 0.1,
*p < 0.05,
**p < 0.01,
***p < 0.001.

The happiness dynamic > neutral dynamic contrast, showed bilateral activations of V5/MT+, STS, pre-SMA, IPL, BA45, Amygdala, Anterior Cingulate Cortex, Caudate Head, Putamen, and Globus Pallidus. Activation of BA44 was visible only in the left hemisphere (see Table Table6;6; for whole brain analysis see Supplementary Table 6).

Table 6

Summary statistics of ROIs' activations for happiness dynamic > neutral dynamic contrast.

Region of interestLeft hemisphereRight hemisphere
MtpMtp
V5/MT+0.2876.4460.000***0.3627.3900.000***
Primary Motor Cortex−0.015−0.3830.648−0.016−0.3660.642
Premotor Cortex0.0290.9180.1820.0311.0310.154
pre-SMA0.1723.2920.001*0.2314.6970.000***
Primary Somatosensory Cortex0.0330.8520.1990.0691.8040.039
Inferior Parietal Lobule0.1514.2340.000**0.1212.9130.003+
Superior Temporal Sulcus0.1846.4810.000***0.2126.9740.000***
BA440.1113.1590.001*0.0732.4730.009
BA450.1773.8580.000**0.1243.5620.000*
Amygdala0.1404.8060.000***0.0973.5000.001*
Anterior Cingulate Cortex0.1093.7080.000**0.0863.4210.001*
Insula0.0220.9860.1650.0170.7720.222
Caudate Head0.1202.9540.002+0.1533.8260.000**
Putamen0.0742.9790.002+0.0632.8850.003+
Nucleus Accumbens0.0742.0740.0220.0832.3850.011
Globus Pallidus0.0893.9830.000**0.0733.7030.000**

Asterisks indicate significant, Bonferroni corrected, activations of each ROI:

+p < 0.1,
*p < 0.05,
**p < 0.01,
***p < 0.001.

For the happiness static > neutral static contrast, only pre-SMA was activated bilaterally (see Table Table7;7; for whole brain analysis see Supplementary Table 7).

Table 7

Summary statistics of ROIs' activations for happiness static > neutral static contrast.

Region of interestLeft hemisphereRight hemisphere
MtpMtp
V5/MT+0.0411.0980.1390.0320.8500.200
Primary Motor Cortex0.0280.7370.2330.0330.8730.194
Premotor Cortex0.0371.3400.0930.0361.3410.093
pre-SMA0.1503.7940.000**0.1523.9620.000**
Primary Somatosensory Cortex0.0451.2780.1040.0541.5030.070
Inferior Parietal Lobule0.0782.6570.0050.0140.4830.316
Superior Temporal Sulcus0.0411.5440.0650.0120.4130.341
BA440.0721.8000.0390.0521.8630.035
BA450.0922.0580.0230.0592.0350.024
Amygdala0.0942.7590.0040.0582.0620.023
Anterior Cingulate Cortex0.0752.8430.0030.0562.3260.012
Insula−0.013−0.5540.709−0.002−0.0690.527
Caudate Head0.0761.8420.0360.1002.6510.006
Putamen0.0411.3570.0910.0240.8500.200
Nucleus Accumbens0.0591.5410.0650.0611.5930.059
Globus Pallidus0.0461.9980.0260.0562.8330.003

Asterisks indicate significant, Bonferroni corrected, activations of each ROI:

**p < 0.01.

For the anger dynamic > neutral dynamic contrast, analysis revealed bilateral activations of V5/MT+, STS, Amygdala and BA45. Pre-SMA activation was visible only in the right hemisphere (see Table Table8;8; for whole brain analysis see Supplementary Table 8).

Table 8

Summary statistics of ROIs' activations for anger dynamic > neutral dynamic contrast.

Region of interestLeft hemisphereRight hemisphere
MtpMtp
V5/MT+0.3208.2420.000***0.5679.9560.000***
Primary Motor Cortex−0.023−0.6890.7530.0020.0670.474
Premotor Cortex0.0050.1560.4380.0220.8010.214
pre-SMA0.1422.7420.0040.2114.6130.000***
Primary Somatosensory Cortex0.0100.3420.3670.0501.5380.065
Inferior Parietal Lobule0.0802.1100.0200.0280.7350.233
Superior Temporal Sulcus0.2167.9950.000***0.2818.8450.000***
BA440.0741.8760.0340.0431.4920.071
BA450.1663.0590.002+0.1083.2280.001*
Amygdala0.1193.9420.000**0.1164.0430.000**
Anterior Cingulate Cortex0.0160.6310.2660.0080.3080.380
Insula0.0110.5190.3030.0130.7990.214
Caudate Head0.0290.7250.2360.0762.0010.026
Putamen0.0391.4690.0740.0441.9140.031
Nucleus Accumbens0.0431.1780.1220.0451.4290.080
Globus Pallidus0.0562.2210.0160.0381.8420.036

Asterisks indicate significant, Bonferroni corrected, activations of each ROI:

+p < 0.1,
*p < 0.05,
**p < 0.01,
***p < 0.001.

The anger static > neutral static contrast, revealed no significant activations of brain structures (see Table Table9;9; for whole brain analysis see Supplementary Table 9).

Table 9

Summary statistics of ROIs' activations for anger static > neutral static contrast.

Region of interestLeft hemisphereRight hemisphere
MtpMtp
V5/MT+0.0641.5510.0640.0731.5760.061
Primary Motor Cortex−0.029−0.7420.769−0.012−0.2820.610
Premotor Cortex0.0070.2210.4130.0060.1790.429
pre-SMA0.1082.2270.0160.1312.3880.011
Primary Somatosensory Cortex0.0020.0630.4750.0080.2430.404
Inferior Parietal Lobule0.0451.1760.1230.0130.3460.366
Superior Temporal Sulcus0.0812.5700.0070.0711.9500.029
BA440.0802.1300.0190.0411.1750.123
BA450.0651.3830.0870.0481.3370.094
Amygdala0.0230.7990.2140.0140.5370.297
Anterior Cingulate Cortex0.0160.5910.2790.0120.4330.334
Insula0.0170.6430.2620.0030.1250.450
Caudate Head0.0390.9820.1660.0551.3530.091
Putamen0.0270.9520.1730.0090.3490.365
Nucleus Accumbens0.0461.1850.1210.0601.5780.061
Globus Pallidus0.0331.4140.0820.0492.0230.025

The emotion dynamic > neutral dynamic contrast, showed that V5/MT+, STS, Amygdala, BA45, pre-SMA and Globus Pallidus were activated bilaterally. Additionally, this contrast revealed IPL activation in the left hemisphere and Caudate Head activation in the right hemisphere (see Table Table10;10; for whole brain analysis see Supplementary Table 10).

Table 10

Summary statistics of ROIs' activations for emotion dynamic > neutral dynamic contrast.

Region of interestLeft hemisphereRight hemisphere
MtpMtp
V5/MT+0.6087.8410.000***0.92910.0220.000***
Primary Motor Cortex−0.039−0.6420.738−0.014−0.2160.585
Premotor Cortex0.0340.6370.2640.0531.0910.141
pre-SMA0.3153.1890.001*0.4425.0960.000***
Primary Somatosensory Cortex0.0430.7910.2170.1182.0320.024
Inferior Parietal Lobule0.2323.6260.000*0.1492.1590.018
Superior Temporal Sulcus0.4008.0730.000***0.4939.2380.000***
BA440.1853.1490.001*0.1172.5280.008
BA450.3433.9080.000**0.2333.8670.000**
Amygdala0.2595.1100.000***0.2134.4970.000***
Anterior Cingulate Cortex0.1252.6260.0060.0942.2060.016
Insula0.0330.9300.1790.0300.9860.165
Caudate Head0.1492.0410.0240.2303.2470.001*
Putamen0.1132.5140.0080.1082.8090.004
Nucleus Accumbens0.1171.8690.0340.1292.2410.015
Globus Pallidus0.1463.3640.001*0.1113.0770.002+

Asterisks indicate significant, Bonferroni corrected, activations of each ROI:

+p < 0.1,
*p < 0.05,
**p < 0.01,
***p < 0.001.

For the emotion static > neutral static contrast, ROI analysis revealed only pre-SMA activations (see Table Table11;11; for whole brain analysis see Supplementary Table 11).

Table 11

Summary statistics of ROIs' activations for emotion static > neutral static contrast.

Region of interestLeft hemisphereRight hemisphere
MtpMtp
V5/MT+0.1061.5110.0690.1051.4560.076
Primary Motor Cortex−0.002−0.0240.5090.0220.3160.377
Premotor Cortex0.0440.8560.1980.0420.8050.212
pre-SMA0.2583.3310.001*0.2833.4110.001*
Primary Somatosensory Cortex0.0470.7200.2380.0621.0450.151
Inferior Parietal Lobule0.1232.2280.0150.0280.5210.302
Superior Temporal Sulcus0.1222.4030.0100.0831.4950.071
BA440.1522.2440.0150.0931.7020.048
BA450.1582.0200.0250.1081.9770.027
Amygdala0.1172.3040.0130.0721.6580.052
Anterior Cingulate Cortex0.0921.9050.0320.0681.4900.072
Insula0.0040.0980.4610.0020.0410.484
Caudate Head0.1151.6660.0510.1552.3070.013
Putamen0.0681.3130.0980.0320.7310.234
Nucleus Accumbens0.1051.5560.0630.1201.8090.039
Globus Pallidus0.0801.9660.0280.1052.7220.005

Asterisks indicate significant, Bonferroni corrected, activations of each ROI:

*p < 0.05.

Correlation analysis

Muscle-brain correlations of dynamic and static happiness conditions

Correlation analyses computed for the happiness dynamic condition with ZM revealed positive relations bilaterally in the pre-SMA (trend effect), putamen, nucleus accumbens and globus pallidus. Trend effects were found in the activations of the right BA44 and insular cortex. No relationships were found between brain activity in the happiness dynamic conditions and OO muscle activity. For the CS, negative relations were found for V5/MT+, STS, BA45 in the left hemisphere, while IPL and ACC in the right hemisphere. Negative trend relationships were found bilaterally in the caudate head (see Table Table1212).

Table 12

Muscles-brain correlations of dynamic and static happiness conditions.

Region of interestHappiness dynamicHappiness static
CSZMOOCSZMOO
LHRHLHRHLHRHLHRHLHRHLHRH
V5/MT+−0.36*−0.1110.2000.0770.0770.086−0.0450.0780.0050.001−0.062−0.120
Primary Motor Cortex0.0360.038−0.085−0.030−0.126−0.130−0.292+−0.283+0.0960.0520.301+b0.312*b
Premotor Cortex−0.002−0.0460.1400.246−0.0810.071−0.362*−0.285+0.2060.1060.275+0.223
pre-SMA−0.233−0.2560.267+0.297+b0.1040.171−0.081−0.0780.1440.108−0.008−0.031
Primary Somatosensory Cortex−0.124−0.1630.0780.111−0.024−0.083−0.317*−0.309*0.0130.1000.2570.400*b
Inferior Parietal Lobule−0.236−0.39*0.1300.210−0.1060.037−0.149−0.139−0.025−0.1130.0830.017
Superior Temporal Sulcus−0.334*−0.2320.1930.323*b0.0950.196−0.191−0.0970.1500.0000.1240.035
BA44−0.188−0.2180.1830.293+−0.0130.033−0.377*−0.2500.244−0.0410.1660.059
BA45−0.352*−0.1690.0350.250−0.0610.024−0.111−0.0440.080−0.109−0.038−0.034
Amygdala−0.232−0.1080.2320.1400.029−0.122−0.238−0.2320.1290.1610.1100.120
Anterior Cingulate Cortex−0.255−0.326*0.0890.0970.0860.114−0.103−0.097−0.0380.0150.001−0.009
Insula−0.135−0.1910.1400.286+−0.0770.018−0.227−0.2050.418**b0.276+b0.333*0.265+
Caudate Head−0.280+−0.300+0.2370.355*0.1310.149−0.253−0.1410.2010.1490.266+0.213
Putamen−0.124−0.1410.352*0.458**b0.0210.112−0.181−0.0990.397*b0.294+b0.2010.037
Nucleus Accumbens−0.187−0.2210.314*b0.357*b0.0310.158−0.056−0.0980.1500.013−0.005−0.138
Globus Pallidus−0.209−0.2240.418**b0.411**b0.1510.117−0.142−0.1460.338*b0.2020.1640.090

Post-number asterisks indicate significant Pearson correlations of muscle-ROI pairs:

+p < 0.1,
*p < 0.05,
**p < 0.01.

Addition “b” letter indicates that Pearson coefficient is significant based on BCa bootstrap criterion. CS, corrugator supercilii; ZM, zygomaticus major; OO, orbicularis oculi, LH, left hemisphere; RH, right hemisphere.

Correlation analyses computed for the happiness static condition with ZM indicated positive relationships for the left insula, putamen and globus pallidus. Trend positive effects of ZM and brain activity were found in the right insula and putamen. Positive relationships of the OO and brain activity during perception of the happiness static condition were found in the right primary motor cortex, right primary somatosensory cortex and left insula. Trend effects of the OO and brain activity were observed for the left primary motor cortex, premotor cortex and caudate head. Negative relationships of the CS and brain activity were found for the premotor cortex and BA44 in the left hemisphere. Moreover, CS activity was negatively related to activity of the primary somatosensory cortex (bilaterally), primary motor cortex (bilaterally) and premotor cortex (right) (see Table Table1212).

Muscle-brain correlations of dynamic and static anger conditions

Correlation analyses performed for the anger dynamic condition indicated a negative relationship of the CS and activity in left BA44. The positive relationship was found with the OO and brain activity during perception of dynamic angry expressions in the STS (bilaterally) and right premotor cortex. Trend positive relationships were found in the primary motor cortex (bilaterally) and right BA45, amygdala and insula (see Table Table1313).

Table 13

Muscles-brain correlations of dynamic and static anger conditions.

Region of interestAnger dynamicAnger static
CSOOCSOO
LHRHLHRHLHRHLHRH
V5/MT+0.082−0.0890.1590.1880.1780.068−0.287+−0.117
Primary Motor Cortex−0.123−0.0220.292+b0.289+0.0010.0440.1310.113
Premotor Cortex−0.252−0.1010.2510.316*b−0.079−0.0170.2570.238
pre-SMA−0.047−0.0770.2010.026−0.148−0.0840.312*0.35*
Primary Somatosensory Cortex−0.190−0.0850.0960.1620.0190.1190.1490.051
Inferior Parietal Lobule−0.116−0.0030.1090.2410.2400.276+b0.0940.106
Superior Temporal Sulcus0.1030.0930.319*b0.369*b0.1740.305*−0.0040.095
BA44−0.323*−0.069−0.0670.1240.0220.0420.2070.174
BA45−0.221−0.1570.1950.288+b0.0500.0530.273+0.186
Amygdala−0.114−0.2620.1890.284+−0.0340.0240.108−0.137
Anterior Cingulate Cortex0.021−0.0580.0890.0360.0270.0980.1590.139
Insula−0.0060.1340.1450.275+b0.0940.1830.1480.091
Caudate Head−0.0140.0150.0490.1360.0260.0160.2330.276+
Putamen−0.081−0.0120.1350.159−0.0040.052−0.053−0.128
Nucleus Accumbens−0.198−0.091−0.0110.182−0.0300.0970.1890.084
Globus Pallidus−0.174−0.0220.1590.1960.0460.048−0.0070.120

Post-number asterisks indicate significant Pearson correlations of muscle-ROI pairs:

+p < 0.1,
*p < 0.05.

Addition “b” letter indicates that Pearson coefficient is significant based on BCa bootstrap criterion. LH, left hemisphere; RH, right hemisphere.

Positive relationships of brain and CS activity for static anger were observed in the right STS and right IPL (trend effect). Activity in the pre-SMA (bilaterally) was positively related to OO activity during perception of angry pictures. Trend effects of the relationship between the OO and brain activity during perception of angry static conditions were observed in the right caudate (positive), left BA45 (positive) and in V5/MT+ (negative, see Table Table1313).

Discussion

The present study examined neural correlates of FM during the observation of dynamic compared to static facial expressions. Proofs of concept came from facial EMG, fMRI, and combined EMG-fMRI analyses. Firstly, the anticipated patterns of mimicry were observed, demonstrated by increased ZM and OO activity and decreased CS activity for happiness (Rymarczyk et al., 2016), as well as increased CS activity for anger (Dimberg and Petterson, 2000). Moreover, we found that dynamic presentations of happy facial expressions induced higher EMG amplitude in the ZM, OO, and CS compared to static presentations. Angry facial expression were not associated with differences in the CS response between static and dynamic displays. Analysis of fMRI data revealed that dynamic (compared to static) emotional expressions activated bilateral STS, V5/MT+, and frontal and parietal areas. On the other hand, the perception of neutral dynamic compared to neutral static facial displays activated only structures related to biological motion i.e., bilaterally V5/MT+ and STS. Furthermore, some interaction effects of emotion and modality were found. For example, dynamic compared to static displays induced greater activity in the bilateral amygdala for anger, while this effect was found in the right IPL for happiness. The correlations between brain activity and facial muscle reactions revealed that correlated regions are related to the motor simulation of facial expressions, such as the IFG, which is considered a classical MNS. Conversely, the correlations between brain activity and facial muscle reactions also demonstrate a role in emotional processing, such as in the insula, which is part of extended MNS.

EMG response for dynamic compared to static facial expressions

The recorded EMG data showed that the subjects reacted spontaneously to happy facial expressions with increased ZM and OO activity (Rymarczyk et al., 2016) and decreased CS activity, interpretable as FM (Dimberg and Thunberg, 1998). However, EMG responses observed in our study were low in amplitude but comparable to other reports (Sato et al., 2008; Dimberg et al., 2011; Rymarczyk et al., 2011). In all muscles, the response was more pronounced when dynamic happy stimuli were presented (Weyers et al., 2006; Sato et al., 2008; Rymarczyk et al., 2011), which points to the benefits of applying dynamic stimuli (Murata et al., 2016). Patterns of ZM and OO reactions observed for dynamic happiness could be interpreted as a Duchenne smile (Ekman et al., 1990), suggesting that subjects could experience true and genuine positive emotion. Moreover, we observed higher CS reactions, similar for static and dynamic anger conditions, showing typical evidence of FM for this emotion (Sato et al., 2008; Dimberg et al., 2011). Increased CS response was found for neutral facial expressions as well. Some studies have reported increased CS activity as a function of mental effort (Neumann and Strack, 2000), disapproval (Cannon et al., 2011) or global negative affect (Larsen et al., 2003). In the case of our study, we interpret that increased CS activity for neutral facial expressions was a consequence of the instruction used in the procedure that asked subjects to pay careful attention (i.e., mental effort) to observed actors.

Neural network for dynamic compared to static facial expressions

We found that passive viewing of emotional dynamic, compared to neutral dynamic stimuli, activated a wide network of brain regions. This network included the inferior frontal gyrus (left BA44 and bilaterally BA45), left IPL, bilaterally preSMA, STS and V5/MT+, as well as left and right amygdala, right caudate head and bilaterally pallidus. In contrast, the emotional static displays compared to neutral displays activated only bilateral preSMA. The last mentioned neuronal pattern was significant due to happiness. Furthermore, dynamic happiness evoked activity that was greater than static happiness in the right IPL, while dynamic vs. static angry faces evoked greater bilateral activity in the amygdala. As expected, we found that, regardless of a specific emotion, dynamic stimuli selectively activated the bilateral visual area V5/MT+ and superior temporal sulcus, structures associated with motion and biological motion perception, respectively (Robins et al., 2009; Arsalidou et al., 2011; Foley et al., 2012; Furl et al., 2015). Recently, in a magneto-encephalography (MEG) study, Sato et al. (2015) explored temporal profiles and dynamic interaction patterns of brain activity during perception of dynamic emotional facial expressions in comparison to dynamic mosaics. Notably, they found that, apart from V5/MT+ and STS, the right IFG exhibited higher activity for dynamic faces vs. dynamic mosaics. Furthermore, they have found a direct functional connectivity between the STS and IFG, closely related to FM.

Our findings concerning IFG are in line with those of previous studies (Carr et al., 2003; Leslie et al., 2004) suggesting that perception of emotional facial displays involves a classical MNS, which is sensitive to goal-directed actions. There is an assumption that during the observation of another individual's actions, the brain simulates the same action by activating the neurons localized in the IFG, which are involved in executing the same behavior (Jabbi and Keysers, 2008). For example, Carr et al. (2003) asked subjects to observe and imitate static emotional facial expressions and found that both tasks induced extensive activity in the IFG. Activation of the right IFG and parietal cortex were also found during passive viewing of dynamic compared to static emotional facial expressions (Arsalidou et al., 2011; Foley et al., 2012) or during viewing and executing smiles (Hennenlotter et al., 2005). It was suggested that activated mirror neurons localized in the IFG and parietal regions could convert observed emotional facial expressions into a pattern of neural activity that would be suitable for producing similar facial expressions, and would provide the basis for a motor simulation of facial expressions (Gazzola et al., 2006; van der Gaag et al., 2007; Jabbi and Keysers, 2008). Our results seems to be in line with the perception-behavior link model (Chartrand and Bargh, 1999), which assumes that an observer's motor system “resonates” and facilitates the understanding of the perceived action. It is believed that classical MNS are responsible for such processes (for a review, see Bastiaansen et al., 2009). Moreover, it seems that dynamic emotional facial expressions might be a stronger social signal to induce imitation processes in MNS, since we did not observe activity of IFG when comparing neutral dynamic to neutral static facial expressions' contrasts.

To summarize, our findings have revealed that functional properties of classical MNS manifest mainly during perception of dynamic compared to static facial displays. It may be justified that dynamic stimuli, which are relevant for social interaction, engage a wide network of brain regions sensitive to motion stimuli (Kilts et al., 2003; Kessler et al., 2011) and signaling intentions (Gallagher et al., 2000; Pelphrey et al., 2003), and thus may be a strong social signal to induce simulation processes in MNS.

Relationships between neural activity and facial muscle responses

One of the fundamental questions regarding the neural basis of FM is whether this phenomenon involves motor and/or affective representations of observed expressions (for a review, see Bastiaansen et al., 2009). So far, only one study has examined that question with simultaneous recording of EMG and BOLD signal, However, only static avatar emotional expressions were used (Likowski et al., 2012). In our study, where both static and dynamic natural displays were applied, the associations between activity of brain regions and facial muscle reactions revealed that correlated regions are related to motor (IFG, pre-SMA, IPL) simulation of facial expressions, but also to emotional processing. Additionally, happiness display correlations with muscle responses were found in basal ganglia structures (right caudate head, bilaterally globus pallidus and putamen), nucleus accumbens and insula, while for angry displays, in the right amygdala and insula, among others.

Activations in the IFG and pre-SMA observed in our study coincide with earlier studies (Hennenlotter et al., 2005; Lee et al., 2006; Jabbi and Keysers, 2008; Likowski et al., 2012; Kircher et al., 2013) that claimed that the regions constitute a representation network for observation and imitation of emotional facial expressions (for a review, see Bastiaansen et al., 2009). For example, Lee et al. (2006), who also explored the relation between brain activity and facial muscle movement (facial markers), emphasized the role of the IFG in intentional imitation of emotional expressions.

It is interesting that the results of our study indicated that activation of the pre-SMA correlated with magnitude of facial muscle response for happy dynamic displays. Consistent with our results, similar ones were observed in a study by Iwase et al. (2002) during spontaneous facial execution of smiling. It was proposed that the activation of the pre-SMA could be understood as contagion of the happy facial expressions (Dimberg et al., 2002) due to pre-SMA connections to the striatum (Lehéricy et al., 2004), a critical component of the motor and reward systems. Moreover, it is well known that smiles evoke a positive response (Sims et al., 2012), serving as socially rewarding stimuli (Heerey and Crossley, 2013) in face-to-face interactions. This interpretation fits with our results of the involvement of the basal ganglia and nucleus accumbens, structures constituting reward-related circuitry (for a review, see Kringelbach and Berridge, 2010) in the processing of positive facial expressions. Basal nuclei activity correlated positively with ZM (nucleus accumbens, putamen) and negatively with CS (caudate head) activity for happiness dynamic displays, which is consistent with previous findings in the literature. It has been shown that the nuclei accumbens responds for different positive stimuli, such as money (Clithero et al., 2011), erotic pictures (Sabatinelli et al., 2007) or happy facial displays (Monk et al., 2008), and is thought to be involved in the experience of pleasure (Ernst et al., 2004). This interpretation is supported by the fact that for happiness displays we also found a significant EMG response in the OO muscle, which could mean that subjects recognized happiness as “real” smiles. Furthermore, our results remain in agreement with another earlier study (Likowski et al., 2012) in which the stronger ZM reactions to happy faces were associated with an increase in activity in the right caudate. This corresponds to Vrticka et al. (2013) who showed that the left putamen is more activated during imitation than passive observation of happy displays.

Interestingly, in our study, the activation in the right caudate also correlated positively with OO reactions for anger expressions. Caudate nucleus, part of the dorsal stratum, is known to be involved in motor and non-motor processes, e.g., including procedural learning, associative learning and inhibitory control of action (Soghomonian, 2016). Moreover, it is suggested that activity of the basal ganglia also reflect approach motivation and could represent reward (O'Doherty et al., 2003; Lee et al., 2006). Recently, Mühlberger et al. (2011) reported that perception of both happy and anger dynamic facial expressions were related to dorsal striatum activity. Furthermore, the activity of caudate nuclei during perception of anger may reflect a more general role in detection of danger signals. For example, it has been shown that PD subjects exhibited selective impairments in the recognition of negative facial emotion, e.g., for anger (Sprengelmeyer et al., 2003; Clark et al., 2008); fear (Livingstone et al., 2016), and sadness and disgust (Sprengelmeyer et al., 2003; Dujardin et al., 2004). Accordingly, neuroimaging data from healthy subjects tends to confirm the role of caudate nuclei in processing of negative emotions, particularly in recognition of angry expression (Beyer et al., 2017).

Importantly, we observed that during perception of anger dynamic displays, OO response correlated not only with caudate nucleus but with the right amygdala activity as well. Historically, the amygdala has been observed as playing a substantial role in the processing and expression of fear, but has recently been linked to other emotions, both positive and negative. For example, some studies have found amygdala activation during the observation and execution of both negative and positive facial expressions (Carr et al., 2003; van der Gaag et al., 2007), suggesting that this structure may reflect not only imitation but also the experience of a particular emotion (Kircher et al., 2013). As far as contraction of OO for anger expressions is concerned, this could be interpreted as a reaction to negative signal value or as a sign of arousal or interest (Witvliet and Vrana, 1995).

Further, in our study, we observed correlations between activity of the insula and facial responses during perception of both happy and angry facial expressions. Recently, a considerable number of studies (Carr et al., 2003; van der Gaag et al., 2007; Jabbi and Keysers, 2008) have suggested that the anterior insula and adjacent inferior frontal operculum (IFO) may represent an emotional component of the MNS. The role of those structures has been shown not only for observing but also for experiencing of emotions [i.e., unpleasant odors, (Wicker et al., 2003) or tastes (Jabbi et al., 2007)]. Moreover, the insula is involved in the experience of positive emotions, such as during the viewing of pleasing facial expressions (Jabbi et al., 2007), or during observation and execution of smile expressions (Hennenlotter et al., 2005). As far as the nature of FM is concerned, there is an idea that the insula and IFO may underlie a simulation of emotional feeling states (referred to as hot simulation). In contrast, the IFG (which activates during observation of neutral and emotional facial expressions) may reflect a form of motor simulation (referred to as cold simulation) (for a review, see Bastiaansen et al., 2009). The support for this idea comes from connectivity analysis of IFO and IFG activity where subjects experience unpleasant and neutral tastes. Using Granger causality, Jabbi and Keysers (2008) showed that activity in the IFO (a structure functionally related to the insula) is causally triggered by activity in the IFG. In other words, motor simulation in the IFG seems to trigger an affective simulation in the IFO of what the other person is feeling. Our results regarding the correlated activity of the IFG and muscle responses, as well as the separate correlated activity between those muscle responses and the insula, seem to be in line with the aforementioned interpretation.

It should be noted that in our study, as in others (Lee et al., 2006; van der Gaag et al., 2007), we did not observed activity in the motor or somatosensory cortex during passive viewing of emotional expressions. Indeed, there is a theoretical assumption that FM processes activate motor as well as somatosensory neuronal structures involved in processing the facial expression (Korb et al., 2015; Paracampo et al., 2016; Wood et al., 2016). Conversely, based on neuroimaging data, it seems that the magnitude of facial muscular change during emotional expression resonates activity related to emotion processing, i.e., insula or amygdala (Lee et al., 2006; van der Gaag et al., 2007), rather than the motor and somatosensory cortex. Moreover, it was shown that explicit imitation and not passive observation of facial expressions engages more somatosensory and premotor cortices. Accordingly, it was shown that activity in IFG was more pronounced during imitation than passive viewing of emotional expression (Carr et al., 2003).

In conclusion, our study confirmed the general agreement that exists among researchers that dynamic facial expressions are a valuable source of information in social communication. The evidence was visible during the stronger FM and greater neural network activations during dynamic compared to static facial expressions of happiness and anger. Moreover, the direct relationships between FM response and brain activity revealed that the associated structures belong to motor and emotional components of the FM phenomenon. The activity of the IFG and pre-SMA (classical MNS) appears to reflect action representation (i.e., the motor aspects of observed facial expressions), while the insula and amygdalae (extended MNS) process the emotional content of facial expressions. Furthermore, it seems that our results agree with the proposal that FM is not pure motor copy of behavior but rather it engages unique neural networks involved in emotion processing. Based on the current set of knowledge, it seems FM includes motor imitation and emotional contagion processes; however, their mutual relations are so far not established conclusively. For example, it could be possible that motor imitation leads to emotional contagion or vice versa, among other factors which could play an important role in social interactions.

Author contributions

Conceived and designed the experiments: KR and ŁZ. Performed the experiments: KR and ŁZ. Analyzed the data: KR and ŁZ. Contributed materials: KR and ŁZ. Wrote the paper: KR, ŁZ, KJ-S, and IS.

Conflict of interest statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Footnotes

Funding. This study was supported by grant no. 2011/03/B/HS6/05161 from the Polish National Science Centre provided to KR.

Supplementary material

The Supplementary Material for this article can be found online at: https://www.frontiersin.org/articles/10.3389/fpsyg.2018.00052/full#supplementary-material

References

  • Allen P. J., Josephs O., Turner R. (2000). A method for removing imaging artifact from continuous EEG recorded during functional MRI. Neuroimage 12, 230–239. 10.1006/nimg.2000.0599 [Abstract] [CrossRef] [Google Scholar]
  • Ambadar Z., Schooler J. W., Cohn J. F. (2005). Deciphering the enigmatic face: the importance of facial dynamics in interpreting subtle facial expressions. Psychol. Sci. 16, 403–410. 10.1111/j.0956-7976.2005.01548.x [Abstract] [CrossRef] [Google Scholar]
  • Andréasson P., Dimberg U. (2008). Emotional empathy and facial feedback. J. Nonverbal Behav. 32, 215–224. 10.1007/s10919-008-0052-z [CrossRef] [Google Scholar]
  • Arsalidou M., Morris D., Taylor M. J. (2011). Converging evidence for the advantage of dynamic facial expressions. Brain Topogr. 24, 149–163. 10.1007/s10548-011-0171-4 [Abstract] [CrossRef] [Google Scholar]
  • Bastiaansen J. A. C. J., Thioux M., Keysers C. (2009). Evidence for mirror systems in emotions. Biol. Sci. Evol. Dev. Int. Control Imitation Philos. Trans. R. Soc. B 364, 2391–2404. 10.1098/rstb.2009.0058 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Beyer F., Krämer U. M., Beckmann C. F. (2017). Anger-sensitive networks: characterising neural systems recruited during aggressive social interactions using data-driven analysis. Soc. Cogn. Affect. Neurosci. 26, 2480–2492. 10.1093/scan/nsx117 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Brett M., Anton J. L., Valabregue R., Poline J. B. (2002). Region of interest analysis using an SPM toolbox, in Neuroimage (Sendai: 8th International Conference on Functional Mapping of the Human Brain). Available online at: http://matthew.dynevor.org/research/articles/marsbar/marsbar_abstract.pdf [Google Scholar]
  • Cacioppo J. T., Petty R. E., Losch M. E., Kim H. S. (1986). Electromyographic activity over facial muscle regions can differentiate the valence and intensity of affective reactions. J. Pers. Soc. Psychol. 50, 260–268. 10.1037/0022-3514.50.2.260 [Abstract] [CrossRef] [Google Scholar]
  • Cannon P. R., Schnall S., White M. (2011). Transgressions and expressions: affective facial muscle activity predicts moral judgments. Soc. Psychol. Personal. Sci. 2, 325–331. 10.1177/1948550610390525 [CrossRef] [Google Scholar]
  • Carr L., Iacoboni M., Dubeau M. C., Mazziotta J. C., Lenzi G. L. (2003). Neural mechanisms of empathy in humans: a relay from neural systems for imitation to limbic areas. Proc. Natl. Acad. Sci. U.S.A. 100, 5497–5502. 10.1073/pnas.0935845100 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Chartrand T. L., Bargh J. A. (1999). The chameleon effect: the perception-behavior link and social interaction. J. Pers. Soc. Psychol. 76, 893–910. 10.1037/0022-3514.76.6.893 [Abstract] [CrossRef] [Google Scholar]
  • Clark U. S., Neargarder S., Cronin-Golomb A. (2008). Specific impairments in the recognition of emotional facial expressions in Parkinson's disease. Neuropsychologia 46, 2300–2309. 10.1016/j.neuropsychologia.2008.03.014 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Clithero J. A., Reeck C., Carter R. M., Smith D. V., Huettel S. A. (2011). Nucleus accumbens mediates relative motivation for rewards in the absence of choice. Front. Hum. Neurosci. 5:87. 10.3389/fnhum.2011.00087 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Dimberg U., Petterson M. (2000). Facial reactions to happy and angry facial expressions: evidence for right hemisphere dominance. Psychophysiology 37, 693–696. 10.1111/1469-8986.3750693 [Abstract] [CrossRef] [Google Scholar]
  • Dimberg U., Thunberg M. (1998). Rapid facial reactions to emotional facial expressions. Scand. J. Psychol. 39, 39–45. 10.1111/1467-9450.00054 [Abstract] [CrossRef] [Google Scholar]
  • Dimberg U., Andréasson P., Thunberg M. (2011). Emotional empathy and facial reactions to facial expressions. J. Psychophysiol. 25, 26–31. 10.1027/0269-8803/a000029 [CrossRef] [Google Scholar]
  • Dimberg U., Thunberg M., Elmehed K. (2000). Unconscious facial reactions to emotional facial expressions. Psychol. Sci. J. Am. Psychol. Soc. Austr. Psychol. Soc. 11, 86–89. 10.1111/1467-9280.00221 [Abstract] [CrossRef] [Google Scholar]
  • Dimberg U., Thunberg M., Grunedal S. (2002). Facial reactions to emotional stimuli: automatically controlled emotional responses. Cogn. Emot. 16, 449–471. 10.1080/02699930143000356 [CrossRef] [Google Scholar]
  • Dujardin K., Blairy S., Defebvre L., Duhem S., Noël Y., Hess U., et al. . (2004). Deficits in decoding emotional facial expressions in Parkinson's disease. Neuropsychologia 42, 239–250. 10.1016/S0028-3932(03)00154-4 [Abstract] [CrossRef] [Google Scholar]
  • Eickhoff S. (2016). SPM Anatomy Toolbox. Available online at: http://www.fz-juelich.de/inm/inm-1/DE/Forschung/_docs/SPMAnatomyToolbox/SPMAnatomyToolbox_node.html [Google Scholar]
  • Ekman P., Rosenberg E. L. (2012). What the Face Reveals: Basic and Applied Studies of Spontaneous Expression Using the Facial Action Coding System (FACS). 10.1093/acprof:oso/9780195179644.001.0001 [CrossRef] [Google Scholar]
  • Ekman P., Davidson R. J., Friesen W. V. (1990). The Duchenne smile: emotional expression and brain physiology. II. J. Pers. Soc. Psychol. 58, 342–353. 10.1037/0022-3514.58.2.342 [Abstract] [CrossRef] [Google Scholar]
  • Ernst M., Nelson E. E., McClure E. B., Monk C. S., Munson S., Eshel N., et al. . (2004). Choice selection and reward anticipation: an fMRI study. Neuropsychologia 42, 1585–1597. 10.1016/j.neuropsychologia.2004.05.011 [Abstract] [CrossRef] [Google Scholar]
  • Foley E., Rippon G., Thai N. J., Longe O., Senior C. (2012). Dynamic facial expressions evoke distinct activation in the face perception network: a connectivity analysis study. J. Cogn. Neurosci. 24, 507–520. 10.1162/jocn_a_00120 [Abstract] [CrossRef] [Google Scholar]
  • Fridlund A. J., Cacioppo J. T. (1986). Guidelines for human electromyographic research. Psychophysiology 23, 567–589. 10.1111/j.1469-8986.1986.tb00676.x [Abstract] [CrossRef] [Google Scholar]
  • Furl N., Henson R. N., Friston K. J., Calder A. J. (2015). Network interactions explain sensitivity to dynamic faces in the superior temporal sulcus. Cereb. Cortex 25, 2876–2882. 10.1093/cercor/bhu083 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Gallagher H. L., Happé F., Brunswick N., Fletcher P. C., Frith U., Frith C. D. (2000). Reading the mind in cartoons and stories: an fMRI study of “theory of mind” in verbal and nonverbal tasks. Neuropsychologia 38, 11–21. 10.1016/S0028-3932(99)00053-6 [Abstract] [CrossRef] [Google Scholar]
  • Gazzola V., Aziz-Zadeh L., Keysers C. (2006). Empathy and the somatotopic auditory mirror system in humans. Curr. Biol. 16, 1824–1829. 10.1016/j.cub.2006.07.072 [Abstract] [CrossRef] [Google Scholar]
  • Hatfield E., Cacioppo J. T., Rapson R. L. (1993). Emotional contagion. Curr. Dir. Psychol. Sci. 2, 96–99. 10.1111/1467-8721.ep10770953 [CrossRef] [Google Scholar]
  • Heerey E. A., Crossley H. M. (2013). Predictive and reactive mechanisms in smile reciprocity. Psychol. Sci. 24, 1446–1455. 10.1177/0956797612472203 [Abstract] [CrossRef] [Google Scholar]
  • Hennenlotter A., Schroeder U., Erhard P., Castrop F., Haslinger B., Stoecker D., et al. . (2005). A common neural basis for receptive and expressive communication of pleasant facial affect. Neuroimage 26, 581–591. 10.1016/j.neuroimage.2005.01.057 [Abstract] [CrossRef] [Google Scholar]
  • Hess U., Blairy S. (2001). Facial mimicry and emotional contagion to dynamic emotional facial expressions and their influence on decoding accuracy. Int. J. Psychophysiol. 40, 129–141. 10.1016/S0167-8760(00)00161-6 [Abstract] [CrossRef] [Google Scholar]
  • Hess U., Bourgeois P. (2010). You smile–I smile: emotion expression in social interaction. Biol. Psychol. 84, 514–520. 10.1016/j.biopsycho.2009.11.001 [Abstract] [CrossRef] [Google Scholar]
  • Hess U., Philippot P., Blairy S. (1998). Facial reactions to emotional facial expressions: affect or cognition? Cogn. Emot. 12, 509–531. 10.1080/026999398379547 [CrossRef] [Google Scholar]
  • Iacoboni M., Dapretto M. (2006). The mirror neuron system and the consequences of its dysfunction. Nat. Rev. Neurosci. 7, 942–951. 10.1038/nrn2024 [Abstract] [CrossRef] [Google Scholar]
  • Iwase M., Ouchi Y., Okada H., Yokoyama C., Nobezawa S., Yoshikawa E., et al. . (2002). Neural substrates of human facial expression of pleasant emotion induced by comic films: a PET study. Neuroimage 17, 758–768. 10.1006/nimg.2002.1225 [Abstract] [CrossRef] [Google Scholar]
  • Jabbi M., Keysers C. (2008). Inferior frontal gyrus activity triggers anterior insula response to emotional facial expressions. Emotion 8, 775–780. 10.1037/a0014194 [Abstract] [CrossRef] [Google Scholar]
  • Jabbi M., Swart M., Keysers C. (2007). Empathy for positive and negative emotions in the gustatory cortex. Neuroimage 34, 1744–1753. 10.1016/j.neuroimage.2006.10.032 [Abstract] [CrossRef] [Google Scholar]
  • Kessler H., Doyen-Waldecker C., Hofer C., Hoffmann H., Traue H. C., Abler B. (2011). Neural correlates of the perception of dynamic versus static facial expressions of emotion. Psychosoc. Med. 8, 8. 10.3205/psm000072 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Kilts C. D., Egan G., Gideon D. A., Ely T. D., Hoffman J. M. (2003). Dissociable neural pathways are involved in the recognition of emotion in static and dynamic facial expressions. Neuroimage 18, 156–168. 10.1006/nimg.2002.1323 [Abstract] [CrossRef] [Google Scholar]
  • Kircher T., Pohl A., Krach S., Thimm M., Schulte-Rüther M., Anders S., et al. . (2013). Affect-specific activation of shared networks for perception and execution of facial expressions. Soc. Cogn. Affect. Neurosci. 8, 370–377. 10.1093/scan/nss008 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Kohn N., Eickhoff S. B., Scheller M., Laird A. R., Fox P. T., Habel U. (2014). Neural network of cognitive emotion regulation — an ALE meta-analysis and MACM analysis. Neuroimage 87, 345–355. 10.1016/j.neuroimage.2013.11.001 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Korb S., Malsert J., Rochas V., Rihs T. A., Rieger S. W., Schwab S., et al. . (2015). Gender differences in the neural network of facial mimicry of smiles – an rTMS study. Cortex 70, 101–114. 10.1016/j.cortex.2015.06.025 [Abstract] [CrossRef] [Google Scholar]
  • Korb S., With S., Niedenthal P., Kaiser S., Grandjean D. (2014). The perception and mimicry of facial movements predict judgments of smile authenticity. PLoS ONE 9:e99194. 10.1371/journal.pone.0099194 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Kringelbach M. L., Berridge K. C. (2010). The functional neuroanatomy of pleasure and happiness. Discov. Med. 9, 579–587. [Europe PMC free article] [Abstract] [Google Scholar]
  • Larsen J. T., Norris C. J., Cacioppo J. T. (2003). Effects of positive and negative affect on electromyographic activity over zygomaticus major and corrugator supercilii. Psychophysiology 40, 776–785. 10.1111/1469-8986.00078 [Abstract] [CrossRef] [Google Scholar]
  • Lee T. W., Josephs O., Dolan R. J., Critchley H. D. (2006). Imitating expressions: emotion-specific neural substrates in facial mimicry. Soc. Cogn. Affect. Neurosci. 1, 122–135. 10.1093/scan/nsl012 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Lehéricy S., Ducros M., Krainik A., Francois C., Van de Moortele P. F., Ugurbil K., et al. . (2004). 3-D diffusion tensor axonal tracking shows distinct SMA and pre-SMA projections to the human striatum. Cereb. Cortex 14, 1302–1309. 10.1093/cercor/bhh091 [Abstract] [CrossRef] [Google Scholar]
  • Leslie A. M., Friedman O., German T. P. (2004). Core mechanisms in “theory of mind.” Trends Cogn. Sci. 8, 528–533. 10.1016/j.tics.2004.10.001 [Abstract] [CrossRef] [Google Scholar]
  • Likowski K. U., Mühlberger A., Gerdes A. B., Wieser M. J., Pauli P., Weyers P. (2012). Facial mimicry and the mirror neuron system: simultaneous acquisition of facial electromyography and functional magnetic resonance imaging. Front. Hum. Neurosci. 6:214. 10.3389/fnhum.2012.00214 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Likowski K. U., Mühlberger A., Seibt B., Pauli P., Weyers P. (2008). Modulation of facial mimicry by attitudes. J. Exp. Soc. Psychol. 44, 1065–1072. 10.1016/j.jesp.2007.10.007 [CrossRef] [Google Scholar]
  • Likowski K. U., Mühlberger A., Seibt B., Pauli P., Weyers P. (2011). Processes underlying congruent and incongruent facial reactions to emotional facial expressions. Emotion 11, 457–467. 10.1037/a0023162 [Abstract] [CrossRef] [Google Scholar]
  • Livingstone S. R., Vezer E., McGarry L. M., Lang A. E., Russo F. A. (2016). Deficits in the mimicry of facial expressions in Parkinson's disease. Front. Psychol. 7:780. 10.3389/fpsyg.2016.00780 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Monk C. S., Klein R. G., Telzer E. H., Schroth E. A., Mannuzza S., Moulton J. L., et al. . (2008). Amygdala and nucleus accumbens activation to emotional facial expressions in children and adolescents at risk for major depression. Am. J. Psychiatry 165, 90–98. 10.1176/appi.ajp.2007.06111917 [Abstract] [CrossRef] [Google Scholar]
  • Mühlberger A., Wieser M. J., Gerdes A. B., Frey M. C. M., Weyers P., Pauli P. (2011). Stop looking angry and smile, please: start and stop of the very same facial expression differentially activate threat- and reward-related brain networks. Soc. Cogn. Affect. Neurosci. 6, 321–329. 10.1093/scan/nsq039 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Murata A., Saito H., Schug J., Ogawa K., Kameda T. (2016). Spontaneous facial mimicry Is enhanced by the goal of inferring emotional states: evidence for moderation of “automatic” mimicry by higher cognitive processes. PLoS ONE 11:e0153128. 10.1371/journal.pone.0153128 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Neumann R., Strack F. (2000). Approach and avoidance: The influence of proprioceptive and exteroceptive cues on encoding of affective information. J. Pers. Soc. Psychol. 79, 39–48. 10.1037/0022-3514.79.1.39 [Abstract] [CrossRef] [Google Scholar]
  • O'Doherty J., Winston J., Critchley H., Perrett D., Burt D., Dolan R. (2003). Beauty in a smile: the role of medial orbitofrontal cortex in facial attractiveness. Neuropsychologia 41, 147–155. 10.1016/S0028-3932(02)00145-8 [Abstract] [CrossRef] [Google Scholar]
  • Paracampo R., Tidoni E., Borgomaneri S., di Pellegrino G., Avenanti A. (2016). Sensorimotor network crucial for inferring amusement from smiles. Cereb. Cortex 7, 5116–5129. 10.1093/cercor/bhw294 [Abstract] [CrossRef] [Google Scholar]
  • Pelphrey K. A., Singerman J. D., Allison T., McCarthy G. (2003). Brain activation evoked by perception of gaze shifts: the influence of context. Neuropsychologia 41, 156–170. 10.1016/S0028-3932(02)00146-X [Abstract] [CrossRef] [Google Scholar]
  • Rizzolatti G., Craighero L. (2004). The Mirror-Neuron System. Annu. Rev. Neurosci. 27, 169–192. 10.1146/annurev.neuro.27.070203.144230 [Abstract] [CrossRef] [Google Scholar]
  • Robins D. L., Hunyadi E., Schultz R. T. (2009). Superior temporal activation in response to dynamic audio-visual emotional cues. Brain Cogn. 69, 269–278. 10.1016/j.bandc.2008.08.007 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Russell J. A., Fernández-Dols J. M., editors. (eds.). (1997). The Psychology of Facial Expression. Cambridge: Cambridge University Press. [Google Scholar]
  • Rymarczyk K., Biele C., Grabowska A., Majczynski H. (2011). EMG activity in response to static and dynamic facial expressions. Int. J. Psychophysiol. 79, 330–333. 10.1016/j.ijpsycho.2010.11.001 [Abstract] [CrossRef] [Google Scholar]
  • Rymarczyk K., Zurawski Ł., Jankowiak-Siuda K., Szatkowska I. (2016). Do dynamic compared to static facial expressions of happiness and anger reveal enhanced facial mimicry? PLoS ONE 11:e0158534. 10.1371/journal.pone.0158534 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Sabatinelli D., Bradley M. M., Lang P. J., Costa V. D., Versace F. (2007). Pleasure rather than salience activates human nucleus accumbens and medial prefrontal cortex. J. Neurophysiol. 98, 1374–1379. 10.1152/jn.00230.2007 [Abstract] [CrossRef] [Google Scholar]
  • Sato W., Yoshikawa S. (2007). Enhanced experience of emotional arousal in response to dynamic facial expressions. J. Nonverbal Behav. 31, 119–135. 10.1007/s10919-007-0025-7 [CrossRef] [Google Scholar]
  • Sato W., Fujimura T., Suzuki N. (2008). Enhanced facial EMG activity in response to dynamic facial expressions. Int. J. Psychophysiol. 70, 70–74. 10.1016/j.ijpsycho.2008.06.001 [Abstract] [CrossRef] [Google Scholar]
  • Sato W., Kochiyama T., Uono S. (2015). Spatiotemporal neural network dynamics for the processing of dynamic facial expressions. Sci. Rep. 5:12432. 10.1038/srep12432 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Sato W., Kochiyama T., Yoshikawa S., Naito E., Matsumura M. (2004). Enhanced neural activity in response to dynamic facial expressions of emotion: an fMRI study. Cogn. Brain Res. 20, 81–91. 10.1016/j.cogbrainres.2004.01.008 [Abstract] [CrossRef] [Google Scholar]
  • Seibt B., Mühlberger A., Likowski K. U., Weyers P. (2015). Facial mimicry in its social setting. Front. Psychol. 6:1122. 10.3389/fpsyg.2015.01122 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Sims T. B., Van Reekum C. M., Johnstone T., Chakrabarti B. (2012). How reward modulates mimicry: EMG evidence of greater facial mimicry of more rewarding happy faces. Psychophysiology 49, 998–1004. 10.1111/j.1469-8986.2012.01377.x [Abstract] [CrossRef] [Google Scholar]
  • Soghomonian J.-J., editor. (eds.). (2016). The Basal Ganglia. Cham: Springer International Publishing; 10.1007/978-3-319-42743-0 [CrossRef] [Google Scholar]
  • Sprengelmeyer R., Young A. W., Mahn K., Schroeder U., Woitalla D., Büttner T., et al. . (2003). Facial expression recognition in people with medicated and unmedicated Parkinson's disease. Neuropsychologia 41, 1047–1057. 10.1016/S0028-3932(02)00295-6 [Abstract] [CrossRef] [Google Scholar]
  • The Mathworks Inc (2013). Matlab 2013b. Natick, MA. [Google Scholar]
  • Trautmann S. A., Fehr T., Herrmann M. (2009). Emotions in motion: dynamic compared to static facial expressions of disgust and happiness reveal more widespread emotion-specific activations. Brain Res. 1284, 100–115. 10.1016/j.brainres.2009.05.075 [Abstract] [CrossRef] [Google Scholar]
  • van der Gaag C., Minderaa R. B., Keysers C. (2007). Facial expressions: What the mirror neuron system can and cannot tell us. Soc. Neurosci. 2, 179–222. 10.1080/17470910701376878 [Abstract] [CrossRef] [Google Scholar]
  • van der Schalk J., Hawk S. T., Fischer A. H., Doosje B. (2011). Moving faces, looking places: validation of the Amsterdam dynamic facial expression set (ADFES). Emotion 11, 907–920. 10.1037/a0023853 [Abstract] [CrossRef] [Google Scholar]
  • van der Zwaag W., Da Costa S. E., Zürcher N. R., Adams R. B., Jr., Hadjikhani N. (2012). A 7 tesla fMRI study of amygdala responses to fearful faces. Brain Topogr. 25, 125–128. 10.1007/s10548-012-0219-0 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Van Overwalle F. (2009). Social cognition and the brain: a meta-analysis. Hum. Brain Mapp. 30, 829–858. 10.1002/hbm.20547 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Vrticka P., Simioni S., Fornari E., Schluep M., Vuilleumier P., Sander D. (2013). Neural substrates of social emotion regulation: a fMRI study on imitation and expressive suppression to dynamic facial signals. Front. Psychol. 4:95. 10.3389/fpsyg.2013.00095 [Europe PMC free article] [Abstract] [CrossRef] [Google Scholar]
  • Wake Forest University (2014). WFU PickAtlas 3.0.3.
  • Weyers P., Mühlberger A., Hefele C., Pauli P. (2006). Electromyographic responses to static and dynamic avatar emotional facial expressions. Psychophysiology 43, 450–453. 10.1111/j.1469-8986.2006.00451.x [Abstract] [CrossRef] [Google Scholar]
  • Wicker B., Keysers C., Plailly J., Royet J. P., Gallese V., Rizzolatti G. (2003). Both of us disgusted in My insula: the common neural basis of seeing and feeling disgust. Neuron 40, 655–664. 10.1016/S0896-6273(03)00679-2 [Abstract] [CrossRef] [Google Scholar]
  • Witvliet C. V., Vrana S. R. (1995). Psychophysiological responses as indices of affective dimensions. Psychophysiology 32, 436–443. 10.1111/j.1469-8986.1995.tb02094.x [Abstract] [CrossRef] [Google Scholar]
  • Wood A., Rychlowska M., Korb S., Niedenthal P. (2016). Fashioning the face: sensorimotor simulation contributes to facial expression recognition. Trends Cogn. Sci. 20, 227–240. 10.1016/j.tics.2015.12.010 [Abstract] [CrossRef] [Google Scholar]

Articles from Frontiers in Psychology are provided here courtesy of Frontiers Media SA

Citations & impact 


Impact metrics

Jump to Citations

Citations of article over time

Alternative metrics

Altmetric item for https://www.altmetric.com/details/31631201
Altmetric
Discover the attention surrounding your research
https://www.altmetric.com/details/31631201

Smart citations by scite.ai
Smart citations by scite.ai include citation statements extracted from the full text of the citing article. The number of the statements may be higher than the number of citations provided by EuropePMC if one paper cites another multiple times or lower if scite has not yet processed some of the citing articles.
Explore citation contexts and check if this article has been supported or disputed.
https://scite.ai/reports/10.3389/fpsyg.2018.00052

Supporting
Mentioning
Contrasting
6
49
0

Article citations


Go to all (21) article citations

Data 


Data behind the article

This data has been text mined from the article, or deposited into data resources.

Similar Articles 


To arrive at the top five similar articles we use a word-weighted algorithm to compare words from the Title and Abstract of each citation.


Funding 


Funders who supported this work.

Narodowe Centrum Nauki (1)