Previous studies have shown that complex visual stimuli, such as emotional facialexpressions, can influence brain activity independently of the observers’ awareness. Little is known yet, however, about the “informational correlates” of consciousness—i.e., which low-level information correlates with brain activation during conscious vs. non-conscious perception. Here, we investigated this question in the spatial frequency (SF) domain. We examined which SFs in disgusted and fearful facialexpressions modulate activation in the insula and amygdala over time and as (...) a function of awareness, using a combination of intracranial event-related potentials (ERPs), SF Bubbles (Willenbockel et al., 2010a), and Continuous Flash Suppression (CFS; Tsuchiya and Koch, 2005). Patients implanted with electrodes for epilepsy monitoring viewed face photographs (13° x 7°) that were randomly SF filtered trial-by-trial. In the conscious condition, the faces were visible; in the non-conscious condition, they were rendered invisible using CFS. Data were analyzed by performing multiple linear regressions on the SF filters from each trial and the transformed ERP amplitudes across time. The resulting classification images suggest that many SFs are involved in the conscious and non-conscious perception of emotional expressions, with those between 6 and 10 cycles per face width being particularly important early on. The results also revealed qualitative differences between the awareness conditions for both regions. Non-conscious processing relied on low SFs more and was faster than conscious processing. Overall, our findings are consistent with the idea that different pathways are employed for the processing of emotional stimuli under different degrees of awareness. The present study represents a first step to mapping with a high temporal resolution how SF information “flows” through the emotion-processing network and to shedding light on the informational correlates of consciousness in general. (shrink)
We investigated the effects of early visual deprivation on the underlying representation of the six basic emotions. Using multi-dimensional scaling (MDS), we compared the similarity judgments of adults who had missed early visual input because of bilateral congenital cataracts to control adults with normal vision. Participants made similarity judgments of the six basic emotional expressions, plus neutral, at three different intensities. Consistent with previous studies, the similarity judgments of typical adults could be modeled with four underlying dimensions, which can (...) be interpreted as representing pleasure, arousal, potency and intensity of expressions. As a group, cataract-reversal patients showed a systematic structure with dimensions representing pleasure, potency, and intensity. However, an arousal dimension was not obvious in the patient group’s judgments. Hierarchical clustering analysis revealed a pattern in patients seen in typical 7-year-olds but not typical 14-year-olds or adults. There was also more variability among the patients than among the controls, as evidenced by higher stress values for the MDS fit to the patients’ data and more dispersed weightings on the four dimensions. The findings suggest an important role for early visual experience in shaping the later development of the representations of emotions. Since the normal underlying structure for emotion emerges postnatally and continues to be refined until late childhood, the altered representation of emotion in adult patients suggests a sleeper effect. (shrink)
Facialexpressions are used by humans to convey various types of meaning in various contexts. The range of meanings spans basic possibly innate socio-emotional concepts such as ‘surprise’ to complex and culture specific concepts such as ‘carelessly’. The range of contexts in which humans use facialexpressions spans responses to events in the environment to particular linguistic constructions within sign languages. In this mini review we summarize findings on the use and acquisition of facial (...) class='Hi'>expressions by signers and present a unified account of the range of facialexpressions used by positing three dimensions; semantic, iconic and compositional. (shrink)
Face perception is critical for social communication. Given its fundamental importance in the course of evolution, the innate neural mechanisms can anticipate the computations necessary for representing faces. However, the effect of visual deprivation on the formation of neural mechanisms that underlie face perception is largely unknown. We previously showed that sighted individuals can recognize basic facialexpressions by haptics surprisingly well. Moreover, the inferior frontal gyrus and posterior superior temporal sulcus in the sighted subjects are involved in (...) haptic and visual recognition of facialexpressions. Here, we conducted both psychophysical and functional magnetic-resonance imaging (fMRI) experiments to determine the nature of the neural representation that subserves the recognition of basic facialexpressions in early-blind individuals. In a psychophysical experiment, both early-blind and sighted subjects haptically identified basic facialexpressions at levels well above chance. In the subsequent fMRI experiment, both groups haptically identified facialexpressions and shoe types (control). The sighted subjects then completed the same task visually. Within brain regions activated by the visual and haptic identification of facialexpressions (relative to that of shoes) in the sighted group, corresponding haptic identification in the early blind activated regions in the inferior frontal and middle temporal gyri. These results suggest that the neural system that underlies the recognition of basic facialexpressions develops supramodally even in the absence of early visual experience. (shrink)
Emotion regulation is crucial for successfully engaging in social interactions. Yet, little is known about the neural mechanisms controlling behavioral responses to emotional expressions perceived in the face of other people, which constitute a key element of interpersonal communication. Here, we investigated brain systems involved in social emotion perception and regulation, using functional magnetic resonance imaging (fMRI) in 20 healthy participants who saw dynamic facialexpressions of either happiness or sadness, and were asked to either imitate the (...) expression or to suppress any expression on their own face (in addition to a gender judgment control task). fMRI results revealed higher activity in regions associated with emotion (e.g., the insula), motor function (e.g., motor cortex), and theory of mind during imitation. Activity in dorsal cingulate cortex was also increased during imitation, possibly reflecting greater action monitoring or conflict with own feeling states. In addition, premotor regions were more strongly activated during both imitation and suppression, suggesting a recruitment of motor control for both the production and inhibition of emotion expressions. Expressive suppression produced increases in dorsolateral and lateral prefrontal cortex typically related to cognitive control. These results suggest that voluntary imitation and expressive suppression modulate brain responses to emotional signals perceived from faces, by up- and down-regulating activity in distributed subcortical and cortical networks that are particularly involved in emotion, action monitoring, and cognitive control. (shrink)
Change blindness—our inability to detect changes in a stimulus—occurs even when the change takes place gradually, without disruption (Simons et al., 2000). Such gradual changes are more difficult to detect than changes that involve a disruption. In this experiment, we extend previous findings to the domain of facialexpressions of emotions occurring in the context of a realistic scene. Even with changes occurring in central, highly relevant stimuli such as faces, gradual changes still produced high levels of change (...) blindness: Detection rates were three times lower for gradual changes than for displays involving disruption, with only 15% of the observers perceiving the gradual change within a single trial. However, despite this high rate of change blindness, changes on faces were significantly better detected than color changes occurring on non facial objects in the same scene. (shrink)
In children experiencing pain, the study of the social context of facialexpressions might help to evaluate evolutionary and conditioning hypotheses of behavioural development. Social motivations and influences may be complex, as seen in studies of children having their ears pierced, and in studies of everyday pain in children. A study of opposing predictions of the long-term effects of parental caregiving is suggested.
In this article we discuss the aspects of designing facialexpressions for virtual humans (VHs) with a specific culture. First we explore the notion of cultures and its relevance for applications with a VH. Then we give a general scheme of designing emotional facialexpressions, and identify the stages where a human is involved, either as a real person with some specific role, or as a VH displaying facialexpressions. We discuss how the display (...) and the emotional meaning of facialexpressions may be measured in objective ways, and how the culture of displayers and the judges may influence the process of analyzing human facialexpressions and evaluating synthesized ones. We review psychological experiments on cross-cultural perception of emotional facialexpressions. By identifying the culturally critical issues of data collection and interpretation with both real and VHs, we aim at providing a methodological reference and inspiration for further research. (shrink)
Understanding the very nature of the smile with an integrative approach and a novel model is a fertile ground for a new theoretical vision and insights. However, from this perspective, I challenge the authors to integrate culture and race in their model, because both factors would impact upon the embodying and decoding of facialexpressions.
In this article, we review empirical evidence regarding the relationship between facial expression and emotion during infancy. We focus on differential emotions theory’s view of this relationship because of its theoretical and methodological prominence. We conclude that current evidence fails to support its proposal regarding a set of pre-specified facialexpressions that invariably reflect a corresponding set of discrete emotions in infants. Instead, the relationship between facial expression and emotion appears to be more complex. Some (...) class='Hi'>facialexpressions may have different meanings in infants than in children and adults. In addition, nonemotion factors may sometimes lead to the production of “emotional” facialexpressions. We consider alternative perspectives on the nature of emotion and emotional expression in infancy with particular focus on differentiation and dynamical systems approaches. (shrink)
With a few yet increasing number of exceptions, the cognitive sciences enthusiastically endorsed the idea that there are basic facialexpressions of emotions that are created by specific configurations of facial muscles. We review evidence that suggests an inherent role for context in emotion perception. Context does not merely change emotion perception at the edges; it leads to radical categorical changes. The reviewed findings suggest that configurations of facial muscles are inherently ambiguous, and they call for (...) a different approach towards the understanding of facialexpressions of emotions. Prices of sticking with the modal view, and advantages of an expanded view, are succinctly reviewed. (shrink)
Gaze plays a fundamental role in the processing of facialexpressions from birth. Gaze direction is a crucial part of the social signal encoded in and decoded from faces. The ability to discriminate gaze direction, already evident early in life, is essential for the development of more complex socially relevant tasks, such as joint and shared attention. At the same time, facialexpressions play a fundamental role in the encoding of gaze direction and, when combined, expression (...) and gaze communicate behavioural motivation to approach or avoid. However, the investigation of how gaze direction and emotional expression interact during the processing of a face has been relatively neglected, and is the key question of this review. (shrink)
In the target article, we reviewed empirical evidence regarding the relationship between facialexpressions and emotion in infancy. In our response to commentators, we make three main points. First, we concur with Hertenstein that the field has thus far relied too heavily on deductive reasoning, and suggest that future research strike a balance between inductive and deductive reasoning. Second, we maintain that infant recognition of discrete emotions remains an open question. Third, we state our position regarding the revised (...) version of DET. (shrink)
The aim of this review is to show the fruitfulness of using images of facialexpressions as experimental stimuli in order to study how neural systems support biologically relevant learning as it relates to social interactions. Here we consider facialexpressions as naturally conditioned stimuli which, when presented in experimental paradigms, evoke activation in amygdala–prefrontal neural circuits that serve to decipher the predictive meaning of the expressions. Facialexpressions offer a relatively innocuous strategy (...) with which to investigate these normal variations in affective information processing, as well as the promise of elucidating what role the aberrance of such processing might play in emotional disorders. (shrink)
According to a common sense theory, facialexpressions signal specific emotions to people of all ages and therefore provide children easy access to the emotions of those around them. The evidence, however, does not support that account. Instead, children’s understanding of facialexpressions is poor and changes qualitatively and slowly over the course of development. Initially, children divide facialexpressions into two simple categories (feels good, feels bad). These broad categories are then gradually differentiated (...) until an adult system of discrete categories is achieved, likely in the teen years. Children’s understanding of most specific emotions begins not with facialexpressions, but with their understanding of the emotion’s antecedents and behavioral consequences. (shrink)
This paper points out that a major shift of paradigm is currently going on in the study of the human face and it seeks to articulate and to develop the fundamental assumptions underlying this shift. The main theses of the paper are: 1) Facialexpressions can convey meanings comparable to the meanings of verbal utterances. 2) Semantic analysis (whether of verbal utterances or of facialexpressions) must distinguish between the context-independent invariant and its contextual interpretations. 3) (...) Certain components of facial behavior (¿facial gestures¿) do have constant context-independent meanings. 4) The meanings of facial components and configurations of components have an inherent first-person and present tense orientation. 5) The basis for the interpretation of facial gestures is, above all, experiential. 6) The meanings of some facialexpressions are universally intelligible and can be interpreted without reference to any local conventions. 7) To be fruitful, the semantic analysis of facialexpressions needs a methodology. This can be derived from the methodological experience of linguistic semantics. The author illustrates and supports these theses by analyzing a range of universally interpretable facialexpressions such as the following ones: ¿brow furrowed¿ (i.e. eyebrows drawn together); eyebrows raised; eyes wide open; corners of the mouth raised; corners of the mouth lowered; mouth open (while not speaking); lips pressed together; upper lip and nose ¿raised¿ (and, consequently, nose wrinkled). (shrink)
A key feature of facial behavior is its dynamic quality. However, most previous research has been limited to the use of static images of prototypical expressive patterns. This article explores the role of facial dynamics in the perception of emotions, reviewing relevant empirical evidence demonstrating that dynamic information improves coherence in the identification of affect (particularly for degraded and subtle stimuli), leads to higher emotion judgments (i.e., intensity and arousal), and helps to differentiate between genuine and fake (...) class='Hi'>expressions. The findings underline that using static expressions not only poses problems of ecological validity, but also limits our understanding of what facial activity does. Implications for future research on facial activity, particularly for social neuroscience and affective computing, are discussed. (shrink)
Research on the neural mechanisms underlying human facial emotion recognition has long focussed on genetically determined neural algorithms and often neglected the question of how these algorithms might be tuned by social learning. Here we show that facial emotion decoding skills can be significantly and sustainably improved by practise without an external teaching signal. Participants saw video clips of dynamic facialexpressions of five women and were asked to decide which of four possible emotions (anger, disgust, (...) fear and sadness) was shown in each clip. Although no external information about the correctness of the participant’s response or the sender’s true affective state was provided, participants showed a significant increase of facial emotion recognition accuracy both within and across two training sessions two days to several weeks apart. We discuss several factors that might have a critical or modulating effect on such unsupervised learning of facial emotion decoding skills. (shrink)
Recent application of theories of embodied or grounded cognition to the recognition and interpretation of facial expression of emotion has led to an explosion of research in psychology and the neurosciences. However, despite the accelerating number of reported findings, it remains unclear how the many component processes of emotion and their neural mechanisms actually support embodied simulation. Equally unclear is what triggers the use of embodied simulation versus perceptual or conceptual strategies in determining meaning. The present article integrates behavioral (...) research from social psychology with recent research in neurosciences in order to provide coherence to the extant and future research on this topic. The roles of several of the brain's reward systems, and the amygdala, somatosensory cortices, and motor centers are examined. These are then linked to behavioral and brain research on facial mimicry and eye gaze. Articulation of the mediators and moderators of facial mimicry and gaze are particularly useful in guiding interpretation of relevant findings from neurosciences. Finally, a model of the processing of the smile, the most complex of the facialexpressions, is presented as a means to illustrate how to advance the application of theories of embodied cognition in the study of facial expression of emotion. (shrink)
I agree with Williams that evolutionary theory provides the best account of the pain expression. We may disagree as to whether pain has an emotional dimension or includes discrete basic emotions as integral components. I interpret basic emotion expressions that occur contemporaneously with pain expression as representing separate but highly interactive systems, each with distinct adaptive functions.
This article focuses on a theoretical account integrating classic and recent findings on the communication of emotions across cultures: a dialect theory of emotion. Dialect theory uses a linguistic metaphor to argue emotion is a universal language with subtly different dialects. As in verbal language, it is more challenging to understand someone speaking a different dialect—which fits with empirical support for an in-group advantage, whereby individuals are more accurate judging emotional expressions from their own cultural group versus foreign groups. (...) Dialect theory has sparked controversy with its implications for dominant theories about cross-cultural differences in emotion. This article reviews the theory, its mounting body of evidence, evidence for alternative accounts, and practical implications for multicultural societies. (shrink)
As a product of natural selection, pain behavior must serve an adaptive function for the species beyond the accurate portrayal of the pain experience. Pain behavior does not simply refer to the pain experience, but promotes survival of the species in various and complex ways. This means that there is no purely respondent or operant pain behavior found in nature.
The embodied simulation of smiles involves motor activity that often changes the perceivers' own emotional experience (e.g., smiling can make us feel happy). Although Niedenthal et al. mention this possibility, the psychological processes by which embodiment changes emotions and their consequences for processing other emotions are not discussed in the target article's review. We argue that understanding the processes initiated by embodiment is important for a complete understanding of the effects of embodiment on emotion perception.
The ability to rapidly detect facialexpressions of anger and threat over other salient expressions has adaptive value across the lifespan. Although studies have demonstrated this threat superiority effect in adults, surprisingly little research has examined the development of this process over the childhood period. In this study, we examined the efficiency of children's facial processing in visual search tasks. In Experiment 1, children (N=49) aged 8 to 11 years were faster and more accurate in detecting (...) angry target faces embedded in neutral backgrounds than vice versa, and they were slower in detecting the absence of a discrepant face among angry than among neutral faces. This search pattern was unaffected by an increase in matrix size. Faster detection of angry than neutral deviants may reflect that angry faces stand out more among neutral faces than vice versa, or that detection of neutral faces is slowed by the presence of surrounding angry distracters. When keeping the background constant in Experiment 2, children (N=35) aged 8 to 11 years were faster and more accurate in detecting angry than sad or happy target faces among neutral background faces. Moreover, children with higher levels of anxiety were quicker to find both angry and sad faces whereas low anxious children showed an advantage for angry faces only. Results suggest a threat superiority effect in processing facialexpressions in young children as in adults and that increased sensitivity for negative faces may be characteristic of children with anxiety problems. (shrink)
This paper proposes that human expression of pain in the presence or absence of caregivers, and the detection of pain by observers, arises from evolved propensities. The function of pain is to demand attention and prioritise escape, recovery, and healing; where others can help achieve these goals, effective communication of pain is required. Evidence is reviewed of a distinct and specific facial expression of pain from infancy to old age, consistent across stimuli, and recognizable as pain by observers. Voluntary (...) control over amplitude is incomplete, and observers can better detect pain that the individual attempts to suppress rather than amplify or simulate. In many clinical and experimental settings, the facial expression of pain is incorporated with verbal and nonverbal vocal activity, posture, and movement in an overall category of pain behaviour. This is assumed by clinicians to be under operant control of social contingencies such as sympathy, caregiving, and practical help; thus, strong facial expression is presumed to constitute an attempt to manipulate these contingencies by amplification of the normal expression. Operant formulations support skepticism about the presence or extent of pain, judgments of malingering, and sometimes the withholding of caregiving and help. To the extent that pain expression is influenced by environmental contingencies, however, could equally plausibly constitute the release of suppression according to evolved contingent propensities that guide behaviour. Pain has been largely neglected in the evolutionary literature and the literature on expression of emotion, but an evolutionary account can generate improved assessment of pain and reactions to it. (shrink)