Conscious emotional experience emerges as a function of multilevel, appraisal-driven response synchronization

https://doi.org/10.1016/j.concog.2008.03.019Get rights and content

Abstract

In this paper we discuss the issue of the processes potentially underlying the emergence of emotional consciousness in the light of theoretical considerations and empirical evidence. First, we argue that componential emotion models, and specifically the Component Process Model (CPM), may be better able to account for the emergence of feelings than basic emotion or dimensional models. Second, we advance the hypothesis that consciousness of emotional reactions emerges when lower levels of processing are not sufficient to cope with the event and regulate the emotional process, particularly when the degree of synchronization between the components reaches a critical level and duration. Third, we review recent neuroscience evidence that bolsters our claim of the central importance of the synchronization of neuronal assemblies at different levels of processing.

Introduction

Understanding the role of consciousness and the processes underlying its emergence in emotion requires agreement among researchers on what emotion is and a theoretical framework allowing specification of the underlying mechanisms and hypothesis-guided research. Currently, there are three major contenders for the role of a general model in affective science research on emotion mechanisms: (1) basic emotion, (2) dimensional feeling, and (3) componential appraisal models. We briefly describe these models and evaluate their promise, in the light of recent findings in cognitive neuroscience, as theoretical grounding of research on consciousness in emotion processes.

(1) Basic emotion models are based on Tomkins, 1962, Tomkins, 1963 interpretation of Darwin’s (1872/1998) account of the evolutionary functions of emotions and their expression, and are represented by theoretical proposals made by Izard, 1977, Izard, 1993, Ekman, 1992, Ekman, 1999. In this tradition, basic emotions are defined as affect programs that are triggered by appropriate eliciting events to produce emotion-specific response patterns such as prototypical facial expressions, physiological reactions, and action tendencies. Although theorists differ on the number and nature of basic emotions, anger, joy, sadness, fear, and disgust are generally included. Other emotions are either conceptualized as blends of basic emotions (e.g., contempt is a blend of anger and disgust) or given a different status (e.g., shame is a complex social emotion). Among the drawbacks of basic emotion theory as a guiding model are (a) the lack of clear predictions on the eliciting conditions for basic emotions (in his most recent theoretical account, Ekman (2004) proposes an unspecified “autoappraiser database” for universal or learned events); (b) the absence of specific hypotheses for the expected prototypical patterning of emotion-specific responses (prototypes are often specified inductively from observation); (c) the unclear criteria for defining basic and non-basic emotions; and, most important; and (d) the unspecified central mechanisms, or affect programs. In addition, there is little hard empirical evidence for the production of emotion-specific response patterns in facial/vocal expression or physiological reactions (see Griffiths, 1997, Scherer and Ellgring, 2007, Stemmler et al., 2001). These models have mostly focused on the issue of prototypical response patterning (particularly facial expression and somatovisceral symptoms) and have not given much attention to the emotional consciousness of the underlying processes or the resulting experience.

(2) Dimensional feeling models are based on Wundt’s (1905) proposal that feelings (which he distinguished from emotions) can be described by the dimensions of pleasantness–unpleasantness, excitement–inhibition, and tension–relaxation, and on Osgood’s work on the dimensions of affective meaning (arousal, valence, and potency; Osgood, May, & Miron, 1975). Most recent models have concentrated on only two dimensions, valence and arousal, with debate on the exact nature of the axes and the existence of a circumplex distribution of central feeling states (Lang, 1984, Russell, 1980, Tellegen et al., 1999). In these models, emotions are generally operationalized as verbal reports of subjective feeling along the positive–negative and active–passive dimensions. This implicit partial overlap between two theoretical concepts that must be clearly distinguished—emotion and feeling—is problematic and results in confusion when scientists study emotional processes and their links with the emergence of a conscious feeling. According to Russell (2003), “core affect,” presumably the primary emotional reaction, consists exclusively of its position in this bidimensional space, which is only later differentiated and enriched by cognitive and linguistic processing. The drawbacks of dimensional feeling theories as a guiding model for emotion research consist of (a) the definition of emotion as subjective feeling, often operationalized exclusively as verbal self-report; (b) the reduction of emotion differentiation to locations in a two-dimensional valence × arousal space (e.g., anger being very close to fear); (c) the lack of a functional perspective in terms of the adaptive functions of emotion; (d) the absence of attempts to theoretically predict the determinants of emotion differences (even in a reduced two-dimensional space); and (e) the lack of an explanatory mechanism allowing prediction of response patterning. Authors in this tradition have rarely raised the problem of consciousness in emotional processing. Russell’s (2003) proposal of core affect assumes that a primitive (preconscious?) representation in valence–arousal space undergoes cognitive postprocessing and elaboration, but the assumed mechanisms are not specified.

Apart from underspecification of the mechanisms presumed to underlie the elicitation and differentiation of emotion, both of these widely used models neglect three central characteristics of emotion: (a) Emotions are multicomponential phenomena, consisting of highly organized patterns of appraisals of eliciting events and response profiles; (b) emotions unfold over time and may undergo rapid change; and (c) emotional processes may vary across individuals and cultures despite comparable eliciting events; the lack of clear conceptualizations and predictions about the possible mechanisms underlying these interindividual and cultural differences is problematic for both basic emotion and dimensional feeling models.

(3) Componential appraisal models view emotion as a dynamic episode in the life of an organism that involves a process of continuous change in all of its subsystems (e.g., cognition, motivation, physiological reactions, motor expressions, and feeling—the components of emotion) to adapt flexibly to events of high relevance and potentially important consequences (adopting a functional approach in the Darwinian tradition; Ellsworth and Scherer, 2003, Scherer, 1984, Scherer, 2001). Based on the pioneering work of Arnold, 1960, Lazarus, 1966, Lazarus, 1991, the elicitation and differentiation of emotion is seen as mainly determined by appraisal, the continuous, recursive subjective evaluation of events for their pertinence, as well as the coping potential of the individual. The outcome of the appraisals from these different criteria is predicted to directly drive response patterning of physiological reactions, motor expression, and action preparation.

In consequence, componential appraisal theories avoid many of the drawbacks of the other two contenders for a guiding model: (a) Emotions are defined and operationalized as complex, multicomponential, dynamic processes that require sophisticated measurement of changes in the different components; (b) highly specific predictions about the determinants that elicit and differentiate emotions are made; (c) a concrete mechanism underlying emotional response patterning, allowing specific hypotheses, is suggested (predicting appraisal-driven responses from functional considerations; see Scherer, 2001); and (d) therefore, the richness of emotion differentiation is accounted for, especially in humans, allowing researchers to model individual differences and emotional disorders (Scherer, 2004).

In this contribution, we will briefly describe the Component Process Model (CPM; Scherer, 1984, Scherer, 2001), which provides detailed suggestions as to the architecture of emotion and the underlying mechanisms and is thus particularly suitable to discuss the emergence and specific role of consciousness in emotional processing. In addition, we will briefly review the pertinent experimental evidence from psychology and the neurosciences. Specifically, we will propose the hypothesis that conscious emotional experience emerges as a function of multilevel, appraisal-driven response synchronization.

The CPM is graphically represented in Fig. 1 (adapted from Sander, Grandjean, & Scherer, 2005a). The major structural elements of the model are four appraisal objectives:

  • 1.

    How relevant is this event for me? Does it directly affect me or my social reference group? (relevance)

  • 2.

    What are the implications or consequences of this event and how do they affect my well-being and my immediate or long-term goals? (implications)

  • 3.

    How well can I cope with or adjust to these consequences? (coping potential)

  • 4.

    What is the significance of this event for my self-concept and to social norms and values? (normative significance)

As Fig. 1 shows, these four classes of assessments are expected to unfold sequentially over time. Each type of assessment receives input from other cognitive and motivational mechanisms, such as attention, memory, motivation, reasoning, and the self-concept, which provide stored information and evaluation criteria that are essential for the appraisal process (this input is represented by downward-pointing arrows). For each sequential stage of assessment, there are two types of output: (1) A modification of the cognitive and motivational mechanisms that have influenced the appraisal process (represented by upward-pointing dashed arrows) and (2) efferent effects on the periphery, in particular the neuroendocrine system and the autonomous and somatic nervous systems (represented by downward-pointing bold arrows). In this model, emotion differentiation is predicted as the result of the net effect of all subsystem changes brought about by the outcome profile of the appraisal sequence. As shown in Fig. 1, each of the major assessment classes consists of constituent appraisal criteria, or stimulus evaluation checks (SECs). Scherer, 1984, Scherer, 2001 has proposed a componential patterning theory, which predicts specific changes in the peripheral subsystems brought about by concrete patterns of SEC results. The central assumption of the componential patterning theory is that the different organismic subsystems are highly interdependent and that changes in one subsystem will elicit related changes in other subsystems in a recursive fashion. In consequence, the result of each consecutive check will differentially and cumulatively affect the state of all other subsystems (see Scherer, 2001).

The predicted patterning of the component states is specific to the unique evaluation history of the stimulus concerned. Each SEC result and the changes produced by it set the scene for the effects of the following SEC in the sense of “added value” in a complex sequential interaction. The concrete predictions and their justification in the framework of a functional approach that views emotion as adaptation are described in Scherer, 1987, Scherer, 2001. A sizeable number of studies report empirical evidence that confirms these predictions to a large extent (Aue et al., 2007, Grandjean and Scherer, in press, Johnstone et al., 2005, Lanctot and Hess, 2007, Pecchinenda and Smith, 1996, van Reekum et al., 2004).

How does this conceptualization facilitate the understanding of emotions as a conscious experience? The first step is to clearly differentiate feeling from emotion and conceptualize the former as a component of emotion, comparable to cognitive appraisal, motivational urges, motor expression, and physiological responding (see Fig. 1). The feeling component has a special status in the emotion process because it integrates and regulates the component processes. Specifically, we suggest that subjective experience serves a monitoring function, integrating all information about the continuous patterns of change in all other components, as well as their coherence, and then building an integrative conscious representation. Thus, feeling is an extraordinarily complex conglomerate of information from different systems, as shown in Fig. 2.

We can now define the role of consciousness by using a Venn diagram in which a set of overlapping circles represents the different aspects of feeling (see Fig. 3, adapted from Scherer, 2004). The first circle (A) represents the sheer reflection or representation of changes in all synchronized components in a monitoring neuronal network in the central nervous system (CNS). This neuronal network is expected to receive massive projections from both cortical and subcortical CNS structures (including proprioceptive feedback from the periphery). The second circle (B), only partially overlapping the first, represents that part of the integrated central representation that enters awareness and thereby becomes conscious, thus constituting the feeling qualities, the qualia about which philosophers and phenomenologically minded psychologists have been most concerned. Thus, this circle corresponds most directly to what is called “feelings.” The conscious part of the feeling component feeds the process of controlled regulation, much of which is determined by self-representation and socio-normative constraints. One can further assume that the conscious part of feeling goes beyond the overall unconscious reflection. In the sense of active, constructive cognitive processing, it could be partly constructed by schemata, scripts, or social representations that can add meaning in the process of rendering unconscious material conscious. Thus, as shown in Fig. 3, we can depict these parts of the feeling component as two only partially overlapping Venn circles.

What is available to interpersonal sharing of emotion through communication (and to research based on declarative reports) is only the tip of the iceberg—the individual’s verbal account of a consciously experienced feeling, represented by the third circle (C) in Fig. 3. Drawing this circle as only partially overlapping the circle representing conscious experience (B) is meant to suggest that the verbal account of feelings captures only part of what is consciously experienced. This selectivity can be due in part to control intentions—the individual may not want to report certain aspects of his/her innermost feelings. The verbalization of the conscious part of feeling can be represented by a third circle partially overlapping the other two. Again, the partial overlap draws attention to the fact that the use of linguistic labels or expressions to describe the conscious part of feeling will not cover all of what is conscious, partially because of the absence of appropriate verbal concepts. It will also add surplus meaning, in the sense of adding content, due to the denotational and connotational meaning of the concepts used in the verbalization. These aspects of meaning which are part of the pragmatics of language may not always be totally appropriate to the conscious feeling state. Most important, verbal report relies on language and thereby on the emotion semantic categories and other pragmatic devices available to express the qualia that are consciously experienced. Apart from capacity constraints (the stream of consciousness cannot be completely described by a discrete utterance), it may not be unreasonable to claim that these linguistic devices are incapable of completely capturing the incredibly rich texture of conscious experience. In fact, the categorization that is implied by verbal labeling may impoverish the experience and mold it into socio-culturally determined schemata.

The claim that much of what is represented in circle A remains unconscious requires the assumption that a large part of the underlying processes, including appraisals, occurs on an implicit, unconscious level of processing. This is indeed what the CPM predicts. From the work of Leventhal and Scherer (1987), it is suggested that appraisal occurs both simultaneously and sequentially at different levels of processing. At the sensory-motor level, the checking mechanisms are mostly genetically determined and the criteria consist of appropriate templates for pattern matching and similar mechanisms (cf. the notion of “biological preparedness”, Öhman, 1987). On the schematic level, the schemata forming the criteria for the SECs are based on learning processes, particularly at the social level, and much of the processing at this level occurs in an automatic fashion, outside of consciousness. On the conceptual level, the SECs are processed primarily via cortical association areas, require consciousness, and involve cultural meaning systems. It is expected that the different levels continuously interact, producing top-down and bottom-up effects.

Although the notion of multilevel processing has been adopted in many areas of research on cognitive and affective functioning (see also Craik, 2002, Lockhart and Craik, 1990, Johnson et al., 1993, Lockhart, 2002, Power and Dalgleish, 1997, van Reekum and Scherer, 1997), little progress has been made in identifying the exact nature of these levels and their underlying neural structures. As mentioned by Robinson (in press), it is important to focus on levels of processing that can be operationalized, such as distinctions between unconscious and conscious processes and between preattentive and postattentive processes. Leventhal’s (1984) adoption of a Piagetian categorization of cognitive processes was conjectural, as have been most other attempts to define levels for explicitness, effort, or consciousness. The latter criterion has become particularly popular in so-called dual systems approaches (Bechara et al., 1995, Loewenstein and Lerner, 2003, Zajonc, 1980), which propose that events are appraised and decisions made in an unconscious affective or impulsive system, or a conscious reflective or deliberative system. However, the problems of overgeneralizing the links between associative, automatic, and unconscious processes on the one hand and between non-automatic, rule-based, or deliberative processes on the other have been aptly outlined by Moors and De Houwer (2005). We believe that there is an even more serious problem in assuming, sometimes explicitly (Zajonc, 1980) but often implicitly (Bechara & Van Der Linden, 2005), that appraisal or decision making occur in an either–or fashion in one of these two super modes, unwittingly suggesting that they are closed, self-contained systems that have clearly separable neural substrata. Thus, Bechara (2005) assigns the impulsive system to the amygdala and the reflective system to the ventromedial prefrontal cortex (VMPFC).

The latter assumption, in contrast with many of the conceptual distinctions proposed, can be empirically examined. Several lines of research on brain-damaged patients or normal participants clearly indicate that both the prefrontal cortex (PFC)—in particular the orbitofrontal cortex (OFC) and the VMPFC—and the amygdala are critically involved in emotional processing, but the specific levels of processing and types of computations that are performed in these regions are not clearly understood (Davidson and Irwin, 1999, Koenigs et al., 2007, Rolls, 1999, Sander et al., 2003). In particular, the important anatomical connectivity between these two regions of the PFC and several nuclei of the amygdala suggests a functional interaction between representations computed in the OFC, VMPFC, and amygdala rather than a functional dissociation between regions of the PFC and the amygdala. For example, Davidson, Putnam, and Larson (2000) proposed that the mechanism underlying suppression of negative emotion is via an inhibitory connection from regions of the PFC, probably the OFC, to the amygdala. Bilateral anatomical connections between the medial part of the orbitofrontal regions and the amygdala have been recently confirmed in monkey brains (Ghashghaei, Hilgetag, & Barbas, 2007). The functional meanings of these distributed neuronal networks between the amygdala and medial OFC are still not clear. Based on these recent findings showing strong anatomical connections between these two regions, we suggest that the latter might form a functional unit involved in the computation of behavioral strategies or action tendencies elicited by emotional perception. Several pieces of evidence indicate that the neuronal synchronization of electric activity between two or more neuronal assemblies is necessary to allow the communication between distant or local neural networks. In particular, Fries (2005) has proposed The Communication Through Coherence (CTC) model which implies that phase coherence underlies neuronal communication: neuronal assemblies have to be in synchronization to exchange information. Based on this model we predict that the amygdala and the OFC have to be in synchronization to be able to exchange information. Previous findings suggest that neuronal synchronization may indeed be necessary to process emotional information. For example, Luo, Holroyd, Jones, Hendler, and Blair (2007) report synchronization between the thalamus/hypothalamus and the amygdala in response to facial threat. Distant synchronization between hippocampus and amygdala has also been shown during various stages of fear memory (Narayanan, Seidenbecher, Sangha, Stork, & Pape, 2007). These empirical findings of neuronal synchronization in the human brain in response to emotional stimuli highlight the importance of the functional coupling between different distant and local neuronal assemblies and suggest continuous cross-talk between different brain regions during the processing of emotional stimuli.

The results of these studies can be interpreted as evidence that different neuronal assemblies, representing different levels of processing in the brain, work in conjunction to assess input of high significance for an individual. This suggestion is reminiscent of the assumption of massive parallel processing in neural network models and is consistent with a recent proposal of a neural network model of emotional consciousness in which emotional coherence through interactions among multiple brain areas needs to be achieved for emotional consciousness to emerge (Thagard & Aubie, in press). These synchronizations could occur at different levels including local and distant neuronal synchronies (see Fig. 6). We suggest that local synchronies, in a specific neuronal network, are necessary to achieve preliminary closure and send information to another neuronal network. For example to process information relative to the state of the body during an emotional episode the synchronization of neuronal assemblies inside the insula would be necessary. When a stable representation emerges from this neuronal network, the information might be sent to another part of the brain, for example to prefrontal brain areas (inducing a specific body representation in working memory). In this example, local synchronization would be necessary to build a stable representation and distant neuronal synchronization would be necessary to exchange this information with another functional unit, here the body consciousness state in working memory. The local synchronies should occur at high frequencies while the distant synchronies would be expected at lower frequencies (Fries, 2005).

Dan Glauser and Scherer (2008) reported a first investigation into the processes involved in the emergence of a subjective feeling. The hypotheses was that the oscillatory brain activity presumed to underlie the emergence of a subjective feeling can be measured by electroencephalography (EEG) frequency band activity, similar to what has been shown in the literature for the conscious representation of objects. Emotional reactions were induced in participants using appropriate visual stimuli. Episodes for which participants reported a subjective feeling were compared with those that did not lead to a conscious emotional experience. Discrete wavelet transforms of the EEG signal in gamma and beta bands showed that significant differences can be observed between these two types of reactions. In addition, whereas beta band activities were widely distributed, differences in gamma band activity were predominantly observed in the frontal and prefrontal regions. These results are interpreted in terms of the complexity of the processes required to perform the affective monitoring task. It is suggested that future work on coherent mental representation of multimodal reaction patterns leading to the emergence of conscious emotional experience should include modifications in the time window examined and an extension of the frequency range to be considered.

The CPM also assumes that the processing of different appraisal criteria occurs in a simultaneous, parallel fashion on several levels of automaticity, representation, schematicity, rule application—and consciousness (see the useful categorization of multimode distinctions proposed by Moors & De Houwer (2005)). This arrangement does not mean that all events are processed in parallel on all levels of processing. Rather, the assumption is that the recursive loop of processing stops when a conclusive result (establishing the behavioral meaning of the consequences with a sufficient degree of certainty and allowing adaptive decisions or actions) has been achieved. As Fig. 4 shows, lower levels are assumed to provide leaner processing and to be much faster. Thus, once a conclusive result is achieved, processing may stop without producing any results on higher levels. In other words, higher, more costly—and more conscious levels—are only recruited in cases in which lower-level processing does not achieve conclusive results.

This type of parallel processing does not contradict the assumption of sequential unfolding, as shown in Fig. 1: Novelty > Intrinsic pleasantness > Goal/Task relevance > Goal conduciveness > Coping potential > Compatibility with internal and external norms or standards. The reason is that the essential criterion for the sequence is the point at which a particular check achieves preliminary closure, that is, yields a reasonably definitive result, one that warrants efferent commands to response modalities. The general assumption is that—for logical and economical reasons—the results of the earlier SECs need to be processed before later SECs can operate successfully, that is, yield a conclusive result. It can also be argued that the microgenetic unfolding of the emotion-antecedent appraisal processes parallels both phylogenetic and ontogenetic development in the differentiation of emotional states (see Scherer, 1999, Scherer, 2001, for further details on the sequence prediction). The implication of this assumption is that earlier appraisals are more “primitive” in level of processing, as well as more unconscious, even if it possible that some processes reach the level of consciousness and then become accessible for an explicit representation and a possible verbal report. For example, a novel object can be processed at an unconscious level before the organism is able to have an explicit representation of this novel object. Evidence for these hypotheses and first indications of the mental chronometry involved comes from a study by Grandjean and Scherer (in press). In this series of empirical studies, Novelty, Goal relevance, Intrinsic pleasantness, and Goal conduciveness SECs were systematically manipulated to test the sequence hypothesis. In two visual experiments (using International Affective Picture System [IAPS] pictures; Lang, Bradley, & Cuthbert, 1999) with electroencephalographic recordings, characterized by a high temporal resolution, we tested the sequential hypothesis through several types of signal analysis. Topographical analyses of the event-related potentials (ERPs) revealed a specific electrical map related to Novelty (∼90 ms after the onset of the stimulus) preceding another topographical map related to Task–goal relevance (Fig. 5), indicating that the occurrence of the Novel map precedes the Task–goal relevance map by about 50 ms.

To investigate the effects of manipulated appraisals not revealed by the topographical analyses, further analyses were performed on the global field power (GFP) of the ERPs, mainly representing the intensity of signal changes without corresponding modifications of the topography of the electrical fields. These GFP analyses revealed early effects related to Novelty and later effects related to the Intrinsic pleasantness factor. For the second experiment in which Intrinsic pleasantness and Goal conduciveness were manipulated, the results confirmed that neuronal processing of Intrinsic pleasantness precedes the effects related to Goal conduciveness checks (see Fig. 5; Grandjean & Scherer, in press). The frequency analyses revealed late effects in the gamma band (not present in the analyses of unfiltered ERPs), indicating an effect of Goal conduciveness on the so-called induced gamma (Tallon-Baudry, Bertrand, Delpuech, & Pernier, 1996) at about 600 ms after the onset of the stimuli, suggesting that a high level of cognitive processing is involved in this type of appraisal. The results of these two experiments (Grandjean & Scherer, in press) support the CPM predictions and suggest that Novelty and Intrinsic pleasantness may be appraised early, on an unconscious, automatic, and possibly schematic level, whereas goal conduciveness tends to be evaluated later in the sequence, on a conscious, controlled, and possibly propositional level.

Although we illustratively use the three-level approach originally proposed by Leventhal (1984), the nature of the different levels has not been specified in more detail, and it may be necessary to go beyond this simple classification and introduce finer differentiations. Thus, as suggested in Fig. 4, one might want to distinguish between strongly established prepotent schemata that may provide immediate meaning analyses of specific core relational themes on the basis of strong earlier experiences (Smith & Lazarus, 1990) and more open associative processes as suggested by neoassociationists (Berkowitz, 1990). There are possibly no identifiable levels with specific characteristics but rather a large set of process characteristic configurations that are flexibly formed in adapting to specific processing requirements. This area will require much more attention and it seems desirable to abandon simple dual system models in favor of more complex, neuroscientically validated conceptualizations of processing characteristics and their relation to consciousness.

Given its explanatory power in most of the conceptual distinctions between unconscious versus conscious emotional processing, the best example is to consider the dual pathways proposal for emotional processing by the amygdala. From pioneering animal experiments, it is classically argued that there is a dual route architecture to the amygdala (see LeDoux, 1996) consisting of a direct subcortical pathway and an indirect cortical pathway.

Indeed, fear conditioning experiments in rats have shown the existence of a fast direct subcortical pathway from the auditory thalamus to the amygdala (see LeDoux, 1996). Several results were also obtained in humans suggesting that the emotional value of visual stimuli could be detected by a colliculo-pulvinar–amygdala pathway (e.g., de Gelder et al., 1999, Morris et al., 1998, Morris et al., 1999, Pegna et al., 2005, Vuilleumier et al., 2003).

On the other hand, the slower indirect cortical pathway in humans is not controversial and consists of massive connections from several levels of sensory cortical processing to the amygdala (e.g., Amaral, Behniea, & Kelly, 2003). It is important to notice that these routes are not mutually exclusive and that the amygdala receives inputs from the thalamus as well as from several sites of all sensory cortices, suggesting that both uni- and multimodal sensory information can reach the amygdala at different levels of integration. The possibility of levels of integration through interactions between visual cortical and amygdala representations is consistent with the recent suggestion of a two-stage model (Vuilleumier, 2005) in addition, or alternatively, to the two-pathway model. According to this hypothesis, the magnocellular (M) pathway provides the amygdala with coarse inputs through an initial feedforward sweep. Given the rapid activation of the M pathway, such activation of the amygdala would be much faster than what is usually meant by the “indirect cortical pathway” (i.e., less than 130 ms) and would already allow for levels of appraisal processing. Only after such a process does a more elaborate and prolonged cortical processing associated with conscious awareness take place (Vuilleumier, 2005). Such functional architecture might allow several levels of integration in the amygdala, as the relevance of the stimulus for the organism is dynamically computed in the multilevel appraisal processes.

We can consider the visual system an analogy for our theoretical proposal of multiple levels of processing and synchronization of component activities in the emotional system. The visual system is a process classically thought to be fully hierarchical, but increasingly conceptualized as a mechanism of sequential and simultaneous parallel processing in three visual channels, feedback connections and synchronization between neuronal assemblies being critical. Indeed, the visual brain, mainly from anatomical considerations, was classically described as consisting of many parallel specialized subsystems (see Felleman & Van Essen, 1991) being activated serially by feedforward connections. Because each visual area shows more or less specific receptive field tuning properties, it has been stated that each subsystem mainly processes a specific feature (i.e., color, motion, texture, or location) of the scene presented on the visual field (see e.g., Palmer, 1999). With the development of physiological recording methods, the differences in the timing of activation between lower order areas (e.g., V1) and higher order areas (e.g., V5) has been a topic of great interest in the neuroscience of vision for more than 10 years (see Bullier & Nowak, 1995). The notion that multiple levels of visual processing of the same stimulus event can take place in parallel was found when considering in detail both the timing of visual processing and the functional differentiation of M, parvocellular (P), and koniocellular (K) pathways in vision (see Cheng et al., 2004, Patel and Sathian, 2000). In addition to the functional specificities of these visual pathways, a major difference between P and M pathways is the latency difference. For example, as discussed by Bullier and Nowak (1995), it appears that latencies in V1 and V2 overlap extensively and even that many neurons in V2 can be active earlier than some V1 neurons. This result is crucial because it shows that a given M neuron in V2 can drive many P neurons in V1, therefore questioning the hierarchical model deeply by separating merely simultaneous activity of neurons from areas of different “levels.” In this context, an alternative to the classical view is to consider that the activation of conscious visual representation when one is identifying an attended object is the result of the interaction between feedback connections from M neurons and feedforward connections from P neurons that fire later within the ventral stream. If enough attention is allocated to the object on the retina, then the result of preprocessing of the M pathway is sent via feedback connections to areas V4, V2, and V1 and, because of the asynchrony of the three pathways, meets the slower feedforward information provided by the P and K pathways (see Bullier, 2004, Ullman, 1995). Without pushing the analogy between the visual and emotional systems too far, one notices striking similarities between the current perspective on the visual brain and our proposals concerning the emotional brain for multilevel parallel and sequential processing: The stimuli are processed at the three levels in parallel, each sequentially, with some levels being faster than others, thus preactivating representations for further processing by the complementary remaining levels.

Also of interest in the analogy between the visual system and our theory of the emotional system is that the classical model in vision has difficulties explaining the so-called binding problem. Indeed, to obtain the unified conscious experience that is characteristic of scene perception and to identify objects presented at the same time, it is necessary to bind all the processed features that belong to the same object together. The hypothesis of a single convergence center that would allow this binding has failed (see Singer, 1999). Results from intracortical recording in animals and EEG in humans (see Tallon-Baudry & Bertrand, 1999) indicate that the synchronization of oscillatory responses could reflect a binding mechanism rather than a single area. It is important to highlight this point because the search for a single integration structure (as the “glande pinéale” of Descartes) for conscious perception of emotion might fail, as it failed for vision, and is consistent with the notion of component synchronization as an underlying mechanism for conscious feelings.

The CPM assumes that events are simultaneously appraised in parallel on several levels. The processing will stop when a conclusive result is achieved on one of the levels. In consequence, often when lower levels successfully deal with potential problems, higher levels will not contribute to the feeling-generating integration process described earlier and the process may remain unconscious. In consequence, we need to address the issue of when the appraisal process must rely on higher levels and thus the question of the emergence of consciousness. We believe that this issue may be intricately linked with the process of component synchronization, which is the major element of the emotion definition provided by the CPM. In brief, our hypothesis is that it is the degree of synchronization of the components (which might in turn be determined by the pertinence of the event as appraised by the organism) that generate conscious experience. The reason is that a high level of synchronization that lasts more than a very brief period is an expensive operation for the organism’s resources and can probably not be regulated automatically. In consequence, to return to equilibrium, higher level regulation strategies, such as reappraisal, must kick in. These strategies require high-level representational operations and are therefore usually conscious. Such explicit representations can, for example, be maintained in working memory, and new computations involving combinations with other explicit representations can take place thanks to mechanisms that depend mainly on prefrontal systems that preferentially allocate cognitive resources and subserve executive functions.

Several lines of research in affective neuroscience clearly indicate that a given stimulus input can be processed at several levels in the human brain before emotional responses are produced. In particular, evidence for the existence of unconscious levels of appraisal processing is provided by several approaches, including the behavioral testing and brain imaging investigation of normal and brain-damaged individuals.

In particular, the question of the involvement of the conscious perception of the eliciting event was extensively investigated in affective neuroscience research. The evaluation of non-consciously perceived emotional stimuli is suggested by a study of blindsight patients who were able to perform levels of emotional categorization with neither visual consciousness nor feeling elicited by emotional stimuli such as facial expressions (e.g., de Gelder et al., 1999, Pegna et al., 2005). In addition, evidence indicating that neglect patients who display a deficit in awareness of stimuli presented contralaterally to their lesion (typically in the left hemispace) show an advantage for the processing of visual (Vuilleumier et al., 2002, Vuilleumier and Schwartz, 2001a) and vocal (Grandjean, Sander, Lucas, Scherer, & Vuilleumier, 2008) emotional stimuli as compared with neutral. Consistently, a number of brain imaging studies in healthy subjects revealed that some structures of the emotional brain, in particular the amygdala, are sensitive to non-recognized fearful faces presented out of the focus of consciousness (Morris et al., 1999, Whalen, 1998, Whalen et al., 2004, Wiens, 2006) and out of the focus of spatial attention (Vuilleumier & Schwartz, 2001b). More generally, several results indicate that implicit emotional processing can take place without explicit processes being involved, as is the case, for example, for the processing of emotional words (Isenberg et al., 1999) or faces (e.g., Gorno-Tempini et al., 2001) that are in the focus of attention, but incidentally processed.

In their role of prioritizing the processing of pertinent events, attentional processes interact with emotional processes, and the existence of emotional effects of a given stimulus in the absence of voluntary attention oriented towards this event has been established using a wide variety of paradigms (see Compton, 2003, Vuilleumier, 2005 for review). Such effects of emotional attention, as opposed to voluntary attention, have been revealed in both the visual (e.g., Mack and Rock, 1998, Pourtois et al., 2004, Vuilleumier et al., 2001) and the auditory (e.g., Grandjean et al., 2005, Sander et al., 2005b) domains, as well as when using a cross-modal paradigm (Brosch, Grandjean, Sander, & Scherer, 2008). It is interesting to note that the rapid emotional effects taking place outside of the focus of voluntary attention are not restricted to the processing of fear-related or negative stimuli, but also include the processing of positive stimuli such as baby faces (Brosch, Sander, & Scherer, 2007). Indeed, complex appraisal processes are also possible at lower levels, outside of the attentional focus and outside of a conscious representation, challenging the simple view of a dual system composed of an independent “impulsive system” (as in the case of fear for example) and a “reflexive system”. Taken together, these results are consistent with our proposal that multiple levels of relevance detection, rather than a specific automatic fear module (Öhman, 1987), as a part of an impulsive system, can implicitly drive voluntary attention towards the relevant events for further processing both in the following appraisal checks and in other appraisal levels that are more dependant upon voluntary attention and might involve consciousness.

Section snippets

Conclusion

The ability of organisms to build up a conscious representation of an emotion—what we call a feeling—is a complex dynamic phenomenon implying neuronal synchronizations at different levels (see Fig. 6). We suggest to conceptualize the process of an emergent conscious feeling as the result of synchronizations of different subsystems at different levels. We propose two main mechanisms that are necessary and sufficient for the emergence of a conscious feeling. (1) In particular, Scherer (2004) has

References (99)

  • D. Sander et al.

    A systems approach to appraisal mechanisms in emotion

    Neural Networks

    (2005)
  • D. Sander et al.

    Emotion and attention interactions in social cognition: Brain regions involved in processing anger prosody

    Neuroimage

    (2005)
  • C. Tallon-Baudry et al.

    Oscillatory gamma activity in humans and its role in object representation

    Trends in Cognitive Science

    (1999)
  • C.M. van Reekum et al.

    Levels of processing for emotion-antecedent appraisal

  • P. Vuilleumier

    How brains beware: Neural mechanisms of emotional attention

    Trends in Cognitive Science

    (2005)
  • P. Vuilleumier et al.

    Neural response to emotional faces with and without awareness: Event-related fMRI in a parietal patient with visual extinction and spatial neglect

    Neuropsychologia

    (2002)
  • P. Vuilleumier et al.

    Effects of attention and emotion on face processing in the human brain: An event related fMRI study

    Neuron

    (2001)
  • S. Wiens

    Subliminal emotion perception in brain imaging: Findings, issues, and recommendations

    Progress in Brain Research

    (2006)
  • M.B. Arnold
    (1960)
  • A. Bechara

    Decision making, impulse control and loss of willpower to resist drugs: A neurocognitive perspective

    Nature Neuroscience

    (2005)
  • A. Bechara et al.

    Double dissociation of conditioning and declarative knowledge relative to the amygdala and hippocampus in humans

    Science

    (1995)
  • A. Bechara et al.

    Decision-making and impulse control after frontal lobe injuries

    Current Opinion in Neurology

    (2005)
  • L. Berkowitz

    On the formation and regulation of anger and aggression. A cognitive-neoassociationistic analysis

    American Psychologist

    (1990)
  • T. Brosch et al.

    That baby caught my eye… attention capture by infant faces

    Emotion

    (2007)
  • J. Bullier

    Integrated model of visual processing

    Brain Research: Brain Research Review

    (2004)
  • A. Cheng et al.

    The role of the magnocellular pathway in serial deployment of visual attention

    European Journal in Neuroscience

    (2004)
  • R.J. Compton

    The interface between emotion and attention: A review of evidence from psychology and neuroscience

    Behavioral and Cognitive Neuroscience Reviews

    (2003)
  • F.I. Craik

    Levels of processing: Past, present, and future?

    Memory

    (2002)
  • R.S. Lockhart et al.

    Levels of processing: A retrospective analysis of a framework for memory research

    Canadian Journal of Psychology

    (1990)
  • E.S. Dan Glauser et al.

    Neuronal processes involved in subjective feeling emergence: Oscillatory activity during an emotional monitoring task

    Brain Topography

    (2008)
  • Darwin, C. (1872/1998). The expression of the emotions in man and animals (3rd ed.). New York: Oxford University...
  • R.J. Davidson et al.

    Dysfunction in the neural circuitry of emotion regulation—a possible prelude to violence

    Science

    (2000)
  • B. de Gelder et al.

    Nonconscious recognition of affect in the absence of the striate cortex

    Neuroreport

    (1999)
  • P. Ekman

    An argument for basic emotions

    Cognition and Emotion

    (1992)
  • P. Ekman

    Basic emotions

  • P. Ekman

    What we become emotional about

  • P.C. Ellsworth et al.

    Appraisal processes in emotion

  • D.J. Felleman et al.

    Distributed hierarchical processing in the primate cerebral cortex

    Cerebral Cortex

    (1991)
  • D. Grandjean et al.

    The voices of wrath: Brain responses to angry prosody in meaningless speech

    Nature Neuroscience

    (2005)
  • Grandjean, D., & Scherer, K. R. (in press). Unpacking the cognitive architecture of emotion processes....
  • P.E. Griffiths

    What emotions really are: The problem of psychological categories

    (1997)
  • S. Guderian et al.

    Induced theta oscillations mediate large-scale synchrony with mediotemporal areas during recollection in humans

    Hippocampus

    (2005)
  • N. Isenberg et al.

    Linguistic threat activates the human amygdala

    Proceedings of the National Academy of Sciences

    (1999)
  • C.E. Izard

    Human emotions

    (1977)
  • C.E. Izard

    Four systems of emotion activation: Cognitive and noncognitive processes

    Psychological Review

    (1993)
  • M.K. Johnson et al.

    Source monitoring

    Psychological Bulletin

    (1993)
  • T. Johnstone et al.

    Affective speech elicited with a computer game

    Emotion

    (2005)
  • M. Koenigs et al.

    Damage to the prefrontal cortex increases utilitarian moral judgements

    Nature

    (2007)
  • N. Lanctot et al.

    The timing of appraisals

    Emotion

    (2007)
  • Cited by (199)

    • If it changes, it must be an emotion process: Micro or macro

      2024, Change in Emotion and Mental Health
    • Does portion size matter? Dynamic changes in hedonic and emotional responses to foods varying in portion size

      2022, Food Quality and Preference
      Citation Excerpt :

      Indeed, the measurement of facial expressions may be more suitable for social contexts in which facial expressions communicate experiences to others: expressions signaling happiness assure the fellow consumer that the food is delicious and encourages the fellow consumer to join eating the product as well (De Wijk et al., 2019; De Wijk & Noldus, 2021). Finally, emotions are discrete and short-lived feeling experiences that are caused by a specific event, in this case the sight, smell and taste of food (Grandjean et al., 2008; Scherer, 2009). Facial expressions evolve over short time windows (milliseconds), and we reason therefore that facial expressions may be less suitable to pick up slower emotional responses over longer time frames (minutes) throughout full consumption.

    View all citing articles on Scopus
    View full text