While there is an extensive literature on the tendency to mimic emotional expressions in adults, it is unclear how this skill emerges and develops over time. Specifically, it is unclear whether infants mimic discrete emotion-related facial actions, whether their facial displays are moderated by contextual cues and whether infants’ emotional mimicry is constrained by developmental changes in the ability to discriminate emotions. We therefore investigate these questions using Baby-FACS to code infants’ facial displays and eye-movement tracking to examine infants’ looking (...) times at facial expressions. Three-, 7-, and 12-month-old participants were exposed to dynamic facial expressions of a virtual model which either looked at the infant or had an averted gaze. Infants did not match emotion-specific facial actions shown by the model, but they produced valence-congruent facial responses to the distinct expressions. Furthermore, only the 7- and 12-month-olds displayed negative responses to the model’s negative expressions and they looked more at areas of the face recruiting facial actions involved in specific expressions. Our results suggest that valence-congruent expressions emerge in infancy during a period where the decoding of facial expressions becomes increasingly sensitive to the social signal value of emotions. (shrink)
In multicellular organisms, cells are frequently programmed to die. This makes good sense: cells that fail to, or are no longer playing important roles are eliminated. From the cell’s perspective, this also makes sense, since somatic cells in multicellular organisms require the cooperation of clonal relatives. In unicellular organisms, however, programmed cell death poses a difficult and unresolved evolutionary problem. The empirical evidence for PCD in diverse microbial taxa has spurred debates about what precisely PCD means in the case of (...) unicellular organisms. In this article, we survey the concepts of PCD in the literature and the selective pressures associated with its evolution. We show that definitions of PCD have been almost entirely mechanistic and fail to separate questions concerning what PCD fundamentally is from questions about the kinds of mechanisms that realize PCD. We conclude that an evolutionary definition is best able to distinguish PCD from closely related phenomena. Specifically, we define “true” PCD as an adaptation for death triggered by abiotic or biotic environmental stresses. True PCD is thus not only an evolutionary product but must also have been a target of selection. Apparent PCD resulting from pleiotropy, genetic drift, or trade-offs is not true PCD. We call this “ersatz PCD.”. (shrink)
The experiments reported herein probe the visual cortical mechanisms that control near–far percepts in response to two-dimensional stimuli. Figural contrast is found to be a principal factor for the emergence of percepts of near versus far in pictorial stimuli, especially when stimulus duration is brief. Pictorial factors such as interposition (Experiment 1) and partial occlusion Experiments 2 and 3) may cooperate, as generally predicted by cue combination models, or compete with contrast factors in the manner predicted by the FACADE model. (...) In particular, if the geometrical con guration of an image favors activation of cortical bipole grouping cells, as at the top of a T-junction, then this advantage can cooperate with the contrast of the con guration to facilitate a near–far percept at a lower contrast than at an X-junction. Varying the exposure duration of the stimuli shows that the more balanced bipole competition in the X-junction case takes longer exposure times to resolve than the bipole competition in the T-junction case (Experiment 3). (shrink)
The devotion of a full article in the Paris Agreement to loss and damage was a major breakthrough for the world’s most vulnerable nations seeing to gain support for climate impacts beyond what can be adapted to. But how will loss and damage be paid for, and who will pay it? Will ethics be part of this decision? Here we ask what are the possible means of raising predictable and adequate levels of funding to address loss and damage? Utilizing a (...) framework developed by Marco Grasso, we argue that making the ethical connections between addressing climate impacts and finance mechanisms could significantly enhance their likelihood of being adopted. We briefly review insurance mechanisms and catastrophe bonds, and then move on to six “innovative finance” approaches to funding loss and damage. We utilize six criteria in assessing them: adequacy, predictability, technical feasibility, fairness, and indirect effects, and whether each has a clear link to loss and damage. Several mechanisms for gathering funds emerged as most promising. Three of the six financial mechanisms we reviewed to raise funding involved airline transport: clearly, there is a huge opportunity to tax this sector in one form or another, in recognition of airline emissions’ role in creating losses and damages in vulnerable nations from sea level rise, droughts, floods or hurricanes. Funding loss and damage response is a contentious issue that will get only more unwieldy if Parties’ conceptions of loss and damage are at odds: a common definition of loss and damage needs to be agreed upon under the UNFCCC. Most immediately, to meet any equity criteria, wealthy countries should do more to support the premiums of those who cannot afford insurance. (shrink)
In their book Spare Parts, published in 1992, Fox and Swazey criticized various aspects of organ transplantation, including the routinization of the procedure, ignorance regarding its inherent uncertainties, and the ethos of transplant professionals. Using this work as a frame of reference, we analyzed articles on organ transplantation published in internal medicine and transplantation journals between 1995 and 2008 to see whether Fox and Swazey’s critiques of organ transplantation were still relevant.
In 1952, Heinrich Scholz published a question in The Journal of Symbolic Logic asking for a characterization of spectra, i.e., sets of natural numbers that are the cardinalities of finite models of first order sentences. Günter Asser in turn asked whether the complement of a spectrum is always a spectrum. These innocent questions turned out to be seminal for the development of finite model theory and descriptive complexity. In this paper we survey developments over the last 50-odd years pertaining to (...) the spectrum problem. Our presentation follows conceptual developments rather than the chronological order. Originally a number theoretic problem, it has been approached by means of recursion theory, resource bounded complexity theory, classification by complexity of the defining sentences, and finally by means of structural graph theory. Although Scholz' question was answered in various ways, Asser's question remains open. (shrink)
We study the extension of dependence logic \ by a majority quantifier \ over finite structures. We show that the resulting logic is equi-expressive with the extension of second-order logic by second-order majority quantifiers of all arities. Our results imply that, from the point of view of descriptive complexity theory, \\) captures the complexity class counting hierarchy. We also obtain characterizations of the individual levels of the counting hierarchy by fragments of \\).
In 2010, the Eurozone became the epicentre of the world crisis. The vulnerability of Europe appears to be linked to the specific institutional arrangement which organises monetary, financial and budgetary policies within the Eurozone. This article tries to understand the evolution of theeuduring a short but decisive historical sequence in a theoretical framework that puts elements of Gramsci’s reflections on the theme of crisis, and especially his notion of ‘Caesarism’, at its centre. It addresses the current debate concerning the relationships (...) between democratic politics and neoliberalism, while focusing on how the radicalisation of the crisis put at stake the co-construction of capitalism and representative democracy in the Western world sincewwii. (shrink)
The article argues the need to rehabilitate the concept of alienation within the post-Fordist model of production, insofar as it is the concept which – if we leave aside the general analysis of wage-labour within capitalist social structures – is best able to explain how the newly devised tools for the management of labour and, most importantly, for the mobilisation of the subjectivities of salaried workers, lead to a reduction both of their autonomy in work and of their opportunities for (...) escape from the stipulated behavioural norms. Finally, it is argued that the procedures for the organisation of production and labour, along with the mechanisms intrinsic to the mobilisation of labour, establish, to an unprecedented degree, the conditions for a denegation of alienation, thus consolidating it. (shrink)
La notion de transition énergétique est aujourd’hui l’une des figures de proue du concept plus général de développement durable. Elle est marquée par la double temporalité de l’urgence de sa mise en œuvre et du temps long dont elle se réclame. La mise en politique de cette transition génère des conflits – notamment autour des grands projets d’infrastructures – que cet article analyse en cherchant à croiser les temporalités et les échelles d’action des différents acteurs impliqués. Il met en évidence (...) un certain nombre de tensions qui existent entre urgence climatique et durabilité dans la mise en politique de la transition énergétique, mais aussi la complexité des temporalités dont se réclament les acteurs, qui s’emboîtent, se contredisent parfois et peuvent être source de conflit. Le propos s’articule en trois temps, organisés autour des échelles européennes, nationales et locales de la transition énergétique. Trois terrains viennent étayer les analyses proposées : une enquête auprès du réseau européen des gestionnaires de réseaux d’électricité chargé du plan d’investissement dans les réseaux de transports d’électricité, l’étude des conflits liés à l’implantation des énergies marines renouvelables en France et un cas d’étude grec sur les conflits liés à des projets de parcs éoliens en Crête. (shrink)
The Type VI secretion system is a multiprotein and mosaic apparatus that delivers protein effectors into prokaryotic or eukaryotic cells. Recent data on the enteroaggregative Escherichia coli T6SS have provided evidence that the TssA protein is a key component during T6SS biogenesis. The T6SS comprises a trans-envelope complex that docks the baseplate, a cytoplasmic complex that represents the assembly platform for the tail. The T6SS tail is structurally, evolutionarily and functionally similar to the contractile tails of bacteriophages. We have shown (...) that TssA docks to the membrane complex, recruits the baseplate complex and initiates and coordinates the polymerization of the inner tube with that of the sheath. Here, we review these recent findings, discuss the variations within TssA-like proteins, speculate on the role of EAEC TssA in T6SS biogenesis and propose future research perspectives. In the environment, bacteria have to cope with other microbial species and therefore have evolved anti-microbial mechanisms. The bacterial Type VI secretion system transports toxins into bacterial and/or eukaryotic cells using a contractile mechanism. At the architectural level, the T6SS could be viewed as a nano-speargun assembled in the bacterial cytoplasm and anchored to a trans-envelope complex. We describe and speculate on recent findings demonstrating that the TssA subunit is involved in the different stages of T6SS biogenesis. (shrink)
This paper is concerned with the description of the process of measurement within the context of a quantum theory of the physical world. It is noted that quantum mechanics permits a quasi-classical description (classical in the limited sense implied by the correspondence principle of Bohr) of those macroscopic phenomena in terms of which the observer forms his perceptions. Thus, the process of measurement in quantum mechanics can be understood on the quasi-classical level by transcribing from the strictly classical observables of (...) Newtonian physics to their quasi-classical counterparts the known rules for the measurement of the former. The remaining physical problem is the delineation of the circumstances in which the correlation of a peculiarly quantum mechanical observable A with a classically measurable observable B can result in a significant measurement of A. This is undertaken within the context of quantum theory. The resulting clarification of the process of measurement has important implications relative to the philosophic interpretation of quantum mechanics. (shrink)