The Structure of Emotions argues that emotion concepts should have a much more important role in the social and behavioural sciences than they now enjoy, and shows that certain influential psychological theories of emotions overlook the explanatory power of our emotion concepts. Professor Gordon also outlines a new account of the nature of commonsense (or ‘folk’) psychology in general.
A new probabilistic-informational concept, earlier constructed by Mugur-Schächter, is further developed. Associated with Jaynes's principle, this concept permits one to define a measure for the distance between the state of a system evolving under stable constraints and the equilibrium with these constraints. An illustration is given for a gas evolving in a thermostated box. It appears that the free energy of the gas estimates the distance to equilibrium, the estimation being defined in abstract informational-probabilistic terms.
: This essay considers the implications of President George W. Bush's proposal for human embryonic stem cell research. Through the perspective of patent law, privacy, and informed consent, we elucidate the ongoing controversy about the moral standing of human embryonic stem cells and their derivatives and consider how the inconsistencies in the president's proposal will affect clinical practice and research.
Background -- Overview of legal sources -- Summary of recent prosecutions and investigations -- Applications of law and professional and trade association standards to physician relationships with industry -- Legal and ethical aspects of specific physician's industry financial relationships -- Approaching and adopting effective compliance plans.
Taking our lead from Solomon’s emphasis on the importance of the intentional object of emotion, we review the history of repeated attempts to make this object disappear. We adduce evidence suggesting that in the case of James and Schachter, the intentional object got lost unintentionally. By contrast, modern constructivists (in particular Barrett) seem quite determined to deny the centrality of the intentional object in accounting for the occurrence of emotions. Griffiths, however, downplays the role objects have in emotion noting that (...) these do not qualify as intentional. We argue that these disappearing acts, deliberate or not, generate fruitless debate and add little to the advancement of our understanding of emotion as an adaptive mechanism to cope with events that are relevant to an organism’s life. (shrink)
Empirical evidence shows that non-conscious appraisal processes generate bodily responses to the environment. This finding is consistent with William James’s account of emotion, and it suggests that a general theory of emotion should follow James: a general theory should begin with the observation that physiological and behavioral responses precede our emotional experience. But I advance three arguments (empirical and conceptual arguments) showing that James’s further account of emotion as the experience of bodily responses is inadequate. I offer an alternative model, (...) according to which responses (physical states) are perceived and interpreted by a separate cognitive process, one that assigns meaning to those responses. The non-conscious appraisal process and the interpretive process are distinct, hence a two-stage model of emotion. This model is related to Schachter and Singer’s two-factor theory. Their often-discussed experiment showed that interpretation can play a role in producing emotions. But they do not show that interpretation is necessary for producing emotions in general, outside of the experimental conditions that generated unexplained arousal in subjects. My two-stage model supports this stronger claim by situating the interpretive process in a comprehensive model of emotion. (shrink)
A formal system of “questions” and “propositions” conceived by C. Piron and claimed to yield by interpretation quantum mechanics as well as all other known physical theories is examined. It is proved that the mentioned system is syntactically self-consistent in the sense of the theory of models. However, it is found that the mentioned formal system possesses certain syntactic characteristics in consequence of which qualification of this system as a generator of quantum mechanics by interpretation encounters semantic obstacles so grave (...) that they annihilate any relevance of such a qualification. (shrink)
The exact bearing of an important theorem proved by Wigner is established. The study brings out the fact that marginal conditions as well as mean conditions of a form currently required in joint probability attempts are in fact inadequate for the determination of a relevant concept of a joint probability. New vistas are thereby opened up.
This article explores the relation between Descartes’s appeal to God’s veracity and his connected notions of “metaphysical” and “moral” certainty. I do this by showing their roles in his proof of the external world, his position on other minds, and his position on the “beast-machine.” Descartes uses God’s veracity in the first proof, but not in the second or third. I suggest that the reason for this is that extending his appeal to God to other minds would have placed his (...) beast-machine doctrine in jeopardy. I conclude by accounting for some Cartesian passages that might seem incompatible with my reading of moral certainty’s important role in his philosophy.Cet article explore les liens entre le recours à la véracité de Dieu et les notions de certitude «métaphysique» et «morale» chez Descartes. Pour cela, je montre le rôle qu’elles jouent dans sa preuve de l’existence du monde extérieur, sa position sur l’existence d’autres esprits et celle sur l’«animal-machine». Descartes se sert de la véracité de Dieu dans le premier cas, mais pas dans le deuxième ni le troisième. Je suggère que c’est parce que faire à nouveau appel à la véracité de Dieu dans le cas des autres esprits aurait mis en péril sa doctrine de l’animal-machine. Je conclus en me penchant sur des passages de Descartes qui pourraient sembler incompatibles avec l’interprétation que je fais du rôle important que joue la certitude morale dans sa philosophie. (shrink)
Surgical devices are often marketed before there is good evidence of their safety and effectiveness. Our paper discusses the ethical issues associated with the early marketing and use of new surgical devices from the perspectives of the six groups most concerned. Health Canada, which is responsible for licensing new surgical devices, should amend their requirements to include rigorous clinical trials that provide data on effectiveness and safety for each new product before it is marketed. Industry should comply with all Health (...) Canada requirements to obtain licenses for new products. Until Health Canada requires effectiveness and safety data, industry should cooperate with physicians in appropriate studies before releasing new products and should make balanced presentations of all the available evidence. Surgeons should, before using a new surgical device, assess the evidence on its effectiveness and safety and ensure they are properly trained and competent in using the device. Surgeons should provide their patients with an evaluation of the available evidence and inform them about possible complications and the surgeon's level of experience with the new device. Patients, who should be given an honest evaluation of the available evidence, possible complications, and the surgeon's experience, should be encouraged to evaluate the evidence and information to their own satisfaction to ensure that fully informed consent is given. Health institutions, responsible for regulating practice within their walls, should review new devices for safety, effectiveness, and economic impacts, before allowing their use. They should also limit the use of new surgical devices to surgeons trained and competent in the new technology. Professional societies should provide guidance on the early adoption of new surgical devices and technologies. We urge all those involved in the development, licensing, and use of new surgical devices to aim for higher ethical standards to protect the health and safety of patients requiring surgery. The lowest acceptable ethical standard would require device manufacturers to provide surgeons with accurate and timely information on the efficacy and safety of their products, allowing surgeons and patients to evaluate the evidence (and the significance of information not yet available) before surgery. (shrink)
Bell's theorem is believed to establish that the quantum mechanical predictions do not generally admit a causal representation compatible with Einsten's principle of separability, thereby proving incompatibility between quantum mechanics and relativity. This interpretation is contested via two convergent approaches which lead to a sharp distinction between quantum nonseparability and violation of Einstein's theory of relativity.In a first approach we explicate from the quantum mechanical formalism a concept of “reflected dependence.” Founded on this concept, we produce a causal representation of (...) the quantum mechanical probability measure involved in Bell's proof, which is clearly separable in Einstein's sense, i.e., it does not involve supraluminal velocities, and nevertheless is “nonlocal” in Bell's sense. So Bell locality and Einstein separability aredistinct qualifications, and Bell nonlocality (or Bell nonseparability) and Einstein separability arenot incompatible. It is then proved explicitly that with respect to the mentioned representation Bell's derivation does not hold. So Bell's derivation does notestablish thatany Einstein-separable representation is incompatible with quantum mechanics. This first—negative—conclusion is asyntactic fact.The characteristics of the representation and of the reasoning involved in the mentioned counterexample to the usual interpretation of Bell's theorem suggest that the representation used—notwithstanding its ability to bring forth the specified syntactic fact—isnot factually true. Factual truth and syntactic properties also have to be radically distinguished in their turn. So, in a second approach, starting from de Broglie's initial relativistic model of a microsystem, a deeper, factually acceptable representation is constructed. The analyses leading to this second representation show that quantum mechanics does indeed involve basically a certain sort of nonseparability, called here de Broglie-Bohr quantum nonseparability. But the de Broglie-Bohr quantum nonseparability is shown to stem directly from the relativistic character of the considerations which led Louis de Broglie to the fundamental relation p = h/λ,thereby being essentially consistent with relativity. As to Einstein separability, it appears to be a still insufficiently specified conceptof which a future, improved specification, will probably be explicitly harmonizable with the de Broglie-Bohr quantum nonseparability.The ensemble of the conclusions obtained here brings forth a new concept of causality, a concept offolded, zigzag, reflexive causality, with respect to which the type of causality conceived of up to now appears as aparticular case of outstretched, one-way causality. The reflexive causality is found compatible with the results of Aspect's experiment, and it suggests new experiments.Considered globally, the conclusions obtained in the present work might convert the conceptual situation created by Bell's proof into a process of unification of quantum mechanics and relativity. (shrink)
In previous works we have established that the spacetime probabilistic organization of the quantum theory is determined by the spacetime characteristics of the operations by which the observer produces the objects to be studied (“states” of microsystems) and obtains qualifications of these. Guided by this first conclusion, we have then built a “general syntax of relativized conceptualization” where any description is explicitly and systematically referred to the two basic epistemic operations by which the conceptor introduces the object to be qualified (...) and then obtains qualifications of it. Inside this syntax there emerges a general typology of the relativized descriptions. Here we show that with respect to this typology the type of the predictive quantum mechanical descriptions acquires a precise definition. It appears that the quantum mechanical formalism has captured and has expressed directly in a mathematical language the most complex form in which can occur a first descriptional phase that lies universally at the bottom of any chain of conceptualization. The main features of the Hilbert-Dirac algorithms are decoded in terms of the general syntax of relativized conceptualization. This renders explicit the semantical contents of the quantum mechanical representations relating each one of these to its mathematical quantum mechanical expression. Basic insufficiencies are thus identified and, correlatively, false problems as well as answers to these, or guides toward answers. Globally the results obtained provide a basis for future attempts at a general mathematical representation of the processes of conceptualization. “Il pourrait, en effet, être dangereux pour l'avenir de la Physique qu'elle se contente trop facilement de purs formalismes, d'images floues et d'explications toutes verbales s'exprimant par des mots à signification imprécise”—Louis de Broglie,Certitudes et Incertitudes de la Science (Albin Michel, Paris, 1965). (shrink)
In the first part of this work(1) we have explicated the spacetime structure of the probabilistic organization of quantum mechanics. We have shown that each quantum mechanical state, in consequence of the spacetime characteristics of the epistemic operations by which the observer produces the state to be studied and the processes of qualification of these, brings in a tree-like spacetime structure, a “quantum mechanical probability tree,” thattransgresses the theory of probabilities as it now stands. In this second part we develop (...) the general implications of these results.Starting from the lowest level of cognitive action and creating an appropriate symbolism, we construct a “relativizing epistemic syntax,” a “general method of relativized conceptualization” where—systematically—each description is explicitly referred to the epistemic operations by which the observer produces the entity to be described and obtains qualifications of it. The method generates a typology of increasingly complex relativized descriptions where the question of realism admits of a particularly clear pronouncement. Inside this typology the epistemic processes that lie—UNIVERSALLY—at thebasis ofany conceptualization, reveal a tree-like spacetime structure. It appears in particular that the spacetime structure of the relativized representation of aprobabilistic description, which transgresses the nowadays theory of probabilities,is the general mould of which the quantum mechanical probability trees are only particular realizations. This entails a clear definition of the descriptional status of quantum mechanics. While the recognition of theuniversal cognitive content of the quantum mechanical formalism opens up vistas towardmathematical developments of the relativizing epistemic syntax.The relativized representation of a probabilistic description leads with inner necessity to a “morphic” interpretation of probabilities thatcan be regarded as a formalized and deepening elaboration of Sir Karl Popper's “propensity” interpretation. A functional is then constructed, the “opacity functional,” that associatesa mathematical expression to the Popperian “propensities”. Furthermore the opacity functional produces adeductive definition of Shannon's “informational entropy.” Thereby there appears an explicitly unified relativized probabilistic-informational approach. This sketches out a second branch of a future mathematical epistemic syntax, to be connected with the branch stemming from quantum mechanics.The problem of the objectivity of probabilistic descriptions acquires certain precise rephrasings and—in a sense—solution. (shrink)