Maxwell’s Demon is a thought experiment devised by J. C. Maxwell in 1867 in order to show that the Second Law of thermodynamics is not universal, since it has a counter-example. Since the Second Law is taken by many to provide an arrow of time, the threat to its universality threatens the account of temporal directionality as well. Various attempts to “exorcise” the Demon, by proving that it is impossible for one reason or another, have been made throughout (...) the years, but none of them were successful. We have shown (in a number of publications) by a general state-space argument that Maxwell’s Demon is compatible with classical mechanics, and that the most recent solutions, based on Landauer’s thesis, are not general. In this paper we demonstrate that Maxwell’s Demon is also compatible with quantum mechanics. We do so by analyzing a particular (but highly idealized) experimental setup and proving that it violates the Second Law. Our discussion is in the framework of standard quantum mechanics; we give two separate arguments in the framework of quantum mechanics with and without the projection postulate. We address in our analysis the connection between measurement and erasure interactions and we show how these notions are applicable in the microscopic quantum mechanical structure. We discuss what might be the quantum mechanical counterpart of the classical notion of “macrostates”, thus explaining why our Quantum Demon setup works not only at the micro level but also at the macro level, properly understood. One implication of our analysis is that the Second Law cannot provide a universal lawlike basis for an account of the arrow of time; this account has to be sought elsewhere. (shrink)
In _Neuroscience and Philosophy_ three prominent philosophers and a leading neuroscientist clash over the conceptual presuppositions of cognitive neuroscience. The book begins with an excerpt from Maxwell Bennett and Peter Hacker's _Philosophical Foundations of Neuroscience_, which questions the conceptual commitments of cognitive neuroscientists. Their position is then criticized by Daniel Dennett and John Searle, two philosophers who have written extensively on the subject, and Bennett and Hacker in turn respond. Their impassioned debate encompasses a wide range of central themes: (...) the nature of consciousness, the bearer and location of psychological attributes, the intelligibility of so-called brain maps and representations, the notion of qualia, the coherence of the notion of an intentional stance, and the relationships between mind, brain, and body. Clearly argued and thoroughly engaging, the authors present fundamentally different conceptions of philosophical method, cognitive-neuroscientific explanation, and human nature, and their exchange will appeal to anyone interested in the relation of mind to brain, of psychology to neuroscience, of causal to rational explanation, and of consciousness to self-consciousness. In his conclusion Daniel Robinson explains why this confrontation is so crucial to the understanding of neuroscientific research. The project of cognitive neuroscience, he asserts, depends on the incorporation of human nature into the framework of science itself. In Robinson's estimation, Dennett and Searle fail to support this undertaking; Bennett and Hacker suggest that the project itself might be based on a conceptual mistake. Exciting and challenging, _Neuroscience and Philosophy_ is an exceptional introduction to the philosophical problems raised by cognitive neuroscience. (shrink)
I address a question recently raised by Simon Saunders [Phil. Sci. 80: 22-48 ] concerning the relationship between the spacetime structure of Newton-Cartan theory and that of what I will call "Maxwell-Huygens spacetime". This discussion will also clarify a connection between Saunders' work and a recent paper by Eleanor Knox [Brit. J. Phil. Sci. 65: 863-880 ].
This article gives an explicit presentation of Newtonian gravitation on the backdrop of Maxwell space-time, giving a sense in which acceleration is relative in gravitational theory. However, caution is needed: assessing whether this is a robust or interesting sense of the relativity of acceleration depends on some subtle technical issues and on substantive philosophical questions over how to identify the space-time structure of a theory.
We present a multiscale integrationist interpretation of the boundaries of cognitive systems, using the Markov blanket formalism of the variational free energy principle. This interpretation is intended as a corrective for the philosophical debate over internalist and externalist interpretations of cognitive boundaries; we stake out a compromise position. We first survey key principles of new radical views of cognition. We then describe an internalist interpretation premised on the Markov blanket formalism. Having reviewed these accounts, we develop our positive multiscale account. (...) We argue that the statistical seclusion of internal from external states of the system—entailed by the existence of a Markov boundary—can coexist happily with the multiscale integration of the system through its dynamics. Our approach does not privilege any given boundary, nor does it argue that all boundaries are equally prescient. We argue that the relevant boundaries of cognition depend on the level being characterised and the explanatory interests that guide investigation. We approach the issue of how and where to draw the boundaries of cognitive systems through a multiscale ontology of cognitive systems, which offers a multidisciplinary research heuristic for cognitive science. (shrink)
This paper proves that Maxwell's Demon is compatible with classical mechanics. In particular it shows how the cycle of operation - including measurement and erasure - can be carried out with no entropy cost, contrary to the Landauer-Bennett thesis (according to which memory erasure costs kln2 of entropy increase per bit). The Landauer-Bennet thesis is thus proven to be mistaken.
The Comprehensibility of the Universe puts forward a radically new conception of science. According to the orthodox conception, scientific theories are accepted and rejected impartially with respect to evidence, no permanent assumption being made about the world independently of the evidence. Nicholas Maxwell argues that this orthodox view is untenable. He urges that in its place a new orthodoxy is needed, which sees science as making a hierarchy of metaphysical assumptions about the comprehensibility and knowability of the universe, these (...) assumptions asserting less and less as one ascends the hierarchy.This view has significant implications: that it is part of scientific knowledge that the universe is physically comprehensible; that metaphysics and philosophy are central to scientific knowledge; that science possesses a rational, if fallible, method of discovery; that a new understanding of scientific method and rationality is required. Maxwell argues that this new conception makes possible a natural resolution of long-standing philosophical problems about science, regarding simplicity, induction, and progress. His goal is the reform not just of the philosophy of science but of science itself, and the healing of the rift between the two. (shrink)
The fact that the same equations or mathematical models reappear in the descriptions of what are otherwise disparate physical systems can be seen as yet another manifestation of Wigner's “unreasonable effectiveness of mathematics.” James Clerk Maxwell famously exploited such formal similarities in what he called the “method of physical analogy.” Both Maxwell and Hermann von Helmholtz appealed to the physical analogies between electromagnetism and hydrodynamics in their development of these theories. I argue that a closer historical examination of (...) the different ways in which Maxwell and Helmholtz each deployed this analogy gives further insight into debates about the representational and explanatory power of mathematical models. (shrink)
This book argues for the need to put into practice a profound and comprehensive intellectual revolution, affecting to a greater or lesser extent all branches of scientific and technological research, scholarship and education. This intellectual revolution differs, however, from the now familiar kind of scientific revolution described by Kuhn. It does not primarily involve a radical change in what we take to be knowledge about some aspect of the world, a change of paradigm. Rather it involves a radical change in (...) the fundamental, overall intellectual aims and methods of inquiry. At present inquiry is devoted to the enhancement of knowledge. This needs to be transformed into a kind of rational inquiry having as its basic aim to enhance personal and social wisdom. This new kind of inquiry gives intellectual priority to the personal and social problems we encounter in our lives as we strive to realize what is desirable and of value – problems of knowledge and technology being intellectually subordinate and secondary. For this new kind of inquiry, it is what we do and what we are that ultimately matters: our knowledge is but an aspect of our life and being. (shrink)
In this paper I expound an argument which seems to establish that probabilism and special relativity are incompatible. I examine the argument critically, and consider its implications for interpretative problems of quantum theory, and for theoretical physics as a whole.
Must a Maxwell demon must fail to reverse the second law of thermodynamics? Standard attempts to show it must fail make use of notions of information and computation. None of these attempts have succeeded. Worse they have distracted both supporters and opponents of these attempts from a much simpler demonstration of the necessary failure of a Maxwell's demon that employs no notions of information or computation. It requires only Liouville's theorem and its quantum analog.
We are in a state of impending crisis. And the fault lies in part with academia. For two centuries or so, academia has been devoted to the pursuit of knowledge and technological know-how. This has enormously increased our power to act which has, in turn, brought us both all the great benefits of the modern world and the crises we now face. Modern science and technology have made possible modern industry and agriculture, the explosive growth of the world’s population, global (...) warming, modern armaments and the lethal character of modern warfare, destruction of natural habitats and rapid extinction of species, immense inequalities of wealth and power across the globe, pollution of earth, sea and air, even the aids epidemic (aids being spread by modern travel). All these global problems have arisen because some of us have acquired unprecedented powers to act, via science and technology, without also acquiring the capacity to act wisely. (shrink)
Incommensurability was Kuhn’s worst mistake. If it is to be found anywhere in science, it would be in physics. But revolutions in theoretical physics all embody theoretical unification. Far from obliterating the idea that there is a persisting theoretical idea in physics, revolutions do just the opposite: they all actually exemplify the persisting idea of underlying unity. Furthermore, persistent acceptance of unifying theories in physics when empirically more successful disunified rivals can always be concocted means that physics makes a persistent (...) implicit assumption concerning unity. To put it in Kuhnian terms, underlying unity is a paradigm for paradigms. We need a conception of science which represents problematic assumptions concerning the physical comprehensibility and knowability of the universe in the form of a hierarchy, these assumptions becoming less and less substantial and more and more such that their truth is required for science, or the pursuit of knowledge, to be possible at all, as one goes up the hierarchy. This hierarchical conception of science has important Kuhnian features, but also differs dramatically from the view Kuhn expounds in his The Structure of Scientific Revolutions. In this paper, I compare and contrast these two views in a much more detailed way than has been done hitherto. I show how the hierarchical view can be construed to emerge from Kuhn’s view as it is modified to overcome objections. I argue that the hierarchical conception of science is to be preferred to Kuhn’s view. (shrink)
The term “analogy” stands for a variety of methodological practices all related in one way or another to the idea of proportionality. We claim that in his first substantial contribution to electromagnetism James Clerk Maxwell developed a methodology of analogy which was completely new at the time or, to borrow John North’s expression, Maxwell’s methodology was a “newly contrived analogue”. In his initial response to Michael Faraday’s experimental researches in electromagnetism, Maxwell did not seek an analogy with (...) some physical system in a domain different from electromagnetism as advocated by William Thomson; rather, he constructed an entirely artificial one to suit his needs. Following North, we claim that the modification which Maxwell introduced to the methodology of analogy has not been properly appreciated. In view of our examination of the evidence, we argue that Maxwell gave a new meaning to analogy; in fact, it comes close to modeling in current usage. (shrink)
Cutting God in Half argues that, in order to tackle climate change, world poverty, extinction of species and our other global problems rather better than we are doing at present we need to bring about a revolution in science, and in academia more generally. We need to put our problems of living – personal, social, global – at the heart of the academic enterprise. How our human world, imbued with meaning and value, can exist and best flourish embedded in the (...) physical universe is, the book argues, our basic problem. It is our fundamental philosophical problem, our fundamental problem of knowledge and understanding, and our fundamental practical problem of living. It is this problem that we fail, at present, to recognize as fundamental – to our cost. It can be understood to arise as a result of cutting God in half – severing the God of Cosmic Power from the God of Value. The first is Einstein’s God, the underlying unity in the physical universe that determines how events occur. The second is what is of most value associated with human life – and sentient life more generally. Having cut God in half in this way, the problem then becomes to see how the two halves can be put together again. This book tackles outstanding aspects of this problem, and in doing so throws out original ideas about science, education, religion, evolutionary theory, free will, quantum theory, and how we should go about tackling our impending global crises. It transpires that bringing our basic problem into sharp focus has revolutionary implications. It becomes clear how and why many aspects of our social and cultural world urgently need to be transformed. Cutting God in Half is written in a lively, accessible style, and ought to be essential reading for anyone concerned about ultimate questions – the nature of the universe, the meaning of life, the future of humanity. (shrink)
This paper aims to address the relevance of the natural sciences for transcendental phenomenology, that is, the issue of naturalism. The first section distinguishes three varieties of naturalism and corresponding forms of naturalization: an ontological one, a methodological one, and an epistemological one. In light of these distinctions, in the second section, I examine the main projects aiming to “naturalize phenomenology”: neurophenomenology, front-loaded phenomenology, and formalized approaches to phenomenology. The third section then considers the commitments of Husserl’s transcendental phenomenology with (...) respect to the three varieties of naturalism previously discussed. I argue that Husserl rejected strong and weak forms of epistemological naturalism, strong methodological naturalism, and ontological naturalism. The fourth section presents the argument that Husserl endorsed a weak, conditional form of methodological naturalism. This point is illustrated with Husserl’s proposal of “somatology,” a natural science apt to study the corporeality of the lived body. The final section addresses the complementarity and respective limits of the transcendental phenomenological and the natural scientific frameworks. I argue that, on Husserl’s account, the function of transcendental phenomenology with respect to the natural sciences is to provide them with an epistemological foundation and an ontological clarification. I suggest that certain natural sciences can be understood, within the transcendental phenomenological framework, as “sciences of constitution,” that is, as sciences investigating the contribution of real structures acting as conditions of possibility for the occurrence of certain kinds of comprehensive unities in lived experience. (shrink)
In this paper, I argue that the recent discussion on the time - reversal invariance of classical electrodynamics (see (Albert 2000: ch.1), (Arntzenius 2004), (Earman 2002), (Malament 2004),(Horwich 1987: ch.3)) can be best understood assuming that the disagreement among the various authors is actually a disagreement about the metaphysics of classical electrodynamics. If so, the controversy will not be resolved until we have established which alternative is the most natural. It turns out that we have a paradox, namely that the (...) following three claims are incompatible: the electromagnetic fields are real, classical electrodynamics is time-reversal invariant, and the content of the state of affairs of the world does not depend on whether it belongs to a forward or a backward sequence of states of the world. (shrink)
This paper presents a praxiological analysis of three everyday educational practices or strategies that can be considered as being directed at the moral formation of the emotions. The first consists in requests to imagine other's emotional reactions. The second comprises requests to imitate normative emotional reactions and the third to re-appraise the features of a situation that are relevant to an emotional response. The interest of these categories is not just that they help to organize and recognize the significance of (...) what might otherwise appear to be a disparate set of ordinary moral-educational interactions between children and educators. We suggest, further, that this analysis provides some new insight into what distinguishes the broad and recurrent conceptions of moral education from one another. Rather than being straightforwardly reducible to intractable differences over core normative or meta-ethical questions they can also be seen as correlating with different suppositions about the central role of the emotions in moral life and, correspondingly, different but to a large degree compatible interpretations of what the "education of the moral emotions" primarily means. (shrink)
This paper argues that philosophers of science have before them an important new task that they urgently need to take up. It is to convince the scientific community to adopt and implement a new philosophy of science that does better justice to the deeply problematic basic intellectual aims of science than that which we have at present. Problematic aims evolve with evolving knowledge, that part of philosophy of science concerned with aims and methods thus becoming an integral part of science (...) itself. The outcome of putting this new philosophy into scientific practice would be a new kind of science, both more intellectually rigorous and one that does better justice to the best interests of humanity. (shrink)
Karl Popper is famous for having proposed that science advances by a process of conjecture and refutation. He is also famous for defending the open society against what he saw as its arch enemies – Plato and Marx. Popper’s contributions to thought are of profound importance, but they are not the last word on the subject. They need to be improved. My concern in this book is to spell out what is of greatest importance in Popper’s work, what its failings (...) are, how it needs to be improved to overcome these failings, and what implications emerge as a result. The book consists of a collection of essays which dramatically develop Karl Popper’s views about natural and social science, and how we should go about trying to solve social problems. Criticism of Popper’s falsificationist philosophy of natural science leads to a conception of science that I call aim-oriented empiricism. This makes explicit metaphysical theses concerning the comprehensibility and knowability of the universe that are an implicit part of scientific knowledge – implicit in the way science excludes all theories that are not explanatory, even those that are more successful empirically than accepted theories. Aim-oriented empiricism has major implications, not just for the academic discipline of philosophy of science, but for science itself. Popper generalized his philosophy of science of falsificationism to arrive at a new conception of rationality – critical rationalism – the key methodological idea of Popper’s profound critical exploration of political and social issues in his The Open Society and Its Enemies, and The Poverty of Historicism. This path of Popper, from scientific method to rationality and social and political issues is followed here, but the starting point is aim-oriented empiricism rather than falsificationism. Aim-oriented empiricism is generalized to form a conception of rationality I call aim-oriented rationalism. This has far-reaching implications for political and social issues, for the nature of social inquiry and the humanities, and indeed for academic inquiry as a whole. The strategies for tackling social problems that arise from aim-oriented rationalism improve on Popper’s recommended strategies of piecemeal social engineering and critical rationalism, associated with Popper’s conception of the open society. This book thus sets out to develop Popper’s philosophy in new and fruitful directions. The theme of the book, in short, is to discover what can be learned from scientific progress about how to achieve social progress towards a better world. (shrink)
The aim of this paper is twofold: (1) to assess whether the construct of neural representations plays an explanatory role under the variational free-energy principle and its corollary process theory, active inference; and (2) if so, to assess which philosophical stance - in relation to the ontological and epistemological status of representations - is most appropriate. We focus on non-realist (deflationary and fictionalist-instrumentalist) approaches. We consider a deflationary account of mental representation, according to which the explanatorily relevant contents of neural (...) representations are mathematical, rather than cognitive, and a fictionalist or instrumentalist account, according to which representations are scientifically useful fictions that serve explanatory (and other) aims. After reviewing the free-energy principle and active inference, we argue that the model of adaptive phenotypes under the free-energy principle can be used to furnish a formal semantics, enabling us to assign semantic content to specific phenotypic states (the internal states of a Markovian system that exists far from equilibrium). We propose a modified fictionalist account: an organism-centered fictionalism or instrumentalism. We argue that, under the free-energy principle, pursuing even a deflationary account of the content of neural representations licenses the appeal to the kind of semantic content involved in the aboutness or intentionality of cognitive systems; our position is thus coherent with, but rests on distinct assumptions from, the realist position. We argue that the free-energy principle thereby explains the aboutness or intentionality in living systems and hence their capacity to parse their sensory stream using an ontology or set of semantic factors. (shrink)
This article provides a narrative review of the scholarly writings on professional ethics education for future teachers. Against the background of a widespread belief among scholars working in this area that longstanding and sustained research and reflection on the ethics of teaching have had little impact on the teacher education curriculum, the article takes stock of the field by synthesizing viewpoints on key aspects of teaching ethics to teacher candidates—the role ethics plays in teacher education, the primary objectives of ethics (...) education for teachers, recommended teaching and learning strategies, and challenges to introducing ethics curriculum—and maps out how opinions on these matters have evolved over the three decades since the initial publication of Strike and Soltis’ seminal book, The Ethics of Teaching. In light of the review’s results, the article identifies critical deficits in this literature and proposes a set of recommendations for future inquiry. (shrink)
Boltzmann's gas model representing the second law of thermodynamics is based on the improbability of certain molecular distributions in space. Maxwell argued that a hypothetical ‘being’ with the faculty of seeing individual molecules could bring about such improbable distributions, thus violating the law of entropy. However, it appears that to render the molecules visible for any observer would increase the entropy more than the demon could decrease it, hence ‘Maxwell's Demon cannot operate’ . In the study presented here (...)Maxwell's Demon is interpreted in a general way as a biological observer system within systems which can upset thermodynamic probabilities provided that the relative magnitudes between observer system and observed system are appropriate.Maxwell's Demon within Boltzmann's Gas Model thus appears only as a special case of inappropriate, relative magnitude between the two systems. (shrink)
It is generally accepted, following Landauer and Bennett, that the process of measurement involves no minimum entropy cost, but the erasure of information in resetting the memory register of a computer to zero requires dissipating heat into the environment. This thesis has been challenged recently in a two-part article by Earman and Norton. I review some relevant observations in the thermodynamics of computation and argue that Earman and Norton are mistaken: there is in principle no entropy cost to the acquisition (...) of information, but the destruction of information does involve an irreducible entropy cost. (shrink)
History of Cognitive Neuroscience documents the major neuroscientific experiments and theories over the last century and a half in the domain of cognitive neuroscience, and evaluates the cogency of the conclusions that have been drawn from them. Provides a companion work to the highly acclaimed Philosophical Foundations of Neuroscience – combining scientific detail with philosophical insights Views the evolution of brain science through the lens of its principal figures and experiments Addresses philosophical criticism of Bennett and Hacker?s previous book Accompanied (...) by more than 100 illustrations. (shrink)
Is Science Neurotic? sets out to show that science suffers from a damaging but rarely noticed methodological disease — “rationalistic neurosis.” Assumptions concerning metaphysics, human value and politics, implicit in the aims of science, are repressed, and the malaise has spread to affect the whole academic enterprise, with the potential for extraordinarily damaging long-term consequences. The book begins with a discussion of the aims and methods of natural science, and moves on to discuss social science, philosophy, education, psychoanalytic theory and (...) academic inquiry as a whole. It makes an original and compelling contribution to the current debate between those for and those against science, arguing that science would be of greater human value if it were more rigorous — we suffer not from too much scientific rationality, but too little. The author discusses the need for a revolution in the aims of science and academic inquiry in general and, in a lively and accessible style, spells out a thesis with profound importance for the long-term future of humanity. (shrink)
From Knowledge to Wisdom argues that there is an urgent need, for both intellectual and humanitarian reasons, to bring about a revolution in science and the humanities. The outcome would be a kind of academic inquiry rationally devoted to helping humanity learn how to create a better world. Instead of giving priority to solving problems of knowledge, as at present, academia would devote itself to helping us solve our immense, current global problems – climate change, war, poverty, population growth, pollution... (...) of sea, earth and air, destruction of natural habitats and rapid extinction of species, injustice, tyranny, proliferation of armaments, conventional, chemical, biological and nuclear, depletion of natural resources. The basic intellectual aim of inquiry would be to seek and promote wisdom – wisdom being the capacity to realize what is of value in life for oneself and others, thus including knowledge and technological know-how, but much else besides. This second edition has been revised throughout, has additional material, a new introduction and three new chapters. (shrink)
What ought to be the aims of science? How can science best serve humanity? What would an ideal science be like, a science that is sensitively and humanely responsive to the needs, problems and aspirations of people? How ought the institutional enterprise of science to be related to the rest of society? What ought to be the relationship between science and art, thought and feeling, reason and desire, mind and heart? Should the social sciences model themselves on the natural sciences: (...) or ought they to take a different form if they are to serve the interests of humanity objectively, sensitively and rigorously? Might it be possible to get into human life, into art, education, politics, industry, international affairs, and other domains of human activity, the same kind of progressive success that is found so strikingly, on the intellectual level, within science? These are some of the questions tackled by What’s Wrong With Science? But the book is no abstruse treatise on the philosophy of science. Most of it takes the form of a passionate debate between a Scientist and a Philosopher, a debate that is by turns humorous, ironical, bitter, dramatically explosive. Even as the argument explores the relationship between thought and feeling, reason and desire, the two main protagonists find it necessary to examine their own feelings and motivations. (shrink)
Modern science began as natural philosophy. In the time of Newton, what we call science and philosophy today – the disparate endeavours – formed one mutually interacting, integrated endeavour of natural philosophy: to improve our knowledge and understanding of the universe, and to improve our understanding of ourselves as a part of it. Profound, indeed unprecedented discoveries were made. But then natural philosophy died. It split into science on the one hand, and philosophy on the other. This happened during the (...) 18th and 19th centuries, and the split is now built into our intellectual landscape. But the two fragments, science and philosophy, are defective shadows of the glorious unified endeavour of natural philosophy. Rigour, sheer intellectual good sense and decisive argument demand that we put the two together again, and rediscover the immense merits of the integrated enterprise of natural philosophy. This requires an intellectual revolution, with dramatic implications for how we understand our world, how we understand and do science, and how we understand and do philosophy. There are dramatic implications, too, for education, and for the entire academic endeavour, and its capacity to help us discover how to tackle more successfully our immense global problems. (shrink)
Consumer actions towards multinationals encompass not just expressions of dissatisfaction and ethical identity but also what are problematically termed ‘instrumental actions’ entailing perceived purposes and likely impacts. This term may seem inappropriate where insufficient information exists for instrumentally linking means to ends, yet we consider it useful for describing purposive consumer action in its subjective aspect because it reflects the psychological reality whereby complexity-reducing social constructions give consumer actions instrumentally rational form for purposes of meaningful understanding and justification. This paper (...) is particularly concerned to explore the complexities of cause and intention—particularly ethical intention—which are thus reduced. In particular, it considers complex interaction between individual ethical values, demographic factors and contexts of societal practice. It seeks to highlight primary antecedents among these interactants in order to guide both consumers and multinationals in their complexity-reducing social constructions to improve their fit to true causes and intentions. Study 1 involved 606 United Kingdom nationals, while study 2 involved 2561 individuals from 15 nations. Both sets of findings link higher personal income levels to propensity to engage in instrumental actions towards multinationals. Overwhelmingly, however, individual ethical values seem to matter most, irrespective of demographic or cultural contexts. These findings suggest that both consumers and multinationals engaged in ethical dialogue with consumers are best advised to articulate a universalising and not culturally or nationally bound ethical intelligence, which speaks directly to conscience within a global ethical discourse. (shrink)
We investigate Maxwell's attempt to justify the mathematical assumptions behind his 1860 Proposition IV according to which the velocity components of colliding particles follow the normal distribution. Contrary to the commonly held view we find that his molecular collision model plays a crucial role in reaching this conclusion, and that his model assumptions also permit inference to equalization of mean kinetic energies, which is what he intended to prove in his discredited and widely ignored Proposition VI. If we take (...) a charitable reading of his own proof of Proposition VI then it was Maxwell, and not Boltzmann, who gave the first proof of a tendency towards equilibrium, a sort of H-theorem. We also call attention to a potential conflation of notions of probabilistic and value independence in relevant prior works of his contemporaries and of his own, and argue that this conflation might have impacted his adoption of the suspect independence assumption of Proposition IV. (shrink)
Three experiments are reported in which the relationships between task format, item type, and strategy usage were investigated for a two-dimensional relational inference task. Contrary to past findings with linear syllogisms, it was found that parallel presentation (presenting problem statements simultaneously) did not result in any increased use of deduction rule processes compared with serial presentation (presenting problem statements individually). Instead, the results suggested that mental models were used by the majority of subjects, and that multiple models were more likely (...) to be constructed with parallel presentation. It is proposed that, in general, multiple model construction will be more frequent for deduction tasks where the cognitive load is relatively low. Hence, contrary to suggestions by Polk and Newell (1995), reasoning in this way appears to be prevalent and highly robust - where supported by task format - even where the use of this strategy is disadvantageous. (shrink)
In this essay, Bruce Maxwell, David Waddington, Kevin McDonough, Andrée-Anne Cormier, and Marina Schwimmer compare two competing approaches to social integration policy, Multiculturalism and Interculturalism, from the perspective of the issue of the state funding and regulation of conservative religious schools. After identifying the key differences between Interculturalism and Multiculturalism, as well as their many similarities, the authors present an explanatory analysis of this intractable policy challenge. Conservative religious schooling, they argue, tests a conceptual tension inherent in Multiculturalism between (...) respect for group diversity and autonomy, on the one hand, and the ideal of intercultural citizenship, on the other. Taking as a case study Québec's education system and, in particular, recent curricular innovations aimed at helping young people acquire the capabilities of intercultural citizenship, the authors illustrate how Interculturalism signals a compelling way forward in the effort to overcome the political dilemma of conservative religious schooling. (shrink)
Two arguments have recently been advanced that Maxwell-Boltzmann particles areindistinguishable just like Bose–Einstein and Fermi–Dirac particles. Bringing modalmetaphysics to bear on these arguments shows that ontological indistinguishabilityfor classical (MB) particles does not follow. The first argument, resting on symmetryin the occupation representation for all three cases, fails since peculiar correlationsexist in the quantum (BE and FD) context as harbingers of ontic indistinguishability,while the indistinguishability of classical particles remains purely epistemic. The secondargument, deriving from the classical limits of quantum statistical (...) partition functions,embodies a conceptual confusion. After clarifying the doctrine of haecceitism, a thirdargument is considered that attempts to deflate metaphysical concerns altogether byshowing that the phase-space and distribution-space representations of MB-statisticshave contrary haecceitistic import. Careful analysis shows this argument to fail as well,leaving de re modality unproblematically grounding particle identity in the classicalcontext while genuine puzzlement about the underlying ontology remains for quantumstatistics. (shrink)
A particle of molecular dimensions which can exist in two states is associated with a membrane pore through which molecules of a gas can pass. The gas molecules from two identical phases on either side of the membrane may pass only when the particle is in one particular state. If certain restrictions are imposed on the system, then the particle appears to act like a Maxwell's Demon(1) which “handles” the gas molecules during their passage through the pore.
In evolutionary biology, niche construction is sometimes described as a genuine evolutionary process whereby organisms, through their activities and regulatory mechanisms, modify their environment such as to steer their own evolutionary trajectory, and that of other species. There is ongoing debate, however, on the extent to which niche construction ought to be considered a bona fide evolutionary force, on a par with natural selection. Recent formulations of the variational free-energy principle as applied to the life sciences describe the properties of (...) living systems, and their selection in evolution, in terms of variational inference. We argue that niche construction can be described using a variational approach. We propose new arguments to support the niche construction perspective, and to extend the variational approach to niche construction to current perspectives in various scientific fields. (shrink)
This paper takes another look at a case study which has featured prominently in a variety of arguments for rival realist positions. After critically reviewing the previous commentaries of the theory shift that took place in the transition from Fresnel’s ether to Maxwell’s electromagnetic theory of optics, it will defend a slightly different reading of this historical case study. Central to this task is the notion of explanatory approximate truth, a concept which must be carefully analysed to begin with. (...) With this notion properly understood, it will be finally argued, the popular Fresnel-Maxwell case study points towards a novel formulation of scientific realism. (shrink)
The exercise of identifying lessons in the aftermath of a major public health emergency is of immense importance for the improvement of global public health emergency preparedness and response. Despite the persistence of the Ebola Virus Disease outbreak in West Africa, it seems that the Ebola ‘lessons learned’ exercise is now in full swing. On our assessment, a significant shortcoming plagues recent articulations of lessons learned, particularly among those emerging from organizational reflections. In this article we argue that, despite not (...) being recognized as such, the vast majority of lessons proffered in this literature should be understood as ethical lessons stemming from moral failures, and that any improvements in future global public health emergency preparedness and response are in large part dependent on acknowledging this fact and adjusting priorities, policies and practices accordingly such that they align with values that better ensure these moral failures are not repeated and that new moral failures do not arise. We cannot continue to fiddle at the margins without critically reflecting on our repeated moral failings and committing ourselves to a set of values that engenders an approach to global public health emergencies that embodies a sense of solidarity and global justice. (shrink)
Moral foundations theory chastises cognitive developmental theory for having foisted on moral psychology a restrictive conception of the moral domain which involves arbitrarily elevating the values of justice and caring. The account of this negative influence on moral psychology, referred to in the moral foundations theory literature as the ?great narrowing?, involves several interrelated claims concerning the scope of the moral domain construct in cognitive moral developmentalism, the procedure by which it was initially elaborated, its empirical grounds and the influence (...) of this conception of the moral domain on research in moral psychology. Examining these claims in light of key theoretical writings on the moral domain concept in cognitive moral developmentalism, the paper shows that the ?great narrowing? narrative is misinformed, superficial and historically inaccurate. On the basis of this critical analysis, we conclude that the primary heuristic value of the ?great narrowing narrative? is as a case lesson in the deep specificity of competing conceptions of the moral domain to the theoretical frameworks in which they are devised. (shrink)
Page generated Thu Aug 5 11:51:11 2021 on philpapers-web-65948fd446-659hb
cache stats: hit=6682, miss=6778, save= autohandler : 1900 ms called component : 1883 ms search.pl : 1592 ms render loop : 1368 ms addfields : 822 ms publicCats : 742 ms next : 476 ms initIterator : 220 ms autosense : 188 ms match_other : 153 ms save cache object : 135 ms retrieve cache object : 106 ms menu : 92 ms quotes : 57 ms match_cats : 32 ms search_quotes : 29 ms prepCit : 28 ms applytpl : 7 ms match_authors : 2 ms intermediate : 1 ms init renderer : 0 ms setup : 0 ms writelog : 0 ms auth : 0 ms