In this paper we consider whether Christopher Bartel has resolved the gamer’s dilemma. The gamer’s dilemma highlights a discrepancy in our moral judgements about the permissibility of performing certain actions in computer games. Many gamers have the intuition that virtual murder is permissible in computer games, whereas virtual paedophilia is not. Yet finding a relevant moral distinction to ground such intuitions can be difficult. Bartel suggests a relevant moral distinction may turn on the notion that virtual paedophilia harms women in (...) a way that virtual murder does not. We argue that this distinction is only in a position to provide a partial solution to the dilemma. (shrink)
Mathias Frisch provides the first sustained philosophical discussion of conceptual problems in classical particle-field theories. Part of the book focuses on the problem of a satisfactory equation of motion for charged particles interacting with electromagnetic fields. As Frisch shows, the standard equation of motion results in a mathematically inconsistent theory, yet there is no fully consistent and conceptually unproblematic alternative theory. Frisch describes in detail how the search for a fundamental equation of motion is partly driven by pragmatic considerations (...) (like simplicity and mathematical tractability) that can override the aim for full consistency. The book also offers a comprehensive review and criticism of both the physical and philosophical literature on the temporal asymmetry exhibited by electromagnetic radiation fields, including Einstein's discussion of the asymmetry and Wheeler and Feynman's influential absorber theory of radiation. Frisch argues that attempts to derive the asymmetry from thermodynamic or cosmological considerations fail and proposes that we should understand the asymmetry as due to a fundamental causal constraint. The book's overarching philosophical thesis is that standard philosophical accounts that strictly identify scientific theories with a mathematical formalism and a mapping function specifying the theory's ontology are inadequate, since they permit neither inconsistent yet genuinely successful theories nor thick causal notions to be part of fundamental physics. (shrink)
Much has been written on the role of causal notions and causal reasoning in the so-called 'special sciences' and in common sense. But does causal reasoning also play a role in physics? Mathias Frisch argues that, contrary to what influential philosophical arguments purport to show, the answer is yes. Time-asymmetric causal structures are as integral a part of the representational toolkit of physics as a theory's dynamical equations. Frisch develops his argument partly through a critique of anti-causal arguments and (...) partly through a detailed examination of actual examples of causal notions in physics, including causal principles invoked in linear response theory and in representations of radiation phenomena. Offering a new perspective on the nature of scientific theories and causal reasoning, this book will be of interest to professional philosophers, graduate students, and anyone interested in the role of causal thinking in science. (shrink)
We give topological characterizations of filters${\cal F}$onωsuch that the Mathias forcing${M_{\cal F}}$adds no dominating reals or preserves ground model unbounded families. This allows us to answer some questions of Brendle, Guzmán, Hrušák, Martínez, Minami, and Tsaban.
According to the Law of Non–Contradiction, no statement and its negation are jointly true. According to many critics, Christians cannot serve both the orthodox faith and the Law of Non–Contradiction: if they hold to the one they must despise the other. And according to an impressive number of these critics, Christians who cling to the traditional doctrine of the Trinity must despise the Law of Non–Contradiction. Augustine's statement of this doctrine poses the problem as poignantly as any.
Priority setting in health care is ubiquitous and health authorities are increasingly recognising the need for priority setting guidelines to ensure efficient, fair, and equitable resource allocation. While cost-effectiveness concerns seem to dominate many policies, the tension between utilitarian and deontological concerns is salient to many, and various severity criteria appear to fill this gap. Severity, then, must be subjected to rigorous ethical and philosophical analysis. Here we first give a brief history of the path to today’s severity criteria in (...) Norway and Sweden. The Scandinavian perspective on severity might be conducive to the international discussion, given its long-standing use as a priority setting criterion, despite having reached rather different conclusions so far. We then argue that severity can be viewed as a multidimensional concept, drawing on accounts of need, urgency, fairness, duty to save lives, and human dignity. Such concerns will often be relative to local mores, and the weighting placed on the various dimensions cannot be expected to be fixed. Thirdly, we present what we think are the most pertinent questions to answer about severity in order to facilitate decision making in the coming years of increased scarcity, and to further the understanding of underlying assumptions and values that go into these decisions. We conclude that severity is poorly understood, and that the topic needs substantial further inquiry; thus we hope this article may set a challenging and important research agenda. (shrink)
This essay reconstructs the place of utopia in realist political theory, by examining the ways in which the literary genre of critical utopias can productively unsettle ongoing discussions about “how to do political theory.” I start by analyzing two prominent accounts of the relationship between realism and utopia: “real utopia” and “dystopic liberalism”. Elaborating on Raymond Geuss’s recent reflections, the essay then claims that an engagement with literature can shift the focus of these accounts. Utopian fiction, I maintain, is useful (...) for comprehending what is and for contemplating what might be. Ursula K. Le Guin’s novel The Dispossessed deploys this double function in an exemplary fashion: through her dynamic and open-ended portrayal of an Anarchist community, Le Guin succeeds in imagining a utopia that negates the status quo, without striving to construct a perfect society. The book’s radical, yet ambiguous, narrative hence reveals a strategy for locating utopia within realist political theory that moves beyond the positions dominating the current debate. Reading The Dispossessed ultimately demonstrates that realism without utopia is status quo–affirming, while utopia without realism is wishful thinking. (shrink)
The visual brain consists of several parallel, functionally specialized processing systems, each having several stages (nodes) which terminate their tasks at different times; consequently, simultaneously presented attributes are perceived at the same time if processed at the same node and at different times if processed by different nodes. Clinical evidence shows that these processing systems can act fairly autonomously. Damage restricted to one system compromises specifically the perception of the attribute that that system is specialized for; damage to a given (...) node of a processing system that leaves earlier nodes intact results in a degraded perceptual capacity for the relevant attribute, which is directly related to the physiological capacities of the cells left intact by the damage. By contrast, a system that is spared when all others are damaged can function more or less normally. Moreover, internally created visual percepts-illusions, afterimages, imagery, and hallucinations-activate specifically the nodes specialized for the attribute perceived. Finally, anatomical evidence shows that there is no final integrator station in the brain, one which receives input from all visual areas; instead, each node has multiple outputs and no node is recipient only. Taken together, the above evidence leads us to propose that each node of a processing-perceptual system creates its own microconsciousness. We propose that, if any binding occurs to give us our integrated image of the visual world, it must be a binding between microconsciousnesses generated at different nodes. Since any two microconsciousnesses generated at any two nodes can be bound together, perceptual integration is not hierarchical, but parallel and postconscious. By contrast, the neural machinery conferring properties on those cells whose activity has a conscious correlate is hierarchical, and we refer to it as generative binding, to distinguish it from the binding that might occur between the microconsciousnesses. (shrink)
We study the Mathias–Prikry and Laver–Prikry forcings associated with filters on ω. We give a combinatorial characterization of Martinʼs number for these forcing notions and present a general scheme for analyzing preservation properties for them. In particular, we give a combinatorial characterization of those filters for which the Mathias–Prikry forcing does not add a dominating real.
Priority setting in health care is ubiquitous and health authorities are increasingly recognising the need for priority setting guidelines to ensure efficient, fair, and equitable resource allocation. While cost-effectiveness concerns seem to dominate many policies, the tension between utilitarian and deontological concerns is salient to many, and various severity criteria appear to fill this gap. Severity, then, must be subjected to rigorous ethical and philosophical analysis. Here we first give a brief history of the path to today’s severity criteria in (...) Norway and Sweden. The Scandinavian perspective on severity might be conducive to the international discussion, given its long-standing use as a priority setting criterion, despite having reached rather different conclusions so far. We then argue that severity can be viewed as a multidimensional concept, drawing on accounts of need, urgency, fairness, duty to save lives, and human dignity. Such concerns will often be relative to local mores, and the weighting placed on the various dimensions cannot be expected to be fixed. Thirdly, we present what we think are the most pertinent questions to answer about severity in order to facilitate decision making in the coming years of increased scarcity, and to further the understanding of underlying assumptions and values that go into these decisions. We conclude that severity is poorly understood, and that the topic needs substantial further inquiry; thus we hope this article may set a challenging and important research agenda. (shrink)
David Albert and Barry Loewer have argued that the temporal asymmetry of our concept of causal influence or control is grounded in the statistical mechanical assumption of a low-entropy past. In this paper I critically examine Albert's and Loewer 's accounts.
The associative theory of creativity states that creativity is associated with differences in the structure of semantic memory, whereas the executive theory of creativity emphasises the role of top-down control for creative thought. For a powerful test of these accounts, individual semantic memory structure was modelled with a novel method based on semantic relatedness judgements and different criteria for network filtering were compared. The executive account was supported by a correlation between creative ability and broad retrieval ability. The associative account (...) was independently supported, when network filtering was based on a relatedness threshold, but not when it was based on a fixed edge number or on the analysis of weighted networks. In the former case, creative ability was associated with shorter average path lengths and higher clustering of the network, suggesting that the semantic networks of creative people show higher small-worldness. (shrink)
The grounds of justice -- "Un pouvoir ordinaire": shared membership in a state as a ground of -- Justice -- Internationalism versus statism and globalism: contemporary debates -- What follows from our common humanity? : the institutional stance, human rights, and nonrelationism -- Hugo Grotius revisited : collective ownership of the Earth and global public reason -- "Our sole habitation" : a contemporary approach to collective ownership of the earth -- Toward a contingent derivation of human rights -- Proportionate use (...) : immigration and original ownership of the Earth -- "But the earth abideth for ever" : obligations to future generations -- Climate change and ownership of the atmosphere -- Human rights as membership rights in the global order -- Arguing for human rights : essential pharmaceuticals -- Arguing for human rights : labor rights as human rights -- Justice and trade -- The way we live now -- "Imagine there's no countries" : a reply to John Lennon -- Justice and accountability : the state -- Justice and accountability : the World Trade Organization. (shrink)
In the recent literature on concepts, two extreme positions concerning animal minds are predominant: the one that animals possess neither concepts nor beliefs, and the one that some animals possess concepts as well as beliefs. A characteristic feature of this controversy is the lack of consensus on the criteria for possessing a concept or having a belief. Addressing this deficit, we propose a new theory of concepts which takes recent case studies of complex animal behavior into account. The main aim (...) of the paper is to present an epistemic theory of concepts and to defend a detailed theory of criteria for having concepts. The distinction between nonconceptual, conceptual, and propositional representations is inherent to this theory. Accordingly, it can be reasonably argued that some animals, e.g., grey parrots and apes, operate on conceptual representations. (shrink)
In a recent extended essay, philosopher Daniel Hausman goes a long way towards dismissing severity as a morally relevant attribute in the context of priority setting in healthcare. In this response, we argue that although Hausman certainly points to real problems with how severity is often interpreted and operationalised within the priority setting context, the conclusion that severity does not contain plausible ethical content is too hasty. Rather than abandonment, our proposal is to take severity seriously by carefully mapping the (...) possibly multiple underlying accounts to well-established ethical theories, in a way that is both morally defensible and aligned with the term’s colloquial uses. (shrink)
Aratus has been notorious for his wordplay since the first decades of his reception. Hellenistic readers such as Callimachus, Leonidas, or ‘King Ptolemy’ seem to have picked up on the pun on the author's own name atPhaenomena2, as well as on the famous λεπτή acrostic atPhaen.783–6 that will be revisited here. Three carefully placed occurrences of the adjective have so far been uncovered in the passage, but for a full appreciation of its elegance we must note that Aratus has set (...) his readers up to notice a fourth. (shrink)
Philosophy emerged for the first time in ancient Greece and, according to Gilles Deleuze and Félix Guattari, it arose decisively with Plato through a triple transformation. Even today, the thought and creation of philosophy still require a triple transformation, despite the fact that the historical preconditions under which a philosopher pursues his or her task have changed since Greek antiquity. In this article, I introduce the concept of the triple transformation, which ensues from my examination of What is Philosophy?, the (...) last jointly authored book of Deleuze and Guattari. Therein they define philosophy as the activity that consists of creating concepts in order to bring forth new events... (shrink)
Many climate scientists have made claims that may suggest that evidence used in tuning or calibrating a climate model cannot be used to evaluate the model. By contrast, the philosophers Katie Steele and Charlotte Werndl have argued that, at least within the context of Bayesian confirmation theory, tuning is simply an instance of hypothesis testing. In this paper I argue for a weak predictivism and in support of a nuanced reading of climate scientists’ concerns about tuning: there are cases, model-tuning (...) among them, in which predictive successes are more highly confirmatory of a model than accommodation of evidence. (shrink)
For its elliptical style, What is Philosophy? appears to be fragmentary and inscrutable, and its reception has been correspondingly contentious. Following an intimation by Gilles Deleuze himself, this article proposes that his final book, written in collaboration with Félix Guattari, contains a philosophy of nature. To address this proposition, the article begins by outlining the comprehensive system of nature set out in What is Philosophy?, defining it as an open system in motion that conjoins philosophy with the historical preconditions and (...) intersects it with science and art. The article then addresses the precise method whereby the philosopher as an individual subject, emerging from nature, can succeed in becoming creative – that is, in creating concepts to bring forth new events. Finally, the brain turns out to be the pivot between the system and this method. What is Philosophy? thus presents an account of the brain based on a theory of the three specific planes of philosophy, science and art, and uses it to expand upon the idea of assemblage for a philosophy of nature. (shrink)
This article focuses on the distinction between psychosocial types and conceptual personae advanced by Gilles Deleuze and Félix Guattari in What is Philosophy? The conceptual persona is the tool that a philosopher invents in order to create new concepts with which to bring forth new events. Although they present it as one of the three elements of philosophy, its nature and function and, above all, its conjunctions with psychosocial types have been overlooked by scholars. What is Philosophy? contains a list (...) of character traits of which each conceptual persona is composed. The central argument of this article is that this list can well be regarded as a table of categories that enable the exercise and experience of philosophy’s creative thinking. Since the character traits of a conceptual persona match the characteristics of the given psychosocial types, it is necessary to keep inventing new conceptual personae always starting from the historical presuppositions. The philosopher requires the conceptual persona to transfer his or her movements of thought to philosophy’s plane of immanence and thereby transform them in such a manner that philosophy can unfold as a creative power. It emerges as the subject of creative thinking at the same time as the concepts that subject creates, with which it coincides in the moment of creation. With the conceptual persona in What is Philosophy?, Deleuze and Guattari determine the one element of philosophy that makes the transcendental empiricism a method of creation that appears as a precise operation with all its convincing and transparent results. (shrink)
According to a widespread view, which can be traced back to Russell’s famous attack on the notion of cause, causal notions have no legitimate role to play in how mature physical theories represent the world. In this paper I first critically examine a number of arguments for this view that center on the asymmetry of the causal relation and argue that none of them succeed. I then argue that embedding the dynamical models of a theory into richer causal structures can (...) allow us to decide between models in cases where our observational data severely underdetermine our choice of dynamical models. (shrink)
Kennedy School of Government, Harvard University, USA, mathias_risse{at}ksg.harvard.edu ' + u + '@' + d + ' '//--> It is a widespread view that support for Fair Trade is called for, whereas agricultural subsidies are pegged as unjustifiable. Though one supports farmers in developing countries while the other does the same for those in already developed ones, there are, nonetheless, similarities between both scenarios. Both are economically `inefficient', upholding production beyond what the market would sustain. In both cases, supportive arguments (...) can assume two forms. First, such arguments might draw on normative claims made by producers. In the case of agricultural subsidies, farmers in developed countries assert claims against their fellow citizens, who ought to accept redistributive measures to keep them in business. In the case of Fair Trade, the claim can be made by farmers in developing nations against consumers, who ought to pay higher prices to keep them in business (under conditions deemed acceptable). Second, arguments to keep producers in business might be presented as the prerogative of both groups: even if farmers in developed countries did not have a claim to be kept in business, these countries would have the right to take measures to do so because they value their products. In the case of Fair Trade, even if farmers in developing nations had no claim against consumers, it is a consumer prerogative to pay more to keep them in business because they value their product or the process of producing it. There are, of course, differences between these scenarios as well, but in light of these parallels in the moral cases for subsidies and Fair Trade, it will be illuminating to examine the arguments for and against subsidies and Fair Trade together. Key Words: trade subsidies fairness markets development. (shrink)
In recent work on the foundations of statistical mechanics and the arrow of time, Barry Loewer and David Albert have developed a view that defends both a best system account of laws and a physicalist fundamentalism. I argue that there is a tension between their account of laws, which emphasizes the pragmatic element in assessing the relative strength of different deductive systems, and their reductivism or funda- mentalism. If we take the pragmatic dimension in their account seriously, then the laws (...) of the special sciences should be part of our best explanatory system of the world, as well. (shrink)
We argue that the two temporal cognition systems are conceptually too confined to be helpful in understanding the evolution of temporal cognition. In fact, we doubt there are two systems. In relation to this, we question that the authors did not describe the results of our planning study on ravens correctly, as this is of consequence to their theory.
I examine Harvey Brown’s account of relativity as dynamic and constructive theory and Michel Janssen recent criticism of it. By contrasting Einstein’s principle-constructive distinction with a related distinction by Lorentz, I argue that Einstein's distinction presents a false dichotomy. Appealing to Lorentz’s distinction, I argue that there is less of a disagreement between Brown and Janssen than appears initially and, hence, that Brown’s view presents less of a departure from orthodoxy than it may seem. Neither the kinematics-dynamics distinction nor Einstein’s (...) principle- and constructive theory distinction ultimately capture their disagreement, which may instead be a disagreement about the role of modality in science and the explanatory force of putatively nomic constraints. (shrink)
This paper examines people's reasoning about identity continuity and its relation to previous research on how people value one-of-a-kind artifacts, such as artwork. We propose that judgments about the continuity of artworks are related to judgments about the continuity of individual persons because art objects are seen as physical extensions of their creators. We report a reanalysis of previous data and the results of two new empirical studies that test this hypothesis. The first study demonstrates that the mere categorization of (...) an object as “art” versus “a tool” changes people's intuitions about the persistence of those objects over time. In a second study, we examine some conditions that may lead artworks to be thought of as different from other artifacts. These observations inform both current understanding of what makes some objects one-of-a-kind as well as broader questions regarding how people intuitively think about the persistence of human agents. (shrink)
According to a view widely held among philosophers of science, the notion of cause has no legitimate role to play in mature theories of physics. In this paper I investigate the role of what physicists themselves identify as causal principles in the derivation of dispersion relations. I argue that this case study constitutes a counterexample to the popular view and that causal principles can function as genuine factual constraints. Introduction Causality and Dispersion Relations Norton's Skepticism Conclusion.
I show that Albert Einstein’s distinction between principle and constructive theories was predated by Hendrik A. Lorentz’s equivalent distinction between mechanism- and principle-theories. I further argue that Lorentz’s views toward realism similarly prefigure what Arthur Fine identified as Einstein’s ‘‘motivational realism.’’ r 2005 Published by Elsevier Ltd.