Unger has recently argued that if you are the only thinking and experienc- ing subject in your chair, then you are not a material object. This leads Unger to endorse a version of Substance Dualism according to which we are immaterial souls. This paper argues that this is an overreaction. We argue that the specifically Dualist elements of Unger’s view play no role in his response to the problem; only the view’s structure is required, and that is available to Unger’s (...) opponents. We outline one such non-Dualist view, suggest how to resolve the dispute, respond to some objections, and argue that ours is but one of many views that survive Unger’s challenge. All these views are incompatible with microphysicalism. So Unger’s discussion does contain an insight: if you are the only conscious subject in your chair, then microphsyicalism is false. Unger’s mistake was to infer Substance Dualism from this; for microphysi- calism is not the only alternative to Dualism. (shrink)
Idealizing conditions are scapegoats for scientific hypotheses, too often blamed for falsehood better attributed to less obvious sources. But while the tendency to blame idealizations is common among both philosophers of science and scientists themselves, the blame is misplaced. Attention to the nature of idealizing conditions, the content of idealized hypotheses, and scientists’ attitudes toward those hypotheses shows that idealizing conditions are blameless when hypotheses misrepresent. These conditions help to determine the content of idealized hypotheses, and they do so in (...) a way that prevents those hypotheses from being false by virtue of their constituent idealizations. (shrink)
In 1952, Heinrich Scholz published a question in The Journal of Symbolic Logic asking for a characterization of spectra, i.e., sets of natural numbers that are the cardinalities of finite models of first order sentences. Günter Asser in turn asked whether the complement of a spectrum is always a spectrum. These innocent questions turned out to be seminal for the development of finite model theory and descriptive complexity. In this paper we survey developments over the last 50-odd years pertaining to (...) the spectrum problem. Our presentation follows conceptual developments rather than the chronological order. Originally a number theoretic problem, it has been approached by means of recursion theory, resource bounded complexity theory, classification by complexity of the defining sentences, and finally by means of structural graph theory. Although Scholz' question was answered in various ways, Asser's question remains open. (shrink)
Using as case studies two early diagrams that represent mechanisms of the cell division cycle, we aim to extend prior philosophical analyses of the roles of diagrams in scientific reasoning, and specifically their role in biological reasoning. The diagrams we discuss are, in practice, integral and indispensible elements of reasoning from experimental data about the cell division cycle to mathematical models of the cycle’s molecular mechanisms. In accordance with prior analyses, the diagrams provide functional explanations of the cell cycle and (...) facilitate the construction of mathematical models of the cell cycle. But, extending beyond those analyses, we show how diagrams facilitate the construction of mathematical models, and we argue that the diagrams permit nomological explanations of the cell cycle. We further argue that what makes diagrams integral and indispensible for explanation and model construction is their nature as locality aids: they group together information that is to be used together in a way that sentential representations do not. (shrink)
Abstract: According to certain dispositional accounts of meaning, an agent's meaning is determined by the dispositions that an idealized version of this agent has in optimal conditions. We argue that such attempts cannot properly fix meaning. For even if there is a way to determine which features of an agent should be idealized without appealing to what the agent means, there is no non-circular way to determine how those features should be idealized. We sketch an alternative dispositional account that avoids (...) this problem, according to which an agent's meaning is determined by the dispositions that an abstract version of this agent has in optimal conditions. (shrink)
Central to discussion of supervaluationist accounts of vagueness is the extent to which they require revisions of classical logic and if so, whether those revisions are objectionable. In an important recent Journal of Philosophy article, J.R.G. Williams presents a powerful challenge to the orthodox view that supervaluationism is objectionably revisionary. Williams argues both that supervaluationism is non-revisionary and that even if it were, those revisions would be unobjectionable. This note shows that his arguments for both claims fail.
This paper elaborates upon various responses to the Problem of the One over the Many, in the service of two central goals. The first is to situate Huayan's mereology within the context of Buddhism's historical development, showing its continuity with a broader tradition of philosophizing about part-whole relations. The second goal is to highlight the way in which Huayan's mereology combines the virtues of the Nyāya-Vaisheshika and Indian Buddhist solutions to the Problem of the One over the Many while avoiding (...) their vices. (shrink)
This is an attempt to explain, in a way familiar to contemporary ways of thinking about mereology, why someone might accept some prima facie puzzling remarks by Fazang, such as his claims that the eye of a lion is its ear and that a rafter of a building is identical to the building itself. These claims are corollaries of the Huayan Buddhist thesis that everything is part of everything else, and it is intended here to show that there is a (...) rational basis for this thesis that involves a nonstandard notion of parthood and, importantly, that does not violate the principle of noncontradiction. (shrink)
In his _Treatise on the Golden Lion_, Fazang says that wholes are _in_ each of their parts and that each part of a whole _is_ every other part of the whole. In this paper, I offer an interpretation of these remarks according to which they are not obviously false, and I use this interpretation in order to rigorously reconstruct Fazang's arguments for his claims. On the interpretation I favor, Fazang means that the presence of a whole's part suffices for the (...) presence of the whole and that the presence of any such part is both necessary and sufficient for the presence of any other part. I also argue that this interpretation is more plausible than its extant competitors. (shrink)
General Relativity and the Standard Model often are touted as the most rigorously and extensively confirmed scientific hypotheses of all time. Nonetheless, these theories appear to have consequences that are inconsistent with evidence about phenomena for which, respectively, quantum effects and gravity matter. This paper suggests an explanation for why the theories are not disconfirmed by such evidence. The key to this explanation is an approach to scientific hypotheses that allows their actual content to differ from their apparent content. This (...) approach does not appeal to ceteris-paribus qualifiers or counterfactuals or similarity relations. And it helps to explain why some highly idealized hypotheses are not treated in the way that a thoroughly refuted theory is treated but instead as hypotheses with limited domains of applicability. (shrink)
The ontologies of scientific theories include a variety of objects: point-mass particles, rigid rods, frictionless planes, flat and curved spacetimes, perfectly spherical planets, continuous fluids, ideal gases, nonidentical but indistinguishable electrons, atoms, quarks and gluons, strong and weak nuclear forces, ideally rational agents, and so on. But the scientific community currently regards only some of these objects as real. According to Paul Teller, a group sometimes can be justified in regarding competing ontologies as real and the ontologies we are justified (...) in regarding as real are inexact, because the theories that give those ontologies characterize what things are like rather than what they are. In this paper, I argue that Teller's view is incomplete and suggest that one way to remove this incompleteness is to adopt a criterion for when we are justified in regarding a theory's ontology as real that is based upon a theory's comparative degree of confirmation. I argue that this criterion is prima-facie plausible and that Teller's view is false if this criterion is correct. (shrink)
In this reply to Gregory Peterson's essay "Maintaining Respectability," which itself is a response to my "Is Theology Respectable as Metaphysics?" I elaborate upon my claims that theology treats God's existence as an absolute certainty immune to refutation and that modern science constitutes the canons of respectable reasoning for metaphysical disciplines. I conclude with some comments on Peterson's "In Praise of Folly? Theology and the University.".
Theology involves inquiry into God's nature, God's purposes, and whether certain experiences or pronouncements come From God. These inquiries are metaphysical, part of theology's concern with the veridicality of signs and realities that are independent from humans. Several research programs concerned with the relation between theology and science aim to secure theology's intellectual standing as a metaphysical discipline by showing that it satisfies criteria that make modern science reputable, on the grounds that modern science embodies contemporary canons of respectability for (...) metaphysical disciplines. But, no matter the ways in which theology qua metaphysics is shown to resemble modern science, these research programs seem destined for failure. For, given the currently dominant approaches to understanding modern scientific epistemology, theological reasoning is crucially dissimilar to modern scientific reasoning in that it treats the existence of God as a certainty immune to refutation. Barring the development of an epistemology of modern science that is amenable to theology, theology as metaphysics is intellectually disreputable. (shrink)
The activities of the life sciences are essential to provide solutions for the future, for both individuals and society. Society has demanded growing accountability from the scientific community as implications of life science research rise in influence and there are concerns about the credibility, integrity and motives of science. While the scientific community has responded to concerns about its integrity in part by initiating training in research integrity and the responsible conduct of research, this approach is minimal. The scientific community (...) justifies itself by appealing to the ethos of science, claiming academic freedom, self-direction, and self-regulation, but no comprehensive codification of this foundational ethos has been forthcoming. A review of the professional norms of science and a prototype code of ethics for the life sciences provide a framework to spur discussions within the scientific community to define scientific professionalism. A formalization of implicit principles can provide guidance for recognizing divergence from the norms, place these norms within a context that would enhance education of trainees, and provide a framework for discussing externally and internally applied pressures that are influencing the practice of science. The prototype code articulates the goal for life sciences research and the responsibilities associated with the freedom of exploration, the principles for the practice of science, and the virtues of the scientists themselves. The time is ripe for scientific communities to reinvigorate professionalism and define the basis of their social contract. Codifying the basis of the social contract between science and society will sustain public trust in the scientific enterprise. (shrink)
In "Bayesian Confirmation of Theories that Incorporate Idealizations", Michael Shaffer argues that, in order to show how idealized hypotheses can be confirmed, Bayesians must develop a coherent proposal for how to assign prior probabilities to counterfactual conditionals. This paper develops a Bayesian reply to Shaffer's challenge that avoids the issue of how to assign prior probabilities to counterfactuals by treating idealized hypotheses as abstract descriptions. The reply allows Bayesians to assign non-zero degrees of confirmation to idealized hypotheses and to capture (...) the intuition that less idealized hypotheses tend to be better confirmed than their more idealized counterparts. (shrink)
This document is a synopsis of discussions at the workshop prepared by Nicholaos Jones and Kevin Coffey, with remarks added by by Chuang Liu, John D. Norton, John Earman, Gordon Belot, Mark Wilson, Bob Batterman and Margie Morrison. The program is included in an appendix.
: Can contradictions be meaningful? How can one assert 'P soku not-P' or 'P and yet not-P' without sacrificing intelligibility? Expanding on previous attempts, mainly by Dilworth and Heisig, to demystify the soku connective, a formal system is presented here for the logic of soku. Through a formal distinction between internal and external negation, grammatical features of the soku connective are shown to be logically irrelevant, and the principle of non-contradiction is preserved. Disparities with traditional logic are noted, with a (...) focus on negation rather than 'soku'. The formal examination of the logic of soku is intended to present the logic in a way acceptable to more analytically minded philosophers and thereby enhance East-West and Japanese-Anglo-American interaction and criticism. (shrink)