There are two central questions concerning probability. First, what are its formal features? That is a mathematical question, to which there is a standard, widely (though not universally) agreed upon answer. This answer is reviewed in the next section. Second, what sorts of things are probabilities---what, that is, is the subject matter of probability theory? This is a philosophical question, and while the mathematical theory of probability certainly bears on it, the answer must come from elsewhere. To see why, observe (...) that there are many things in the world that have the mathematical structure of probabilities---the set of measurable regions on the surface of a table, for example---but that one would never mistake for being probabilities. So probability is distinguished by more than just its formal characteristics. The bulk of this essay will be taken up with the central question of what this “more” might be. (shrink)
Arguably, Hume's greatest single contribution to contemporary philosophy of science has been the problem of induction (1739). Before attempting its statement, we need to spend a few words identifying the subject matter of this corner of epistemology. At a first pass, induction concerns ampliative inferences drawn on the basis of evidence (presumably, evidence acquired more or less directly from experience)—that is, inferences whose conclusions are not (validly) entailed by the premises. Philosophers have historically drawn further distinctions, often appropriating the term (...) “induction” to mark them; since we will not be concerned with the philosophical issues for which these distinctions are relevant, we will use the word “inductive” in a catch-all sense synonymous with “ampliative”. But we will follow the usual practice of choosing, as our paradigm example of inductive inferences, inferences about the future based on evidence drawn from the past and present. A further refinement is more important. Opinion typically comes in degrees, and this fact makes a great deal of difference to how we understand inductive inferences. For while it is often harmless to talk about the conclusions that can be rationally believed on the basis of some.. (shrink)
: Critics have suggested that deliberative democracy reproduces inequalities of gender, race, and class by privileging calm rational discussion over passionate speech and action. Their solution is to supplement deliberation with such forms of emotional expression. Hall argues that deliberation already inherently involves passion, a point that is especially important to recognize in order to deconstruct the dichotomy between reason and passion that plays a central role in reinforcing inequalities of gender, race, and class in the first place.
The authors comments on several articles on addiction. Research suggests that addicted individuals have substantial impairments in cognitive control of behavior. The authors maintain that a proper study of addiction must include a neurobiological model of addiction to draw the attention of bioethicists and addiction neurobiologists. They also state that more addiction neuroscientists like S. E. Hyman are needed as they understand the limits of their research. Accession Number: 24077921; Authors: Carter, Adrian 1; Email Address: email@example.com Hall, Wayne 1; (...) Affiliations: 1: The University of Queensland, Brisbane, Australia; Subject: EDITORIALS; Subject: ADDICTIONS; Subject: BEHAVIOR; Subject: HYMAN, S. E.; Subject: NEUROBIOLOGISTS; Subject: NEUROSCIENTISTS; Number of Pages: 3p. (shrink)
Editorial preface vol. 70.2 Content Type Journal Article Category Editorial Pages 1-2 DOI 10.1007/s11153-011-9321-6 Authors Ronald L. Hall, Department of Philosophy, Stetson University, DeLand, FL, USA Journal International Journal for Philosophy of Religion Online ISSN 1572-8684 Print ISSN 0020-7047.
This book explores the making of health care rationing decisions through the analysis of three alternative decision makers: patients paying out of pocket; officials setting limits on treatments and coverage; and physicians at the bedside. Hall develops this analysis along three dimensions: political economics, ethics, and law. The economic dimension addresses the practical feasibility of each method. The ethical dimension discusses the moral aspects of these methods, while the legal dimension traces the most recent developments in jurisprudence and health (...) law. (shrink)
Hall, Gerard V The term interfaith dialogue may be relatively new and, in the minds of some, not the best term to describe the positive interaction between people of various religious, spiritual and cultural traditions. However, rather than get ourselves hijacked over the best choice of words, we need to acknowledge some fundamental realities. The first is that cultures, societies and religions have evolved in relationship with - and, too often, conflict between - one another. The second is that, (...) even in the darkest moments of religious and cultural conflict, there are outstanding examples of individuals who stood against the tide of hatred, division and intolerance. Throughout history, there are also examples of entire multi-religious societies living in relative harmony and peace, sometimes for centuries. At some level, interfaith dialogue has always been with us - even if it was sometimes looked upon with suspicion. (shrink)
In 1965, Armstrong and Head explored the problem of a pile-up of screw dislocations against a grain boundary. They used numerical methods to determine the positions of the dislocations in the pile-up and they were able to fit approximate formulae for the locations of the first and last dislocations. These formulae were used to gain insights into the Hall?Petch relationship. More recently, Voskoboinikov et al. used asymptotic techniques to study the equivalent problem of a pile-up of a large number (...) of screw dislocations against a bimetallic interface. In this paper, we extend the work of Voskoboinikov et al. to construct systematic asymptotic expressions for the formulae proposed by Armstrong and Head. The further extension of these techniques to more general pile-ups is also outlined. As a result of this work, we show that a pile-up against a grain boundary can become equivalent to a pile-up against a locked dislocation in the case where the mismatch across the boundary is small. (shrink)
Composed more than 2,000 years ago during a turbulent period of Chinese history, the Dao de jing set forth an alternative vision of reality in a world torn apart by violence and betrayal. Daoism, as this subtle but enduring philosophy came to be known, offers a comprehensive view of experience grounded in a full understanding of the wonders hidden in the ordinary. Now in this luminous new translation, based on the recently discovered ancient bamboo scrolls, China scholars Roger T. Ames (...) and David L. Hall bring the timeless wisdom of the Dao de jing into our contemporary world. Though attributed to Laozi, “the Old Master,” the Dao de jing is, in fact, of unknown authorship and may well have originated in an oral tradition four hundred years before the time of Christ. Eschewing philosophical dogma, the Dao de jing set forth a series of maxims that outlined a new perspective on reality and invited readers to embark on a regimen of self-cultivation. In the Daoist world view, each particular element in our experience sends out an endless series of ripples throughout the cosmos. The unstated goal of the Dao de jing is self-transformation–the attainment of personal excellence that flows from the world and back into it. Responding to the teachings of Confucius, the Dao de jing revitalizes moral behavior by recommending a spontaneity made possible by the cultivated “habits” of the individual. In this elegant volume, Ames and Hall feature the original Chinese texts of the Dao de jing and translate them into crisp, chiseled English that reads like poetry. Each of the eighty-one brief chapters is followed by clear, thought-provoking commentary exploring the layers of meaning in the text. The book’s extensive introduction is a model of accessible scholarship in which Ames and Hall consider the origin of the text, place the emergence of Daoist philosophy in its historical and political context, and outline its central tenets. The Dao de jing is a work of timeless wisdom and beauty, as vital today as it was in ancient China. This new version will stand as both a compelling introduction to the complexities of Daoist thought and as the classic modern English translation. (shrink)
Can the physicalist consistently hold that representational content is all there is to sensory experience and yet that two perceivers could have inverted phenomenal spectra? Yes, if he holds that the phenomenal properties the inverts experience are dummy properties, not instantiated in the physical objects being perceived nor in the perceivers.
The standard adaptationist explanation of the presence of a sensory mechanism in an organism--that it detects properties useful to the organism--cannot be given for color vision. This is because colors do not exist. After arguing for this latter claim, I consider, but reject, nonadaptationist explanations. I conclude by proposing an explanation of how color vision could have adaptive value even though it does not detect properties in the environment.
We give an analysis of the Monty Hall problem purely in terms of confirmation, without making any lottery assumptions about priors. Along the way, we show the Monty Hall problem is structurally identical to the Doomsday Argument.
American philosopher Everett W. Hall (1901-1960) was among the first epistemologists writing in English to have promoted “representationism,” a currently popular explanation of cognition. According to this school, there are no private sense-data or qualia, because the ascription (representation) of public properties that are exemplified in the world of common sense is believed to be sufficient to explain mental content. In this timely volume, Walter Horn, perhaps the foremost living expert on Hall’s philosophy, not only provides copious excerpts (...) from Hall’s works in epistemology, metaphysics, and the philosophy of language--as well as his own commentaries on those writings--but also includes articles by Richard Rorty, Amie Thomasson, Thomas Natsoulas, and Romane Clark that are pertinent to Hall’s unique blend of linguistic idealism and intentional, common-sense realism. Covering metaphilosophy, the intentionality of perception, naïve realism, linguistic relativism, and Hall's public disagreements with such luminaries as Moore, Carnap, Wittgenstein, Quine, and Sellars, The Roots of Representationism is essential reading for students of 20th Century analytic philosophy. (shrink)
This small book packs a considerable theoretical and practical punch. Alan Ware challenges much received wisdom about the dynamics of two party politics. In the process, he adds considerably to contemporary discussion of the intersection of structure and agency in the development and adaptation of political systems. Ware picks out two party systems for concentrated attention because of their relative tractability in his words: these systems are ideal for analysing the capacity of parties to pursue their interests in (...) the face of both other actors within the political system and also of elements within the party itself. (shrink)
Science and mathematics: the scope and limits of mathematical fictionalism Content Type Journal Article Category Book Symposium Pages 1-26 DOI 10.1007/s11016-011-9640-3 Authors Christopher Pincock, University of Missouri, 438 Strickland Hall, Columbia, MO 65211-4160, USA Alan Baker, Department of Philosophy, Swarthmore College, Swarthmore, PA 19081, USA Alexander Paseau, Wadham College, Oxford, OX1 3PN UK Mary Leng, Department of Philosophy, University of York, Heslington, York, YO10 5DD UK Journal Metascience Online ISSN 1467-9981 Print ISSN 0815-0796.
A major voice in late twentieth-century philosophy, Alan Donagan is distinguished for his theories on the history of philosophy and the nature of morality. The Philosophical Papers of Alan Donagan, volumes 1 and 2, collect 28 of Donagan's most important and best-known essays on historical understanding and ethics from 1957 to 1991. Volume 2 addresses issues in the philosophy of action and moral theory. With papers on Kant, von Wright, Sellars, and Chisholm, this volume also covers a range (...) of questions in applied ethics--from the morality of Truman's decision to drop atomic bombs on Hiroshima and Nagasaki to ethical questions in medicine and law. (shrink)
The allure of perennial questions in biology: temporary excitement or substantive advance? Content Type Journal Article Pages 1-4 DOI 10.1007/s11016-011-9533-5 Authors Alan C. Love, Department of Philosophy, Minnesota Center for Philosophy of Science, University of Minnesota, 831 Heller Hall, 271 19th Ave. S, Minneapolis, MN 55455-0310, USA Journal Metascience Online ISSN 1467-9981 Print ISSN 0815-0796.
In this rich and impressive new book, Henry Somers-Hall gives a nuanced analysis of the philosophical relationship between G. W. F. Hegel and Gilles Deleuze. He convincingly shows that a serious study of Hegel provides an improved insight into Deleuze’s conception of pure difference as the transcendental condition of identity. Somers-Hall develops his argument in three steps. First, both Hegel and Deleuze formulate a critique of representation. Second, Hegel’s proposed alternative is as logically consistent as Deleuze’s. Third, Deleuze (...) can account for evolution, whereas Hegel cannot. (shrink)
In Baumann (American Philosophical Quarterly 42: 71–79, 2005) I argued that reflections on a variation of the Monty Hall problem throws a very general skeptical light on the idea of single-case probabilities. Levy (Synthese, forthcoming, 2007) puts forward some interesting objections which I answer here.
In his article, 'Gratuitous evil and divine providence', Alan Rhoda claims to have produced an uncontroversial theological premise for the evidential argument from evil. I argue that his premise is by no means uncontroversial among theists, and I doubt that any premise can be found that is both uncontroversial and useful for the argument from evil.
Alan Musgrave has been one of the most important philosophers of science in the last quarter of the 20th century. He has exemplified an exceptional combination of clearheaded and profound philosophical thinking. Two seem to be the pillars of his thought: an uncompromising commitment to scientific realism and an equally uncompromising commitment to deductivism. The essays reprinted in this volume (which span a period of 25 years, from 1974 to 1999) testify to these two commitments. (There are two omissions (...) from this collection: “Realism, Truth and Objectivity” in Realism and Anti-realism in the PhilosophyofScience(1996,Kluwer)and“HowtoDowithoutInductiveLogic” (Science&Educationvol.8,1999.Iwillmakesomereferencestothesepapersin what follows.) In the present review, instead of giving an orderly summary of the 16 papers of Essays, I discuss Musgrave’s two major commitments and raise some worries about their combination. (shrink)
This paper concerns Alan Turing’s ideas about machines, mathematical methods of proof, and intelligence. By the late 1930s, Kurt Gödel and other logicians, including Turing himself, had shown that no finite set of rules could be used to generate all true mathematical statements. Yet according to Turing, there was no upper bound to the number of mathematical truths provable by intelligent human beings, for they could invent new rules and methods of proof. So, the output of a human mathematician, (...) for Turing, was not a computable sequence (i.e., one that could be generated by a Turing machine). Since computers only contained a finite number of instructions (or programs), one might argue, they could not reproduce human intelligence. Turing called this the “mathematical objection” to his view that machines can think. Logico-mathematical reasons, stemming from his own work, helped to convince Turing that it should be possible to reproduce human intelligence, and eventually compete with it, by developing the appropriate kind of digital computer. He felt it should be possible to program a computer so that it could learn or discover new rules, overcoming the limitations imposed by the incompleteness and undecidability results in the same way that human mathematicians presumably do. (shrink)
August 16, 1997 David Lewis2 has long defended an account of scientific law acceptable even to an empiricist with significant metaphysical scruples. On this account, the laws are defined to be the consequences of the best system for axiomitizing all occurrent fact. Here "best system" means the set of sentences which yields the best combination of strength of descriptive content 3 with simplicity of exposition. And occurrent facts, the facts to be systematized, are roughly the particular facts about a localized (...) space-time region that are non-modal, non-dispositional, and non-causal. Scientists providing or attempting to provide laws are plausibly seen as giving general principles that unify a body of data. Thus they organize or systematize the arrangement of occurrences. For this reason, Lewis's account has the important merits of providing contact with actual scientific practice while making sense of the standard philosophical conception that laws should be general but more than mere accidental generalizations. However, Lewis has long known about a potential problem with this account, a problem involving chance and credence.4 In a recent series of articles he, Michael Thau, and Ned Hall have developed a new formulation of the relationship between chance and credence which solves the problem. However, I will argue that these articles leave untouched and even exacerbate a closely related and more fundamental problem with the best system account, the problem of nomic necessity. Laws are supposed to be more than true; in some sense they must be true. Yet a principle's membership in the best systematization for one world seems to say nothing about its necessity, i.e., its truth at other worlds. I close by briefly describing how an alternative empiricist account may remove both problems. (shrink)
Alan Shewmons article, The brain and somatic integration: Insights into the standard biological rationale for equating brain death with death (2001), strikes at the heart of the standard justification for whole brain death criteria. The standard justification, which I call the standard paradigm, holds that the permanent loss of the functions of the entire brain marks the end of the integrative unity of the body. In my response to Shewmons article, I first offer a brief summary of the standard (...) paradigm and cite recent work by advocates of whole brain criteria who tenaciously cling to the standard paradigm despite increasing evidence showing that it has significant weaknesses. Second, I address Shewmons case against the standard paradigm, arguing that he is successful in showing that whole brain dead patients have integrated organic unity. Finally, I discuss some minor problems with Shewmons article, along with suggestions for further elaboration. (shrink)
The application of probabilistic arguments to rational decisions in a single case is a contentious philosophical issue which arises in various contexts. Some authors (e.g. Horgan, Philos Pap 24:209–222, 1995; Levy, Synthese 158:139–151, 2007) affirm the normative force of probabilistic arguments in single cases while others (Baumann, Am Philos Q 42:71–79, 2005; Synthese 162:265–273, 2008) deny it. I demonstrate that both sides do not give convincing arguments for their case and propose a new account of the relationship between probabilistic reasoning (...) and rational decisions. In particular, I elaborate a flaw in Baumann’s reductio of rational single-case decisions in a modified Monty Hall Problem. (shrink)
Peter Baumann uses the Monty Hall game to demonstrate that probabilities cannot be meaningfully applied to individual games. Baumann draws from this first conclusion a second: in a single game, it is not necessarily rational to switch from the door that I have initially chosen to the door that Monty Hall did not open. After challenging Baumann’s particular arguments for these conclusions, I argue that there is a deeper problem with his position: it rests on the false assumption (...) that what justifies the switching strategy is its leading me to win a greater percentage of the time. In fact, what justifies the switching strategy is not any statistical result over the long run but rather the “causal structure” intrinsic to each individual game itself. Finally, I argue that an argument by Hilary Putnam will not help to save Baumann’s second conclusion above. (shrink)
Alan Gewirth's Reason and Morality , in which he set forth the Principle of Generic Consistency, is a major work of modern ethical theory that, though much debated and highly respected, has yet to gain full acceptance. Deryck Beyleveld contends that this resistance stems from misunderstanding of the method and logical operations of Gewirth's central argument. In this book Beyleveld seeks to remedy this deficiency. His rigorous reconstruction of Gewirth's argument gives its various parts their most compelling formulation and (...) clarifies its essential logical structure. Beyleveld then classifies all the criticisms that Gewirth's argument has received and measures them against his reconstruction of the argument. The overall result is an immensely rich picture of the argument, in which all of its complex issues and key moves are clearly displayed and its validity can finally be discerned. The comprehensiveness of Beyleveld's treatment provides ready access to the entire debate surrounding the foundational argument of Reason and Morality . It will be required reading for all who are interested in Gewirth's theory and deontological ethics and will be of central importance to moral and legal theorists. (shrink)
As is well known, Alan Turing drew a line, embodied in the "Turing test," between intellectual and physical abilities, and hence between cognitive and natural sciences. Less familiarly, he proposed that one way to produce a "passer" would be to educate a "child machine," equating the experimenter's improvements in the initial structure of the child machine with genetic mutations, while supposing that the experimenter might achieve improvements more expeditiously than natural selection. On the other hand, in his foundational "On (...) the chemical basis of morphogenesis," Turing insisted that biological explanation clearly confine itself to purely physical and chemical means, eschewing vitalist and teleological talk entirely and hewing to D'Arcy Thompson's line that "evolutionary 'explanations,'" are historical and narrative in character, employing the same intentional and teleological vocabulary we use in doing human history, and hence, while perhaps on occasion of heuristic value, are not part of biology as a natural science. To apply Turing's program to recent issues, the attempt to give foundations to the social and cognitive sciences in the "real science" of evolutionary biology (as opposed to Turing's biology) is neither to give foundations, nor to achieve the unification of the social/cognitive sciences and the natural sciences. (shrink)