In recent papers, Zurek [(2005). Probabilities from entanglement, Born's rule pk=|ψk|2 from entanglement. Physical Review A, 71, 052105] has objected to the decision-theoretic approach of Deutsch [(1999) Quantum theory of probability and decisions. Proceedings of the Royal Society of London A, 455, 3129–3137] and Wallace [(2003). Everettian rationality: defending Deutsch's approach to probability in the Everett interpretation. Studies in History and Philosophy of Modern Physics, 34, 415–438] to deriving the Born rule for quantum probabilities on the grounds that it courts (...) circularity. Deutsch and Wallace assume that the many worlds theory is true and that decoherence gives rise to a preferred basis. However, decoherence arguments use the reduced density matrix, which relies upon the partial trace and hence upon the Born rule for its validity. Using the Heisenberg picture and quantum Darwinism—the notion that classical information is quantum information that can proliferate in the environment pioneered in Ollivier et al. [(2004). Objective properties from subjective quantum states: Environment as a witness. Physical Review Letters, 93, 220401 and (2005). Environment as a witness: Selective proliferation of information and emergence of objectivity in a quantum universe. Physical Review A, 72, 042113]—I show that measurement interactions between two systems only create correlations between a specific set of commuting observables of system 1 and a specific set of commuting observables of system 2. This argument picks out a unique basis in which information flows in the correlations between those sets of commuting observables. I then derive the Born rule for both pure and mixed states and answer some other criticisms of the decision theoretic approach to quantum probability. (shrink)
Alan Turing is known for both his mathematical creativity and genius and role in cryptography war efforts, and for his homosexuality, for which he was persecuted. Yet there is little work that brings these two parts of his life together. This paper deconstructs and moves beyond the extant stereotypes around perceived associations between gay men and creativity, to consider how Turing’s lived experience as a queer mathematician provides a rich seam of insight into the ways in which his life, (...) relationships, and working environment shaped his work. (shrink)
In this in ter view, the pres ti gious an thro - pol o gist, his to rian and T.V. anaouncer, Alan Macfarlane com ments on some of the is sues that have been ad dressed in his writ ings. His main the o ret i cal con cern has been to study the pe cu - liar con di tions that gave rise to the mod e..
This small book packs a considerable theoretical and practical punch. Alan Ware challenges much received wisdom about the dynamics of two party politics. In the process, he adds considerably to contemporary discussion of the intersection of structure and agency in the development and adaptation of political systems. Ware picks out two party systems for concentrated attention because of their relative tractability in his words: these systems are ideal for analysing the capacity of parties to pursue their interests in (...) the face of both other actors within the political system and also of elements within the party itself. (shrink)
Alan Turing’s pioneering work on computability, and his ideas on morphological computing support Andrew Hodges’ view of Turing as a natural philosopher. Turing’s natural philosophy differs importantly from Galileo’s view that the book of nature is written in the language of mathematics (The Assayer, 1623). Computing is more than a language used to describe nature as computation produces real time physical behaviors. This article presents the framework of Natural info-computationalism as a contemporary natural philosophy that builds on the legacy (...) of Turing’s computationalism. The use of info-computational conceptualizations, models and tools makes possible for the first time in history modeling of complex self-organizing adaptive systems, including basic characteristics and functions of living systems, intelligence, and cognition. (shrink)
A major voice in late twentieth-century philosophy, Alan Donagan is distinguished for his theories on the history of philosophy and the nature of morality. The Philosophical Papers of Alan Donagan, volumes 1 and 2, collect 28 of Donagan's most important and best-known essays on historical understanding and ethics from 1957 to 1991. Volume 2 addresses issues in the philosophy of action and moral theory. With papers on Kant, von Wright, Sellars, and Chisholm, this volume also covers a range (...) of questions in applied ethics--from the morality of Truman's decision to drop atomic bombs on Hiroshima and Nagasaki to ethical questions in medicine and law. (shrink)
In a recent article in this journal, Alan Thomas presents a novel defence of what I call ‘Rawlsian Institutionalism about Justice’ against G. A. Cohen’s well-known critique. In this response I aim to defend Cohen’s rejection of Institutionalism against Thomas’s arguments. In part this defence requires clarifying precisely what is at issue between Institutionalists and their opponents. My primary focus, however, is on Thomas’s critical discussion of Cohen’s endorsement of an ethical prerogative, as well as his appeal to the (...) institutional framework of a ‘property-owning democracy’ in his elaboration of the precise institutional requirements of Rawlsian Institutionalist justice, and his related claim that Cohen’s rejection of Institutionalism involves an objectionable ‘double counting’ of the demands of justice. I argue that once we are clear about both the kind of justification that can be given for a prerogative within a plausible ethical theory, and about the key points of departure between Institutionalist views and their rivals, Cohen’s rejection of Institutionalism appears well-motivated, and Thomas’s claim that his view is guilty of double counting the demands of justice can be seen to be mistaken. (shrink)
Alan Gewirth's Reason and Morality , in which he set forth the Principle of Generic Consistency, is a major work of modern ethical theory that, though much debated and highly respected, has yet to gain full acceptance. Deryck Beyleveld contends that this resistance stems from misunderstanding of the method and logical operations of Gewirth's central argument. In this book Beyleveld seeks to remedy this deficiency. His rigorous reconstruction of Gewirth's argument gives its various parts their most compelling formulation and (...) clarifies its essential logical structure. Beyleveld then classifies all the criticisms that Gewirth's argument has received and measures them against his reconstruction of the argument. The overall result is an immensely rich picture of the argument, in which all of its complex issues and key moves are clearly displayed and its validity can finally be discerned. The comprehensiveness of Beyleveld's treatment provides ready access to the entire debate surrounding the foundational argument of Reason and Morality . It will be required reading for all who are interested in Gewirth's theory and deontological ethics and will be of central importance to moral and legal theorists. (shrink)
This paper concerns Alan Turing’s ideas about machines, mathematical methods of proof, and intelligence. By the late 1930s, Kurt Gödel and other logicians, including Turing himself, had shown that no finite set of rules could be used to generate all true mathematical statements. Yet according to Turing, there was no upper bound to the number of mathematical truths provable by intelligent human beings, for they could invent new rules and methods of proof. So, the output of a human mathematician, (...) for Turing, was not a computable sequence (i.e., one that could be generated by a Turing machine). Since computers only contained a finite number of instructions (or programs), one might argue, they could not reproduce human intelligence. Turing called this the “mathematical objection” to his view that machines can think. Logico-mathematical reasons, stemming from his own work, helped to convince Turing that it should be possible to reproduce human intelligence, and eventually compete with it, by developing the appropriate kind of digital computer. He felt it should be possible to program a computer so that it could learn or discover new rules, overcoming the limitations imposed by the incompleteness and undecidability results in the same way that human mathematicians presumably do. (shrink)
In his article, ‘Gratuitous evil and divine providence’, Alan Rhoda claims to have produced an uncontroversial theological premise for the evidential argument from evil. I argue that his premise is by no means uncontroversial among theists, and I doubt that any premise can be found that is both uncontroversial and useful for the argument from evil.
Discussions of the evidential argument from evil generally pay little attention to how different models of divine providence constrain the theist's options for response. After describing four models of providence and general theistic strategies for engaging the evidential argument, I articulate and defend a definition of ‘gratuitous evil’ that renders the theological premise of the argument uncontroversial for theists. This forces theists to focus their fire on the evidential premise, enabling us to compare models of providence with respect to how (...) plausibly they can resist it. I then assess the four models, concluding that theists are better off vis-à-vis the evidential argument if they reject meticulous providence. (shrink)
Review Essay: Exemplary Stories: On the Uses of Biography in Recent Sociology: Alan Sica and Stephen Turner The Disobedient Generation: Social Theorists in the Sixties ; Mathieu Deflem Sociologists in a Global Age: Biographical Perspectives ; Anthony Elliott and Charles Lemert, The New Individualism: The Emotional Costs of Globalization.
D. Alan Shewmon has advanced a well-documented challenge to the widely accepted total brain death criterion for death of the human being. We show that Shewmon's argument against this criterion is unsound, though he does refute the standard argument for that criterion. We advance a distinct argument for the total brain death criterion and answer likely objections. Since human beings are rational animals – sentient organisms of a specific type – the loss of the radical capacity for sentience involves (...) a substantial change, the passing away of the human organism. In human beings total brain death involves the complete loss of the radical capacity for sentience, and so in human beings total brain death is death. (shrink)
In his article, 'Gratuitous evil and divine providence', Alan Rhoda claims to have produced an uncontroversial theological premise for the evidential argument from evil. I argue that his premise is by no means uncontroversial among theists, and I doubt that any premise can be found that is both uncontroversial and useful for the argument from evil.
In this book Alan Haworth tends to sneer at libertarians. However, there are, I believe, a few sound criticisms. I have always held similar opinions of Murray Rothbard‟s and Friedrich Hayek‟s definitions of liberty and coercion, Robert Nozick‟s account of natural rights, and Hayek‟s spontaneous-order arguments. I urge believers of these positions to read Haworth. But I don‟t personally know many libertarians who believe them (or who regard Hayek as a libertarian).
As is well known, Alan Turing drew a line, embodied in the "Turing test," between intellectual and physical abilities, and hence between cognitive and natural sciences. Less familiarly, he proposed that one way to produce a "passer" would be to educate a "child machine," equating the experimenter's improvements in the initial structure of the child machine with genetic mutations, while supposing that the experimenter might achieve improvements more expeditiously than natural selection. On the other hand, in his foundational "On (...) the chemical basis of morphogenesis," Turing insisted that biological explanation clearly confine itself to purely physical and chemical means, eschewing vitalist and teleological talk entirely and hewing to D'Arcy Thompson's line that "evolutionary 'explanations,'" are historical and narrative in character, employing the same intentional and teleological vocabulary we use in doing human history, and hence, while perhaps on occasion of heuristic value, are not part of biology as a natural science. To apply Turing's program to recent issues, the attempt to give foundations to the social and cognitive sciences in the "real science" of evolutionary biology (as opposed to Turing's biology) is neither to give foundations, nor to achieve the unification of the social/cognitive sciences and the natural sciences. (shrink)
C. J. Mews - Logic, Theology, and Poetry in Boethius, Abelard, and Alan of Lille: Words in the Absence of Things - Journal of the History of Philosophy 45:2 Journal of the History of Philosophy 45.2 327-328 Muse Search Journals This Journal Contents Reviewed by Constant J. Mews Monash University Eileen C. Sweeney. Logic, Theology, and Poetry in Boethius, Abelard, and Alan of Lille: Words in the Absence of Things. The New Middle Ages. London: Palgrave MacMillan, 2006. Pp. (...) xii + 248. Cloth, $65.00. Boethius, Abelard, and Alan of Lille all crystallized their thoughts in poetry as much as in prose. In seeking to condense their achievement into a slim monograph, Sweeney synthesizes much thought into relatively few pages. Her ambition pays off. Her opening chapter on Boethius.. (shrink)
Alan Shewmons article, The brain and somatic integration: Insights into the standard biological rationale for equating brain death with death (2001), strikes at the heart of the standard justification for whole brain death criteria. The standard justification, which I call the standard paradigm, holds that the permanent loss of the functions of the entire brain marks the end of the integrative unity of the body. In my response to Shewmons article, I first offer a brief summary of the standard (...) paradigm and cite recent work by advocates of whole brain criteria who tenaciously cling to the standard paradigm despite increasing evidence showing that it has significant weaknesses. Second, I address Shewmons case against the standard paradigm, arguing that he is successful in showing that whole brain dead patients have integrated organic unity. Finally, I discuss some minor problems with Shewmons article, along with suggestions for further elaboration. (shrink)
The origin of my article lies in the appearance of Copeland and Proudfoot's feature article in Scientific American, April 1999. This preposterous paper, as described on another page, suggested that Turing was the prophet of 'hypercomputation'. In their references, the authors listed Copeland's entry on 'The Church-Turing thesis' in the Stanford Encyclopedia. In the summer of 1999, I circulated an open letter criticising the Scientific American article. I included criticism of this Encyclopedia entry. This was forwarded to Prof. Ed Zalta, (...) editor of the Encyclopedia, and after some discussion he invited me to submit an entry on ' Alan Turing.'. (shrink)
Alan Carter's recent review in Mind of my Ethics of the Global Environment combines praise of biocentric consequentialism with criticisms that it could advocate both minimal satisfaction of human needs and the extinction of for the sake of generating extra people; Carter also maintains that as a monistic theory it is predictably inadequate to cover the full range of ethical issues, since only a pluralistic theory has this capacity. In this reply, I explain how the counter-intuitive implications of biocentric (...) consequentialism suggested by Carter are not implications, and argue that since pluralistic theories either generate contradictions or collapse into monistic theories, the superiority of pluralistic theories is far from predictable. Thus Carter's criticisms fail to undermine biocentric consequentialism as a normative theory applicable to the generality of ethical issues. (shrink)
Alan Weir’s new book is, like Darwin’s Origin of Species, ‘one long argument’. The author has devised a new kind of have-it-both-ways philosophy of mathematics, supposed to allow him to say out of one side of his mouth that the integer 1,000,000 exists and even that the cardinal ℵω exists, while saying out of the other side of his mouth that no numbers exist at all, and the whole book is devoted to an exposition and defense of this new (...) view. The view is presented in the book in a way that can make it difficult for the reader to trace the main line of argument: with a great deal of apparatus, and with a great many digressions into subordinate issues. In what follows I will try to stick to what I take to be the essentials, even at the risk of oversimplifying some central but complicated issues, and at the cost of neglecting some interesting but peripheral ones.In chapter 1, the author introduces a distinction between what he calls ‘two aspects of meaning’ and dubs informational content and metaphysical content. Informational content is the aspect of meaning of primary interest to linguists, and the one of which speakers themselves are generally aware, at least upon reflection. Metaphysical content is supposed to be another aspect of meaning primarily of interest to philosophers. The basic idea is that if there are standards of correctness for assertions of a certain kind, then such an assertion may be called ‘true’ when those standards are met, even though the kind of correctness involved is not correctness in representing how the world is. What the world must be like in order for the utterance to be true is the metaphysical content of the assertion, but it need not be part of its …. (shrink)
Alan Musgrave has been one of the most important philosophers of science in the last quarter of the 20th century. He has exemplified an exceptional combination of clearheaded and profound philosophical thinking. Two seem to be the pillars of his thought: an uncompromising commitment to scientific realism and an equally uncompromising commitment to deductivism. The essays reprinted in this volume (which span a period of 25 years, from 1974 to 1999) testify to these two commitments. (There are two omissions (...) from this collection: “Realism, Truth and Objectivity” in Realism and Anti-realism in the PhilosophyofScience(1996,Kluwer)and“HowtoDowithoutInductiveLogic” (Science&Educationvol.8,1999.Iwillmakesomereferencestothesepapersin what follows.) In the present review, instead of giving an orderly summary of the 16 papers of Essays, I discuss Musgrave’s two major commitments and raise some worries about their combination. (shrink)
In this paper we present the syntax and semantics of a temporal action language named Alan, which was designed to model interactive multimedia presentations where the Markov property does not always hold. In general, Alan allows the specification of systems where the future state of the world depends not only on the current state, but also on the past states of the world. To the best of our knowledge, Alan is the first action language which incorporates causality (...) with temporal formulas. In the process of defining the effect of actions we define the closure with respect to a path rather than to a state, and show that the non-Markovian model is an extension of the traditional Markovian model. Finally, we establish relationship between theories of Alan and logic programs. (shrink)
The two books reviewed here are different efforts to embrace the vast subject called "social thought." The second edition of The Blackwell Dictionary of Modern Social Thought, edited by William Outhwaite with Alain Touraine, contains numerous updates; yet it also has some disadvantages compared to the first edition. Social Thought: From the Enlightenment to the Present, edited by Alan Sica, is a bold but controversial attempt at gathering in one anthology as many social thinkers as possible. Key Words: "social" (...) • social thought/theory • William Outhwaite • Alan Sica • explanation. (shrink)
Jonathan Quong | : Alan Patten presents his account of minority rights as broadly continuous with Ronald Dworkin’s theory of equality of resources. This paper challenges this claim. I argue that, contra Patten, Dworkin’s theory does not provide a basis to offer accommodations or minority rights, as a matter of justice, to some citizens who find themselves at a relative disadvantage in pursuing their plans of life after voluntarily changing their cultural or religious commitments. | : Alan Patten (...) considère que sa théorie des droits des minorités s’inscrit en continuité avec celle de l’égalité des ressources chez Donald Dworkin. Cet article interroge cette affirmation. Je soutiens que, contrairement à ce que pense Patten, la théorie de Dworkin ne fournit pas de base en vue d’accommodations ou des droits de la minorité, en ce qui a trait à la justice, à des citoyens relativement désavantagés par la poursuite de leur plan de vie après avoir volontairement changé de culture ou d’engagements religieux. (shrink)
Alan Gewirth's Reason and Morality directed philosophical attention to the possibility of presenting a rational and rigorous demonstration of fundamental moral principles. Now, these previously unpublished essays from some of the most distinguished philosophers of our generation subject Gewirth's program to thorough evaluation and assessment. In a tour de force of philosophical analysis, Professor Gewirth provides detailed replies to all of his critics--a major, genuinely clarifying essay of intrinsic philosophical interest.
The December 2008 White Paper (WP) on “Brain Death” published by the President’s Council on Bioethics (PCBE) reaffirmed its support for the traditional neurological criteria for human death. It spends considerable time explaining and critiquing what it takes to be the most challenging recent argument opposing the neurological criteria formulated by D. Alan Shewmon, a leading critic of the “whole brain death” standard. The purpose of this essay is to evaluate and critique the PCBE’s argument. The essay begins with (...) a brief background on the history of the neurological criteria in the United States and on the preparation of the 2008 WP. After introducing the WP’s contents, the essay sets forth Shewmon’s challenge to the traditional neurological criteria and the PCBE’s reply to Shewmon. The essay concludes by critiquing the WP’s novel justification for reaffirming the traditional conclusion, a justification the essay finds wanting. (shrink)
The Early Han enjoyed some prosperity while it struggled with centralization and political control of the kingdom. The Later Han was plagued by the court intrigue, corrupt eunuchs, and massive flooding of the Yellow River that eventually culminated in popular uprisings that led to the demise of the dynasty. The period that followed was a renewed warring states period that likewise stimulated a rebirth of philosophical and religious debate, growth, and innovations. Alan K. L. Chan and Yuet-Keung Lo's Philosophy (...) and Religion in Early Medieval China is a welcome addition to the growing body of literature on medieval China. It is a companion volume to their coauthored work, Interpretation and Literature in Early .. (shrink)
It is worth at least a moment to note and praise Alan Goldman’s methodological stance in Philosophy and the Novel.1 Goldman reflects appreciatively on the achievements of specific novels in order to arrive at philosophically interesting results about interpretation and moral understanding. In his appreciative reflections, Goldman is aware of, but by no means bound by, recent work in experimental moral psychology and metaethics. The result is a powerful demonstration not only of the human, cognitive, and ethical interest of (...) the novel but also of the ability of the novel to inform and transform our thinking.. (shrink)
Alan Gewirth has propounded a moral theory which commits him to the view that prescriptions can appropriately be addressed to people who have neither any moral reasons nor any prudential reasons to follow the prescriptions. We highlight the strangeness of Gewirth's position and then show that it undermines his attempt to come up with a supreme moral principle.
John Stuart Mill is—surprisingly—a difficult writer. He writes clearly, non-technically, and in a very plain prose which Bertrand Russell once described as a model for philosophers. It is never hard to see what the general drift of the argument is, and never hard to see which side he is on. He is, none the less, a difficult writer because his clarity hides complicated arguments and assumptions which often take a good deal of unpicking. And when we have done that unpicking, (...) the task of analysing the merits and deficiencies of the arguments is still only half completed. This is true of all his work and particularly true of Liberty. It is an essay whose clarity and energy have made it the most popular of all Mill's work. Yet it conceals philosophical, sociological and historical assumptions of a very debatable kind. In his introduction, Mill says the object of this essay is to defend one very simple principle, as entitled to govern absolutely the dealings of society with the individual in the way of compulsion and control, whether the means used be legal penalties, or the moral coercion of public opinion. (shrink)
I live just off of Bell Road outside of Newburgh, Indiana, a small town of 3,000 people. A mile down the street Bell Road intersects with Telephone Road not as a modern reminder of a technology belonging to bygone days, but as testimony that this technology, now more than a century and a quarter old, is still with us. In an age that prides itself on its digital devices and in which the computer now equals the telephone as a medium (...) of communication, it is easy to forget the debt we owe to an era that industrialized the flow of information, that the light bulb, to pick a singular example, which is useful for upgrading visual information we might otherwise overlook, nonetheless remains the most prevalent of all modern day information technologies. Edison’s light bulb, of course, belongs to a different order of informational devices than the computer, but not so the telephone, not entirely anyway. Alan Turing, best known for his work on the Theory of Computation (1937), the Turing Machine (also 1937) and the Turing Test (1950), is often credited with being the father of computer science and the father of artificial intelligence. Less well-known to the casual reader but equally important is his work in computer engineering. The following lecture on the Automatic Computing Engine, or ACE, shows Turing in this different light, as a mechanist concerned with getting the greatest computational power from minimal hardware resources. Yet Turing’s work on mechanisms is often eclipsed by his thoughts on computability and his other theoretical interests. This is unfortunate for several reasons, one being that it obscures our picture of the historical trajectory of information technology, a second that it emphasizes a false dichotomy between “hardware” and “software” to which Turing himself did not ascribe but which has, nonetheless, confused researchers who study the nature of mind and intelligence for generations.. (shrink)
In his short life, Alan Turing (1912-1954) made foundational contributions to philosophy, mathematics, biology, artificial intelligence, and computer science. He, as much as anyone, invented and showed how to program the digital electronic computer. From September, 1939, his work on computation was war-driven and brutally practical. He developed high speed computing devices needed to decipher German Enigma Machine messages to and from U-boats, countering the most serious threat by far to Britain..
In this essay I wish to defend the intuition that God transcends time, of which he is the Creator. To do this, I will develop a new understanding of the term ‘timeless eternity’ as it applies to God. This assumes the inadequacy of the traditional notion of divine eternity, as it is found in Boethius, Anselm and Aquinas. Very briefly, the reasons for this inadequacy are as follows. God sustains the universe, which means in part that he is responsible for (...) the fundamental ontological status of things. Because the universe is an everchanging reality, things do change in their fundamental ontological status at different times – a change we must ascribe to God, and cannot ascribe to the objects themselves, since this has to do with their very existence. God himself, therefore, does different things at different times. This implies change in God. Whenever a change occurs, a duration occurs. Therefore, God is in time. But I do not think it is proper to say that God is in our time. God transcends time, and he is the Creator of our space-time. It is theologically more proper to say that we are in God's time, and I will adopt this language here. (shrink)
It seems opportune to commemorate in ‘Augustinianum’ the centenary of the birth of Alan Turing, insofar as he is an outstanding figure whose theoritical insight gave birth to the computer revolution of the twentieth centur y. His theories are equally important for the methodology supporting studies in the humanities.
A substantial body of literature has been produced in the twentieth century by religious and philosophical writers on the ethics of belief. Discussion of this topic has generally focused on the processes leading up to belief within the individual, so that it would not be inaccurate to say that for most of these writers ‘the ethics of belief’ means ‘the ethics of coming–to–believe’. There has been little attention among these writers, however, to the moral questions which surround the production or (...) inducement of beliefs in others, to the ethics of persuasion . An extension of the ethics of belief to cover moral issues which arise in connection with persuasion seems reasonable; the ethics of belief, widely construed, might be said to encompass questions about both the production of beliefs within oneself and the inducement of beliefs in others. (shrink)