E-Z Reader is a highly successful model of eye-movement control, employing the notion of a serial-sequential attentional spotlight switched from word to word. Evidence of parallel processing of words in text calls this notion into question. Modifications to the model to accommodate this evidence are possible but will not address the fundamental objection that reading should not be seen as “surrogate listening.”.
Machine generated contents note: 1. Introduction Juliette Kennedy and Roman Kossak; 2. Historical remarks on Suslin's problem Akihiro Kanamori; 3. The continuum hypothesis, the generic-multiverse of sets, and the [OMEGA] conjecture W. Hugh Woodin; 4. [omega]-Models of finite set theory Ali Enayat, James H. Schmerl and Albert Visser; 5. Tennenbaum's theorem for models of arithmetic Richard Kaye; 6. Hierarchies of subsystems of weak arithmetic Shahram Mohsenipour; 7. Diophantine correct open induction Sidney Raffer; 8. Tennenbaum's theorem and recursive reducts James (...) H. Schmerl; 9. History of constructivism in the 20th century A. S. Troelstra; 10. A very short history of ultrafinitism Rose M. Cherubin and Mirco A. Mannucci; 11. Sue Toledo's notes of her conversations with Gödel in 1972-1975 Sue Toledo; 12. Stanley Tennenbaum's Socrates Curtis Franks; 13. Tennenbaum's proof of the irrationality of [the square root of] 2́. (shrink)
It is now eighteen years on since the Human Tissue Act 1961, but this legislation is still unchanged in England, Scotland and Wales. Ian Kennedy, in this paper, lays before us the law as it is, the problems of its interpretation and his opinion of what government should be doing to help clarify the situation and remove some of the problems which exist daily for the doctors who face the dilemma of seeking consent for transplants at the moment of (...) extreme grief for the surviving spouses or relatives of the patient who has been in his care only moments before. Ian Kennedy suggests that by doing nothing the Department of Health and the government are being both callous and less than honest. (shrink)
I defend a view of the structure of visual property-awareness by considering the phenomenon of perceptual constancy. I argue that visual property-awareness is a three-place relation between a subject, a property, and a manner of presentation. Manners of presentation mediate our visual awareness of properties without being objects of visual awareness themselves. I provide criteria of identity for manners ofpresentation, and I argue that our ignorance of their intrinsic nature does not compromise the viability of a theory that employs them. (...) In closing, I argue that the proposed manners of presentation are consistent with key direct-realist claims about the structure of visual awareness. (shrink)
Adults and infants display a robust ability to perceive the unity of a center-occluded object when the visible ends of the object undergo common motion (e.g. Kellman, P.J., Spelke, E.S., 1983. Perception of partly occluded objects in infancy. Cognitive Psychology 15, 483±524). Ecologically oriented accounts of this ability focus on the primacy of motion in the perception of segregated objects, but Gestalt theory suggests a broader possibility: observers may perceive object unity by detecting patterns of synchronous change, of which common (...) motion is a special case. We investigated this possibility with observations of adults and 4-month-old infants. Participants viewed a center-occluded object whose visible surfaces were either misaligned or aligned, stationary or moving, and unchanging or synchronously changing in color or bright- ness in various temporal patterns (e.g. ¯ashing). Both alignment and common motion con- tributed to adults' perception of object unity, but synchronous color changes did not. For infants, motion was an important determinant of object unity, but other synchronous changes and edge alignment were not. When a stationary object with aligned edges underwent syn- chronous changes in color or brightness, infants showed high levels of attention to the object, but their perception of its unity appeared to be indeterminate. An inherent preference for fast over slow ¯ash rates, and a novelty preference elicited by a change in rate, both indicated that infants detected the synchronous changes, although they failed to use them as information for object unity. These ®ndings favor ecologically oriented accounts of object perception in which surface motion plays a privileged role. Ó 1999 Elsevier Science B.V. All rights reserved. (shrink)
: Hipparchia's use of exile as an ethical and rhetorical space from which to critique convention is the point of departure for an examination of the ethics of using exile as a rhetorically effective position for feminist theorizing. To address the ethical problems involved in using exile as a rhetorical space, I argue for a reading of exile as both a rhetorical and embodied space that can maintain an ethical anchor for feminist rhetorical and political practice.
In this in ter view, the pres ti gious an thro - pol o gist, his to rian and T.V. anaouncer, Alan Macfarlane com ments on some of the is sues that have been ad dressed in his writ ings. His main the o ret i cal con cern has been to study the pe cu - liar con di tions that gave rise to the mod e..
This small book packs a considerable theoretical and practical punch. Alan Ware challenges much received wisdom about the dynamics of two party politics. In the process, he adds considerably to contemporary discussion of the intersection of structure and agency in the development and adaptation of political systems. Ware picks out two party systems for concentrated attention because of their relative tractability in his words: these systems are ideal for analysing the capacity of parties to pursue their interests in (...) the face of both other actors within the political system and also of elements within the party itself. (shrink)
Alan Turing is known for both his mathematical creativity and genius and role in cryptography war efforts, and for his homosexuality, for which he was persecuted. Yet there is little work that brings these two parts of his life together. This paper deconstructs and moves beyond the extant stereotypes around perceived associations between gay men and creativity, to consider how Turing’s lived experience as a queer mathematician provides a rich seam of insight into the ways in which his life, (...) relationships, and working environment shaped his work. (shrink)
A major voice in late twentieth-century philosophy, Alan Donagan is distinguished for his theories on the history of philosophy and the nature of morality. The Philosophical Papers of Alan Donagan, volumes 1 and 2, collect 28 of Donagan's most important and best-known essays on historical understanding and ethics from 1957 to 1991. Volume 2 addresses issues in the philosophy of action and moral theory. With papers on Kant, von Wright, Sellars, and Chisholm, this volume also covers a range (...) of questions in applied ethics--from the morality of Truman's decision to drop atomic bombs on Hiroshima and Nagasaki to ethical questions in medicine and law. (shrink)
: The Newman programs established at secular colleges and universities provided an opportunity for intellectual, spiritual, and social growth among the Catholic student population. As a young physician and junior medical faculty member, André Hellegers took part in the early organization and ongoing work of Carroll House, the Newman Center at the Johns Hopkins Medical Institutions. Hellegers's experience at Carroll House enabled him to develop a clear blueprint of an academic center of excellence for the scientific, theological, and philosophical exploration (...) of the many problems that he had seen and foresaw in medicine. That center would become Georgetown's Kennedy Institute of Ethics. (shrink)
We have benefited from conversations with Archon Fung, Brian Jacob, Todd Pittinsky, Peter Schuck, Ani Satz, Andrew Williams, and students in a joint class on statistics and ethics at the John F. Kennedy School of Government in October 2002. We are also grateful to our audience at the conference “The Priority of Practice,” organized by Jonathan Wolff at University College London in September 2003, and to Arthur Applbaum, Miriam Avins, Frances Kamm, Simon Keller, Frederick Schauer, Alan Wertheimer, and (...) the Editors of Phi- losophy & Public Affairs for insightful comments. We have benefited from prepublication reading of Schauer’s work on profiling, Profiles, Probabilities, and Stereotypes (Cambridge, Mass: Harvard University Press, 2003). We thank Avedis Koutoujian for research assistance. (shrink)
In his article, 'Gratuitous evil and divine providence', Alan Rhoda claims to have produced an uncontroversial theological premise for the evidential argument from evil. I argue that his premise is by no means uncontroversial among theists, and I doubt that any premise can be found that is both uncontroversial and useful for the argument from evil.
Alan Musgrave has been one of the most important philosophers of science in the last quarter of the 20th century. He has exemplified an exceptional combination of clearheaded and profound philosophical thinking. Two seem to be the pillars of his thought: an uncompromising commitment to scientific realism and an equally uncompromising commitment to deductivism. The essays reprinted in this volume (which span a period of 25 years, from 1974 to 1999) testify to these two commitments. (There are two omissions (...) from this collection: “Realism, Truth and Objectivity” in Realism and Anti-realism in the PhilosophyofScience(1996,Kluwer)and“HowtoDowithoutInductiveLogic” (Science&Educationvol.8,1999.Iwillmakesomereferencestothesepapersin what follows.) In the present review, instead of giving an orderly summary of the 16 papers of Essays, I discuss Musgrave’s two major commitments and raise some worries about their combination. (shrink)
This paper concerns Alan Turing’s ideas about machines, mathematical methods of proof, and intelligence. By the late 1930s, Kurt Gödel and other logicians, including Turing himself, had shown that no finite set of rules could be used to generate all true mathematical statements. Yet according to Turing, there was no upper bound to the number of mathematical truths provable by intelligent human beings, for they could invent new rules and methods of proof. So, the output of a human mathematician, (...) for Turing, was not a computable sequence (i.e., one that could be generated by a Turing machine). Since computers only contained a finite number of instructions (or programs), one might argue, they could not reproduce human intelligence. Turing called this the “mathematical objection” to his view that machines can think. Logico-mathematical reasons, stemming from his own work, helped to convince Turing that it should be possible to reproduce human intelligence, and eventually compete with it, by developing the appropriate kind of digital computer. He felt it should be possible to program a computer so that it could learn or discover new rules, overcoming the limitations imposed by the incompleteness and undecidability results in the same way that human mathematicians presumably do. (shrink)
Alan Shewmons article, The brain and somatic integration: Insights into the standard biological rationale for equating brain death with death (2001), strikes at the heart of the standard justification for whole brain death criteria. The standard justification, which I call the standard paradigm, holds that the permanent loss of the functions of the entire brain marks the end of the integrative unity of the body. In my response to Shewmons article, I first offer a brief summary of the standard (...) paradigm and cite recent work by advocates of whole brain criteria who tenaciously cling to the standard paradigm despite increasing evidence showing that it has significant weaknesses. Second, I address Shewmons case against the standard paradigm, arguing that he is successful in showing that whole brain dead patients have integrated organic unity. Finally, I discuss some minor problems with Shewmons article, along with suggestions for further elaboration. (shrink)
Alan Gewirth's Reason and Morality , in which he set forth the Principle of Generic Consistency, is a major work of modern ethical theory that, though much debated and highly respected, has yet to gain full acceptance. Deryck Beyleveld contends that this resistance stems from misunderstanding of the method and logical operations of Gewirth's central argument. In this book Beyleveld seeks to remedy this deficiency. His rigorous reconstruction of Gewirth's argument gives its various parts their most compelling formulation and (...) clarifies its essential logical structure. Beyleveld then classifies all the criticisms that Gewirth's argument has received and measures them against his reconstruction of the argument. The overall result is an immensely rich picture of the argument, in which all of its complex issues and key moves are clearly displayed and its validity can finally be discerned. The comprehensiveness of Beyleveld's treatment provides ready access to the entire debate surrounding the foundational argument of Reason and Morality . It will be required reading for all who are interested in Gewirth's theory and deontological ethics and will be of central importance to moral and legal theorists. (shrink)
As is well known, Alan Turing drew a line, embodied in the "Turing test," between intellectual and physical abilities, and hence between cognitive and natural sciences. Less familiarly, he proposed that one way to produce a "passer" would be to educate a "child machine," equating the experimenter's improvements in the initial structure of the child machine with genetic mutations, while supposing that the experimenter might achieve improvements more expeditiously than natural selection. On the other hand, in his foundational "On (...) the chemical basis of morphogenesis," Turing insisted that biological explanation clearly confine itself to purely physical and chemical means, eschewing vitalist and teleological talk entirely and hewing to D'Arcy Thompson's line that "evolutionary 'explanations,'" are historical and narrative in character, employing the same intentional and teleological vocabulary we use in doing human history, and hence, while perhaps on occasion of heuristic value, are not part of biology as a natural science. To apply Turing's program to recent issues, the attempt to give foundations to the social and cognitive sciences in the "real science" of evolutionary biology (as opposed to Turing's biology) is neither to give foundations, nor to achieve the unification of the social/cognitive sciences and the natural sciences. (shrink)
D. Alan Shewmon has advanced a well-documented challenge to the widely accepted total brain death criterion for death of the human being. We show that Shewmon's argument against this criterion is unsound, though he does refute the standard argument for that criterion. We advance a distinct argument for the total brain death criterion and answer likely objections. Since human beings are rational animals – sentient organisms of a specific type – the loss of the radical capacity for sentience (the (...) capacity to sense or to develop the capacity to sense) involves a substantial change, the passing away of the human organism. In human beings total brain death involves the complete loss of the radical capacity for sentience, and so in human beings total brain death is death. (shrink)
I live just off of Bell Road outside of Newburgh, Indiana, a small town of 3,000 people. A mile down the street Bell Road intersects with Telephone Road not as a modern reminder of a technology belonging to bygone days, but as testimony that this technology, now more than a century and a quarter old, is still with us. In an age that prides itself on its digital devices and in which the computer now equals the telephone as a medium (...) of communication, it is easy to forget the debt we owe to an era that industrialized the flow of information, that the light bulb, to pick a singular example, which is useful for upgrading visual information we might otherwise overlook, nonetheless remains the most prevalent of all modern day information technologies. Edison’s light bulb, of course, belongs to a different order of informational devices than the computer, but not so the telephone, not entirely anyway. Alan Turing, best known for his work on the Theory of Computation (1937), the Turing Machine (also 1937) and the Turing Test (1950), is often credited with being the father of computer science and the father of artificial intelligence. Less well-known to the casual reader but equally important is his work in computer engineering. The following lecture on the Automatic Computing Engine, or ACE, shows Turing in this different light, as a mechanist concerned with getting the greatest computational power from minimal hardware resources. Yet Turing’s work on mechanisms is often eclipsed by his thoughts on computability and his other theoretical interests. This is unfortunate for several reasons, one being that it obscures our picture of the historical trajectory of information technology, a second that it emphasizes a false dichotomy between “hardware” and “software” to which Turing himself did not ascribe but which has, nonetheless, confused researchers who study the nature of mind and intelligence for generations.. (shrink)
J. M. Kennedy and J. Vervaeke argue that my view of the bodily and imaginative basis of meaning commits me to a mistaken reductionism and to the erroneous view that metaphors actually impose structure on the target domain. I explain the sense in which image schemas are central to the bodily grounding of meaning, although in a way that is not reductionistic. I then show how conceptual metaphors can involve pre-existing image-schematic structure and yet can also be partially constitutive (...) of the conceptual structure of the target domain. In this way human conceptual systems can be both rooted in patterns of our bodily interactions and at the same time can be subject to various kinds of imaginative development and extension. (shrink)
Englantilaisen yleisneron Alan Turingin kuoleman yllä lepää salaperäisyyden verho. On hyvin mahdollista, ettei kenenkään muun nykyajan ajattelijan kuolemaan liity yhtä paljon legendoja ja spekulaatioita. Kiistattomat tosiasiat ovat lyhykäisyydessään seuraavat: siivooja löysi Turingin kotoaan kuolleena 8. kesäkuuta 1954. Turingin todettiin kuolleen edellisenä iltana syanidimyrkytykseen, ja hänen viereltään löytyi puoliksi syöty omena. Hän oli kuollessaan 41-vuotias. Loppu on enemmän tai vähemmän arvailujen varassa.
The two books reviewed here are different efforts to embrace the vast subject called "social thought." The second edition of The Blackwell Dictionary of Modern Social Thought, edited by William Outhwaite with Alain Touraine, contains numerous updates; yet it also has some disadvantages compared to the first edition. Social Thought: From the Enlightenment to the Present, edited by Alan Sica, is a bold but controversial attempt at gathering in one anthology as many social thinkers as possible. Key Words: "social" (...) • social thought/theory • William Outhwaite • Alan Sica • explanation. (shrink)
Alan Carter's recent review in Mind of my Ethics of the Global Environment combines praise of biocentric consequentialism (as presented there and in Value, Obligation and Meta-Ethics) with criticisms that it could advocate both minimal satisfaction of human needs and the extinction of for the sake of generating extra people; Carter also maintains that as a monistic theory it is predictably inadequate to cover the full range of ethical issues, since only a pluralistic theory has this capacity. In this (...) reply, I explain how the counter-intuitive implications of biocentric consequentialism suggested by Carter (for population, needs-satisfaction, and biodiversity preservation) are not implications, and argue that since pluralistic theories (in Carter's sense) either generate contradictions or collapse into monistic theories, the superiority of pluralistic theories is far from predictable. Thus Carter's criticisms fail to undermine biocentric consequentialism as a normative theory applicable to the generality of ethical issues. (shrink)
For much of the second half of the 20th Century, the primary role logical empiricism played was that of the argumentative foil. The 'received view' on a given topic (especially in philosophy of science, logic, or language) was frequently identified with some supposedly dogmatic tenet of logical empiricism. However, during the last twenty-five years, scholars have paid serious, sustained attention to what the logical positivists, individually and collectively, actually said. Early scholarship on logical empiricism had to engage in heavy-duty PR (...) work: why should anyone study the now-discarded mixture of blunders and implausibilities collected under the label 'logical empiricism'? However, thanks to the efforts of the pioneers, people studying the logical empiricists today need not articulate an extended apologia for their chosen subject of study -- rather, they can simply get on with their work. Many of the best fruits of these recent labors are on display in The Cambridge Companion to Logical Empiricism (CCLE), edited by Alan Richardson and Thomas Uebel. (shrink)
Alan Gewirth has propounded a moral theory which commits him to the view that prescriptions can appropriately be addressed to people who have neither any moral reasons nor any prudential reasons to follow the prescriptions. We highlight the strangeness of Gewirth's position and then show that it undermines his attempt to come up with a supreme moral principle.
Miles Kennedy: Home: A Bachelardian concrete metaphysics Content Type Journal Article Pages 1-4 DOI 10.1007/s11007-012-9212-2 Authors Dylan Trigg, Centre de Recherche en Épistémologie Appliquée, Paris, France Journal Continental Philosophy Review Online ISSN 1573-1103 Print ISSN 1387-2842.
In his short life, Alan Turing (1912-1954) made foundational contributions to philosophy, mathematics, biology, artificial intelligence, and computer science. He, as much as anyone, invented the digital electronic computer. From September, 1939 much of his work on computation was war-driven and brutally practical. He developed high speed computing devices needed to decipher German Enigma Machine messages to and from U-boats, countering the most serious threat by far to Britain's survival during World War Two. Yet few people have an image (...) of him. (shrink)
Explaining his now famous parody in Social Text's "Science Wars" issue, Alan Sokal writes in Dissent ("Afterword", Fall 1996): But why did I do it? I confess that I'm an unabashed Old Leftist who never quite understood how deconstruction was supposed to help the working class. And I'm a stodgy old scientist who believes, naively, that there exists an external world, that there exist objective truths about that world, and that my job is to discover some of them. (...) There is much to note in this "confession." Why choose a hoax on Social Text to make these points? Did Sokal believe its editors were unabashed deconstructionists who doubted the existence of an external world or that they were anti-science? If so, he has either misread the burden of its seventeen-year history or was capricious in his choice. If not, then he has perpetuated the saddest hoax of all: on himself. For the fact is that Social Text, of which I am a founder and in whose editorial collective I served until this year, has never been in the deconstructionist camp; nor do its editors or the preponderance of its contributors doubt the existence of a material world. What is at issue is whether our knowledge of it can possibly be free of social and cultural presuppositions. (shrink)
Grunbaum has argued that the Lorentz-Fitzgerald contraction hypothesis is not ad hoc since the Kennedy-Thorndike experiment can be used to provide a test that is significantly different from that provided by the Michelson-Morley experiment. In the first part of the paper, I show that the differences claimed by Grunbaum to hold between these two experiments are not sufficient for establishing independent testability. A dilemma is developed: either the Kennedy-Thorndike experiment, because of experimental realities, cannot test the uncontracted Fresnel (...) aether theory, or if experimental difficulties are ignored, the Kennedy-Thorndike experiment degenerates into a version of the Michelson-Morley experiment. The second part of the paper is a feasibility study of the prospects for defining experimental types according to aims of measurement and determination. This approach is applied to the contraction hypothesis, where it is suggested that the usual analysis of independent testability be modified. (shrink)
In his short life, Alan Turing (1912-1954) made foundational contributions to philosophy, mathematics, biology, artificial intelligence, and computer science. He, as much as anyone, invented and showed how to program the digital electronic computer. From September, 1939, his work on computation was war-driven and brutally practical. He developed high speed computing devices needed to decipher German Enigma Machine messages to and from U-boats, countering the most serious threat by far to Britain..
The origin of my article lies in the appearance of Copeland and Proudfoot's feature article in Scientific American, April 1999. This preposterous paper, as described on another page, suggested that Turing was the prophet of 'hypercomputation'. In their references, the authors listed Copeland's entry on 'The Church-Turing thesis' in the Stanford Encyclopedia. In the summer of 1999, I circulated an open letter criticising the Scientific American article. I included criticism of this Encyclopedia entry. This was forwarded (by Prof. Sol Feferman) (...) to Prof. Ed Zalta, editor of the Encyclopedia, and after some discussion he invited me to submit an entry on 'Alan Turing.'. (shrink)
In his short life, <span class='Hi'>Alan</span> Turing (1912-1954) made foundational contributions to philosophy, mathematics, biology, artificial intelligence, and computer science. He, as much as anyone, invented and showed how to program the digital electronic computer. From September, 1939, his work on computation was war-driven and brutally practical. He developed high speed computing devices needed to decipher German Enigma Machine messages to and from U-boats, countering the most serious threat by far to Britain=s survival during World War Two.
It seems opportune to commemorate in ‘Augustinianum’ the centenary of the birth of Alan Turing, insofar as he is an outstanding figure whose theoritical insight gave birth to the computer revolution of the twentieth centur y. His theories are equally important for the methodology supporting studies in the humanities.
The mathematical genius Alan Turing (1912-1954) was one of the greatest scientists and thinkers of the 20th century. Now well known for his crucial wartime role in breaking the ENIGMA code, he was the first to conceive of the fundamental principle of the modern computer-the idea of controlling a computing machine's operations by means of a program of coded instructions, stored in the machine's 'memory'. In 1945 Turing drew up his revolutionary design for an electronic computing machine-his Automatic Computing (...) Engine ('ACE'). A pilot model of the ACE ran its first program in 1950 and the production version, the 'DEUCE', went on to become a cornerstone of the fledgling British computer industry. The first 'personal' computer was based on Turing's ACE. -/- Alan Turing's Automatic Computing Engine describes Turing's struggle to build the modern computer. The first detailed history of Turing's contributions to computer science, this text is essential reading for anyone interested in the history of the computer and the history of mathematics. It contains first hand accounts by Turing and by the pioneers of computing who worked with him. As well as relating the story of the invention of the computer, the book clearly describes the hardware and software of the ACE-including the very first computer programs. The book is intended to be accessible to everyone with an interest in computing, and contains numerous diagrams and illustrations as well as original photographs. -/- The book contains chapters describing Turing's path-breaking research in the fields of Artificial Intelligence (AI) and Artificial Life (A-Life). The book has an extensive system of hyperlinks to The Turing Archive for the History of Computing, an on-line library of digital facsimiles of typewritten documents by Turing and the other scientists who pioneered the electronic computer. (shrink)