This is the first comprehensive study in English of Voltaire's contes philosophiques--the philosophical tales for which he is best remembered and which include his masterpiece Candide. Pearson situates each story in its historical and intellectual context and offers new readings in light of modern critical thinking. He rejects the traditional view that Voltaire's contes were the private expression of his philosophical perplexity, and argues that it is narrative that is Voltaire's essential mode of thought. His book is a witty, (...) lucid, and scholarly guide to the "fables of reason" through which Voltaire's skepticism undermined the contemporary religious and philosophical explanations of human experience. (shrink)
[Sensation, Causality, and Attention: Roger Bacon and Peter Olivi] This paper investigates what conditions are to be met for sensory perception to occur. It introduces two diff erent theories of perception that were held by two medieval Franciscan thinkers — namely, Roger Bacon (1214/1220–1292) and Peter Olivi (ca. 1248–1298). Bacon analyses especially the causal relation between the object and the sensory organ in his doctrine of the multiplication of species. In his view, a necessary condition of perception is (...) the reception of the species in a fully disposed sensory organ. On the contrary, Olivi stresses the active role of the sensory power. A necessary condition of sensation is the aspectus — i.e. the focus of our power’s attention on the object. Furthermore, the paper investigates whether and how each of the two thinkers can deal with the arguments proposed by his opponent — namely whether Bacon’s theory is able to explain attention and what the causal role of the object in Olivi’s theory is. (shrink)
There are two motivations commonly ascribed to historical actors for taking up statistics: to reduce complicated data to a mean value (e.g., Quetelet), and to take account of diversity (e.g., Galton). Different motivations will, it is assumed, lead to different methodological decisions in the practice of the statistical sciences. Karl Pearson and W. F. R. Weldon are generally seen as following directly in Galton’s footsteps. I argue for two related theses in light of this standard interpretation, based on a (...) reading of several sources in which Weldon, independently of Pearson, reflects on his own motivations. First, while Pearson does approach statistics from this "Galtonian" perspective, he is, consistent with his positivist philosophy of science, utilizing statistics to simplify the highly variable data of biology. Weldon, on the other hand, is brought to statistics by a rich empiricism and a desire to preserve the diversity of biological data. Secondly, we have here a counterexample to the claim that divergence in motivation will lead to a corresponding separation in methodology. Pearson and Weldon, despite embracing biometry for different reasons, settled on precisely the same set of statistical tools for the investigation of evolution. (shrink)
In his latest book,Roger Penrose deals with three foundational problems of current physics fromhis particularly fresh perspective.He criticizes mainstream string the- ories, standard interpretations of quantum mechanics, and pre-Big Bang cosmolo- gies inasmuch as they aim to solve profound questions while glossing over equally deep issues in our understanding of nature. In this review, I analyze Penrose’s main arguments, emphasizing his presentation of the Second Law conundrum as “the most profound mystery of cosmology”, and discuss his own proposals to (...) overcome the impasse. I especially focus on the capabilities of conformal cyclic cosmology to illuminate the enigma of the extraordinarily low entropy at the Big Bang and review its capacity of success in stipulating a reset for the entropy of the universe. Even though one need not follow Penrose’s tentative answers, which are not immune to serious critiques, much of his view can be shared as a sound starting point in search of “the new physics of the universe.”. (shrink)
Whereas there are many aspects of Roger Simon’s thought that can be privileged, one of the most compelling points of entry for beginning to consider his legacy in the field of education, and beyond, lies with his concern for the difficult work of receiving and transmitting, of giving countenance to, the traces of those now absent. Indeed, in the last 20 years of his scholarly work, Simon pressed us to consider the pedagogical stakes in forging an ethical living relation (...) with the remnants of past and presently unsettled—ongoing—historical wrongs. Keenly aware of how memorial practices risk falling into facile assurances and deferrals, Simon emphasized the important work of “remembrance-learning,” in which the task is to learn how to ethically receive and translate the remnants of a difficult past in our present, so that we might be able to more thoroughly think our time. In this paper I provide an overview of a certain tendency in Simon’s later thinking, pointing to how his work on pedagogy, aesthetics, curation and collective study was motivated by a not so ordinary way of thinking that takes seriously the fact that the dead cannot bury the dead; that they need those in the present, those whose turn it is to do the work, to offer human significance and a human completion to what remains a remnant. (shrink)
ROGER SCRUTON’s An Intelligent Person’s Guide to Philosophy takes a personal and provocative look at the subject—those abstract, but nevertheless practical, problems that concern anyone who has reflected on his or her life. Of special delight is his discussion of sex and music. I make some brief critical comments on this based on new economic approaches.
Medical futility, one of the most debated end-of-life issues in medical ethics, has been discussed among physicians and scholars for years but remained an unresolved question. Roger C. Bone (1941–1997), an outstanding pulmonologist and critical care specialist, devoted his last years to ethical issues of terminal care, while facing himself metastatic renal cancer. Criticising the abuse of technology in terminal care and the administrative and financial interference on medical decisions, he bequeathed important points on futility, bringing also patients’ views (...) into attention. He stressed the importance of physician-patient relationship and prompted physicians to remain honest with their patients and stand with them till their very last moments. Roger Bone’s insight of futility, terminal care and physician-patient relationship remains an important legacy for health care professionals and for families and patients facing end-of-life issues. (shrink)
From popular introductions to biographies and television programmes, philosophy is everywhere. Many people even want to be philosophers, usually in the café or the pub. But what do real philosophers do? What are the big philosophical issues of today? Why do they matter? How did some our best philosophers get into philosophy in the first place? Read New British Philosophy and find out for the first time. Clear, engaging and designed for a general audience, sixteen fascinating interviews with some of (...) the top philosophers from the new generation of the subject's leaders range from music to the mind and feminism to the future of philosophy. Each interview is introduced and conducted by Julian Baggini and Jeremy Stangroom of The Philosophers Magazine . This is a unique snapshot of philosophy in Great Britain today and includes interviews with: Ray Monk - Biography; Nigel Warburton - the Public; Aaron Ridley - Music; Jonathan Wolff - Politics; Roger Crisp - Ethics; Rae Langton - Pornography; Miranda Fricker - Knowledge; M.G.F.Martin - Perception; Timothy Williamson - Vagueness; Tim Crane - Mind; Robin Le Poidevin - Metaphysics; Christina Howells - Sartre; Simon Critchley - Phenomenology; Simon Glendinning - Continental; Stephen Mulhall - the Future; Keith Ansell Pearson - the Human. (shrink)
Roger Bastide a été un des rares sociologues français de sa génération à ne pas se reconnaître d’emblée héritier de Durkheim, auquel il reprochait son « sociologisme ». Toute l’œuvre de Bastide peut être caractérisée comme une tentative d’articulation du « fait individuel », du fait social et du fait culturel. L’attention qu’il portait à la subjectivité des individus explique l’intérêt qu’il a très tôt éprouvé pour les travaux des chercheurs de l’École de Chicago, qu’il a découverts en grande (...) partie pendant son long séjour au Brésil et qu’il a été un des premiers à faire connaître en France par ses enseignements et ses publications.Roger Bastide was one of the few French sociologists of his generation who did not declare himself to be heir to Durkheim. On the contrary, he criticized the latter for his « sociologism ». All Bastide’s work can be described as an attempt to connect the « individual fact » with social and cultural facts. The attention he paid to the subjectivity of individuals accounts for his interest, early on, in the work of the Chicago School, which he discovered during a long stay in Brazil. Bastide was one of the first to try, through his teaching and writings, to call the attention of French academics to the work of the Chicago School. (shrink)
From popular introductions to biographies and television programmes, philosophy is everywhere. Many people even want to _be_ philosophers, usually in the café or the pub. But what do real philosophers do? What are the big philosophical issues of today? Why do they matter? How did some our best philosophers get into philosophy in the first place? Read _New British Philosophy_ and find out for the first time. Clear, engaging and designed for a general audience, sixteen fascinating interviews with some of (...) the top philosophers from the new generation of the subject's leaders range from music to the mind and feminism to the future of philosophy. Each interview is introduced and conducted by Julian Baggini and Jeremy Stangroom of _The Philosophers Magazine_. This is a unique snapshot of philosophy in Great Britain today and includes interviews with: Ray Monk - Biography; Nigel Warburton - the Public; Aaron Ridley - Music; Jonathan Wolff - Politics; Roger Crisp - Ethics; Rae Langton - Pornography; Miranda Fricker - Knowledge; M.G.F.Martin - Perception; Timothy Williamson - Vagueness; Tim Crane - Mind; Robin Le Poidevin - Metaphysics; Christina Howells - Sartre; Simon Critchley - Phenomenology; Simon Glendinning - Continental; Stephen Mulhall - the Future; Keith Ansell Pearson - the Human. (shrink)
In this paper I challenge the claim that Bacon considered the operation of species as limited to the physical and sensory levels and demonstrate that in his view, the very same species issued by physical objects operate within the intellect as well. I argue that in Bacon the concept of illumination plays a secondary role in the acquisition of knowledge, and that he regarded innate knowledge as dispositional and confused. What was left as the main channel through which knowledge is (...) gained were species received through the senses. I argue that according to Bacon these species, representing their agents in essence, definition and operation, arrive in the intellect without undergoing a complete abstraction from matter and while still retaining the character of agents acting naturally. In this way Bacon sets the intellect as separate from the natural world not in any essential way, but rather as it were in degree, thus supplying a theoretical justification for the ability to access and know nature. (shrink)
Roger North's The Musicall Grammarian 1728 is a treatise on musical eloquence in all its branches. Of its five parts, I and II, on the orthoepy, orthography and syntax of music, constitute a grammar; III and IV, on the arts of invention and communication, form a rhetoric; and V, on etymology, consists of a history. Two substantial chapters of commentary introduce the text, which is edited here for the first time in its entirety: Jamie Kassler places his treatise within (...) the broader context not only of North's musical and non-musical writings but also their relation to the intellectual ferment of the seventeenth and eighteenth centuries and Mary Chan describes physical and textual aspects of the treatise as evidence for North's processes of thinking about musical thinking. (shrink)
Professor Sir Roger Penrose's work, spanning fifty years of science, with over five thousand pages and more than three hundred papers, has been collected together for the first time and arranged chronologically over six volumes, each with an introduction from the author. Where relevant, individual papers also come with specific introductions or notes. The first volume covers the beginnings of a career that is ground-breaking from the outset. Inspired by courses given by Dirac and Bondi, much of the early (...) published work involves linking general relativity with tensor systems. Among his early works is the seminal 1955 paper, 'A Generalized Inverse for Matrices', his previously unpublished PhD and St John's College Fellowship theses, and from 1967, his Adam's Prize-winning essay on the structure of space-time. Add to this his 1965 paper, 'Gravitational collapse and space-time singularities', and the 1967 paper that introduced a remarkable new theory, 'Twistor algebra', and this becomes a truly stellar procession of works on mathematics and cosmology. (shrink)
Professor Sir Roger Penrose is one of the truly original thinkers of our time. He has made several remarkable contributions to science, from quantum physics and theories of human consciousness to relativity theory and observations on the structure of the universe. Unusually for a scientist, some of his ideas have crossed over into the public arena. Now his work, spanning fifty years of science, with over five thousand pages and more than three hundred papers, has been collected together for (...) the first time and arranged chronologically over six volumes, each with an introduction from the author. Where relevant, individual papers also come with specific introductions or notes. (shrink)
Professor Sir Roger Penrose's work, spanning fifty years of science, with over five thousand pages and more than three hundred papers, has been collected together for the first time and arranged chronologically over six volumes, each with an introduction from the author. Where relevant, individual papers also come with specific introductions or notes. Many important realizations concerning twistor theory occurred during the short period of this third volume, providing a new perspective on the way that mathematical features of the (...) complex geometry of twistor theory relate to actual physical fields. Following on from the nonlinear graviton construction, a twistor construction was found for (anti-)self-dual electromagnetism allowing the general (anti-)self-dual Yang-Mills field to be obtained. It became clear that some features of twistor contour integrals could be understood in terms of holomorphic sheaf cohomology. During this period, the Oxford research group founded the informal publication, Twistor Newsletter. This volume also contains the influential Weyl curvature hypothesis and new forms of Penrose tiles. (shrink)
Professor Sir Roger Penrose's work, spanning fifty years of science, with over five thousand pages and more than three hundred papers, has been collected together for the first time and arranged chronologically over six volumes, each with an introduction from the author. Where relevant, individual papers also come with specific introductions or notes. Among the new developments that occurred during this period was the introduction of a particular notion of 'quasi-local mass-momentum and angular momentum', the topic of Penrose's Royal (...) Society paper. Many encouraging results were initially obtained but, later, difficulties began to emerge and remain today. Also, an extensive paper (with Eastwood and Wells) gives a thorough account of the relation between twistor cohomology and massless fields. This volume witnesses Penrose's increasing conviction that the puzzling issue of quantum measurement could only be resolved by the appropriate unification of quantum mechanics with general relativity, where that union must involve an actual change in the rules of quantum mechanics as well as in space-time structure. Penrose's first incursions into a possible relation between consciousness and quantum state reduction are also covered here. (shrink)
Professor Sir Roger Penrose's work, spanning fifty years of science, with over five thousand pages and more than three hundred papers, has been collected together for the first time and arranged chronologically over six volumes, each with an introduction from the author. Where relevant, individual papers also come with specific introductions or notes. Publication of The Emperor's New Mind (OUP 1989) had caused considerable debate and Penrose's responses are included in this volume. Arising from this came the idea that (...) large-scale quantum coherence might exist within the conscious brain, and actual conscious experience would be associated with a reduction of the quantum state. Within this collection, Penrose also proposes that a twistor might usefully be regarded as a source (or 'charge') for a massless field of spin 3/2, suggesting that the twistor space for a Ricci-flat space-time might actually be the space of such possible sources. Towards the end of the volume, Penrose begins to develop a quite different approach to incorporating full general relativity into twistor theory. This period also sees the origin of the Diósi-Penrose proposal. (shrink)
Professor Sir Roger Penrose's work, spanning fifty years of science, with over five thousand pages and more than three hundred papers, has been collected together for the first time and arranged chronologically over six volumes, each with an introduction from the author. Where relevant, individual papers also come with specific introductions or notes. This sixth volume describes an actual experiment to measure the length of time that a quantum superposition might last (developing the Diósi-Penrose proposal). It also discusses the (...) significant progress made in relation to incorporating the 'googly' information for a gravitational field into the structure of a curved twistor space. Penrose also covers such things as the geometry of light rays in relation to twistor-space structures, the utility of complex numbers in drawing three-dimensional shapes, and the geometrical representation of different types of musical scales. The turn of the millennium was also an opportunity to reflect on progress in many areas up until that point. (shrink)
Professor Sir Roger Penrose's work, spanning fifty years of science, with over five thousand pages and more than three hundred papers, has been collected together for the first time and arranged chronologically over six volumes, each with an introduction from the author. Where relevant, individual papers also come with specific introductions or notes. Developing ideas sketched in the first volume, twistor theory is now applied to genuine issues of physics, and there are the beginnings of twistor diagram theory (an (...) analogue of Feynman Diagrams). This collection includes joint papers with Stephen Hawking, and uncovers certain properties of black holes. The idea of cosmic censorship is also first proposed. Along completely different lines, the first methods of aperiodic tiling for the Euclidean plane that come to be known as Penrose tiles are described. This volume also contains Penrose's three prize-winning essays for the Gravity Foundation (two second places with both Ezra Newman and Steven Hawking, and a solo first place for 'The Non-linear graviton'). (shrink)
In his stimulating book SHADOWS OF THE MIND, Roger Penrose presents arguments, based on Gödel's theorem, for the conclusion that human thought is uncomputable. There are actually two separate arguments in Penrose's book. The second has been widely ignored, but seems to me to be much more interesting and novel than the first. I will address both forms of the argument in some detail. Toward the end, I will also comment on Penrose's proposals for a "new science of consciousness".
In his book Shadows of the Mind: A search for the missing science of con- sciousness [SM below], Roger Penrose has turned in another bravura perfor- mance, the kind we have come to expect ever since The Emperor’s New Mind [ENM ] appeared. In the service of advancing his deep convictions and daring conjectures about the nature of human thought and consciousness, Penrose has once more drawn a wide swath through such topics as logic, computa- tion, artiﬁcial intelligence, quantum (...) physics and the neuro-physiology of the brain, and has produced along the way many gems of exposition of diﬃcult mathematical and scientiﬁc ideas, without condescension, yet which should be broadly appealing.1 While the aims and a number of the topics in SM are the same as in ENM , the focus now is much more on the two axes that Pen- rose grinds in earnest. Namely, in the ﬁrst part of SM he argues anew and at great length against computational models of the mind and more speciﬁ- cally against any account of mathematical thought in computational terms. Then in the second part, he argues that there must be a scientiﬁc account of consciousness but that will require a (still to be found) non-computational extension or modiﬁcation of present-day quantum physics. (shrink)
Despite the widespread use of key concepts of the Neyman–Pearson (N–P) statistical paradigm—type I and II errors, significance levels, power, confidence levels—they have been the subject of philosophical controversy and debate for over 60 years. Both current and long-standing problems of N–P tests stem from unclarity and confusion, even among N–P adherents, as to how a test's (pre-data) error probabilities are to be used for (post-data) inductive inference as opposed to inductive behavior. We argue that the relevance of error (...) probabilities is to ensure that only statistical hypotheses that have passed severe or probative tests are inferred from the data. The severity criterion supplies a meta-statistical principle for evaluating proposed statistical inferences, avoiding classic fallacies from tests that are overly sensitive, as well as those not sensitive enough to particular errors and discrepancies. Introduction and overview 1.1 Behavioristic and inferential rationales for Neyman–Pearson (N–P) tests 1.2 Severity rationale: induction as severe testing 1.3 Severity as a meta-statistical concept: three required restrictions on the N–P paradigm Error statistical tests from the severity perspective 2.1 N–P test T(): type I, II error probabilities and power 2.2 Specifying test T() using p-values Neyman's post-data use of power 3.1 Neyman: does failure to reject H warrant confirming H? Severe testing as a basic concept for an adequate post-data inference 4.1 The severity interpretation of acceptance (SIA) for test T() 4.2 The fallacy of acceptance (i.e., an insignificant difference): Ms Rosy 4.3 Severity and power Fallacy of rejection: statistical vs. substantive significance 5.1 Taking a rejection of H0 as evidence for a substantive claim or theory 5.2 A statistically significant difference from H0 may fail to indicate a substantively important magnitude 5.3 Principle for the severity interpretation of a rejection (SIR) 5.4 Comparing significant results with different sample sizes in T(): large n problem 5.5 General testing rules for T(), using the severe testing concept The severe testing concept and confidence intervals 6.1 Dualities between one and two-sided intervals and tests 6.2 Avoiding shortcomings of confidence intervals Beyond the N–P paradigm: pure significance, and misspecification tests Concluding comments: have we shown severity to be a basic concept in a N–P philosophy of induction? (shrink)
The debate between the Mendelians and the (largely Darwinian) biometricians has been referred to by R. A. Fisher as ‘one of the most needless controversies in the history of science’ and by David Hull as ‘an explicable embarrassment’. The literature on this topic consists mainly of explaining why the controversy occurred and what factors prevented it from being resolved. Regrettably, little or no mention is made of the issues that figured in its resolution. This paper deals with the latter topic (...) and in doing so reorients the focus of the debate as one between Karl Pearson and R. A. Fisher rather than between the biometricians and the Mendelians. One reason for this reorientation is that Pearson's own work in 1904 and 1909 suggested that Mendelism and biometry could, to some extent, be made compatible, yet he remained steadfast in his rejection of Mendelism. The interesting question then is why Fisher, who was also a proponent of biometric methods, was able to synthesise the two traditions in a way that Pearson either could not or would not. My answer to this question involves an analysis of the ways in which different kinds of assumptions were used in modelling Mendelian populations. I argue that it is these assumptions, which lay behind the statistical techniques of Pearson and Fisher, that can be isolated as the source of Pearson's rejection of Mendelism and Fisher's success in the synthesis. (shrink)
According to the Imprecise Credence Framework (ICF), a rational believer's doxastic state should be modelled by a set of probability functions rather than a single probability function, namely, the set of probability functions allowed by the evidence ( Joyce  ). Roger White (  ) has recently given an arresting argument against the ICF, which has garnered a number of responses. In this article, I attempt to cast doubt on his argument. First, I point out that it's not (...) an argument against the ICF per se , but an argument for the Principle of Indifference. Second, I present an argument that's analogous to White's. I argue that if White's premises are true, the premises of this argument are too. But the premises of my argument entail something obviously false. Therefore, White's premises must not all be true. (shrink)
Al final de su libro “La conciencia inexplicada”, Juan Arana señala que la nomología, explicación según las leyes de la naturaleza, requiere de una nomogonía, una consideración del origen de las leyes. Es decir, que el orden que observamos en el mundo natural requiere una instancia previa que ponga ese orden específico. Sabemos que desde la revolución científica la mejor manera de explicar dicha nomología ha sido mediante las matemáticas. Sin embargo, en las últimas décadas se han presentado algunas propuestas (...) basadas en modelos matemáticos que fundamentarían muchos aspectos de la realidad. Dos claros ejemplos provienen de Roger Penrose y Max Tegmark. Esto lleva a pensar en una posición no solo nomológica sino además nomogónica de la matemática. ¿Puede la Naturaleza estar fundada por las matemáticas como señalan algunos físico-matemáticos? Y en ese caso, ¿sería pertinente buscar una nomo-génesis de esta índole en la constitución de la conciencia? -/- At the end of his book “La conciencia inexplicada”, Juan Arana points out that nomology, explanation according to the laws of nature requires a nomogony, an account of the origin of the laws. This means that the order that we can observe in the natural World demands something prior to posit that specific order. Since the scientific revolution we know that the best way to explain that nomology has been through mathematics. Anyway, in recent decades a number of proposals based on mathematical models that found many aspects of reality has been offered. Two clear examples come from Roger Penrose and Max Tegmark. This drives us to think of a position of mathematics as not only nomological but also nomogonical. Can Nature be founded by mathematics as some physicists and mathematicians point out? And, in this case, would be relevant this kind of approach to search a nomo-genesis in the constitution of consciousness? (shrink)
This case study focuses on Roger Boisjoly's attempt to prevent the launch of the Challenger and subsequent quest to set the record straight despite negative consequences. Boisjoly's experiences before and after the Challenger disaster raise numerous ethical issues that are integral to any explanation of the disaster and applicable to other management situations. Underlying all these issues, however, is the problematic relationship between individual and organizational responsibility. In analyzing this fundamental issue, this paper has two objectives: first, to demonstrate (...) the extent to which the ethical ambiguity that permeates the relationship between individual and organizational responsibility contributed to the Challenger disaster; second, to reclaim the meaning and importance of individual responsibility within the diluting context of large organizations. (shrink)
I document some of the main evidence showing that E. S. Pearson rejected the key features of the behavioral-decision philosophy that became associated with the Neyman-Pearson Theory of statistics (NPT). I argue that NPT principles arose not out of behavioral aims, where the concern is solely with behaving correctly sufficiently often in some long run, but out of the epistemological aim of learning about causes of experimental results (e.g., distinguishing genuine from spurious effects). The view Pearson did (...) hold gives a deeper understanding of NPT tests than their typical formulation as accept-reject routines, against which criticisms of NPT are really directed. The Pearsonian view that emerges suggests how NPT tests may avoid these criticisms while still retaining what is central to these methods: the control of error probabilities. (shrink)
This is a critical review of Roger Crisp's The Cosmos of Duty. The review praises the book but, among other things, takes issue with some of Crisp's criticisms of Sidgwick's view that resolution of the free will problem is of limited significance to ethics and with Crisp's claim that in Methods III.xiii Sidgwick defends an axiom of prudence that undergirds rational egoism.
Summary Long-standing claims have been made for nearly the entire twentieth century that the biometrician, Karl Pearson, and his colleague, W. F. R. Weldon, rejected Mendelism as a theory of inheritance. It is shown that at the end of the nineteenth century Pearson considered various theories of inheritance (including Francis Galton's law of ancestral heredity for characters underpinned by continuous variation), and by 1904 he ?accepted the fundamental idea of Mendel? as a theory of inheritance for discontinuous variation. (...) Moreover, in 1909, he suggested a synthesis of biometry and Mendelism. Despite the many attempts made by a number of geneticists (including R. A. Fisher in 1936) to use Pearson's chi-square (X 2, P) goodness-of-fit test on Mendel's data, which produced results that were ?too good to be true?, Weldon reached the same conclusion in 1902, but his results were never acknowledged. The geneticist and arch-rival of the biometricians, Williams Bateson, was instead exceptionally critical of this work and interpreted this as Weldon's rejection of Mendelism. Whilst scholarship on Mendel, by historians of science in the last 18 years, has led to a balanced perspective of Mendel, it is suggested that a better balanced and more rounded view of the hereditarian-statistical work of Pearson, Weldon, and the biometricians is long overdue. (shrink)
This paper traces the background to R. A. Fisher's multi-factorial theory of inheritance. It is argued that the traditional account is incomplete, and that Karl Pearson's well-known pre-Fisherian objections to the theory were in fact overcome by Pearson himself. It is further argued that Pearson's stated reasons for not accepting his own achievement has to be seen as a rationalization, standing in for deeper-seated metaphysical objections to the Mendelian paradigm of a type not readily discussed in a (...) formal scientific paper. The apparent, post-Fisherian, continued acceptance of Pearson's objections is presented as an interesting problem for the historian and sociologist. (shrink)
Standard statistical measures of strength of association, although pioneered by Pearson deliberately to be acausal, nowadays are routinely used to measure causal efficacy. But their acausal origins have left them ill suited to this latter purpose. I distinguish between two different conceptions of causal efficacy, and argue that: 1) Both conceptions can be useful 2) The statistical measures only attempt to capture the first of them 3) They are not fully successful even at this 4) An alternative definition more (...) squarely based on causal thinking not only captures the second conception, it can also capture the first one better too. (shrink)
Comments on Roger Ariew’s “Descartes and Leibniz as Readers of Suarez," presented at Franscico Suarez, S.J.: Last Medieval or First Early Modern?, London, Ontario, University of Western Ontario, September 2008.
Roger Crisp distinguishes a positive and a negative aspect of the buck-passing account of goodness (BPA), and argues that the positive account should be dropped in order to avoid certain problems, in particular, that it implies eliminativism about value. This eliminativism involves what I call an ontological claim, the claim that there is no real property of goodness, and an error theory, the claim that all value talk is false. I argue first that the positive aspect of the BPA (...) is necessary to explain the negative aspect. I accept the ontological claim but argue that this does not imply any sort of error theory about value. (shrink)