This is the first comprehensive study in English of Voltaire's contes philosophiques--the philosophical tales for which he is best remembered and which include his masterpiece Candide. Pearson situates each story in its historical and intellectual context and offers new readings in light of modern critical thinking. He rejects the traditional view that Voltaire's contes were the private expression of his philosophical perplexity, and argues that it is narrative that is Voltaire's essential mode of thought. His book is a witty, (...) lucid, and scholarly guide to the "fables of reason" through which Voltaire's skepticism undermined the contemporary religious and philosophical explanations of human experience. (shrink)
Recent and rapid technological developments on many fronts have created in our society some extremely difficult moral predicaments. Previous generations have not had to face the dilemmas posed by, for example, the availability of safe abortions, sperm banks and prostoglandins. They have not had to come to terms with an unchecked exploitation of natural resources heralding imminent ecological crisis, or, worst of all, with the recognition that only in this current generation have people the capacity to destroy themselves and their (...) environment. This book seeks to show how, and why, Seventh-day Adventism has addressed these moral issues, and that the ethical questions arising from these issues are especially relevant to the Adventist church and its development. Dr Pearson looks specifically at the moral decisions Adventists have made in the area of human sexuality, on such issues as contraception, abortion, the role and status of women, divorce and homosexuality, from the beginnings of the movement to 1985. He seeks to put such decision-making in perspective by providing the general social context in which it took place, and shows how Ellen White (whose charismatic leadership held the movement together in its first fifty years) has been a major source of moral authority in the Adventist church - her writings continuing to exercise authority in a contemporary society of turmoil and change. This important book, which conveys something of the general ethos of Adventism, is the first to investigate the ethics of the movement, ans so fill a notable gap in the literature. (shrink)
What follows is a dialogue, in the Platonic sense, concerning the justifications for "business ethics" as a vehicle for asking questions about the values of modern business organisations. The protagonists are the authors, Gordon Pearson – a pragmatist and sceptic where business ethics is concerned – and Martin Parker – a sociologist and idealist who wishes to be able to ask ethical questions of business. By the end of the dialogue we come to no agreement on the necessity or (...) justification for business ethics, but on the way discuss the uses of philosophy, the meanings of integrity and trust, McDonald''s, a hypothetical torture manufacturer and various other matters. (shrink)
"A remarkable book that influenced the scientific thought of an entire generation."-- Dictionary of Scientific Biography A major statement of the language, method, and concepts of the physical sciences, this 1892 volume traces not only the history of experimental investigation but also the efforts of philosophic minds to state and organize their findings intelligently. A classic in the philosophy of science, its author is the founder of modern statistics. Karl Pearson was among the most influential university teachers of his (...) era, and he possessed a remarkable ability to captivate both students and casual listeners. In The Grammar of Science, his most widely read book, he introduced the concept of a general methodology underlying all science, and thus made one of the great contributions to modern thought. 1957 ed. (shrink)
Some critics of the accounting/auditing profession in the United States claim that independence-related quality control problems are the cause of an increased number of alleged audit failures. Certified public accountants (CPAs) were queried regarding independence impairment in their profession. Questionnaire results indicate a number of CPAs believe independence deficiencies exist, and some CPAs admit to personal independence impairment.
In this paper I examine the prevailing assumption that there is a right to procreate and question whether there exists a coherent notion of such a right. I argue that we should question any and all procreative activities, not just alternative procreative means and contexts. I suggest that clinging to the assumption of a right to procreate prevents serious scrutiny of reproductive behavior and that, instead of continuing to embrace this assumption, attempts should be made to provide a proper foundation (...) for it. I argue that the focus of procreative activities and discourse on reproductive ethics should be on obligations instead of rights, as rights talk tends to obfuscate recognition of obligations toward others, particularly those who bear the most significant burdens of the procreative process. I examine some possible foundations of a right to procreate as well as John Robertson’s thoughtful account of “procreative liberty” but conclude that at the present time there exists no compelling account of a right to procreate. Finally, I conclude that in the absence of a satisfactory account of a right to procreate, we should refrain from grounding practices or polices on the assumption that there is such a right. (shrink)
This study discusses how perceptions of ethics are formed by certified public accountants (CPAs). Theologians are used as a point of comparison. When considering CPA ethical dilemmas, both subject groups in this research project viewed confidentiality and independence as more important than recipient of responsibility and seriousness of breach. Neither group, however, was insensitive to any of the factors presented for its consideration. CPA reactions to ethical dilemmas were governed primarily by provisions of the CPA ethics code; conformity to that (...) code may well be evidence of higher stage moral reasoning. (shrink)
This dialogue engages with the ethics of politics of capitalism, and enacts a debate between two participants who have divergent views on these matters. Beginning with a discussion concerning definitions of capitalism, it moves on to cover issues concerning our different understandings of the costs and benefits of global capitalist systems. This then leads into a debate about the nature and purposes of regulation, in terms of whether regulation is intended to make competition work better for consumers, or to prevent (...) negative outcomes for citizens. The conclusion speculates about the usefulness or otherwise of this Socratic method of dialogue. (shrink)
Financial statement users must believe that external auditors are free from management control, or users will doubt the verity of auditors' representations. Although U.S.-based auditing firms claim they are independent of their corporate clients, research has demonstrated that many individuals and groups perceive the situation otherwise. A proposal for enhancing perceptions of auditor independence is offered in this article. The proposal calls for an auditor-administered educational program, complemented by corporate audit committee involvement to lend credibility to auditors' claims.
Regeneration in arthropods and amphibians follows an analogous principle making comparisons between the two phyla possible.Larval arthropods and amphibians possess powers of epimorphic regeneration which wane for many species of these phyla with the completion of metamorphosis or the cessation of moulting. In those species which retain, post-maturationally, the ability to form a regenerative blastema, larval characteristics are carried into the adult and reproductive stages of these organisms. These include many species of: urodeles, ametabolous insects, crustaceans, myriapods and arachnids. The (...) long-standing distinction between embryonic regulation and true epimorphosis would thus appear to be a difference of degree rather than kind. (shrink)
Variation or rearrangement of regulatory genes is responsible for cellular malignant change. These types of chromosomal variations also produce heterochrony or paedomorphic evolution at the organismal level. Analogously, neoplasia represents a cellular macroevolutionary event, and a tumour can be said to be an evolved population of cells. To understand this cellular evolution to malignancy, it may be necessary to go beyond a clonal selection (adaptationist) explanation of neoplastic alteration. In the pericellular environment natural selection consists of the organizational restraints of (...) surrounding cells as well as the host's immunological surveillance and non-specific monocyte-macrophage systems. Indirect evidence suggests that success for the neoplasm depends not upon clonal selection, but solely upon a genetic methodology—the function of which is to elude selection.The author has coined the term cellular heterochrony to illustrate analogic similarities in the molecular modes of speciation between anaplastic cancer cells and the heterochronic evolution of organisms. By reverting to a juvenile (embryonic) repertoire of cellular behaviour a tumour secures its own tenure or niche by usurping the host's armamentarium of selection forces, employing many of the same or similar methods by which implanting and invading tissues of the mammalian embryo forestall maternal detection and rejection. A number of ways by which the tumour blocks, subverts or evades selection are discussed. (shrink)
It is posited that the initiating event of amphibian regeneration of a limb, is retrodifferentiation* of what are to become the developing cells of the blastema. These cells reiterate a larval or premetamorphic ontogenic repertoire, induced by elevated levels of prolactin with adequate innervation. Subsequent redifferentiation of the blastema cells occurs, controlled by thyroxine and innervation.This temporal displacement of cellular morphologic characters in regeneration should be looked upon as a function of the ability to reiterate larval characters and subsequently metamorphose. (...) If correct, this would explain why amphibians which metamorphose only once, lose the ability to postmetamorphically regenerate. An exception to this,Xenopus laevis, an anuran which can epimorphically regenerate, to some extent, will be discussed.[/p]. (shrink)
There are two motivations commonly ascribed to historical actors for taking up statistics: to reduce complicated data to a mean value (e.g., Quetelet), and to take account of diversity (e.g., Galton). Different motivations will, it is assumed, lead to different methodological decisions in the practice of the statistical sciences. Karl Pearson and W. F. R. Weldon are generally seen as following directly in Galton’s footsteps. I argue for two related theses in light of this standard interpretation, based on a (...) reading of several sources in which Weldon, independently of Pearson, reflects on his own motivations. First, while Pearson does approach statistics from this "Galtonian" perspective, he is, consistent with his positivist philosophy of science, utilizing statistics to simplify the highly variable data of biology. Weldon, on the other hand, is brought to statistics by a rich empiricism and a desire to preserve the diversity of biological data. Secondly, we have here a counterexample to the claim that divergence in motivation will lead to a corresponding separation in methodology. Pearson and Weldon, despite embracing biometry for different reasons, settled on precisely the same set of statistical tools for the investigation of evolution. (shrink)
Roger North's The Musicall Grammarian 1728 is a treatise on musical eloquence in all its branches. Of its five parts, I and II, on the orthoepy, orthography and syntax of music, constitute a grammar; III and IV, on the arts of invention and communication, form a rhetoric; and V, on etymology, consists of a history. Two substantial chapters of commentary introduce the text, which is edited here for the first time in its entirety: Jamie Kassler places his treatise within (...) the broader context not only of North's musical and non-musical writings but also their relation to the intellectual ferment of the seventeenth and eighteenth centuries and Mary Chan describes physical and textual aspects of the treatise as evidence for North's processes of thinking about musical thinking. (shrink)
Professor Sir Roger Penrose's work, spanning fifty years of science, with over five thousand pages and more than three hundred papers, has been collected together for the first time and arranged chronologically over six volumes, each with an introduction from the author. Where relevant, individual papers also come with specific introductions or notes. The first volume covers the beginnings of a career that is ground-breaking from the outset. Inspired by courses given by Dirac and Bondi, much of the early (...) published work involves linking general relativity with tensor systems. Among his early works is the seminal 1955 paper, 'A Generalized Inverse for Matrices', his previously unpublished PhD and St John's College Fellowship theses, and from 1967, his Adam's Prize-winning essay on the structure of space-time. Add to this his 1965 paper, 'Gravitational collapse and space-time singularities', and the 1967 paper that introduced a remarkable new theory, 'Twistor algebra', and this becomes a truly stellar procession of works on mathematics and cosmology. (shrink)
Professor Sir Roger Penrose is one of the truly original thinkers of our time. He has made several remarkable contributions to science, from quantum physics and theories of human consciousness to relativity theory and observations on the structure of the universe. Unusually for a scientist, some of his ideas have crossed over into the public arena. Now his work, spanning fifty years of science, with over five thousand pages and more than three hundred papers, has been collected together for (...) the first time and arranged chronologically over six volumes, each with an introduction from the author. Where relevant, individual papers also come with specific introductions or notes. (shrink)
Professor Sir Roger Penrose's work, spanning fifty years of science, with over five thousand pages and more than three hundred papers, has been collected together for the first time and arranged chronologically over six volumes, each with an introduction from the author. Where relevant, individual papers also come with specific introductions or notes. Many important realizations concerning twistor theory occurred during the short period of this third volume, providing a new perspective on the way that mathematical features of the (...) complex geometry of twistor theory relate to actual physical fields. Following on from the nonlinear graviton construction, a twistor construction was found for (anti-)self-dual electromagnetism allowing the general (anti-)self-dual Yang-Mills field to be obtained. It became clear that some features of twistor contour integrals could be understood in terms of holomorphic sheaf cohomology. During this period, the Oxford research group founded the informal publication, Twistor Newsletter. This volume also contains the influential Weyl curvature hypothesis and new forms of Penrose tiles. (shrink)
Professor Sir Roger Penrose's work, spanning fifty years of science, with over five thousand pages and more than three hundred papers, has been collected together for the first time and arranged chronologically over six volumes, each with an introduction from the author. Where relevant, individual papers also come with specific introductions or notes. Among the new developments that occurred during this period was the introduction of a particular notion of 'quasi-local mass-momentum and angular momentum', the topic of Penrose's Royal (...) Society paper. Many encouraging results were initially obtained but, later, difficulties began to emerge and remain today. Also, an extensive paper (with Eastwood and Wells) gives a thorough account of the relation between twistor cohomology and massless fields. This volume witnesses Penrose's increasing conviction that the puzzling issue of quantum measurement could only be resolved by the appropriate unification of quantum mechanics with general relativity, where that union must involve an actual change in the rules of quantum mechanics as well as in space-time structure. Penrose's first incursions into a possible relation between consciousness and quantum state reduction are also covered here. (shrink)
Professor Sir Roger Penrose's work, spanning fifty years of science, with over five thousand pages and more than three hundred papers, has been collected together for the first time and arranged chronologically over six volumes, each with an introduction from the author. Where relevant, individual papers also come with specific introductions or notes. Publication of The Emperor's New Mind (OUP 1989) had caused considerable debate and Penrose's responses are included in this volume. Arising from this came the idea that (...) large-scale quantum coherence might exist within the conscious brain, and actual conscious experience would be associated with a reduction of the quantum state. Within this collection, Penrose also proposes that a twistor might usefully be regarded as a source (or 'charge') for a massless field of spin 3/2, suggesting that the twistor space for a Ricci-flat space-time might actually be the space of such possible sources. Towards the end of the volume, Penrose begins to develop a quite different approach to incorporating full general relativity into twistor theory. This period also sees the origin of the Diósi-Penrose proposal. (shrink)
Professor Sir Roger Penrose's work, spanning fifty years of science, with over five thousand pages and more than three hundred papers, has been collected together for the first time and arranged chronologically over six volumes, each with an introduction from the author. Where relevant, individual papers also come with specific introductions or notes. This sixth volume describes an actual experiment to measure the length of time that a quantum superposition might last (developing the Diósi-Penrose proposal). It also discusses the (...) significant progress made in relation to incorporating the 'googly' information for a gravitational field into the structure of a curved twistor space. Penrose also covers such things as the geometry of light rays in relation to twistor-space structures, the utility of complex numbers in drawing three-dimensional shapes, and the geometrical representation of different types of musical scales. The turn of the millennium was also an opportunity to reflect on progress in many areas up until that point. (shrink)
Professor Sir Roger Penrose's work, spanning fifty years of science, with over five thousand pages and more than three hundred papers, has been collected together for the first time and arranged chronologically over six volumes, each with an introduction from the author. Where relevant, individual papers also come with specific introductions or notes. Developing ideas sketched in the first volume, twistor theory is now applied to genuine issues of physics, and there are the beginnings of twistor diagram theory (an (...) analogue of Feynman Diagrams). This collection includes joint papers with Stephen Hawking, and uncovers certain properties of black holes. The idea of cosmic censorship is also first proposed. Along completely different lines, the first methods of aperiodic tiling for the Euclidean plane that come to be known as Penrose tiles are described. This volume also contains Penrose's three prize-winning essays for the Gravity Foundation (two second places with both Ezra Newman and Steven Hawking, and a solo first place for 'The Non-linear graviton'). (shrink)
I document some of the main evidence showing that E. S. Pearson rejected the key features of the behavioral-decision philosophy that became associated with the Neyman-Pearson Theory of statistics (NPT). I argue that NPT principles arose not out of behavioral aims, where the concern is solely with behaving correctly sufficiently often in some long run, but out of the epistemological aim of learning about causes of experimental results (e.g., distinguishing genuine from spurious effects). The view Pearson did (...) hold gives a deeper understanding of NPT tests than their typical formulation as accept-reject routines, against which criticisms of NPT are really directed. The Pearsonian view that emerges suggests how NPT tests may avoid these criticisms while still retaining what is central to these methods: the control of error probabilities. (shrink)
This case study focuses on Roger Boisjoly's attempt to prevent the launch of the Challenger and subsequent quest to set the record straight despite negative consequences. Boisjoly's experiences before and after the Challenger disaster raise numerous ethical issues that are integral to any explanation of the disaster and applicable to other management situations. Underlying all these issues, however, is the problematic relationship between individual and organizational responsibility. In analyzing this fundamental issue, this paper has two objectives: first, to demonstrate (...) the extent to which the ethical ambiguity that permeates the relationship between individual and organizational responsibility contributed to the Challenger disaster; second, to reclaim the meaning and importance of individual responsibility within the diluting context of large organizations. (shrink)
Despite the widespread use of key concepts of the Neyman–Pearson (N–P) statistical paradigm—type I and II errors, significance levels, power, confidence levels—they have been the subject of philosophical controversy and debate for over 60 years. Both current and long-standing problems of N–P tests stem from unclarity and confusion, even among N–P adherents, as to how a test's (pre-data) error probabilities are to be used for (post-data) inductive inference as opposed to inductive behavior. We argue that the relevance of error (...) probabilities is to ensure that only statistical hypotheses that have passed severe or probative tests are inferred from the data. The severity criterion supplies a meta-statistical principle for evaluating proposed statistical inferences, avoiding classic fallacies from tests that are overly sensitive, as well as those not sensitive enough to particular errors and discrepancies. Introduction and overview 1.1 Behavioristic and inferential rationales for Neyman–Pearson (N–P) tests 1.2 Severity rationale: induction as severe testing 1.3 Severity as a meta-statistical concept: three required restrictions on the N–P paradigm Error statistical tests from the severity perspective 2.1 N–P test T(): type I, II error probabilities and power 2.2 Specifying test T() using p-values Neyman's post-data use of power 3.1 Neyman: does failure to reject H warrant confirming H? Severe testing as a basic concept for an adequate post-data inference 4.1 The severity interpretation of acceptance (SIA) for test T() 4.2 The fallacy of acceptance (i.e., an insignificant difference): Ms Rosy 4.3 Severity and power Fallacy of rejection: statistical vs. substantive significance 5.1 Taking a rejection of H0 as evidence for a substantive claim or theory 5.2 A statistically significant difference from H0 may fail to indicate a substantively important magnitude 5.3 Principle for the severity interpretation of a rejection (SIR) 5.4 Comparing significant results with different sample sizes in T(): large n problem 5.5 General testing rules for T(), using the severe testing concept The severe testing concept and confidence intervals 6.1 Dualities between one and two-sided intervals and tests 6.2 Avoiding shortcomings of confidence intervals Beyond the N–P paradigm: pure significance, and misspecification tests Concluding comments: have we shown severity to be a basic concept in a N–P philosophy of induction? (shrink)
Roger Crisp distinguishes a positive and a negative aspect of the buck-passing account of goodness (BPA), and argues that the positive account should be dropped in order to avoid certain problems, in particular, that it implies eliminativism about value. This eliminativism involves what I call an ontological claim, the claim that there is no real property of goodness, and an error theory, the claim that all value talk is false. I argue first that the positive aspect of the BPA (...) is necessary to explain the negative aspect. I accept the ontological claim but argue that this does not imply any sort of error theory about value. (shrink)
The main thesis of the paper is that in the case of modern statistics, the differences between the various concepts of models were the key to its formative controversies. The mathematical theory of statistical inference was mainly developed by Ronald A. Fisher, Jerzy Neyman, and Egon S. Pearson. Fisher on the one side and Neyman–Pearson on the other were involved often in a polemic controversy. The common view is that Neyman and Pearson made Fisher's account more stringent (...) mathematically. It is argued, however, that there is a profound theoretical basis for the controversy: both sides held conflicting views about the role of mathematical modelling. At the end, the influential programme of Exploratory Data Analysis is considered to be advocating another, more instrumental conception of models. Introduction Models in statistics—‘of what population is this a random sample?’ The fundamental lemma Controversy about models Exploratory data analysis as a model-critical approach. (shrink)
According to the Imprecise Credence Framework (ICF), a rational believer's doxastic state should be modelled by a set of probability functions rather than a single probability function, namely, the set of probability functions allowed by the evidence ( Joyce  ). Roger White (  ) has recently given an arresting argument against the ICF, which has garnered a number of responses. In this article, I attempt to cast doubt on his argument. First, I point out that it's not (...) an argument against the ICF per se , but an argument for the Principle of Indifference. Second, I present an argument that's analogous to White's. I argue that if White's premises are true, the premises of this argument are too. But the premises of my argument entail something obviously false. Therefore, White's premises must not all be true. (shrink)
Comments on Roger Ariew’s “Descartes and Leibniz as Readers of Suarez," presented at Franscico Suarez, S.J.: Last Medieval or First Early Modern?, London, Ontario, University of Western Ontario, September 2008.
The debate between the Mendelians and the (largely Darwinian) biometricians has been referred to by R. A. Fisher as ‘one of the most needless controversies in the history of science’ and by David Hull as ‘an explicable embarrassment’. The literature on this topic consists mainly of explaining why the controversy occurred and what factors prevented it from being resolved. Regrettably, little or no mention is made of the issues that figured in its resolution. This paper deals with the latter topic (...) and in doing so reorients the focus of the debate as one between Karl Pearson and R. A. Fisher rather than between the biometricians and the Mendelians. One reason for this reorientation is that Pearson's own work in 1904 and 1909 suggested that Mendelism and biometry could, to some extent, be made compatible, yet he remained steadfast in his rejection of Mendelism. The interesting question then is why Fisher, who was also a proponent of biometric methods, was able to synthesise the two traditions in a way that Pearson either could not or would not. My answer to this question involves an analysis of the ways in which different kinds of assumptions were used in modelling Mendelian populations. I argue that it is these assumptions, which lay behind the statistical techniques of Pearson and Fisher, that can be isolated as the source of Pearson's rejection of Mendelism and Fisher's success in the synthesis. (shrink)
"The Emperor's New Mind" by Roger Penrose has received a great deal of both praise and criticism. This review discusses philosophical aspects of the book that form an attack on the "strong" AI thesis. Eight different versions of this thesis are distinguished, and sources of ambiguity diagnosed, including different requirements for relationships between program and behaviour. Excessively strong versions attacked by Penrose (and Searle) are not worth defending or attacking, whereas weaker versions remain problematic. Penrose (like Searle) regards the (...) notion of an algorithm as central to AI, whereas it is argued here that for the purpose of explaining mental capabilities the architecture of an intelligent system is more important than the concept of an algorithm, using the premise that what makes something intelligent is not what it does but how it does it. What needs to be explained is also unclear: Penrose thinks we all know what consciousness is and claims that the ability to judge Go "del's formula to be true depends on it. He also suggests that quantum phenomena underly consciousness. This is rebutted by arguing that our existing concept of "consciousness" is too vague and muddled to be of use in science. This and related concepts will gradually be replaced by a more powerful theory-based taxonomy of types of mental states and processes. The central argument offered by Penrose against the strong AI thesis depends on a tempting but unjustified interpretation of Goedel's incompleteness theorem. Some critics are shown to have missed the point of his argument. A stronger criticism is mounted, and the relevance of mathematical Platonism analysed. Architectural requirements for intelligence are discussed and differences between serial and parallel implementations analysed. (shrink)
In Philosophical Problems of Statistical Inference, Seidenfeld argues that the Neyman-Pearson (NP) theory of confidence intervals is inadequate for a theory of inductive inference because, for a given situation, the 'best' NP confidence interval, [CIλ], sometimes yields intervals which are trivial (i.e., tautologous). I argue that (1) Seidenfeld's criticism of trivial intervals is based upon illegitimately interpreting confidence levels as measures of final precision; (2) for the situation which Seidenfeld considers, the 'best' NP confidence interval is not [CIλ] as (...) Seidenfeld suggests, but rather a one-sided interval [CI0]; and since [CI0] never yields trivial intervals, NP theory escapes Seidenfeld's criticism entirely; (3) Seidenfeld's criterion of non-triviality is inadequate, for it leads him to judge an alternative confidence interval, [CI alt. ], superior to [CIλ] although [CI alt. ] results in counterintuitive inferences. I conclude that Seidenfeld has not shown that the NP theory of confidence intervals is inadequate for a theory of inductive inference. (shrink)
Roger Sansom and Robert N. Brandon (eds.): Integrating Evolution and Development: From Theory to Practice Content Type Journal Article Pages 81-86 DOI 10.1007/s10441-010-9121-x Authors Thomas A. C. Reydon, Institute of Philosophy & Center for Philosophy and Ethics of Science (ZEWW), Leibniz Universität Hannover, Im Moore 21, 30167 Hannover, Germany Journal Acta Biotheoretica Online ISSN 1572-8358 Print ISSN 0001-5342 Journal Volume Volume 59 Journal Issue Volume 59, Number 1.
My pleasure in being here, at the Studiecentrum Soeterbeeck, to discuss the book Roger Scruton wrote on beauty, is twofold. It so happens that I am ﬁnishing a book on facial expression and facial beauty, and the chapter I sent to Roger to request his comments, resurfaced unopened in my own mail box, last week. Apparently something went wrong in the mail. Today I might get some of those comments. Secondly, reading Roger’s book, an impression of a (...) kindred spirit has stuck with me throughout.1) Sometimes, though, something like an ungrounded preference surfaces, which for Roger, clearly has intuitive force, maybe even the force of a conclusion, but for me this doesn’t always ring true. I only mention two instances where my own preferences would be diﬀerent. One is, where after rightly criticising the reverence allotted to Duchamp’s Fountain, in a single sentence (on p. 98) both Radiohead and Brahms are mentioned, in an obvious eﬀort to disqualify the former. The other is where he defends ﬁlm as an art by comparing it to traditional art, by pointing to shots from an Ingmar Bergman movie, which “would sit on your wall like an engraving, resonant, engaging and composed.” (p. 102). What the incidental surfacing of such preferences makes available to us is that doing aesthetics is not a merely technical philosophical endeavour, but involves art criticism, from time to time. If you don’t love art or its core values, how could you do aesthetics? And there is a deeper thought behind this in Roger’s writings: that the use of taste belongs to the good life.2) All this, also, indicates my predicament, here and now. I feel most inclined.. (shrink)
In [Dutilh Novaes, Medieval-obligations as logical Games of Consistency maintenance, synthese, (2004)], I proposed a reconstruction of Walter Burley’s theory of obligationes, based on the idea that Burley’s theory of obligationes could be seen as a logical game of consistency maintenance. In the present paper, I intend to test the game hypothesis on another important theory of obligationes, namely Roger Swyneshed’s theory. In his treatise on obligationes [edited by P.V. Spade, cf. Spade History and philosophy of Logic 3(1982) 1-32], (...) Swyneshed introduced significant modifications to the general framework of obligationes. To compare the two theories, I apply the same formal apparatus used in the previous paper. It will become patent that Swyneshed’s theory is considerably different from Burley’s, among other reasons because the dynamic aspects that play a major role in the latter are simply not present in the former. My conclusion is that Swyneshed’s version of obligationes is not directed towards consistency maintenance, but rather towards inference recognition, and that it is, from a game-theoretical perspective, less interesting a theory than Burley’s. (shrink)
What do real philosophers do? What are the big philosophical issues of today? Clear and engaging, New British Philosophy contains sixteen fascinating interviews with some of the top philosophers working in Britain today, on topics that range from music to the mind and feminism to the future of philosophy. This unique snapshot of philosophy today includes interviews with: Ray Monk, Nigel Warburton, Aaron Ridley, Jonathan Wolff, Roger Crisp, Rae Langton, Miranda Fricker, M.G.F. Martin, Timothy Williamson, Tim Crane, Robin Le (...) Poidevin, Christina Howells, Simon Critchley Simon Glendinning, Stephen Mulhall and Keith Ansell Pearson. (shrink)
Chow pays lip service (but not much more!) to Type I errors and thus opts for a hard (all-or-none) .05 level of significance (Superego of Neyman/Pearson theory; Gigerenzer 1993). Most working scientists disregard Type I errors and thus utilize a soft .05 level (Ego of Fisher; Gigerenzer 1993), which lets them report gradations of significance (e.g., p.
In the past, hypothesis testing in medicine has employed the paradigm of the repeatable experiment. In statistical hypothesis testing, an unbiased sample is drawn from a larger source population, and a calculated statistic is compared to a preassigned critical region, on the assumption that the comparison could be repeated an indefinite number of times. However, repeated experiments often cannot be performed on human beings, due to ethical or economic constraints. We describe a new paradigm for hypothesis testing which uses only (...) rearrangements of data present within the observed data set. The token swap test, based on this new paradigm, is applied to three data sets from cardiovascular pathology, and computational experiments suggest that the token swap test satisfies the Neyman Pearson condition. (shrink)
Despite its widespread use in science, the Neyman-Pearson Theory of Statistics (NPT) has been rejected as inadequate by most philosophers of induction and statistics. They base their rejection largely upon what the author refers to as after-trial criticisms of NPT. Such criticisms attempt to show that NPT fails to provide an adequate analysis of specific inferences after the trial is made, and the data is known. In this paper, the key types of after-trial criticisms are considered and it is (...) argued that each fails to demonstrate the inadequacy of NPT because each is based on judging NPT on the grounds of a criterion that is fundamentally alien to NPT. As such, each may be seen to either misconstrue the aims of NPT, or to beg the question against it. (shrink)
Standard statistical measures of strength of association, although pioneered by Pearson deliberately to be acausal, nowadays are routinely used to measure causal efficacy. But their acausal origins have left them ill suited to this latter purpose. I distinguish between two different conceptions of causal efficacy, and argue that: 1) Both conceptions can be useful 2) The statistical measures only attempt to capture the first of them 3) They are not fully successful even at this 4) An alternative definition more (...) squarely based on causal thinking not only captures the second conception, it can also capture the first one better too. (shrink)
If I have understood Pearson's use of “a practice” correctly my main objection to his project is that it gives the current practices of teaching far too much normative force over the educational beliefs of teachers. While the principles of practical reasoning advocated by Pearson may serve to test the coherence of the various beliefs which are part of current practice, they do not suffice to test the reasonableness of such beliefs. To do this we need, at least (...) for some of these beliefs, to draw upon the resources made available to us by such theoretical practices as psychology, philosophy, history, etc. None of these, of course, nullifies the significance of regarding teaching as a practice. Indeed, such a conception is a forceful reminder that the theoretical practices which are concerned with education need to focus on current teaching practices if they are to guard against the sort of empty rationalism despised by Oakeshott, while saying something to the teaching profession. This will not give us an educational theory, but rather theoretical perspectives on teaching as a practice. (shrink)
This paper argues that in attempting to protect the religious life from the sullying influence of worldly affairs, Roger Williams participated, albeit unintentionally, in creating the economic conditions that led to the birth of American capitalism. Although Williams argued for a separation of church and state, he did so not in [...].
Roger Boscovich, belonging to XVIII century, halfway from Newton to Faraday, is traditionally considered as a newtonian philosopher. Nevertheless, following Berkson’s suggestion, he could be a Field Theory forerunner. In this work, we will try to go on with the idea of this suggestion in order to show this possible Boscovich’s contribution.
David Lindberg presents the first critical edition of the text of Roger Bacon's classic work Perspectiva, prepared from Latin manuscripts, accompanied by a facing-page English translation, critical notes, and a full study of the text. Also included is an analysis of Bacon's sources, influence, and role in the emergence of the discipline of perspectiva. -/- About Roger Bacon: Roger Bacon (c.1220-c.1292) is one of the most renowned thinkers of the Middle Ages, a philosopher-scientist praised and mythologized for (...) his attack on authority and his promotion of what he called experimental science. He was a leading figure in the intellectual life of the thirteenth century, a campaigner for educational reform, and a major disseminator of Greek and Arabic natural philosophy and mathematical science. -/- About Perspectiva: The science that Roger Bacon most fully mastered was perspectiva, the study of light and vision (what would later become the science of optics). His great treatment of the subject, the Perspectiva, written in about 1260, was the first book by a European to display a full mastery of Greek and Arabic treatises on the subject, and through it Bacon was instrumental in defining this scientific discipline for the next 350 years. (shrink)
If one looks at the controversial premises of analytical approaches to fascism according to Roger Griffin, it is not surprising that a yawning distance has opened up between Marxist and non-Marxist schools of interpretation. In this situation whereby two camps are mutually ignorant of one another, it is certainly suggestive that the liberal British theoretician of fascism should put himself forward to play the role of a ‘mediator’, even if he faces the danger of significant criticism from both schools (...) of interpretation. But Griffin’s attempt takes place on a predominantly theoretical level. The author of this essay instead places the notion of revolution in historical-empirical perspective, in order to distinguish it from the account associated with (liberal) representatives of the ‘new consensus’. He then examines, in particular, whether National Socialism represented a utopia which satisfied revolutionary aspirations. The author further asks whether fascism could separate itself from its (early) conservative support to an extent that would permit commentators to meaningfully identify a revolutionary breakthrough. And finally he clarifies what the modernizing achievements of fascism during its time in power actually were. Against this background, there does seem at least the possibility of a dialogue between the two approaches that would advance each of them. (shrink)
According to Jim Pryor’s dogmatism, when you have an experience with content p, you have prima facie justification to believe p that does not rest on your independent justification or evidence to believe any proposition. Although dogmatism is intuitive and seems to have an antisceptical punch, it has been targeted by different objections. In this paper I aim to answer the objections by Roger White according to which dogmatism is incoherent with the Bayesian account of how evidence affects rational (...) credences. If this were true, the rational acceptability of dogmatism would be seriously questionable. I respond that these objections don’t get off the ground because they assume that experiences and reports of experience have the same evidential force, whereas the dogmatist is uncommitted to this assumption. I also elucidate what gives dogmatism its antisceptical punch by drawing from recent papers by Brian Weatherson, Peter Kung and Pryor himself in which alternative responses to White’s challenge are delineated. I argue that my rejoinder is more complete and simpler than these responses, for the latter permit White’s objections to go through in many cases, whereas my response doesn’t. Furthermore according to these responses, dogmatism is tenable only if Bayesianism is replaced with alternative formal frameworks, which is not a requirement of my rejoinder. (shrink)
According to Roger Scruton, it is not possible for photographs to be representational art. Most responses to Scruton’s scepticism are versions of the claim that Scruton disregards the extent to which intentionality features in photography; but these cannot force him to give up his notion of the ideal photograph. My approach is to argue that Scruton has misconstrued the role of causation in his discussion of photography. I claim that although Scruton insists that the ideal photograph is defined by (...) its ‘merely causal’ provenance, in fact he fails to take the causal provenance of photographs seriously enough. To replace Scruton’s notion of the ideal photograph, I offer a substantive account of the causal provenance of photographs, centred on the distinctive role of ‘the photographic event’. I conclude that, with a proper understanding of the photographic process, we have good reason to re-open the question of photography as a representational art. (shrink)
Summarizing a surrounding 200 pages, pages 179 to 190 of Shadows of the Mind contain a future dialog between a human identified as "Albert Imperator" and an advanced robot, the "Mathematically Justified Cybersystem", allegedly Albert's creation. The two have been discussing a Gödel sentence for an algorithm by which a robot society named SMIRC certifies mathematical proofs. The sentence, referred to in mathematical notation as Omega(Q*), is to be precisely constructed from on a definition of SMIRC's algorithm. It can be (...) interpreted as stating "SMIRC's algorithm cannot certify this statement." The robot has asserted that SMIRC never makes mistakes. If so, SMIRC's algorithm cannot certify the Goedel sentence, for that would make the statement false. But, if they can't certify it, what is says is true! Humans can understand it is true, but mighty SMIRC cannot certify it. The dialog ends melodramatically as the robot, apparently unhinged by this revelation, claims to be a messenger of god, and the human shuts it down with a secret control. (shrink)