Filling a gap in scholarship on 19th- and 20th-century religious thought, this book discusses the philosophy and theology of the influential Marburg School in Germany before 1914, focusing on the writings of Hermann Cohen, its leader, and on the Ritschlian theologian Wilhelm Herrmann, Karl Barth's teacher. In addition, Fisher examines Barth's earliest writings and clarifies the little-known liberal phase of Barth's theology.
Some of Quine’s critics charge that he arrives at a behavioristic account of linguistic meaning by starting from inappropriately behavioristic assumptions (Kripke 1982, 14; Searle 1987, 123). Quine has even written that this account of linguistic meaning is a consequence of his behaviorism (Quine 1992, 37). I take it that the above charges amount to the assertion that Quine assumes the denial of one or more of the following claims: (1) Language-users associate mental ideas with their linguistic expressions. (2) A (...) language-user can have a private theory of linguistic meaning which guides his or her use of language. (3) Language learning relies on innate mechanisms. Call an antecedent denial of one or more of these claims illicit behaviorism. In this paper I show that Quine is prepared to grant, if only for the sake of argument, all three of the above claims. I argue that his claim that there is nothing in linguistic meaning beyond what is to be gleaned from overt behavior in observable circumstances is unscathed by these allowances (Quine 1992, 38). And I show that the behaviorism which Quine does assume should be viewed as a largely uncontroversial aspect of his evidential empiricism. I conclude that if one sets out to dismiss Quine’s arguments for internal-meaning skepticism, this dismissal should not be motivated by the charge that his conclusions rely on the illicitly behavioristic assumptions that some have suggested that they do. (shrink)
Abstract: For William Blattner, Heidegger's phenomenology fails to demonstrate how a nonsuccessive temporal manifold can ‘generate’ the appropriate sequence of world-time Nows. Without this he cannot explain the ‘derivative’ status of ordinary time. In this article I show that it is only Blattner's reconstruction that makes failure inevitable. Specifically, Blattner is wrong in the way he sets out the explanatory burden, arguing that the structure of world-time must meet the traditional requirements of ordinary time logic if the derivation is to (...) succeed. He takes this to mean: mundane ‘tasks’, the contents of world-time nows, must form a transitive series, importing back into world-time the very structure that Heidegger says is derived by its levelling-off. I argue, instead, that world-time nows, seen at the level of lived content, can be quite ‘irrational’ but this is perfectly consistent with the generative thesis. Adapting Blattner's useful suggestion that temporality is sequence building or ‘iterative’ I show that iteration does not manifest itself at the level of tasks but at the ‘existential’ level of my involvement in a task. Depriving that involvement of its expressive content is what accounts for the levelling-off of the world-time now and thus the derivation of the ordinary concept of time. (shrink)
In this paper, we review recent neuroimaging investigations of disorders of consciousness and different disciplines' understanding of consciousness itself. We consider potential tests of consciousness, their legal significance, and how they map onto broader themes in U.S. statutory law pertaining to advance directives and surrogate decision-making. In the process, we outline a taxonomy of themes to illustrate and clarify the variance in state-law definitions of consciousness. Finally, we discuss broader scientific, ethical, and legal issues associated with the advent of neuroimaging (...) for disorders of consciousness and conclude with policy recommendations that could help to mitigate confusion in this realm. (shrink)
During the 1960s, Howard M. Temin (1934-1994), dared to advocate a "heretical" hypothesis that appeared to be at variance with the central dogma of molecular biology, understood by many to imply that information transfer in nature occurred only from DNA to RNA. Temin's provirus hypothesis offered a simple explanation of both virus replication and viral-induced cancer and stated that Rous sarcoma virus, an RNA virus, is replicated via a DNA intermediate. Popular accounts of this scientific episode, written after the discovery (...) of an RNA-directed DNA polymerase in 1970, tend to describe the reaction to his proposition as ardent opposition. Typically these accounts use a 'molecular biology' standpoint emphasizing the central dogma's part in its rejection. In this article, however, this episode will be examined from a joint perspective of virology and experimental cancer research. From this perspective it is clear that Temin's work was well within the epistemological and methodological boundaries of virology and cancer research. Still, scientists did have reasons to doubt the provirus hypothesis, but these do not seem to be good enough to either justify an account that portrays Temin as a renegade or his ideas as heretical. (shrink)
A generation before Beardsley, legal scholar John Henry Wigmore invented a scheme for representing arguments in a tree diagram, aimed to help advocates analyze the proof of facts at trial. In this essay, I describe Wigmore's "Chart Method" and trace its origin and influence. Wigmore, I argue, contributes to contemporary theory in two ways. His rhetorical approach to diagramming provides a novel perspective on problems about the theory of reasoning, premise adequacy, and dialectical obligations. Further, he advances a novel solution (...) to the problem of assessing argument quality by representing the strength of argument in meeting objections. (shrink)
Numerous articles in the popular press together with an examination of websites associated with the medical, legal, engineering, financial, and other professions leave no doubt that the role of professions has been impacted by the Internet. While offering the promise of the democratization of expertise – expertise made available to the public at convenient times and locations and at an affordable cost – the Internet is also driving a reexamination of the concept of professional identity and related claims of expertise (...) and standards of integrity. This paper begins with a presentation of case studies illustrating the ease by which impostors infiltrate the ranks of professionals. Reports of individuals masquerading as professionals via the Internet often reveal that these imposters cause harm to the unwary victims who rely on assertions of professional expertise. Such reports motivated the authors to examine the origins and evolution of the traditional roles of professions and professionals in today’s society, as well as question how, or whether, the standards for professional practice have been adapted to the challenges posed by technology, i.e., do statements of professional ethics provide a ‘guiding light’ for practitioners and their clients in the cyber age? The authors challenge the professions to consider the notion that technology forces a confrontation between the guild-like aspects of a profession that have served, on the one hand, to protect a profession from encroachment and, on the other hand, have purportedly protected the public. (shrink)
This article deploys sadomasochism as a framework for understanding medical practice on an institutional level. By examining the case of the factitious illness Munchausen syndrome, this article analyzes the operations of power in the doctor-patient relationship through the trope of role-playing. Because Munchausen syndrome causes a disruption to the dyadic relationship between physicians and patients, a lens of sadomasochism highlights dynamics of power in medical practice that are often obscured in everyday practice. Specifically, this article illustrates how classification and diagnosis (...) are concrete manifestations of the mobilization of medical power. (shrink)
A study of the structural perfection of icosahedral quasicrystalline grains of various alloys (Al-Pd-Mn, Zn-Mg-RE (RE L rare earth) and Al-Cu-Fe), grown by different slow solidification techniques (Czochralski, Bridgman, flux and annealing) was performed using high-resolution diffraction, including recording rocking curves combined with X-ray topography and phase contrast radiography, at a third-generation synchrotron radiation source (European Synchrotron Radiation Facility, Grenoble, France). For Al-Pd-Mn, additional coherent diffraction and diffuse scattering measurements were also carried out. After evaluating the potentialities of the techniques (...) used, in the light of the criteria defined for crystals, it is shown that the structural perfection of icosahedral quasicrystals is quite comparable with that of metallic crystals but is considerably influenced by either uniform phason strains which can destroy the quasiperiodic long-range order, or by long-wavelength phason fluctuations leading to diffuse scattering. The structural perfection was also found to be extremely variable across the as-grown quasicrystalline grains and to be dependent on the presence and characteristics of inhomogeneities (pores and precipitates) often included in the quasicrystalline matrix. Regarding the grains that we used, it has been impossible to distinguish a clear influence of either the type of alloy or the growth method. It has, however, been noticed that Al-Pd-Mn and Al-Cu-Fe grains appeared less defective than Zn-Mg-RE grains and that the microstructure of these latter grains looks like that of crystals grown by the same technique. Annealing and mechanical polishing effects have also been analysed in the case of Al-Pd-Mn grains. It appeared that annealing improves the quasicrystalline lattice perfection by lowering phason strains insofar as no precipitates are nucleated. Mechanical polishing can introduce defects, located at the external surfaces, having the shape of bands. (shrink)
Aims: The aims of the study were to explore expert opinion on the distinction between “research” and “audit”, and to determine the need for review by a National Health Service (NHS) Research Ethics Committee (REC). Background: Under current guidelines only “research” projects within the NHS require REC approval. Concerns have been expressed over difficulties in distinguishing between research and other types of project, and no existing guidelines appear to have been validated. The implications of this confusion include unnecessary REC applications, (...) and crucially, the potential for ethically unsound projects to escape review. Methods: A three-stage Delphi method was chosen to explore expert opinion and develop consensus. Stage 1 comprised ten semi-structured interviews gathering opinion on distinguishing between types of project and how to determine need for ethical review. Stages 2 and 3 were questionnaires, asking 24 “experts” to rate levels of ethical concern and types of project for a series of questions. Anonymised responses from stage 2 were fed back in stage 3. The final responses were analysed for consensus. Results: Of 46 questions, consensus was achieved for 14 (30.4%) for level of ethical concern and for 15 (32.6%) for type of project. Conclusions: Several ideas proved discriminatory for classifying the type of project and assessing level of ethical concern, and they can be used to develop an algorithm to determine need for ethical review. There was little relationship between assessment of the level of ethical concern and classification of the project. There was inconsistency in defining and classifying studies as something other than “research” or “audit”. (shrink)
Expanding the temperature range of previous specific-heat measurements on the Th7(Fe, Ru, Os, Co, Rh, Ir)3 system, we measure the effect of transition-metal substitution on total entropy , electronic specific heat (?), and Debye temperature (ΘD). In addition we measure the pressure dependence, up to 10 kbar, of the superconducting transition.
We show evidence that a structural martensitic transition is related to significant changes in the electronic structure, as revealed in thermodynamic measurements made in high magnetic fields. The effect of the magnetic field is considered unusual as many influential investigations of martensitic transitions have emphasized that the structural transitions are primarily lattice dynamical and are driven by the entropy due to the phonons. We provide a theoretical framework, which can be used to describe the effect of the magnetic field on (...) the lattice dynamics in which the field dependence originates from the dielectric constant. (shrink)
In Finite and Infinite Goods, Robert Adams defends his metaphysical account that good is resemblance to God via an ‘open-question’ intuition. It is, however, unclear what this intuition amounts to. I give two possible readings: one based on the semantic framework Adams employs, and another based on Adams's account of humankind's epistemological limitations. I argue that neither of these readings achieves Adams's advertised aim.
This paper contributes to an ongoing debate regarding the cognitive processes involved when one person predicts a target person's behavior and/or attributes a mental state to that target person. According to simulation theory, a person typically performs these tasks by employing some part of her brain as a simulation of what is going on in a corresponding part of the brain of the target person. I propose a general intuitive analysis of what 'simulation' means. Simulation is a particular way of (...) using one process to acquire knowledge about another process. What distinguishes simulation from other ways of acquiring knowledge is that simulation requires, for its non-accidental success, that the simulating process reflect significant aspects of the simulated process. This conceptual work is of independent philosophical interest, but it also enables me to argue for two conclusions that are of great significance to the debate about mental simulation theory. First, I argue that, in order to stake a non-trivial claim, simulation theory must hold that mental simulation involves what I call concretely similar processes. Second, I argue for the surprising conclusion that a significant class of cases that simulation theorists have claimed as intuitive cases of simulation do not actually involve simulation, after all. I close by sketching an alternative account that might handle these problematic cases. (shrink)
Wittgenstein at Work: Method in the Philosophical Investigations explores the least well-understood aspect of Wittgenstein's later work: his aims and methods. Specially-commissioned papers by twelve of the world's leading Wittgenstein scholars analyze the way he approached key topics such as rule-following and private language, and examine his remarks on clarification, nonsense and other central notions of his methodology. Many contributors touch on the therapeutic aspects Wittgenstein's approach, the focus of much current debate. Wittgenstein at Work provides both students and specialist (...) with a much-needed methodological companion to one of the greatest philosophical works of the twentieth century. (shrink)
A number of authors have suggested that a conditional analysis of dispositions must take roughly the following form: Thing X is disposed to produce response R to stimulus S just in case, if X were exposed to S and surrounding circumstances were auspicious, then X would produce R. The great challenge is cashing out the relevant notion of ‘auspicious circumstances’. I give a general argument which entails that all existing conditional analyses fail, and that there is no satisfactory way to (...) define ‘auspicious circumstances’ just in terms of S, R, and X. Instead, I argue that the auspicious circumstances C for the manifestation of a disposition constitute a third irreducible element of that disposition, and that to pick out (or to ‘individuate’) that disposition one must specify C along with S and R. This enables a new conditional analysis of dispositions that gives intuitively satisfying answers in cases that pose problems for other approaches. (shrink)
One unresolved dispute within Heidegger scholarship concerns the question of whether Dasein should be conceived in terms of narrative self-constitution. A survey of the current literature suggests two standard responses. The first correlates Heidegger’s talk of authentic historicality with that of self-authorship. To the alternative perspective, however, Heidegger’s talk of Dasein’s existentiality, with its emphasis on nullity and unattainability, is taken as evidence that Dasein is structurally and ontologically incapable of being completed via any life-project. Narrativity imports into Being and (...) Time commitments concerning temporality, selfhood, and ethics, which Heidegger rejects. Although both positions find good exegetic support for their conclusions, they can’t both be right. In this article, I navigate a path between these two irreconcilable positions by applying insights derived from recent debates within Anglo-American literature on personal identity. I develop an alternative thesis to Narrativism, without rejecting it outright, by arguing that Dasein can be analysed in terms of what I call narratability conditions. These allow us to make sense of the prima facie paradoxical notion of historicality without narrativity. Indeed, rather than reconciling the two standard positions, I hold that the tension between them says something important about Dasein’s kind of existence. Thus I conclude that while the narrativist question Who ought I to be? is perfectly legitimate within limits, what the existential analysis of the limits on narratability reveals is that no answer to this question can ever be definitive. (shrink)
Wittgenstein's private language argument is interpreted as an example of a kind of transcendental argument which, if valid, explains why a certain concept must possess certain features. Cognition and affect are shown to require each other by an application of Bennett's account of what beings capable of true cognition must be capable of, and the necessity of certain emotions to the existence of any rules in a community is argued in similar fashion. Hume's account of love and admiration being rejected, (...) an account of love, intended to explain some of love's familiar features, is defended, and various proposed additions to the analysis are rejected. The idea of love is linked to those of value, agency, and the transcendental self by argument showing that each of these ideas requires all of the others. Finally, the idea of love is linked by a direct argument to that of the transcendental self. (shrink)
I develop and defend a version of what I call Disposition-Based Decision Theory (or DBDT). I point out important problems in David Gauthier’s (1985, 1986) formulation of DBDT, and carefully develop a more defensible formulation. I then compare my version of DBDT to the currently most widely accepted decision theory, Causal Decision Theory (CDT). Traditional intuition-based arguments fail to give us any strong reason to prefer either theory over the other, but I propose an alternative strategy for resolving this debate. (...) I argue that we should embrace DBDT because it does better than CDT at the work that we, as a matter of empirical fact, commonly call upon a notion of rationality to do. (shrink)