According to the Fine-Tuning Argument, the existence of life in our universe confirms the Multiverse Hypothesis. A standard objection to FTA is that it violates the Requirement of Total Evidence. I argue that RTE should be rejected in favor of the Predesignation Requirement, according to which, in assessing the outcome of a probabilistic process, we should only use evidence characterizable in a manner available before observing the outcome. This produces the right verdicts in some simple cases in which RTE (...) leads us astray, and, when applied to FTA, it shows that our evidence does confirm HM. (shrink)
Peter Kivy is the author of many books on the history of art and, in particular, the aesthetics of music. This collection of essays spans a period of some thirty years and focuses on a richly diverse set of issues: the biological origins of music, the role of music in the liberal education, the nature of the musical work and its performance, the aesthetics of opera, the emotions of music, and the very nature of music itself. Some of these (...) subjects are viewed as part of the history of ideas, others as current problems in the philosophy of art. A particular feature of the volume is that Kivy avoids the use of musical notation so that no technical knowledge at all is required to appreciate his work. The essays will prove enjoyable and insightful not just to professionals in the philosophy of art and musicologists, or to musicians themselves, but also to any motivated general reader with a deep interest in music. (shrink)
Two of the most potent challenges faced by scientific realism are the underdetermination of theories by data, and the pessimistic induction based on theories previously held to be true, but subsequently acknowledged as false. Recently, Stanford (2006, Exceeding our grasp: Science, history, and the problem of unconceived alternatives. Oxford: Oxford University Press) has formulated what he calls the problem of unconceived alternatives: a version of the underdetermination thesis combined with a historical argument of the same form as the pessimistic induction. (...) In this paper, I contend that while Stanford does present a novel antirealist argument, a successful response to the pessimistic induction would likewise defuse the problem of unconceived alternatives, and that a more selective and sophisticated realism than that which he allows is arguably immune to both concerns. (shrink)
The debate between 3- and 4-dimensionalists is one of the most lively and pervasive in current metaphysics. At stake is a glittering prize: the correct metaphysical analysis of material things and other objects commonly thought to persist in time by enduring. Since we count ourselves among such objects the outcome of the debate is of more than merely academic interest to us. Obviously the ramifications of the debate, even of the points raised by Kit Fine, go far beyond what (...) I can discuss here, so I shall simply select some salient issues and comment on them from my own somewhat heterodox point of view. (shrink)
ports the thesis that there exist very many universes. The view has found favor with a number of philosophers such as Derek Parfit ~1998!, J. J. C. Smart ~1989! and Peter van Inwagen ~1993!.1 My purpose is to argue that this is a mistake. First let me set out the issue in more detail.
The term “Gettier Case” is a technical term frequently applied to a wide array of thought experiments in contemporary epistemology. What do these cases have in common? It is said that they all involve a justified true belief which, intuitively, is not knowledge, due to a form of luck called “Gettiering.” While this very broad characterization suffices for some purposes, it masks radical diversity. We argue that the extent of this diversity merits abandoning the notion of a “Gettier case” in (...) a favour of more finely grained terminology. We propose such terminology, and use it to effectively sort the myriad Gettier cases from the theoretical literature in a way that charts deep fault lines in ordinary judgments about knowledge. (shrink)
Two successes of old quantum theory are particularly notable: Bohr’s prediction of the spectral lines of ionised helium, and Sommerfeld’s prediction of the fine-structure of the hydrogen spectral lines. Many scientific realists would like to be able to explain these successes in terms of the truth or approximate truth of the assumptions which fuelled the relevant derivations. In this paper I argue that this will be difficult for the ionised helium success, and is almost certainly impossible for the (...) class='Hi'>fine-structure success. Thus I submit that the case against the realist’s thesis that success is indicative of truth is marginally strengthened. (shrink)
Does tenure serve its original purpose of promoting freedom of inquiry for academics in teaching and research? It seems not. Of concern is the finding that achieving tenure does not translate into a significant increase in exercise of freedom of inquiry either in teaching or research. Why? Promotion evaluation for associate professors by their senior colleagues has a continued inhibiting effect. (Published Online February 8 2007).
This paper is a study of higher-order contingentism – the view, roughly, that it is contingent what properties and propositions there are. We explore the motivations for this view and various ways in which it might be developed, synthesizing and expanding on work by Kit Fine, Robert Stalnaker, and Timothy Williamson. Special attention is paid to the question of whether the view makes sense by its own lights, or whether articulating the view requires drawing distinctions among possibilities that, according (...) to the view itself, do not exist to be drawn. The paper begins with a non-technical exposition of the main ideas and technical results, which can be read on its own. This exposition is followed by a formal investigation of higher-order contingentism, in which the tools of variable-domain intensional model theory are used to articulate various versions of the view, understood as theories formulated in a higher-order modal language. Our overall assessment is mixed: higher-order contingentism can be fleshed out into an elegant systematic theory, but perhaps only at the cost of abandoning some of its original motivations. (shrink)
Throughout the history of the Western world, science has possessed an extraordinary amount of authority and prestige. And while its pedestal has been jostled by numerous evolutions and revolutions, science has always managed to maintain its stronghold as the knowing enterprise that explains how the natural world works: we treat such legendary scientists as Galileo, Newton, Darwin, and Einstein with admiration and reverence because they offer profound and sustaining insight into the meaning of the universe. In The Intelligibility of Nature (...) , Peter Dear considers how science as such has evolved and how it has marshaled itself to make sense of the world. His intellectual journey begins with a crucial observation: that the enterprise of science is, and has been, directed toward two distinct but frequently conflated ends—doing and knowing. The ancient Greeks developed this distinction of value between craft on the one hand and understanding on the other, and according to Dear, that distinction has survived to shape attitudes toward science ever since. Teasing out this tension between doing and knowing during key episodes in the history of science—mechanical philosophy and Newtonian gravitation, elective affinities and the chemical revolution, enlightened natural history and taxonomy, evolutionary biology, the dynamical theory of electromagnetism, and quantum theory—Dear reveals how the two principles became formalized into a single enterprise, science, that would be carried out by a new kind of person, the scientist. Finely nuanced and elegantly conceived, The Intelligibility of Nature will be essential reading for aficionados and historians of science alike. (shrink)
Our topic is the theory of topics. My goal is to clarify and evaluate three competing traditions: what I call the way-based approach, the atom-based approach, and the subject-predicate approach. I develop criteria for adequacy using robust linguistic intuitions that feature prominently in the literature. Then I evaluate the extent to which various existing theories satisfy these constraints. I conclude that recent theories due to Parry, Perry, Lewis, and Yablo do not meet the constraints in total. I then introduce the (...) issue-based theory—a novel and natural entry in the atom-based tradition that meets our constraints. In a coda, I categorize a recent theory from Fine as atom-based, and contrast it to the issue-based theory, concluding that they are evenly matched, relative to our main criteria of adequacy. I offer tentative reasons to nevertheless favour the issue-based theory. (shrink)
This paper describes the application of eight statistical and machine-learning methods to derive computer models for predicting mortality of hospital patients with pneumonia from their findings at initial presentation. The eight models were each constructed based on 9847 patient cases and they were each evaluated on 4352 additional cases. The primary evaluation metric was the error in predicted survival as a function of the fraction of patients predicted to survive. This metric is useful in assessing a model’s potential to assist (...) a clinician in deciding whether to treat a given patient in the hospital or at home. We examined the error rates of the models when predicting that a given fraction of patients will survive. We examined survival fractions between 0.1 and 0.6. Over this range, each model’s predictive error rate was within 1% of the error rate of every other model. When predicting that approximately 30°K of the patients will survive, all the models have an error rate of less than 1.5%. The models are distinguished more by the number of variables and parameters that they contain than by their error rates; these differences suggest which models may be the most amenable to future implementation as paper-based guidelines. (shrink)
I am grateful to Nathan Salmon [in Salmon (2012)] for being willing to spill so much ink over my monograph on semantic relationism (2007), even if what he has to say is not altogether complimentary. There is a great deal in his criticisms to which I take exception but I wish to focus on one point, what he calls my ‘formal disproof’ of standard Millianism. He believes that ‘the alleged hard result is nearly demonstrably false’ (p. 420) and that the (...) disproof contains a ‘serious error’ (p. 407). Neither claim is correct; and it is the aim of this short note to explain why.First some background. In some cases, we are justified (from an internalist standpoint) in inferring the singular proposition F&G(x) from F(x) and G(x) (as when I learn more and more about Obama, for example); and in other cases, we are not so justified (as when Peter, in Kripke’s puzzle case, knows that Paderewski is a pianist and that Paderewski is a politician but does not know that anyone is both a pianist and a poli. (shrink)
How can we know what really happened in the distant past in places like ancient Egypt, Mesopotamia, Palestine, Greece, and Rome, especially since the evidence is fragmentary and ancient cultures are so different from our own frame of reference? Scholars may examine historical documents and archaeological artifacts, and then make reasonable inferences. But in the final analysis there can be no absolute certainty about events far removed from present reality, and the past must be reconstructed by means of hypotheses that (...) coherently organize all available data. Knowledge claims about the past, and about many areas of science as well, rest on a network of interdependence between theory and evidence, and between interpretation and data. In this fascinating study of epistemology, philosopher Peter Kosso argues for a coherence model of epistemic justification. In the first part, the conceptual argument, he proposes a model of knowledge of the past. In the second part he presents three detailed case studies drawn from the work of historians and archaeologists. These studies are used to support and fine-tune the model outlined in the first part. Kosso presents many insights into the limits of knowledge and our ability to know the mental as well as the physical past. Historians, archaeologists, philosophers, and students interested in epistemology will find this accessible work to be of great value. (shrink)
The reconciliation of theories of concepts based on prototypes, exemplars, and theory-like structures is a longstanding problem in cognitive science. In response to this problem, researchers have recently tended to adopt either hybrid theories that combine various kinds of representational structure, or eliminative theories that replace concepts with a more finely grained taxonomy of mental representations. In this paper, we describe an alternative approach involving a single class of mental representations called “semantic pointers.” Semantic pointers are symbol-like representations that result (...) from the compression and recursive binding of perceptual, lexical, and motor representations, effectively integrating traditional connectionist and symbolic approaches. We present a computational model using semantic pointers that replicates experimental data from categorization studies involving each prior paradigm. We argue that a framework involving semantic pointers can provide a unified account of conceptual phenomena, and we compare our framework to existing alternatives in accounting for the scope, content, recursive combination, and neural implementation of concepts. (shrink)
The standard argument against ordered tuples as propositions is that it is arbitrary what truth-conditions they should have. In this paper we generalize that argument. Firstly, we require that propositions have truth-conditions intrinsically. Secondly, we require strongly equivalent truth-conditions to be identical. Thirdly, we provide a formal framework, taken from Graph Theory, to characterize structure and structured objects in general. The argument in a nutshell is this: structured objects are too fine-grained to be identical to truth-conditions. Without identity, there (...) is no privileged mapping from structured objects to truth-conditions, and hence structured objects do not have truth-conditions intrinsically. Therefore, propositions are not structured objects. (shrink)
Scientist, mathematician, thinker, the father of pragmatism, the inspiration for William James and John Dewey, Charles Peirce has remained until recently a philosopher's philosopher. Peirce trod a fine line between the extremes of nominalism and realism, tough-minded pragmatism and metaphysical speculation. As Peter Skagestad makes clear, Peirce's system of thought was fragmented, incomplete, and sometimes inconsistent. But one overriding concern gives unity to the whole: the road of inquiry must never be blocked.
Existential claims are widely held to be grounded in their true instances. However, this principle is shown to be problematic by arguments due to Kit Fine. Stephan Krämer has given an especially simple form of such an argument using propositional quantifiers. This note shows that even if a schematic principle of existential grounds for propositional quantifiers has to be restricted, this does not immediately apply to a corresponding non-schematic principle in higher-order logic.
Probably the most dramatic historical challenge to scientific realism concerns Arnold Sommerfeld’s derivation of the fine structure energy levels of hydrogen. Not only were his predictions good, he derived exactly the same formula that would later drop out of Dirac’s 1928 treatment. And yet the most central elements of Sommerfeld’s theory were not even approximately true: his derivation leans heavily on a classical approach to elliptical orbits, including the necessary adjustments to these orbits demanded by relativity. Even physicists call (...) Sommerfeld’s success a ‘miracle’, which rather makes a joke of the so-called ‘no miracles argument’. However, this can all be turned around. Here I argue that the realist has a story to tell vis-à-vis the discontinuities between the old and the new theory, leading to a realist defence based on sufficient continuity of relevant structure. 1Introduction2No Realist Commitment Required?3Enter the Physicists4A New Approach to the Non-relativistic Success5Relativity and Spin6Structure and Realist Commitment7Conclusion. (shrink)
Since the beginning of the eighteenth century the philosophy of art has been engaged on the project of trying to find out what the fine arts have in common and, thus, how they might be defined. Peter Kivy's purpose in this accessible and lucid book is to trace the history of that enterprise and argue that the definitional project has been unsuccessful. He offers a fruitful change of strategy: instead of engaging in an obsessive quest for sameness, let (...) us explore the differences between the arts. He presents five case studies, three from literature, two from music. With its combination of historical and analytic approaches this is a book for a wide range of readers in philosophy, literary studies, music, and non-academic readers with interests in the arts. (shrink)
Wood and Spekkens argue that any causal model explaining the EPRB correlations and satisfying the no-signalling constraint must also violate the assumption that the model faithfully reproduces the statistical dependences and independences—a so-called ‘fine-tuning’ of the causal parameters. This includes, in particular, retrocausal explanations of the EPRB correlations. I consider this analysis with a view to enumerating the possible responses an advocate of retrocausal explanations might propose. I focus on the response of Näger, who argues that the central ideas (...) of causal explanations can be saved if one accepts the possibility of a stable fine-tuning of the causal parameters. I argue that in light of this view, a violation of faithfulness does not necessarily rule out retrocausal explanations of the EPRB correlations. However, when we consider a plausible retrocausal picture in some detail, it becomes clear that the causal modelling framework is not a natural arena for representing such an account of retrocausality. _1_ Causal Models, Quantum Mechanics, and Faithfulness _2_ Fine-Tuning _2.1_ Fine-tuning in a retrocausal model _3_ Possible Responses _4_ Quantum Causal Models and Retrocausality _4.1_ A more detailed retrocausal account _4.2_ A model of the EPRB probabilities _4.3_ Mapping to a causal model _5_ Conclusion. (shrink)
Can argumentation schemes play a part in the critical processing of argumentation by lay people? In a qualitative study, participants were invited to come up with strong and weak arguments for a given claim and were subsequently interviewed for why they thought the strong argument was stronger than the weak one. Next, they were presented with a list of arguments and asked to rank these arguments from strongest to weakest, upon which they were asked to motivate their judgments in an (...) interview. In order to assess whether lay people apply argument scheme specific criteria when performing these tasks, five different argumentation schemes were included in this study: argumentation from authority, from example, from analogy, from cause to effect, and from consequences. Laypeople’s use of criteria for argument quality was inferred from interview protocols. The results revealed that participants combined general criteria from informal logic and scheme-specific criteria. The results supported the conventional validity of the pragma-dialectical argument scheme rule in a strong sense and provided a more fine-grained view of central processing in the Elaboration Likelihood Model. (shrink)
Peirce trod a fine line between the extremes of nominalism and realism, tough-minded pragmatism and metaphysical speculation. As Peter Skagestad makes clear, Peirce's system of thought was fragmented, incomplete, and sometimes inconsistent.
In his paper on transcendental intersubjectivity in Husserl, which refers mainly to the Fifth Cartesian Meditation, Schutz (1966a) marks out four stages in Husserl's argument and finds what are for him insurmountable problems in each stage. These stages are: (1) isolation of the primordial world of one's peculiar ownness by means of a further epoche; (2) apperception of the other via pairing; (3) constitution of objective, intersubjective Nature; (4) constitution of higher forms of community. Because of the problems Schutz encounters (...) in each of these stages, he concludes that Husserl's theory is unacceptable (Schutz, 1966a, p.82). Having already proved that it is unacceptable, he now explains why these problems arise in Husserl's theory. Intersubjectivity, says Schutz, is "a datum of the life-world," (1966a, p.82) not a transcendental problem. In other words, intersubjectivity must be dealt with as a problem of the life-world of the natural attitude, not a "problem of constitution which can be solved within the transcendental sphere." (Schutz, 1966a, p.82). There is no such thing as transcendental intersubjectivity, if by that is meant intersubjectivity of a plurality of transcendental egos. The role of transcendental phenomenology in the problem of intersubjectivity is to explicate within the transcendental reduction the sense: "intersubjectivity in the life-world." Husserl was diverted from this proper role of phenomenology--in his words, to "explicate the sense which this world has for us prior to all philosophy" (trans. and quoted by Schutz from "Cartesianische Meditationen, para. 62, in fine," in Schutz, 1966a, p.82)--because of the unobtrusive transformation of sense of his concept of constitution from that of explication and clarification to "creation," in the sense of providing an ontology of the lifeworld. The fact that phenomenology is in principle incapable of doing this lies behind the failure of Husserl's theory of intersubjectivity (Schutz, 1966a, pp.83-84). Unlike Schutz, I will deal with this general issue explicitly in the context of the stages in Husserl's argument and Schutz's objections. It seems to me that Husserl does remain within the sphere of clarification of sense, but to do explication and clarification of certain "senses" results inevitably in doing a kind of ontology. (shrink)
The paper argues that there is a proper place for literature within aesthetics but that care must be taken in identifying just what the relation is. In characterising aesthetic pleasure associated with literature it is all too easy to fall into reductive accounts, for example, of literature as merely “fine writing”. Belleslettrist or formalistic accounts of literature are rejected, as are two other kinds of reduction, to pure meaning properties and to a kind of narrative realism. The idea is (...) developed that literature—both poetry and prose fiction—invites its own distinctive kind of aesthetic appreciation which far from being at odds with critical practice, in fact chimes well with it. (shrink)
In this comparative analysis of twelve focus groups conducted in Austria, France, and the Netherlands, we investigate how lay people come to terms with two biomedical technologies. Using the term ‘‘technopolitical culture,’’ we aim to show that the ways in which technosciences are interwoven with a specific society frame how citizens build their individual and collective positions toward them. We investigate how the focus group participants conceptualized organ transplantation and genetic testing, their perceptions of individual agency in relation to the (...) two technologies and to more collective forms of acting and governing, and also their understanding of the two technologies’ relationship to broader societal value systems. Against the background of the sustained political effort to build common European values, we suggest that more fine-grained attention toward the culturally embedded differences in coming to terms with biomedical technologies is needed. (shrink)
As adults age, their performance on many psychometric tests changes systematically, a finding that is widely taken to reveal that cognitive information-processing capacities decline across adulthood. Contrary to this, we suggest that older adults'; changing performance reflects memory search demands, which escalate as experience grows. A series of simulations show how the performance patterns observed across adulthood emerge naturally in learning models as they acquire knowledge. The simulations correctly identify greater variation in the cognitive performance of older adults, and successfully (...) predict that older adults will show greater sensitivity to fine-grained differences in the properties of test stimuli than younger adults. Our results indicate that older adults'; performance on cognitive tests reflects the predictable consequences of learning on information-processing, and not cognitive decline. We consider the implications of this for our scientific and cultural understanding of aging. (shrink)
Does contemporary science tend to favour pantheism over its rivals or vice versa? Here I take the rivals to be the other members of a five-point spectrum: atheism, polytheism, pantheism, panentheism, and transcendent theism. And the features of contemporary science that I shall consider are: that the Universe has only existed for a finite time; that the Universe is expanding; that there are ubiquitous and pervasive laws of nature; and the ‘fine tuning’ required for life.
Comparatively easy questions we might ask about creativity are distinguished from the hard question of explaining transformative creativity. Many have focused on the easy questions, offering no reason to think that the imagining relied upon in creative cognition cannot be reduced to more basic folk psychological states. The relevance of associative thought processes to songwriting is then explored as a means for understanding the nature of transformative creativity. Productive artificial neural networks—known as generative antagonistic networks (GANs)—are a recent example of (...) how a system’s ability to generate novel products can both be finely tuned by prior experience and grounded in strategies that cannot be articulated by the system itself. Further, the kinds of processes exploited by GANs need not be seen as incorporating something akin to sui generis imaginative states. The chapter concludes with reflection on the added relevance of personal character to explanations of creativity. [This is Chapter 12 of the book Explaining Imagination.]. (shrink)