This article discusses the relation between features of empirical data and structures in the world. I defend the following claims. Any empirical data set exhibits all possible patterns, each with a certain noise term. The magnitude and other properties of this noise term are irrelevant to the evidential status of a pattern: all patterns exhibited in empirical data constitute evidence of structures in the world. Furthermore, distinct patterns constitute evidence of distinct structures in the world. It follows that the world (...) must be regarded as containing all possible structures. The remainder of the article is devoted to elucidating the meaning and implications of the latter claim. (shrink)
Several quantitative techniques for choosing among data models are available. Among these are techniques based on algorithmic information theory, minimum description length theory, and the Akaike information criterion. All these techniques are designed to identify a single model of a data set as being the closest to the truth. I argue, using examples, that many data sets in science show multiple patterns, providing evidence for multiple phenomena. For any such data set, there is more than one data model that must (...) be considered close to the truth. I conclude that, since the established techniques for choosing among data models are unequipped to handle these cases, they cannot be regarded as adequate. ‡I presented a previous version of this paper at the 20th Biennial Meeting of the Philosophy of Science Association, Vancouver, November 2006. I am grateful to the audience for constructive discussion. I thank Leiden University students Marjolein Eysink Smeets and Lenneke Schrier for suggesting the cortisol example, and Remko van der Geest for comments on a draft. †To contact the author, please write to: Faculty of Philosophy, University of Leiden, P.O. Box 9515, 2300 RA Leiden, The Netherlands; e-mail: firstname.lastname@example.org. (shrink)
I argue that, contrary to thestandard view, the Newtonian universe containsno contingency. I do this by arguing (i) thatno contingency is introduced into the Newtonianuniverse by the initial conditions of physicalsystems in the universe, and (ii) that theclaim that the Newtonian universe as a wholehas contingent properties leads to incoherence.This result suggests that Newtonian physics iseither inconsistent or incomplete, since thelaws of Newtonian physics are too weak todetermine all the properties of the Newtonianuniverse uniquely.
Thought experiment acquires evidential significance only on particular metaphysical assumptions. These include the thesis that science aims at uncovering "phenomena"universal and stable modes in which the world is articulatedand the thesis that phenomena are revealed imperfectly in actual occurrences. Only on these Platonically inspired assumptions does it make sense to bypass experience of actual occurrences and perform thought experiments. These assumptions are taken to hold in classical physics and other disciplines, but not in sciences that emphasize variety and contingency, such (...) as Aristotelian natural philosophy and some forms of historiography. This explains why thought experiments carry weight in the former but not the latter disciplines. (shrink)
Murray Gell-Mann has proposed the concept of effective complexity as a measure of information content. The effective complexity of a string of digits is defined as the algorithmic complexity of the regular component of the string. This paper argues that the effective complexity of a given string is not uniquely determined. The effective complexity of a string admitting a physical interpretation, such as an empirical data set, depends on the cognitive and practical interests of investigators. The effective complexity of a (...) string as a purely formal construct, lacking a physical interpretation, is either close to zero, or equal to the string's algorithmic complexity, or arbitrary, depending on the auxiliary criterion chosen to pick out the regular component of the string. Because of this flaw, the concept of effective complexity is unsuitable as a measure of information content. (shrink)
This introduction to the special issue on "Aesthetics of Science" reviews recent philosophical research on aesthetic aspects of science. Topics represented in this research include the aesthetic properties of scientific images, theories, and experiments; the relation of science and art; the role of aesthetic criteria in scientific practice and their effect on the development of science; aesthetic aspects of mathematics; the contrast between a classic and a Romantic aesthetic; and the relation between emotion, cognition, and rationality.
The Newtonian universe is usually understood to contain two classes of causal factors: universal regularitiesand initial conditions. I demonstrate that,in fact, the Newtonian universe contains no causal factors other thanuniversal regularities: the initial conditions ofany physical system are merely theconsequence of universal regularities acting on previoussystems. It follows that aNewtonian universe lacks the degree of contingency that is usually attributed to it. This is a necessary precondition for maintaining that the Newtonian universe is a block universe that exhibits no temporal (...) development. It follows also that Newtonian physics is inconsistent, since a Newtonian universe as a whole exhibits some properties – such as the total mass of the universe – that are not determined by the laws of Newtonian physics, and that must therefore be considered contingent. (shrink)
The modern sciences are divided into two groups: law-formulating and natural historical sciences. Sciences of both groups aim at describing the world, but they do so differently. Whereas the natural historical sciences produce “transcriptions” intended to be literally true of actual occurrences, laws of nature are expressive symbols of aspects of the world. The relationship between laws and the world thus resembles that between the symbols of classical iconography and the objects for which they stand. The natural historical approach was (...) founded by Aristotle and is retained in such present-day sciences as botany. Modern physics differentiated itself from the natural historical sciences and developed a symbolizing approach at the hands of Galileo and Descartes. Our knowledge of the physical domain is provided by two disciplines: the law-formulating science of physics and a natural historical science on which we depend in the everyday manipulation of our surroundings. (shrink)
Bogen and Woodward claim that the function of scientific theories is to account for 'phenomena', which they describe both as investigator-independent constituents of the world and as corresponding to patterns in data sets. I argue that, if phenomena are considered to correspond to patterns in data, it is inadmissible to regard them as investigator-independent entities. Bogen and Woodward's account of phenomena is thus incoherent. I offer an alternative account, according to which phenomena are investigator-relative entities. All the infinitely many patterns (...) that data sets exhibit have equal intrinsic claim to the status of phenomenon: each investigator may stipulate which patterns correspond to phenomena for him or her. My notion of phenomena accords better both with experimental practice and with the historical development of science. (shrink)
Conditions for philosophy of science in the Netherlands are not optimal. The climate of opinion in Dutch philosophy is unsympathetic to the sciences, partly because of the influence of theology. Dutch universities offer no taught graduate programmes in philosophy of science, which would provide an entry route for science graduates. A great deal of Dutch research in philosophy of science is affected by an exegetical attitude, which fosters the interpretation and evaluation of other writers rather than the development of original (...) theories. Doctoral candidates in particular should be trained to greater originality and assertiveness. Nonetheless, much good research in philosophy of science is conducted in the Netherlands, both in philosophy faculties and in institutes dedicated to the foundations of the special sciences. Distinguished work is done also in the neighbouring disciplines of logic, history of science, and social studies of science. (shrink)
The eighteenth and nineteenth centuries witnessed a change in the perception of the arts and of philosophy. In the arts this transition occurred around 1800, with, for instance, the breakdown of Vitruvianism in architecture, while in philosophy the foundationalism of which Descartes and Spinoza were paradigmatic representatives, which presumed that philosophy and the sciences possessed a method of ensuring the demonstration of truths, was undermined by the idea, asserted by Nietzsche and Wittgenstein, that there exist alternative styles of enquiry among (...) which a choice is open. The essays in this book examine the circumstances, features, and consequences of this historical transition, exploring in particular new aspects and instances of the inter-relatedness of content and its formal representation in both the arts and philosophy. (shrink)
The central terms of certain theories which were valued highly in the past, such as the phlogiston theory, are now believed by realists not to refer. Laudan and others have claimed that, in the light of the existence of such theories, scientific realism is untenable. This paper argues in response that realism is consistent with — and indeed is able to explain — such theories' having been highly valued and yet not being close to the truth. It follows that the (...) set of highly-valued past theories cited by Laudan, presumed to militate against realism, is in fact innocuous to the doctrine. The argument hinges largely on identifying the grounds on which theory-adoption is actually performed. (shrink)
Almost all commentators acknowledge that among the grounds on which scientists perform theory-choices are criteria of simplicity. In general, simplicity is regarded either as only a logico-empirical quality of a theory, diagnostic of the theory's future predictive success, or as a purely aesthetic or otherwise extra-empirical property of it. This paper attempts to demonstrate that the simplicity-criteria applied in scientific practice include both a logico-empirical and a quasi-aesthetic criterion: to conflate these in an account of scientists' theory-choice is to court (...) confusion. (shrink)
A rationalist and realist model of scientific revolutions will be constructed by reference to two categories of criteria of theory-evaluation, denominated indicators of truth and of beauty. Whereas indicators of truth are formulateda priori and thus unite science in the pursuit of verisimilitude, aesthetic criteria are inductive constructs which lag behind the progression of theories in truthlikeness. Revolutions occur when the evaluative divergence between the two categories of criteria proves too wide to be recomposed or overlooked. This model of revolutions (...) depends upon a substantial new treatment of aesthetic criteria in science with which much of the paper will therefore be occupied. (shrink)
This paper argues that evaluation of the truth and rationality of past scientific theories is both possible and profitable. The motivation for this enterprise is traced to recent discussions by I. Lakatos, L. Laudan and others on the import of history for the philosophy of science; several objections to it are considered and T. S. Kuhn is found to advance the most substantive. An argument for establishing judgements of rationality and truth in the face of scientific revolutions is presented; finally (...) evidence is offered for the value of such assessments to historiography and to debates on scientific progress. (shrink)