Computational methods have become the dominant technique in many areas of science. This book contains the first systematic philosophical account of these new methods and their consequences for scientific method. This book will be of interest to philosophers of science and to anyone interested in the role played by computers in modern science.
Emergence develops a novel account of diachronic ontological emergence called transformational emergence and locates it in an established historical framework. The author shows how many problems affecting ontological emergence result from a dominant but inappropriate metaphysical tradition and provides a comprehensive assessment of current theories of emergence.
Emergence develops a novel account of diachronic ontological emergence called transformational emergence and locates it in an established historical framework. The author shows how many problems affecting ontological emergence result from a dominant but inappropriate metaphysical tradition and provides a comprehensive assessment of current theories of emergence.
A framework for representing a specific kind of emergent property instance is given. A solution to a generalized version of the exclusion argument is then provided and it is shown that upwards and downwards causation is unproblematical for that kind of emergence. One real example of this kind of emergence is briefly described and the suggestion made that emergence may be more common than current opinions allow.
Reasons are given to justify the claim that computer simulations and computational science constitute a distinctively new set of scientific methods and that these methods introduce new issues in the philosophy of science. These issues are both epistemological and methodological in kind.
This book provides a post-positivist theory of deterministic and probabilistic causality that supports both quantitative and qualitative explanations. Features of particular interest include the ability to provide true explanations in contexts where our knowledge is incomplete, a systematic interpretation of causal modeling techniques in the social sciences, and a direct realist view of causal relations that is compatible with a liberal empiricism. The book should be of wide interest to both philosophers and scientists. Originally published in 1989. The Princeton Legacy (...) Library uses the latest print-on-demand technology to again make available previously out-of-print books from the distinguished backlist of Princeton University Press. These editions preserve the original texts of these important books while presenting them in durable paperback and hardcover editions. The goal of the Princeton Legacy Library is to vastly increase access to the rich scholarly heritage found in the thousands of books published by Princeton University Press since its founding in 1905. (shrink)
I argue that supervenience is an inadequate device for representing relations between different levels of phenomena. I then provide six criteria that emergent phenomena seem to satisfy. Using examples drawn from macroscopic physics, I suggest that such emergent features may well be quite common in the physical realm.
Computer Simulations.Paul Humphreys - 1990 - PSA: Proceedings of the Biennial Meeting of the Philosophy of Science Association 1990:497 - 506.details
This article provides a survey of some of the reasons why computational approaches have become a permanent addition to the set of scientific methods. The reasons for this require us to represent the relation between theories and their applications in a different way than do the traditional logical accounts extant in the philosophical literature. A working definition of computer simulations is provided and some properties of simulations are explored by considering an example from quantum chemistry.
I discuss here a number of different kinds of diachronic emergence, noting that they differ in important ways from synchronic conceptions. I argue that Bedau’s weak emergence has an essentially historical aspect, in that there can be two indistinguishable states, one of which is weakly emergent, the other of which is not. As a consequence, weak emergence is about tokens, not types, of states. I conclude by examining the question of whether the concept of weak emergence is too weak and (...) note that there is at present no unifying account of diachronic and synchronic concepts of emergence. (shrink)
I argue that supervenience is an inadequate device for representing relations between different levels of phenomena. I then provide six criteria that emergent phenomena seem to satisfy. Using examples drawn from macroscopic physics, I suggest that such emergent features may well be quite common in the physical realm.
A different way of thinking about how the sciences are organized is suggested by the use of cross‐disciplinary computational methods as the organizing unit of science, here called computational templates. The structure of computational models is articulated using the concepts of construction assumptions and correction sets. The existence of these features indicates that certain conventionalist views are incorrect, in particular it suggests that computational models come with an interpretation that cannot be removed as well as a prior justification. A form (...) of selective realism is described which denies that one can simply read the ontological commitments from the theory itself. (shrink)
In this paper, we show that it is not a conceptual truth about laws of nature that they are immutable. In order to do so, we survey three popular accounts of lawhood— necessitarianism, dispositionalism and ‘best system analysis’—and expose the extent, as well as the philosophical cost, of the amendments that should be enforced in order to leave room for the possibility of changing laws.
A different way of thinking about how the sciences are organized is suggested by the use of cross-disciplinary computational methods as the organizing unit of science, here called computational templates. The structure of computational models is articulated using the concepts of construction assumptions and correction sets. The existence of these features indicates that certain conventionalist views are incorrect, in particular it suggests that computational models come with an interpretation that cannot be removed as well as a prior justification. A form (...) of selective realism is described which denies that one can simply read the ontological commitments from the theory itself. (shrink)
Four interpretations of single-case conditional propensities are described and it is shown that for each a version of what has been called ‘Humphreys' Paradox’ remains, despite the clarifying work of Gillies, McCurdy and Miller. This entails that propensities cannot be a satisfactory interpretation of standard probability theory. Introduction The basic issue The formal paradox Values of conditional propensities Interpretations of propensities McCurdy's response Miller's response Other possibilities 8.1 Temporal evolution 8.2 Renormalization 8.3 Causal influence Propensities to generate frequencies Conclusion.
A twofold taxonomy for emergence is presented into which a variety of contemporary accounts of emergence fit. The first taxonomy consists of inferential, conceptual, and ontological emergence; the second of diachronic and synchronic emergence. The adequacy of weak emergence, a computational form of inferential emergence, is then examined and its relationship to conceptual emergence and ontological emergence is detailed. †To contact the author, please write to: Corcoran Department of Philosophy, 120 Cocke Hall, University of Virginia, Charlottesville, VA 22904‐4780; e‐mail: [email protected]
The process of constructing mathematical models is examined and a case made that the construction process is an integral part of the justification for the model. The role of heuristics in testing and modifying models is described and some consequences for scientific methodology are drawn out. Three different ways of constructing the same model are detailed to demonstrate the claims made here.
It is shown that three common conditions for scientific explanations are violated by a widely used class of domain-independent explanations. These explanations can accommodate both complex and noncomplex systems and do not require the use of detailed models of system-specific processes for their effectiveness, although they are compatible with such model-based explanations. The approach also shows how a clean separation can be maintained between mathematical representations and empirical content.
Although scientific models and simulations differ in numerous ways, they are similar in so far as they are posing essentially philosophical problems about the nature of representation. This collection is designed to bring together some of the best work on the nature of representation being done by both established senior philosophers of science and younger researchers. Most of the pieces, while appealing to existing traditions of scientific representation, explore new types of questions, such as: how understanding can be developed within (...) computational science; how the format of representations matters for their use, be it for the purpose of research or education; how the concepts of emergence and supervenience can be further analyzed by taking into account computational science; or how the emphasis upon tractability--a particularly important issue in computational science--sheds new light on the philosophical analysis of scientific reasoning. (shrink)
There have been many efforts to infer causation from association byusing statistical models. Algorithms for automating this processare a more recent innovation. In Humphreys and Freedman[(1996) British Journal for the Philosophy of Science 47, 113–123] we showed that one such approach, by Spirtes et al., was fatally flawed. Here we put our arguments in a broader context and reply to Korb and Wallace [(1997) British Journal for thePhilosophy of Science 48, 543–553] and to Spirtes et al.[(1997) British Journal for the (...) Philosophy of Science 48, 555–568]. Their arguments leave our position unchanged: claims to have developed a rigorous engine for inferring causation from association are premature at best, the theorems have no implications for samples of any realistic size, and the examples used to illustrate the algorithms are indicative of failure rather than success. The gap between association and causation has yet to be bridged. (shrink)
This handbook provides both an overview of state-of-the-art scholarship in philosophy of science, as well as a guide to new directions in the discipline. Section I contains broad overviews of the main lines of research and the state of established knowledge in six principal areas of the discipline, including computational, physical, biological, psychological and social sciences, as well as general philosophy of science. Section II covers what are considered to be the traditional topics in the philosophy of science, such as (...) causation, probability, models, ethics and values, and explanation. Section III identifies new areas of investigation that show promise of becoming important areas of research, including the philosophy of astronomy and astrophysics, data, complexity theory, neuroscience, simulations, post-Kuhnian philosophy, post-empiricist epistemology, and emergence. Most chapters are accessible to scientifically educated non-philosophers as well as to professional philosophers, and the contributors - all leading researchers in their field -- bring diverse perspectives from the North American, European, and Australasian research communities. This volume is an essential resource for scholars and students. (shrink)
In this commentary to Napoletani et al. (Found Sci 16:1–20, 2011), we argue that the approach the authors adopt suggests that neural nets are mathematical techniques rather than models of cognitive processing, that the general approach dates as far back as Ptolemy, and that applied mathematics is more than simply applying results from pure mathematics.
Existing definitions of relevance relations are essentially ambiguous outside the binary case. Hence definitions of probabilistic causality based on relevance relations, as well as probability values based on maximal specificity conditions and homogeneous reference classes are also not uniquely specified. A 'neutral state' account of explanations is provided to avoid the problem, based on an earlier account of aleatory explanations by the author. Further reasons in support of this model are given, focusing on the dynamics of explanation. It is shown (...) that truth in explanation need not entail maximal specificity and that probabilistic explanations should not contain a specification of probability values. (shrink)
This volume contains fifteen papers by Paul Humphreys, who has made important contributions to the philosophy of computer simulations, emergence, the philosophy of probability, probabilistic causality, and scientific explanation. It includes detailed postscripts to each section and a philosophical introduction. One of the papers is previously unpublished.
A comparison is made between some epistemological issues arising in computer networks and standard features of social epistemology. A definition of knowledge for computational devices is provided and the topics of nonconceptual content and testimony are discussed.
I argue here for a number of ways that modern computational science requires a change in the way we represent the relationship between theory and applications. It requires a switch away from logical reconstruction of theories in order to take surface mathematical syntax seriously. In addition, syntactically different versions of the same theory have important differences for applications, and this shows that the semantic account of theories is inappropriate for some purposes. I also argue against formalist approaches in the philosophy (...) of science and for a greater role for perceptual knowledge rather than propositional knowledge in scientific empiricism. (shrink)
Wesley Salmon provided three classic criteria of adequacy for satisfactory interpretations of probability. A fourth criterion is suggested here. A distinction is drawn between frequency‐driven probability models and theory‐driven probability models and it is argued that single case accounts of chance are superior to frequency accounts at least for the latter. Finally it is suggested that theories of chance should be required only to be contingently true, a position which is a natural extension of Salmon's ontic account of probabilistic causality (...) and his own later views on propensities. (shrink)
Starting with the view that methodological constraints depend upon the nature of the system investigated, a tripartite division between theoretical, semitheoretical, and empirical discoveries is made. Many nanosystems can only be investigated semitheoretically or empirically, and this aspect leads to some nanophenomena being weakly emergent. Self-assembling systems are used as an example, their existence suggesting that the class of systems that is not Kim-reducible may be quite large.