Structural realism is considered by many realists and antirealists alike as the most defensible form of scientific realism. There are now many forms of structural realism and an extensive literature about them. There are interesting connections with debates in metaphysics, philosophy of physics and philosophy of mathematics. This entry is intended to be a comprehensive survey of the field.
Whether we think of the routine conviction or acquittal of suspects on the basis of scientific evidence in the law courts, the trust placed in scientific medicine and the extraordinary interventions it makes possible, or the importance that policy makers attach to the opinions of scientists, it is clear that those making up our scientific institutions are among the most authoritative and respected people that there are. Among intellectual endeavours science has an unrivalled dominance in terms of funding, status and (...) influence on practical affairs. However, the days when natural science was widely considered to be a model for the study of the arts and the humanities already seem distant. Indeed the influence of science even within subjects which were conceived of as scientific from their very inception, such as political science and sociology, has waned considerably. Perhaps in economics scientism is still dominant but elsewhere in academia a widespread disillusionment with science has taken hold. Perhaps this is understandable given what were with hindsight the obviously foolish attempts to study everything with the same methodology as is employed in physics. Yet the backlash against a misconceived scientism and reductionism in the study of social life and culture has amounted to more than just a defence of disciplinary boundaries, for critics of science now assail it in its own castles (which they allege are built in the air). (shrink)
The second edition of Peter Lipton’s classic text contains new and important material on the causal model of explanation, the relation of inference to the best explanation to the Bayesian account of scientiﬁc reasoning, how exactly explanation guides inference, and why we ought to think that explanatory virtues are truth-tropic. Lipton is a wonderfully clear writer and a thorough and subtle philosopher, and his book is both a student-friendly introduction to the issues addressed, and essential reading for expert epistemologists and (...) philosophers of science. Appeal to the notion of inference to the best explanation is ubiquitous in defences of scientiﬁc realism, but also elsewhere in philosophy where the explanatory virtues of theories are often the only purported grounds for accepting or rejecting them. Despite this, most authors are far from explicit about the details of inference to the best explanation, and Lipton’s book is the most sustained investigation of the relationship between explanation and inference currently available. Furthermore, Lipton is exemplary in his engagement with the problems his arguments face, and judiciously modest in his claims, though not so modest as to court triviality. Hence, the book is replete with interesting and careful arguments. Everyone interested in epistemology or philosophy of science ought to read this book. That said, in my discussion below I will concentrate on what I regard as problems with some of Lipton’s arguments. The model of explanation which he develops is contrastive and causal. Lipton is clear that he does not think all explanations are causal, but he does think that many are, especially in science, and. (shrink)
The aim of this paper is to revisit the phlogiston theory to see what can be learned from it about the relationship between scientific realism, approximate truth and successful reference. It is argued that phlogiston theory did to some extent correctly describe the causal or nomological structure of the world, and that some of its central terms can be regarded as referring. However, it is concluded that the issue of whether or not theoretical terms successfully refer is not the key (...) to formulating the appropriate form of scientific realism in response to arguments from theory change, and that the case of phlogiston theory is shown to be readily accommodated by ontic structural realism. (shrink)
van Fraassen (The empirical stance, 2002) contrasts the empirical stance with the materialist stance. The way he describes them makes both of them attractive, and while opposed they have something in common for both stances are scientific approaches to philosophy. The difference between them reflects their differing conceptions of science itself. Empiricists emphasise fallibilism, verifiability and falsifiability, and also to some extent scepticism and tolerance of novel hypotheses. Materialists regard the theoretical picture of the world as matter in motion as (...) a true and explanatory account and insist on not taking ‘spooky’ entities or processes seriously as potential explanations of phenomena that so far lie outside the scope of successful science. The history of science shows us that both stances have been instrumental in the achievement of progress at various times. It is therefore plausible for a naturalist to suggest that science depends for its success on the dialectic between empiricism and materialism. A truly naturalist approach to philosophy ought then to synthesise them. Call the synthesized empiricist and materialist stances ‘the scientistic stance’. This paper elaborates and defends it. (shrink)
Complex systems research is becoming ever more important in both the natural and social sciences. It is commonly implied that there is such a thing as a complex system, different examples of which are studied across many disciplines. However, there is no concise definition of a complex system, let alone a definition on which all scientists agree. We review various attempts to characterize a complex system, and consider a core set of features that are widely associated with complex systems in (...) the literature and by those in the field. We argue that some of these features are neither necessary nor sufficient for complexity, and that some of them are too vague or confused to be of any analytical use. In order to bring mathematical rigour to the issue we then review some standard measures of complexity from the scientific literature, and offer a taxonomy for them, before arguing that the one that best captures the qualitative notion of the order produced by complex systems is that of the Statistical Complexity. Finally, we offer our own list of necessary conditions as a characterization of complexity. These conditions are qualitative and may not be jointly sufficient for complexity. We close with some suggestions for future work. (shrink)
There has been much debate in philosophy about the relation between identity and distinctness on the one hand, and various forms of discernibility on the other. For instance, philosophers have debated the truth of the Principle of the Identity of Indiscernibles (PII), which is naturally formulated using a second-order quantifier ranging over some class of properties of particular philosophical significance.
I do not see why all philosophers should be interested in communicating their thoughts to the world. Philosophy is no different in this regard from pure mathematics or microbiology. The idea that every scientist should be a part-time public speaker is absurd.
Scientific representation: A long journey from pragmatics to pragmatics Content Type Journal Article DOI 10.1007/s11016-010-9465-5 Authors James Ladyman, Department of Philosophy, University of Bristol, 9 Woodland Rd, Bristol, BS8 1TB UK Otávio Bueno, Department of Philosophy, University of Miami, Coral Gables, FL 33124, USA Mauricio Suárez, Department of Logic and Philosophy of Science, Complutense University of Madrid, 28040 Madrid, Spain Bas C. van Fraassen, Philosophy Department, San Francisco State University, 1600 Holloway Avenue, San Francisco, CA 94132, USA Journal Metascience Online (...) ISSN 1467-9981 Print ISSN 0815-0796. (shrink)
It is argued that recent discussion of the principle of the identity of indiscernibles (PII) and quantum mechanics has lost sight of the broader philosophical motivation and significance of PII and that the `received view' of the status of PII in the light of quantum mechanics survives recent criticisms of it by Muller, Saunders, and Seevinck.
We provide a formulation of physicalism, and show that this is to be favoured over alternative formulations. Much of the literature on physicalism assumes without argument that there is a fundamental level to reality, and we show that a consideration of the levels problem and its implications for physicalism tells in favour of the form of physicalism proposed here. Its hey elements are, fast, that the empirical and substantive part of physicalism amounts to a prediction that physics will not posit (...) new entities solely for the purpose of accounting for mental phenomena, nor new entities with essentially mental characteristics such as propositioned attitudes or intentions; secondly, that physicalism can safely make do with no more than a weak global formulation of supervenience. (shrink)
The primacy of physics generates a philosophical problem that the naturalist must solve in order to be entitled to an egalitarian acceptance of the ontological commitments he or she inherits from the special sciences and fundamental physics. The problem is the generalized causal exclusion argument. If there is no genuine causation in the domains of the special sciences but only in fundamental physics then there are grounds for doubting the existence of macroscopic objects and properties, or at least the concreteness (...) of them. The aim of this paper is to show that the causal exclusion problem derives its force from a false dichotomy between Humeanism about causation and a notion of productive or generative causation based on a defunct model of the physical world. †To contact the author, please write to: Department of Philosophy, University of Bristol, 9 Woodland Rd., Bristol BS8 1TB, UK. (shrink)
When considering controversial thermodynamic scenarios such as Maxwell's demon, it is often necessary to consider probabilistic mixtures of states. This raises the question of how, if at all, to assign entropy to them. The information-theoretic entropy is often used in such cases; however, no general proof of the soundness of doing so has been given, and indeed some arguments against doing so have been presented. We offer a general proof of the applicability of the information-theoretic entropy to probabilistic mixtures of (...) macrostates, making clear the assumptions on which it depends, in particular a probabilistic version of the Kelvin statement of the Second Law. We briefly discuss the interpretation of our result. (shrink)
In discussions about whether the Principle of the Identity of Indiscernibles is compatible with structuralist ontologies of mathematics, it is usually assumed that individual objects are subject to criteria of identity which somehow account for the identity of the individuals. Much of this debate concerns structures that admit of non-trivial automorphisms. We consider cases from graph theory that violate even weak formulations of PII. We argue that (i) the identity or difference of places in a structure is not to be (...) accounted for by anything other than the structure itself and that (ii) mathematical practice provides evidence for this view. We want to thank Leon Horsten, Jeff Ketland, Øystein Linnebo, John Mayberry, Richard Pettigrew, and Philip Welch for valuable comments on drafts of this paper. We are especially grateful to Fraser MacBride for correcting our interpretation of two of his papers and for other helpful comments. CiteULike Connotea Del.icio.us What's this? (shrink)
There has recently been a good deal of controversy about Landauer's Principle, which is often stated as follows: The erasure of one bit of information in a computational device is necessarily accompanied by a generation of kTln2 heat. This is often generalised to the claim that any logically irreversible operation cannot be implemented in a thermodynamically reversible way. John Norton (2005) and Owen Maroney (2005) both argue that Landauer's Principle has not been shown to hold in general, and Maroney offers (...) a method that he claims instantiates the operation Reset in a thermodynamically reversible way. In this paper we defend the qualitative form of Landauer's Principle, and clarify its quantitative consequences (assuming the second law of thermodynamics). We analyse in detail what it means for a physical system to implement a logical transformation L, and we make this precise by defining the notion of an L-machine. Then we show that logical irreversibility of L implies thermodynamic irreversibility of every corresponding L-machine. We do this in two ways. First, by assuming the phenomenological validity of the Kelvin statement of the second law, and second, by using information-theoretic reasoning. We illustrate our results with the example of the logical transformation 'Reset', and thereby recover the quantitative form of Landauer's Principle. (shrink)
There has recently been a good deal of controversy about Landauer's Principle, which is often stated as follows: The erasure of one bit of information in a computational device is necessarily accompanied by a generation of kT ln 2 heat. This is often generalised to the claim that any logically irreversible operation cannot be implemented in a thermodynamically reversible way. John Norton (2005) and Owen Maroney (2005) both argue that Landauer's Principle has not been shown to hold in general, and (...) Maroney offers a method that he claims instantiates the operation reset in a thermodynamically reversible way. In this paper we defend the qualitative form of Landauer's Principle, and clarify its quantitative consequences (assuming the second law of thermodynamics). We analyse in detail what it means for a physical system to implement a logical transformation L, and we make this precise by defining the notion of an L-machine. Then we show that logical irreversibility of L implies thermodynamic irreversibility of every corresponding L-machine. We do this in two ways. First, by assuming the phenomenological validity of the Kelvin statement of the second law, and second, by using information-theoretic reasoning. We illustrate our results with the example of the logical transformation 'reset', and thereby recover the quantitative form of Landauer's Principle. (shrink)
, I argued that Bas van Fraassen's constructive empiricism was undermined in various ways by his antirealism about modality. Here I offer some comments and responses to the reply to my arguments by Bradley Monton and van Fraassen . In particular, after making some minor points, I argue that Monton and van Fraassen have not done enough to show that the context dependence of counterfactuals renders their truth conditions non-objective, and I also argue that adopting modal realism does after all (...) undermine the motivation for constructive empiricism. Introduction Underdetermination and epistemic modesty Counterfactual observations Modal realism and constructive empiricism. (shrink)
This commentary argues that Ross & Spurrett (R&S) have not shown that supervenience is two-way, but they have shown that all the sciences, including physics, make use of functional and supervenient properties. The entrenched defender of Kim's position could insist that only fundamental physics describes causal relations directly, but Kim's microphysical reductionism becomes completely implausible when we consider contemporary physics.
We outline Ladyman's 'metaphysical' or 'ontic' form of structuralrealism and defend it against various objections. Cao, in particular, has questioned theview of ontology presupposed by this approach and we argue that by reconceptualisingobjects in structural terms it offers the best hope for the realist in thecontext of modern physics.
Without scientific theory, the technology developments of recent years would not have been possible. In this exceptionally clear and engaging introduction to philosophy of science, James Ladyman explores the scope of natural science and its implications for human life. With the focus firmly upon realism, he discusses how fundamental philosophical questions can be answered by science and how scientific theory can confirm and inform our basic and intrinsic knowledge.
Constructive empiricism is supposed to offer a positive alternative to scientific realism that dispenses with the need for metaphysics. I first review the terms of the debate before arguing that the standard objections to constructive empiricism are not decisive. I then explain van Fraassen's views on modality and counterfactuals, and argue that, because constructive empiricism recommends on epistemological grounds belief in the empirical adequacy rather than the truth of theories, it requires that there be an objective modal distinction between the (...) observable and the unobservable. This conclusion is incompatible with van Fraassen's empiricism. Finally I explain some further problems for constructive empiricism that arise when we consider modal matters. (shrink)
The semantic, or model-theoretic, approach to theories has recently come under criticism on two fronts: (i) it is claimed that it cannot account for the wide diversity of models employed in scientific practice—a claim which has led some to propose a “deflationary” account of models; (ii) it is further contended that the sense of “model” used by the approach differs from that given in model theory. Our aim in the present work is to articulate a possible response to these claims, (...) drawing on recent developments within the semantic approach itself. Thus, the first is answered by utilizing the notion of a “partial structure”, first introduced in this context by da Costa and French in 1990. The second claim is undermined by consideration of van Fraassen's understanding of “model” which corresponds well with that evinced by modem mathematicians. This latter discussion, in particular, has an impact on the continuing debate regarding the relative merits of the semantic and syntactic views and the developments presented here can be taken to provide further support to the former. (shrink)
Cartwright and her collaborators have elaborated a provocative view of science which emphasises the independence from theory &unknown;in methods and aims&unknown; of phenomenological model building. This thesis has been supported in a recent paper by an analysis of the London and London model of superconductivity. In the present work we begin with a critique of Cartwright's account of the relationship between theoretical and phenomenological models before elaborating an alternative picture within the framework of the partial structures version of the semantic (...) approach to theories. Drawing on the recent histories of superconductivity by Dahl and Gavroglu, together with the original works by London and London and by F. London separately, and taking due consideration of the heuristic aspects, we argue that the historical details fail to support Cartwright et al.'s claims but that they fit comfortably within the partial structures framework. (shrink)