This book argues that the only kind of metaphysics that can contribute to objective knowledge is one based specifically on contemporary science as it really is, and not on philosophers' a priori intuitions, common sense, or simplifications of science. In addition to showing how recent metaphysics has drifted away from connection with all other serious scholarly inquiry as a result of not heeding this restriction, this book demonstrates how to build a metaphysics compatible with current fundamental physics, which, when combined (...) with metaphysics of the special sciences, can be used to unify physics with the other sciences without reducing these sciences to physics itself. Taking science metaphysically seriously, this book argues, means that metaphysicians must abandon the picture of the world as composed of self-subsistent individual objects, and the paradigm of causation as the collision of such objects. The text assesses the role of information theory and complex systems theory in attempts to explain the relationship between the special sciences and physics, treading a middle road between the grand synthesis of thermodynamics and information, and eliminativism about information. The consequences of the books' metaphysical theory for central issues in the philosophy of science are explored, including the implications for the realism versus empiricism debate, the role of causation in scientific explanations, the nature of causation and laws, the status of abstract and virtual objects, and the objective reality of natural kinds. (shrink)
Structural realism is considered by many realists and antirealists alike as the most defensible form of scientific realism. There are now many forms of structural realism and an extensive literature about them. There are interesting connections with debates in metaphysics, philosophy of physics and philosophy of mathematics. This entry is intended to be a comprehensive survey of the field.
Social machines are systems formed by technical and human elements interacting in a structured manner. The use of digital platforms as mediators allows large numbers of human participants to join such mechanisms, creating systems where interconnected digital and human components operate as a single machine capable of highly sophisticated behaviour. Under certain conditions, such systems can be described as autonomous and goal-driven agents. Many examples of modern Artificial Intelligence (AI) can be regarded as instances of this class of mechanisms. We (...) argue that this type of autonomous social machines has provided a new paradigm for the design of intelligent systems marking a new phase in the field of AI. The consequences of this observation range from methodological, philosophical to ethical. On the one side, it emphasises the role of Human-Computer Interaction in the design of intelligent systems, while on the other side it draws attention to both the risks for a human being and those for a society relying on mechanisms that are not necessarily controllable. The difficulty by companies in regulating the spread of misinformation, as well as those by authorities to protect task-workers managed by a software infrastructure, could be just some of the effects of this technological paradigm. (shrink)
We outline Ladyman's 'metaphysical' or 'ontic' form of structuralrealism and defend it against various objections. Cao, in particular, has questioned theview of ontology presupposed by this approach and we argue that by reconceptualisingobjects in structural terms it offers the best hope for the realist in thecontext of modern physics.
The semantic, or model-theoretic, approach to theories has recently come under criticism on two fronts: (i) it is claimed that it cannot account for the wide diversity of models employed in scientific practice—a claim which has led some to propose a “deflationary” account of models; (ii) it is further contended that the sense of “model” used by the approach differs from that given in model theory. Our aim in the present work is to articulate a possible response to these claims, (...) drawing on recent developments within the semantic approach itself. Thus, the first is answered by utilizing the notion of a “partial structure”, first introduced in this context by da Costa and French in 1990. The second claim is undermined by consideration of van Fraassen's understanding of “model” which corresponds well with that evinced by modem mathematicians. This latter discussion, in particular, has an impact on the continuing debate regarding the relative merits of the semantic and syntactic views and the developments presented here can be taken to provide further support to the former. (shrink)
There is good reason to believe that scientific realism requires a commitment to the objective modal structure of the physical world. Causality, equilibrium, laws of nature, and probability all feature prominently in scientific theory and explanation, and each one is a modal notion. If we are committed to the content of our best scientific theories, we must accept the modal nature of the physical world. But what does the scientific realist’s commitment to physical modality require? We consider whether scientific realism (...) is compatible with Humeanism about the laws of nature, and we conclude that it is not. We specifically identify three major problems for the best-systems account of lawhood: its central concept of strength cannot be formulated non-circularly, it cannot offer a satisfactory account of the laws of the special sciences, and it can offer no explanation of the success of inductive inference. In addition, Humeanism fails to be naturalistically motivated. For these reasons, we conclude that the scientific realist must embrace natural necessity. (shrink)
Few can imagine a world without telephones or televisions; many depend on computers and the Internet as part of daily life. Without scientific theory, these developments would not have been possible. In this exceptionally clear and engaging introduction to philosophy of science, James Ladyman explores the philosophical questions that arise when we reflect on the nature of the scientific method and the knowledge it produces. He discusses whether fundamental philosophical questions about knowledge and reality might be answered by science, and (...) considers in detail the debate between realists and antirealists about the extent of scientific knowledge. Along the way, central topics in philosophy of science, such as the demarcation of science from non-science, induction, confirmation and falsification, the relationship between theory and observation and relativism are all addressed. Important and complex current debates over underdetermination, inference to the best explaination and the implications of radical theory change are clarified and clearly explained for those new to the subject. (shrink)
The aim of this paper is to revisit the phlogiston theory to see what can be learned from it about the relationship between scientific realism, approximate truth and successful reference. It is argued that phlogiston theory did to some extent correctly describe the causal or nomological structure of the world, and that some of its central terms can be regarded as referring. However, it is concluded that the issue of whether or not theoretical terms successfully refer is not the key (...) to formulating the appropriate form of scientific realism in response to arguments from theory change, and that the case of phlogiston theory is shown to be readily accommodated by ontic structural realism. (shrink)
In discussions about whether the Principle of the Identity of Indiscernibles is compatible with structuralist ontologies of mathematics, it is usually assumed that individual objects are subject to criteria of identity which somehow account for the identity of the individuals. Much of this debate concerns structures that admit of non-trivial automorphisms. We consider cases from graph theory that violate even weak formulations of PII. We argue that (i) the identity or difference of places in a structure is not to be (...) accounted for by anything other than the structure itself and that (ii) mathematical practice provides evidence for this view. We want to thank Leon Horsten, Jeff Ketland, Øystein Linnebo, John Mayberry, Richard Pettigrew, and Philip Welch for valuable comments on drafts of this paper. We are especially grateful to Fraser MacBride for correcting our interpretation of two of his papers and for other helpful comments. CiteULike Connotea Del.icio.us What's this? (shrink)
It is argued that recent discussion of the principle of the identity of indiscernibles (PII) and quantum mechanics has lost sight of the broader philosophical motivation and significance of PII and that the `received view' of the status of PII in the light of quantum mechanics survives recent criticisms of it by Muller, Saunders, and Seevinck.
We examine, from the partial structures perspective, two forms of applicability of mathematics: at the “bottom” level, the applicability of theoretical structures to the “appearances”, and at the “top” level, the applicability of mathematical to physical theories. We argue that, to accommodate these two forms of applicability, the partial structures approach needs to be extended to include a notion of “partial homomorphism”. As a case study, we present London's analysis of the superfluid behavior of liquid helium in terms of Bose‐Einstein (...) statistics. This involved both the introduction of group theory at the top level, and some modeling at the “phenomenological” level, and thus provides a nice example of the relationships we are interested in. We conclude with a discussion of the “autonomy” of London's model. (shrink)
Interactions between an intelligent software agent and a human user are ubiquitous in everyday situations such as access to information, entertainment, and purchases. In such interactions, the ISA mediates the user’s access to the content, or controls some other aspect of the user experience, and is not designed to be neutral about outcomes of user choices. Like human users, ISAs are driven by goals, make autonomous decisions, and can learn from experience. Using ideas from bounded rationality, we frame these interactions (...) as instances of an ISA whose reward depends on actions performed by the user. Such agents benefit by steering the user’s behaviour towards outcomes that maximise the ISA’s utility, which may or may not be aligned with that of the user. Video games, news recommendation aggregation engines, and fitness trackers can all be instances of this general case. Our analysis facilitates distinguishing various subcases of interaction, as well as second-order effects that might include the possibility for adaptive interfaces to induce behavioural addiction, and/or change in user belief. We present these types of interaction within a conceptual framework, and review current examples of persuasive technologies and the issues that arise from their use. We argue that the nature of the feedback commonly used by learning agents to update their models and subsequent decisions could steer the behaviour of human users away from what benefits them, and in a direction that can undermine autonomy and cause further disparity between actions and goals as exemplified by addictive and compulsive behaviour. We discuss some of the ethical, social and legal implications of this technology and argue that it can sometimes exploit and reinforce weaknesses in human beings. (shrink)
Complex systems research is becoming ever more important in both the natural and social sciences. It is commonly implied that there is such a thing as a complex system, different examples of which are studied across many disciplines. However, there is no concise definition of a complex system, let alone a definition on which all scientists agree. We review various attempts to characterize a complex system, and consider a core set of features that are widely associated with complex systems in (...) the literature and by those in the field. We argue that some of these features are neither necessary nor sufficient for complexity, and that some of them are too vague or confused to be of any analytical use. In order to bring mathematical rigour to the issue we then review some standard measures of complexity from the scientific literature, and offer a taxonomy for them, before arguing that the one that best captures the qualitative notion of the order produced by complex systems is that of the Statistical Complexity. Finally, we offer our own list of necessary conditions as a characterization of complexity. These conditions are qualitative and may not be jointly sufficient for complexity. We close with some suggestions for future work. (shrink)
Homotopy Type Theory is a proposed new language and foundation for mathematics, combining algebraic topology with logic. An important rule for the treatment of identity in HoTT is path induction, which is commonly explained by appeal to the homotopy interpretation of the theory's types, tokens, and identities as spaces, points, and paths. However, if HoTT is to be an autonomous foundation then such an interpretation cannot play a fundamental role. In this paper we give a derivation of path induction, motivated (...) from pre-mathematical considerations, without recourse to homotopy theory. (shrink)
Constructive empiricism is supposed to offer a positive alternative to scientific realism that dispenses with the need for metaphysics. I first review the terms of the debate before arguing that the standard objections to constructive empiricism are not decisive. I then explain van Fraassen's views on modality and counterfactuals, and argue that, because constructive empiricism recommends on epistemological grounds belief in the empirical adequacy rather than the truth of theories, it requires that there be an objective modal distinction between the (...) observable and the unobservable. This conclusion is incompatible with van Fraassen's empiricism. Finally I explain some further problems for constructive empiricism that arise when we consider modal matters. (shrink)
Questions about the relation between identity and discernibility are important both in philosophy and in model theory. We show how a philosophical question about identity and dis- cernibility can be ‘factorized’ into a philosophical question about the adequacy of a formal language to the description of the world, and a mathematical question about discernibility in this language. We provide formal definitions of various notions of discernibility and offer a complete classification of their logical relations. Some new and surprising facts are (...) proved; for instance, that weak dis- cernibility corresponds to discernibility in a language with constants for every object, and that weak discernibility is the most discerning nontrivial discernibility relation. (shrink)
This chapter discusses the plausibility of the criticism against the thesis that external factors causally influence cognition and that they are, consequently, partly constitutive of cognition. The discussion should not be taken as implicitly proposing that the opposite theory is true, although the works of Adams and Aizawa suggest that they are defending internalism. This can be attributed to the fact that systems are, by definition, bounded; one must make assumptions about systems in developing cognitive models. This chapter defends the (...) position that metaphysical considerations should play no role in deciding how to model cognition. It further explains how there is no basis for a general fact of the matter about determining what is and what is not a cognitive system. (shrink)
While there are many examples of metaphysical theorising being heuristically and intellectually important in the progress of scientific knowledge, many people wonder how metaphysics not closely informed and inspired by empirical science could lead to rival or even supplementary knowledge about the world. This paper assesses the merits of a popular defence of the a priori methodology of metaphysics that goes as follows. The first task of the metaphysician, like the scientist, is to construct a hypothesis that accounts for the (...) phenomena in question. It is then argued that among the possible metaphysical theories, the empirical evidence underdetermines the right one, just as the empirical evidence underdetermines the right scientific theory. In the latter case it is widely agreed that we must break the underdetermination by appeal to theoretical virtues, and this is just what should be and largely is done in metaphysics. This is part of a more general line of argument that defends metaphysics on the basis of its alleged continuity with highly theoretical science. In what follows metaphysics and theoretical science are compared in order to see whether the above style of defence of a priori metaphysics is successful. (shrink)
van Fraassen (The empirical stance, 2002) contrasts the empirical stance with the materialist stance. The way he describes them makes both of them attractive, and while opposed they have something in common for both stances are scientific approaches to philosophy. The difference between them reflects their differing conceptions of science itself. Empiricists emphasise fallibilism, verifiability and falsifiability, and also to some extent scepticism and tolerance of novel hypotheses. Materialists regard the theoretical picture of the world as matter in motion as (...) a true and explanatory account and insist on not taking ' spooky' entities or processes seriously as potential explanations of phenomena that so far lie outside the scope of successful science. The history of science shows us that both stances have been instrumental in the achievement of progress at various times. It is therefore plausible for a naturalist to suggest that science depends for its success on the dialectic between empiricism and materialism. A truly naturalist approach to philosophy ought then to synthesise them. Call the synthesized empiricist and materialist stances 4he scientistic stance'.This paper elaborates and defends it. (shrink)
The Hole Argument is primarily about the meaning of general covariance in general relativity. As such it raises many deep issues about identity in mathematics and physics, the ontology of space–time, and how scientific representation works. This paper is about the application of a new foundational programme in mathematics, namely homotopy type theory, to the Hole Argument. It is argued that the framework of HoTT provides a natural resolution of the Hole Argument. The role of the Univalence Axiom in the (...) treatment of the Hole Argument in HoTT is clarified. (shrink)
Homotopy Type Theory is a putative new foundation for mathematics grounded in constructive intensional type theory that offers an alternative to the foundations provided by ZFC set theory and category theory. This article explains and motivates an account of how to define, justify, and think about HoTT in a way that is self-contained, and argues that, so construed, it is a candidate for being an autonomous foundation for mathematics. We first consider various questions that a foundation for mathematics might be (...) expected to answer, and find that many of them are not answered by the standard formulation of HoTT as presented in the ‘HoTT Book’. More importantly, the presentation of HoTT given in the HoTT Book is not autonomous since it explicitly depends upon other fields of mathematics, in particular homotopy theory. We give an alternative presentation of HoTT that does not depend upon ideas from other parts of mathematics, and in particular makes no reference to homotopy theory, and argue that it is a candidate autonomous foundation for mathematics. Our elaboration of HoTT is based on a new interpretation of types as mathematical concepts, which accords with the intensional nature of the type theory. 1 Introduction2 What Is a Foundation for Mathematics?2.1 A characterization of a foundation for mathematics2.2 Autonomy3 The Basic Features of Homotopy Type Theory3.1 The rules3.2 The basic ways to construct types3.3 Types as propositions and propositions as types3.4 Identity3.5 The homotopy interpretation4 Autonomy of the Standard Presentation?5 The Interpretation of Tokens and Types5.1 Tokens as mathematical objects?5.2 Tokens and types as concepts6 Justifying the Elimination Rule for Identity7 The Foundations of Homotopy Type Theory without Homotopy7.1 Framework7.2 Semantics7.3 Metaphysics7.4 Epistemology7.5 Methodology8 Possible Objections to this Account8.1 A constructive foundation for mathematics?8.2 What are concepts?8.3 Isn’t this just Brouwerian intuitionism?8.4 Duplicated objects8.5 Intensionality and substitution salva veritate9 Conclusion9.1 Advantages of this foundation. (shrink)
There has recently been a good deal of controversy about Landauer's Principle, which is often stated as follows: The erasure of one bit of information in a computational device is necessarily accompanied by a generation of kTln2 heat. This is often generalised to the claim that any logically irreversible operation cannot be implemented in a thermodynamically reversible way. John Norton (2005) and Owen Maroney (2005) both argue that Landauer's Principle has not been shown to hold in general, and Maroney offers (...) a method that he claims instantiates the operation Reset in a thermodynamically reversible way. In this paper we defend the qualitative form of Landauer's Principle, and clarify its quantitative consequences (assuming the second law of thermodynamics). We analyse in detail what it means for a physical system to implement a logical transformation L, and we make this precise by defining the notion of an L-machine. Then we show that logical irreversibility of L implies thermodynamic irreversibility of every corresponding L-machine. We do this in two ways. First, by assuming the phenomenological validity of the Kelvin statement of the second law, and second, by using information-theoretic reasoning. We illustrate our results with the example of the logical transformation 'Reset', and thereby recover the quantitative form of Landauer's Principle. (shrink)
Psillos has recently argued that van Fraassen’s arguments against abduction fail. Moreover, he claimed that, if successful, these arguments would equally undermine van Fraassen’s own constructive empiricism, for, Psillos thinks, it is only by appeal to abduction that constructive empiricism can be saved from issuing in a bald scepticism. We show that Psillos’ criticisms are misguided, and that they are mostly based on misinterpretations of van Fraassen’s arguments. Furthermore, we argue that Psillos’ arguments for his claim that constructive empiricism itself (...) needs abduction point up to his failure to recognize the importance of van Fraassen’s broader epistemology for constructive empiricism. Towards the end of our paper we discuss the suspected relationship between constructive empiricism and scepticism in the light of this broader epistemology, and from a somewhat more general perspective. (shrink)
The Univalence axiom, due to Vladimir Voevodsky, is often taken to be one of the most important discoveries arising from the Homotopy Type Theory research programme. It is said by Steve Awodey that Univalence embodies mathematical structuralism, and that Univalence may be regarded as ‘expanding the notion of identity to that of equivalence’. This article explores the conceptual, foundational and philosophical status of Univalence in Homotopy Type Theory. It extends our Types-as-Concepts interpretation of HoTT to Universes, and offers an account (...) of the Univalence axiom in such terms. We consider Awodey’s informal argument that Univalence is motivated by the principle that reasoning should be invariant under isomorphism, and we examine whether an autonomous and rigorous justification along these lines can be given. We consider two problems facing such a justification. First, there is a difference between equivalence and isomorphism and Univalence must be formulated in terms of the former. Second, the argument as presented cannot establish Univalence itself but only a weaker version of it, and must be supplemented by an additional principle. The article argues that the prospects for an autonomous justification of Univalence are promising. (shrink)
Cartwright and her collaborators have elaborated a provocative view of science which emphasises the independence from theory &unknown;in methods and aims&unknown; of phenomenological model building. This thesis has been supported in a recent paper by an analysis of the London and London model of superconductivity. In the present work we begin with a critique of Cartwright's account of the relationship between theoretical and phenomenological models before elaborating an alternative picture within the framework of the partial structures version of the semantic (...) approach to theories. Drawing on the recent histories of superconductivity by Dahl and Gavroglu, together with the original works by London and London and by F. London separately, and taking due consideration of the heuristic aspects, we argue that the historical details fail to support Cartwright et al.'s claims but that they fit comfortably within the partial structures framework. (shrink)
Among the most interesting features of Homotopy Type Theory is the way it treats identity, which has various unusual characteristics. We examine the formal features of “identity types” in HoTT, and how they relate to its other features including intensionality, constructive logic, the interpretation of types as concepts, and the Univalence Axiom. The unusual behaviour of identity types might suggest that they be reinterpreted as representing indiscernibility. We explore this by defining indiscernibility in HoTT and examine its relationship with identity. (...) We argue that identity types are a primitive component of HoTT and cannot be reduced to indiscernibility. (shrink)
Scientific representation: A long journey from pragmatics to pragmatics Content Type Journal Article DOI 10.1007/s11016-010-9465-5 Authors James Ladyman, Department of Philosophy, University of Bristol, 9 Woodland Rd, Bristol, BS8 1TB UK Otávio Bueno, Department of Philosophy, University of Miami, Coral Gables, FL 33124, USA Mauricio Suárez, Department of Logic and Philosophy of Science, Complutense University of Madrid, 28040 Madrid, Spain Bas C. van Fraassen, Philosophy Department, San Francisco State University, 1600 Holloway Avenue, San Francisco, CA 94132, USA Journal Metascience Online (...) ISSN 1467-9981 Print ISSN 0815-0796. (shrink)
The primacy of physics generates a philosophical problem that the naturalist must solve in order to be entitled to an egalitarian acceptance of the ontological commitments he or she inherits from the special sciences and fundamental physics. The problem is the generalized causal exclusion argument. If there is no genuine causation in the domains of the special sciences but only in fundamental physics then there are grounds for doubting the existence of macroscopic objects and properties, or at least the concreteness (...) of them. The aim of this paper is to show that the causal exclusion problem derives its force from a false dichotomy between Humeanism about causation and a notion of productive or generative causation based on a defunct model of the physical world. †To contact the author, please write to: Department of Philosophy, University of Bristol, 9 Woodland Rd., Bristol BS8 1TB, UK. (shrink)
Quantum mechanics tells us that states involving indistinguishable fermions must be antisymmetrized. This is often taken to mean that indistinguishable fermions are always entangled. We consider several notions of entanglement and argue that on the best of them, indistinguishable fermions are not always entangled. We also present a simple but unconventional way of representing fermionic states that allows us to maintain a link between entanglement and non-factorizability.
We provide a formulation of physicalism, and show that this is to be favoured over alternative formulations. Much of the literature on physicalism assumes without argument that there is a fundamental level to reality, and we show that a consideration of the levels problem and its implications for physicalism tells in favour of the form of physicalism proposed here. Its hey elements are, fast, that the empirical and substantive part of physicalism amounts to a prediction that physics will not posit (...) new entities solely for the purpose of accounting for mental phenomena, nor new entities with essentially mental characteristics such as propositioned attitudes or intentions; secondly, that physicalism can safely make do with no more than a weak global formulation of supervenience. (shrink)
When considering controversial thermodynamic scenarios such as Maxwell's demon, it is often necessary to consider probabilistic mixtures of states. This raises the question of how, if at all, to assign entropy to them. The information-theoretic entropy is often used in such cases; however, no general proof of the soundness of doing so has been given, and indeed some arguments against doing so have been presented. We offer a general proof of the applicability of the information-theoretic entropy to probabilistic mixtures of (...) macrostates, making clear the assumptions on which it depends, in particular a probabilistic version of the Kelvin statement of the Second Law. We briefly discuss the interpretation of our result. (shrink)
According to logical positivism, so the story goes, metaphysical questions are meaningless, since they do not admit of empirical confirmation or refutation. However, the logical positivists did not in fact reject as meaningless all questions about for example, the structure of space and time. Rather, key figures such as Reichenbach and Schlick believed that scientific theories often presupposed a conceptual framework that was not itself empirically testable, but which was required for the theory as a whole to be empirically testable. (...) For example, the theory of Special Relativity relies upon the simultaneity convention introduced by Einstein that assumes that the one-way speed of light is the same in all directions of space. Hence, the logical positivists accepted an a priori component to physical theories. However, they denied that this a priori component is necessarily true. Whereas for Kant, metaphysics is the a priori science of the necessary structure of rational thought about reality , the logical positivists were forced by the history of science to accept that the a priori structure of theories could change. Hence, they defended a notion of what Michael Friedman calls the ‘relativised’ or the ‘constitutive’ a priori. Carnap and Reichenbach held that such an a priori framework was conventional, whereas Schlick seems to have been more of a realist and held that the overall relative simplicity of different theories could count as evidence for their truth, notwithstanding the fact that some parts of them are not directly testable. All this is part of the story of how the verification principle came to be abandoned, and how logical positivism transmuted into logical empiricism. (shrink)