Thomason (1979/2010)'s argument against competence psychologism in semantics envisages a representation of a subject's competence as follows: he understands his own language in the sense that he can identify the semantic content of each of its sentences, which requires that the relation between expression and content be recursive. Then if the scientist constructs a theory that is meant to represent the body of the subject's beliefs, construed as assent to the content of the pertinent sentences, and that theory satisfies certain (...) 'natural assumptions', then it implies that the subject is inconsistent if the beliefs include arithmetic. I challenge the result by insisting that the motivation for Thomason's principle (ii), via Moore's Paradox, leads to a more complex representation, in which stating the facts and expressing one's beliefs are treated differently. Certain logical connections among expressions of assent, and between expression and statement, are a matter of consequence on pain of pragmatic incoherence, not consequence on pain of classical logical inconsistency. But while this salvages the possibility that a modification of the above sort of representation could be adequate, Thomason's devastating conclusion returns if the scientist identifies himself as the subject of that representation, even when paying heed to the requirement of pragmatic coherence of the sort highlighted by Moore's Paradox. (shrink)
The world science describes tends to have a very strange look. We can’t see atoms or force fields, nor are they imaginable within visualizable categories, so neither can we even imagine what the world must be like according to recent physical theories. That tension, between what science depicts as reality and how things appear to us, though it is more striking now, has been with us since modern science began. It can be addressed, and perhaps alleviated by inquiring into how (...) science represents nature. In general, representation is selective, the selection is of what is relevant to the purpose at hand, and success may even require distortion. From this point of view, the constraint on science, that it must ‘save the phenomena’, takes on a new form. The question to be faced is how the perspectival character of the appearances (that is, contents of measurement outcomes) can be related to the hidden structure that the sciences postulate. In the competing interpretations of quantum mechanics we can see how certain traditional ideals and constraints are left behind. Specifically, Carlo Rovelli’s Relational Quantum Mechanics offers a probative example of the freedom of scientific representation. (shrink)
The story of how Perrin's experimental work established the reality of atoms and molecules has been a staple in (realist) philosophy of science writings (Wesley Salmon, Clark Glymour, Peter Achinstein, Penelope Maddy,...). I'll argue that how this story is told distorts both what the work was and its significance, and draw morals for the understanding of how theories can be or fail to be empirically grounded.
In the philosophy of science, identity over time emerges as a central concern both as an ontological category in the interpretation of physical theories, and as an epistemological problem concerning the conditions of possibility of knowledge. In Reichenbach and subsequent writers on the problem of indistinguishable quantum particles we see the return of a contrast between Leibniz and Aquinas on the subject of individuation. The possibility of rejecting the principle of the identity of indiscernibles has certain logical difficulties, leading us (...) inexorably from ontology into epistemology. For the epistemological problem we attend to the differences that emerged between the (neo-)Kantian and logical empiricist traditions, also saliently displayed in Reichenbach's writings. After examining the contrast between Kant's and Leibniz's conceptions of empirical knowledge, specifically with respect to the irreducibility of spatiotemporal determinations, we explore an application of a neo-Kantian view to the same problem of indistinguishable quantum particles. (shrink)
Structural realism as developed by John Worrall and others can claim philosophical roots as far back as the late 19th century, though the discussion at that time does not unambiguously favor the contemporary form, or even its realism. After a critical examination of some aspects of the historical background some severe critical challenges to both Worrall's and Ladyman's versions are highlighted, and an alternative empiricist structuralism proposed. Support for this empiricist version is provided in part by the different way in (...) which we can do justice to Worrall's original demands and in part by the viewpoint it provides (in contrast to e.g. Michael Friedman's) on the stability maintained through scientific theory change. (shrink)
Vague subjective probability may be modeled by means of a set of probability functions, so that the represented opinion has only a lower and upper bound. The standard rule of conditionalization can be straight-forwardly adapted to this. But this combination has difficulties which, though well known in the technical literature, have not been given sufficient attention in probabilist or Bayesian epistemology. Specifically, updating on apparently irrelevant bits of news can be destructive of one's explicitly prior expectations. Stability of (...) class='Hi'>vague subjective opinion appears to need a more complex model. (shrink)
James Ladyman has argued that constructive empiricism entails modal realism, and that this renders constructive empiricism untenable. We maintain that constructive empiricism is compatible with modal nominalism. Although the central term 'observable' has been analyzed in terms of counterfactuals, and in general counterfactuals do not have objective truth conditions, the property of being observable is not a modal property, and hence there are objective, non-modal facts about what is observable. Both modal nominalism and constructive empiricism require clarification in the face (...) of Ladyman's argument. But we also argue that, even if Ladyman were right that constructive empiricism entails modal realism, this would not be a problem for constructive empiricism. (shrink)
A recent article (Leeds and Healey 1996) argues that the modal interpretation (Copenhagen variant) of quantum mechanics does not do justice to immediately repeated non-disturbing measurements. This objection has been raised before, but the article presents it in a new, detailed, precise form. I show that the objection is mistaken.
In response to parts I-III of G Rosen's "What is Constructive Empiricism?", "Philosophical Studies", 74, 1994, 143-178, this paper examines several construals of the position of constructive empiricism. At issue, in part, is the equation of intentional aspects of science with the intentions and opinions of scientists. In addition it is necessary to distinguish the constructive empiricist -- a philosopher holding that acceptance of theories in science need not involve belief that they are true -- from the scientific agnostic' who (...) accepts but does not believe current science. What is at stake for empiricism is a view of science that will admit it as a paradigm for rational inquiry. (shrink)
In a traditional view of science we come to fully believe the main accepted theories (the 'body of scientific knowledge'). Some of the hypotheses "possible for all that science tells us" seem more likely than others: enter probability as grading the possibilities left open. Probabilism contends with this tradition. Richard Jeffrey told us never to resolve doubt but only to quantify it, and to give maximal probability only to tautologies. Despite severe difficulties, I shall argue that the traditional view is (...) reconcilable with (moderate) probabilism. I will propose a single unified account with conditional personal probability as basic, allowing for full belief in empirical theories, with our probabilities grading the possibilities left open. (shrink)
The attempt to formulate a viable empiricist and non-foundationalist epistemology of science faces four problems here confronted. The first is an apparent loss of objectivity in science, in the conditions of use of models in applied science. The second derives from the theory-infection of scientific language, with an apparent loss of objective conditions of truth and reference. The third, often cited as objection to The Scientific Image, is the apparent theory-dependence of the distinction between what is and is not observable. (...) The fourth and last is the loss of the possibility of objective evaluation of rationality in scientific methodology. It is argued that each of these problems is illusory. (shrink)
Richard Jeffrey and Michael Goldstein have both introduced systematic approaches to the structure of opinion changes. For both approaches there are theorems which indicate great generality and width of scope. The main questions addressed here will be to what extent the basic forms of representation are intertranslatable, and how we can conceive of such programs in general.
Probability kinematics is the theory of how subjective probabilities change with time, in response to certain constraints (accepted by the subject). Rules are classified by the imposed constraints for which the rules prescribe a procedure for updating one's opinion. The first is simple conditionalization (constraint: give probability 1 to proposition A), and the second Jeffrey conditionalization (constraint: give probability x i , 0 i ). It is demonstrated by a symmetry argument that these rules are the unique admissible rules for (...) those constraints, and moreover, that any probability kinematic rule must be equivalent to a (simple or Jeffrey) conditionalization preceded by a determination of the values x i to be given to the members of such a partition. Next two rival rules which can go beyond such conditionalization are described. INFOMIN (minimize relative information) and MTP (maximize transition probability). Their properties are investigated and compared. (shrink)
A general form is proposed for epistemological theories, the relevant factors being: the family of epistemic judgments, the epistemic state, the epistemic commitment (governing change of state), and the family of possible epistemic inputs (deliverances of experience). First a simple theory is examined in which the states are probability functions, and the subject of probability kinematics introduced by Richard Jeffrey is explored. Then a second theory is examined in which the state has as constituents a body of information (...) (rational corpus) and a recipe that determines the accepted epistemic judgments on the basis of this corpus. Through an examination of several approaches to the statistical syllogism, a relation is again established with Jeffrey's generalized conditionalization. (shrink)
This paper focuses on the empiricism/realism debate. The initial portion of the paper is a short sketch of the nature of the enterprise of philosophy of science. What are taken as empiricist views on theory construction and experiment are described. The paper concludes with a simple recasting of the main points at issue in the empiricism/realism debate.
The modal interpretation of quantum mechanics has two variants: the Copenhagen variant (CV) and the anti-Copenhagen variant (ACV). Healey uses the Bell-Wigner locality condition to criticize the latter, which I do not advocate. 2 The conclusions of Healey's admirably written article are therefore welcome to me. But if I had wished to advocate the ACV, I do not think that his arguments would have dissuaded me. Specifically, as I shall explain, we should distinguish between Physical Locality and Metaphysical Locality. The (...) first principle is unexceptionable, but Healey needs the second for his reductio. At the end I shall briefly indicate why I reject the ACV and advocate instead the Copenhagen variant. (shrink)
Sentences attributing beliefs, doubts, wants, and the like (propositional attitudes, in Russell's terminology) have posed a major problem for semantics. Recently the pragmatic description of language has become more systematic. I shall discuss the formalization of pragmatics, and propose an analysis of belief attribution that avoids some main problems apparently inherent in the semantic approach.