Coherent, well organized text familiarizes readers with complete theory of logical inference and its applications to math and the empirical sciences. Part I deals with formal principles of inference and definition; Part II explores elementary intuitive set theory, with separate chapters on sets, relations, and functions. Last section introduces numerous examples of axiomatically formulated theories in both discussion and exercises. Ideal for undergraduates; no background in math or philosophy required.
Bayesian prior probabilities have an important place in probabilistic and statistical methods. In spite of this fact, the analysis of where these priors come from and how they are formed has received little attention. It is reasonable to excuse the lack, in the foundational literature, of detailed psychological theory of what are the mechanisms by which prior probabilities are formed. But it is less excusable that there is an almost total absence of a detailed discussion of the highly differentiating nature (...) of past experience in forming a prior. The focus here is on what kind of account, even if necessarily schematic, can be given about the psychological mechanisms back of the formation of Bayesian priors. The last section examines a detailed experiment relevant to how priors are learned. (shrink)
This volume broadens our concept of reasoning and rationality to allow for a more pluralistic and situational view of human thinking as a practical activity. Drawing on contributors across disciplines including philosophy, economics, psychology, statistics, computer science, engineering, and physics, _Reasoning, Rationality, and Probability_ argues that the search for strong theories should leave room for the construction of context-sensitive conceptual tools. Both science and everyday life, the authors argue, are too complex and multifaceted to be forced into ready-made schemata.
PREVIOUS WORK Theoretical discussion of the interval measurement of utility based upon theories of decision making under conditions of risk has been voluminous and will not be reviewed here. Those interested will find extensive ...
We prove the existence of hidden variables, or, what we call generalized common causes, for finite sequences of pairwise correlated random variables that do not have a joint probability distribution. The hidden variables constructed have upper probability distributions that are nonmonotonic. The theorem applies directly to quantum mechanical correlations that do not satisfy the Bell inequalities.
The aim of this paper is to state the single most powerful argument for use of a non-classical logic in quantum mechanics. In outline the argument is the following. The working logic of a science is the logic of the events and propositions to which probabilities are assigned. A probability should be assigned to every element of the algebra of events. In the case of quantum mechanics probabilities may be assigned to events but not, without restriction, to the conjunction of (...) two events. The conclusion is that the working logic of quantum mechanics is not classical. The nature of the logic that is appropriate for quantum mechanics is examined. (shrink)
The thesis of this article is that the nature of probability is centered on its formal properties, not on any of its standard interpretations. Section 2 is a survey of Bayesian applications. Section 3 focuses on two examples from physics that seem as completely objective as other physical concepts. Section 4 compares the conflict between subjective Bayesians and objectivists about probability to the earlier strident conflict in physics about the nature of force. Section 5 outlines a pragmatic approach to the (...) various interpretations of probability. Finally, Sect. 6 argues that the essential formal nature of probability is expressed in the standard axioms, but more explicit attention should be given to the concept of randomness. (shrink)
Quantum mechanical entangled configurations of particles that do not satisfy Bell’s inequalities, or equivalently, do not have a joint probability distribution, are familiar in the foundational literature of quantum mechanics. Nonexistence of a joint probability measure for the correlations predicted by quantum mechanics is itself equivalent to the nonexistence of local hidden variables that account for the correlations (for a proof of this equivalence, see Suppes and Zanotti, 1981). From a philosophical standpoint it is natural to ask what sort of (...) concept can be used to provide a “joint” analysis of such quantum correlations. In other areas of application of probability, similar but different problems arise. A typical example is the introduction of upper and lower probabilities in the theory of belief. A person may feel uncomfortable assigning a precise probability to the occurrence of rain tomorrow, but feel comfortable saying the probability should be greater than ½ and less than ⅞. Rather extensive statistical developments have occurred for this framework. A thorough treatment can be found in Walley (1991) and an earlier measurement-oriented development in Suppes (1974). It is important to note that this focus on beliefs, or related Bayesian ideas, is not concerned, as we are here, with the nonexistence of joint probability distributions. Yet earlier work with no relation to quantum mechanics, but focused on conditions for existence has been published by many people. For some of our own work on this topic, see Suppes and Zanotti (1989). Still, this earlier work naturally suggested the question of whether or not upper and lower measures could be used in quantum mechanics, as a generalization of.. (shrink)
The role of the concept of invariance in physics and geometry is analyzed, with attention to the closely connected concepts of symmetry and objective meaning. The question of why the fundamental equations of physical theories are not invariant, but only covariant, is examined in some detail. The last part of the paper focuses on the surprising example of entropy as a complete invariant in ergodic theory for any two ergodic processes that are isomorphic in the measure-theoretic sense.
Ordinary measurement using a standard scale, such as a ruler or a standard set of weights, has two fundamental properties. First, the results are approximate, for example, within 0.1 g. Second, the resulting indistinguishability is transitive, rather than nontransitive, as in the standard psychological comparative judgments without a scale. Qualitative axioms are given for structures having the two properties mentioned. A representation theorem is then proved in terms of upper and lower measures.
This article is concerned to formulate some open problems in the philosophy of space and time that require methods characteristic of mathematical traditions in the foundations of geometry for their solution. In formulating the problems an effort has been made to fuse the separate traditions of the foundations of physics on the one hand and the foundations of geometry on the other. The first part of the paper deals with two classical problems in the geometry of space, that of giving (...) operationalism an exact foundation in the case of the measurement of spatial relations, and that of providing an adequate theory of approximation and error in a geometrical setting. The second part is concerned with physical space and space-time and deals mainly with topics concerning the axiomatic theory of bodies, the operational foundations of special relativity and the conceptual foundations of elementary physics. (shrink)
The fundamental problem considered is that of the existence of a joint probability distribution for momentum and position at a given instant. The philosophical interest of this problem is that for the potential energy functions (or Hamiltonians) corresponding to many simple experimental situations, the joint "distribution" derived by the methods of Wigner and Moyal is not a genuine probability distribution at all. The implications of these results for the interpretation of the Heisenberg uncertainty principle are analyzed. The final section consists (...) of some observations concerning the axiomatic foundations of quantum mechanics. (shrink)
In his published work and even more in conversations, Tarski emphasized what he thought were important philosophical aspects of his work. The English translation of his more philosophical papers [56m] was dedicated to his teacher Tadeusz Kotarbinski, and in informal discussions of philosophy he often referred to the influence of Kotarbinski. Also, the influence of Leiniewski, his dissertation adviser, is evident in his early papers. Moreover, some of his important papers of the 1930s were initially given to philosophical audiences. For (...) example, the famous monograph on the concepotf truth ([33”], [35b]) was first given as twol ectures to the Logic Section of the Philosophical Society in Warsaw in 1930. Second, his paper , which introduced the concepts of co-consistency and co-completeness as well as the rule of infinite induction: was first given at the Second Conference of the Polish Philosophical Society in Warsaw in 1927. Also [35c] was based upon an address given in 1934 to the conference for the Unity of Science in Prague; C361 and [36a] summarize an address given at the International Congress of Scientific Philosophy in Paris in 1935. The article [44a] was published in a philosophical journal and widely reprinted in philosophical texts. This list is of course not exhaustiveb ut only representative of Tarski’s philosophical interactions as reflected in lectures given to philosophical audiences, which were later embodied in substantial papers. After 1945 almost all of Tarski’s publications and presentations are mathematical in character with one or two minor exceptions. This division, occurring about 1945, does not, however, indicate al oss of interest in philosophical questionbsu t is a result of Tarski’s moving to the Department of Mathematics at Berkeley. There he assumed an important role in the development of logic within mathematics in the United States. (shrink)
This article focuses on the role of statistical concepts in both experiment and theory in various scientific disciplines, especially physics, including astronomy, and psychology. In Sect. 1 the concept of uncertainty in astronomy is analyzed from Ptolemy to Laplace and Gauss. In Sect. 2 theoretical uses of probability and statistics in science are surveyed. Attention is focused on the historically important example of radioactive decay. In Sect. 3 the use of statistics in biology and the social sciences is examined, with (...) detailed consideration of various Chi-square statistical tests. Such tests are essential for proper evaluation of many different kinds of scientific hypotheses. (shrink)