Gödel's Theorem is often used in arguments against machine intelligence, suggesting humans are not bound by the rules of any formal system. However, Gödelian arguments can be used to support AI, provided we extend our notion of computation to include devices incorporating random number generators. A complete description scheme can be given for integer functions, by which nonalgorithmic functions are shown to be partly random. Not being restricted to algorithms can be accounted for by the availability of an arbitrary (...) random function. Humans, then, might not be rule-bound, but Gödelian arguments also suggest how the relevant sort of nonalgorithmicity may be trivially made available to machines. (shrink)
"The Emperor's New Mind" by Roger Penrose has received a great deal of both praise and criticism. This review discusses philosophical aspects of the book that form an attack on the "strong" AI thesis. Eight different versions of this thesis are distinguished, and sources of ambiguity diagnosed, including different requirements for relationships between program and behaviour. Excessively strong versions attacked by Penrose (and Searle) are not worth defending or attacking, whereas weaker versions remain problematic. Penrose (like Searle) regards the notion (...) of an algorithm as central to AI, whereas it is argued here that for the purpose of explaining mental capabilities the architecture of an intelligent system is more important than the concept of an algorithm, using the premise that what makes something intelligent is not what it does but how it does it. What needs to be explained is also unclear: Penrose thinks we all know what consciousness is and claims that the ability to judge Go "del's formula to be true depends on it. He also suggests that quantum phenomena underly consciousness. This is rebutted by arguing that our existing concept of "consciousness" is too vague and muddled to be of use in science. This and related concepts will gradually be replaced by a more powerful theory-based taxonomy of types of mental states and processes. The central argument offered by Penrose against the strong AI thesis depends on a tempting but unjustified interpretation of Goedel's incompleteness theorem. Some critics are shown to have missed the point of his argument. A stronger criticism is mounted, and the relevance of mathematical Platonism analysed. Architectural requirements for intelligence are discussed and differences between serial and parallel implementations analysed. (shrink)
My purpose in this brief paper is to consider the implications of a radically different computer architecure to some fundamental problems in the foundations of Cognitive Science. More exactly, I wish to consider the ramifications of the 'Gödel-Minds-Machines' controversy of the late 1960s on a dynamically changing computer architecture which, I venture to suggest, is going to revolutionize which 'functions' of the human mind can and cannot be modelled by (non-human) computational automata. I will proceed on the presupposition that the (...) reader is familiar with some of the fundamentals of computational theory and mathematical logic. (shrink)
The Gödelian symphony -- Foundations and paradoxes -- This sentence is false -- The liar and Gödel -- Language and metalanguage -- The axiomatic method or how to get the non-obvious out of the obvious -- Peano's axioms -- And the unsatisfied logicists, Frege and Russell -- Bits of set theory -- The abstraction principle -- Bytes of set theory -- Properties, relations, functions, that is, sets again -- Calculating, computing, enumerating, that is, the notion of algorithm -- Taking numbers (...) as sets of sets -- It's raining paradoxes -- Cantor's diagonal argument -- Self-reference and paradoxes -- Hilbert -- Strings of symbols -- In mathematics there is no ignorabimus -- Gödel on stage -- Our first encounter with the incompleteness theorem -- And some provisos -- Gödelization, or say it with numbers! -- TNT -- The arithmetical axioms of tnt and the standard model N -- The fundamental property of formal systems -- The Gödel numbering -- And the arithmetization of syntax -- Bits of recursive arithmetic -- Making algorithms precise -- Bits of recursion theory -- Church's thesis -- The recursiveness of predicates, sets, properties, and relations -- And how it is represented in typographical number theory -- Introspection and representation -- The representability of properties, relations, and functions -- And the Gödelian loop -- I am not provable -- Proof pairs -- The property of being a theorem of TNI (is not recursive!) -- Arithmetizing substitution -- How can a TNT sentence refer to itself? -- Fixed point -- Consistency and omega-consistency -- Proving G1 -- Rosser's proof -- The unprovability of consistency and the immediate consequences of G1 and -- G2 -- Technical interlude -- Immediate consequences of G1 and G2 -- Undecidable1 and undecidable 2 -- Essential incompleteness, or the syndicate of mathematicians -- Robinson arithmetic -- How general are Gödel's results? -- Bits of turing machine -- G1 and G2 in general -- Unexpected fish in the formal net -- Supernatural numbers -- The culpability of the induction scheme -- Bits of truth (not too much of it, though) -- The world after Gödel -- Bourgeois mathematicians! : the postmodern interpretations -- What is postmodernism? -- From Gödel to Lenin -- Is biblical proof decidable? -- Speaking of the totality -- Bourgeois teachers! -- (un)interesting bifurcations -- A footnote to Plato -- Explorers in the realm of numbers -- The essence of a life -- The philosophical prejudices of our times -- From Gödel to Tarski -- Human, too human -- Mathematical faith -- I'm not crazy! -- Qualified doubts -- From gentzen to the dialectica interpretation -- Mathematicians are people of faith -- Mind versus computer : Gödel and artificial intelligence -- Is mind (just) a program? -- Seeing the truth and going outside the system -- The basic mistake -- In the haze of the transfinite -- Know thyself : Socrates and the inexhaustibility of mathematics -- Gödel versus wittgenstein and the paraconsistent interpretation -- When geniuses meet -- The implausible Wittgenstein -- There is no metamathematics -- Proof and prose -- The single argument -- But how can arithmetic be inconsistent? -- The costs and benefits of making Wittgenstein plausible. (shrink)
Goedel's theorem states that in any consistent system which is strong enough to produce simple arithmetic there are formulae which cannot be proved-in-the-system, but which we can see to be true. Essentially, we consider the formula which says, in effect, "This formula is unprovable-in-the-system". If this formula were provable-in-the-system, we should have a contradiction: for if it were provablein-the-system, then it would not be unprovable-in-the-system, so that "This formula is unprovable-in-the-system" would be false: equally, if it were provable-in-the-system, (...) then it would not be false, but would be true, since in any consistent system nothing false can be provedin-the-system, but only truths. So the formula "This formula is unprovable-in-the-system" is not provable-in-the-system, but unprovablein-the-system. Further, if the formula "This formula is unprovablein- the-system" is unprovable-in-the-system, then it is true that that formula is unprovable-in-the-system, that is, "This formula is unprovable-in-the-system" is true. Goedel's theorem must apply to cybernetical machines, because it is of the essence of being a machine, that it should be a concrete instantiation of a formal system. It follows that given any machine which is consistent and capable of doing simple arithmetic, there is a formula which it is incapable of producing as being true---i.e., the formula is unprovable-in-the-system-but which we can see to be true. It follows that no machine can be a complete or adequate model of the mind, that minds are essentially different from machines. (shrink)
The following essay reconsiders the ontological and logical issues around Frege’s Basic Law (V). If focuses less on Russell’s Paradox, as most treatments of Frege’s Grundgesetze der Arithmetik (GGA)1 do, but rather on the relation between Frege’s Basic Law (V) and Cantor’s Theorem (CT). So for the most part the inconsistency of Naïve Comprehension (in the context of standard Second Order Logic) will not concern us, but rather the ontological issues central to the conflict between (BLV) and (CT). These (...) ontological issues are interesting in their own right. And if and only if in case ontological considerations make a strong case for something like (BLV) we have to trouble us with inconsistency and paraconsistency. These ontological issues also lead to a renewed methodological reflection what to assume or recognize as an axiom. (shrink)
Many political theorists and philosophers use Condorcet's Jury Theorem to defend democracy. This paper illustrates an uncomfortable implication of Condorcet's Jury Theorem. Realistically, when the conditions of Condorcet’s Jury Theorem hold, even in very high stakes elections, having more than 100,000 citizens vote does no significant good in securing good political outcomes. On the Condorcet model, unless voters enjoy voting, or unless they produce some other value by voting, then the cost to most voters of voting exceeds (...) the expected epistemic benefits to the common good of their casting a vote. Anyone who is committed to democracy on the basis of the Jury Theorem ought also to hold that widespread voting is wasteful, at least unless she can provide some further justification of mass democratic participation. (shrink)
Fisher’s ‘fundamental theorem of natural selection’ is notoriously abstract, and, no less notoriously, many take it to be false. In this paper, I explicate the theorem, examine the role that it played in Fisher’s general project for biology, and analyze why it was so very fundamental for Fisher. I defend Ewens (1989) and Lessard (1997) in the view that the theorem is in fact a true theorem if, as Fisher claimed, ‘the terms employed’ are ‘used strictly (...) as defined’ (1930, p. 38). Finally, I explain the role that projects such as Fisher’s play in the progress of scientific inquiry. (shrink)
Bell’s theorem admits several interpretations or ‘solutions’, the standard interpretation being ‘indeterminism’, a next one ‘nonlocality’. In this article two further solutions are investigated, termed here ‘superdeterminism’ and ‘supercorrelation’. The former is especially interesting for philosophical reasons, if only because it is always rejected on the basis of extra-physical arguments. The latter, supercorrelation, will be studied here by investigating model systems that can mimic it, namely spin lattices. It is shown that in these systems the Bell inequality can be (...) violated, even if they are local according to usual definitions. Violation of the Bell inequality is retraced to violation of ‘measurement independence’. These results emphasize the importance of studying the premises of the Bell inequality in realistic systems. (shrink)
This book contains an introduction to symbolic logic and a thorough discussion of mechanical theorem proving and its applications. The book consists of three major parts. Chapters 2 and 3 constitute an introduction to symbolic logic. Chapters 4–9 introduce several techniques in mechanical theorem proving, and Chapters 10 an 11 show how theorem proving can be applied to various areas such as question answering, problem solving, program analysis, and program synthesis.
There are several known Lindström-style characterization results for basic modal logic. This paper proves a generic Lindström theorem that covers any normal modal logic corresponding to a class of Kripke frames definable by a set of formulas called strict universal Horn formulas. The result is a generalization of a recent characterization of modal logic with the global modality. A negative result is also proved in an appendix showing that the result cannot be strengthened to cover every first-order elementary class (...) of frames. This is shown by constructing an explicit counterexample. (shrink)
Takahashi translation * is a translation which means reducing all of the redexes in a λ-term simultaneously. In  and , Takahashi gave a simple proof of the Church–Rosser confluence theorem by using the notion of parallel reduction and Takahashi translation. Our aim of this paper is to give a simpler proof of Church–Rosser theorem using only the notion of Takahashi translation.
An intricate, long, and occasionally heated debate surrounds Boltzmann’s H-theorem (1872) and his combinatorial interpretation of the second law (1877). After almost a century of devoted and knowledgeable scholarship, there is still no agreement as to whether Boltzmann changed his view of the second law after Loschmidt’s 1876 reversibility argument or whether he had already been holding a probabilistic conception for some years at that point. In this paper, I argue that there was no abrupt statistical turn. In the (...) first part, I discuss the development of Boltzmann’s research from 1868 to the formulation of the H-theorem. This reconstruction shows that Boltzmann adopted a pluralistic strategy based on the interplay between a kinetic and a combinatorial approach. Moreover, it shows that the extensive use of asymptotic conditions allowed Boltzmann to bracket the problem of exceptions. In the second part I suggest that both Loschmidt’s challenge and Boltzmann’s response to it did not concern the H-theorem. The close relation between the theorem and the reversibility argument is a consequence of later investigations on the subject. (shrink)
We give a short, explicit proof of Hindman's Theorem that in every finite coloring of the integers, there is an infinite set all of whose finite sums have the same color. We give several examples of colorings of the integers which do not have computable witnesses to Hindman's Theorem.
Errors in the history of economic analysis often remain uncorrected for long periods due to positive epistemic costs (PEC) involved in allocating time to going back over what older generations wrote. In order to demonstrate this in a case study, the economists’ practice of the “Coase Theorem” is reconsidered from a PEC point of view.
It has been known for a few years that no more than Pi-1-1 comprehension is needed for the proof of "Frege's Theorem". One can at least imagine a view that would regard Pi-1-1 comprehension axioms as logical truths but deny that status to any that are more complex—a view that would, in particular, deny that full second-order logic deserves the name. Such a view would serve the purposes of neo-logicists. It is, in fact, no part of my view that, (...) say, Delta-3-1 comprehension axioms are not logical truths. What I am going to suggest, however, is that there is a special case to be made on behalf of Pi-1-1 comprehension. Making the case involves investigating extensions of first-order logic that do not rely upon the presence of second-order quantifiers. A formal system for so-called "ancestral logic" is developed, and it is then extended to yield what I call "Arché logic". (shrink)
This is an introduction to a collected volume. It distinguishes between evidential, statistical, and physical probability, and between objective and subjective understandings of evidential probability, in the use of Bayes’s theorem. If Bayes’s theorem is to be used to assess an objective evidential probability, a priori criteria--mainly the criterion of simplicity--are required to determine prior probability. The five main contributors to the volume discuss the use of Bayes’s theorem to assess the evidential probability of scientific theories, statistical (...) hypotheses, criminal guilt, and miracles; and also its value for assessing physical probability. (shrink)
This paper explores the set theoretic assumptions used in the current published proof of Fermat's Last Theorem, how these assumptions figure in the methods Wiles uses, and the currently known prospects for a proof using weaker assumptions.
Remarks on the Foundations of Mathematics, Wittgenstein, despite his official 'mathematical nonrevisionism', slips into attempting to refute Gödel's theorem. Actually, Wittgenstein could have used Gödel's theorem to good effect, to support his view that proof, and even truth, are 'family resemblance' concepts. The reason that Wittgenstein did not see all this is that Gödel's theorem had become an icon of mathematical realism, and he was blinded by his own ideology. The essay is a reply to Juliet Floyd's (...) work on Gödel: what she says Wittgenstein said, I say he should have said, but didn't (couldn't). (shrink)
The two theories that revolutionized physics in the twentieth century, relativity and quantum mechanics, are full of predictions that defy common sense. Recently, we used three such paradoxical ideas to prove “The Free Will Theorem” (strengthened here), which is the culmination of a series of theorems about quantum mechanics that began in the 1960s. It asserts, roughly, that if indeed we humans have free will, then elementary particles already have their own small share of this valuable commodity. More precisely, (...) if the experimenter can freely choose the directions in which to orient his apparatus in a certain measurement, then the particle’s response (to be pedantic—the universe’s response near the particle) is not determined by the entire previous history of the universe. Our argument combines the well-known consequence of relativity theory, that the time order of space-like separated events is not absolute, with the EPR paradox discovered by Einstein, Podolsky, and Rosen in 1935, and the Kochen-Specker Paradox of 1967 (See .) We follow Bohm in using a spin version of EPR and Peres in using his set of 33 directions, rather than the original configuration used by Kochen and Specker. More contentiously, the argument also involves the notion of free will, but we postpone further discussion of this to the last section of the article. Note that our proof does not mention “probabilities” or the “states” that determine them, which is.. (shrink)
Kurt Godel, the greatest logician of our time, startled the world of mathematics in 1931 with his Theorem of Undecidability, which showed that some statements in mathematics are inherently "undecidable." His work on the completeness of logic, the incompleteness of number theory, and the consistency of the axiom of choice and the continuum theory brought him further worldwide fame. In this introductory volume, Raymond Smullyan, himself a well-known logician, guides the reader through the fascinating world of Godel's incompleteness theorems. (...) The level of presentation is suitable for anyone with a basic acquaintance with mathematical logic. As a clear, concise introduction to a difficult but essential subject, the book will appeal to mathematicians, philosophers, and computer scientists. (shrink)
This article shows that in two respects, Gödel's incompleteness theorem strongly supports the arguments of Edgar Morin's complexity paradigm. First, from the viewpoint of the content of Gödel's theorem, the latter justifies the basic view of complexity paradigm according to which knowledge is a dynamic, unfinished process, and develops by way of self-criticism and self-transcendence. Second, from the viewpoint of the proof procedure of Gödel's theorem, the latter confirms the complexity paradigm's circular line of inference through which (...) is formed the all-round knowledge of a concrete object. (shrink)
Although the philosophical literature on the foundations of quantum field theory recognizes the importance of Haag’s theorem, it does not provide a clear discussion of the meaning of this theorem. The goal of this paper is to make up for this deficit. In particular, it aims to set out the implications of Haag’s theorem for scattering theory, the interaction picture, the use of non-Fock representations in describing interacting fields, and the choice among the plethora of the unitarily (...) inequivalent representations of the canonical commutation relations for free and interacting fields. (shrink)
Conway and Kochen have presented a “free will theorem” [4, 6] which they claim shows that “if indeed we humans have free will, then [so do] elementary particles.” In a more precise fashion, they claim it shows that for certain quantum experiments in which the experimenters can choose between several options, no deterministic or stochastic model can account for the observed outcomes without violating a condition “MIN” motivated by relativistic symmetry. We point out that for stochastic models this conclusion (...) is not correct, while for deterministic models it is not new. In the way the free will theorem is formulated and proved, it only concerns deterministic models. But Conway and Kochen have argued [4, 5, 6, 7] that “randomness can’t help,” meaning that stochastic models are excluded as well if we insist on the conditions “SPIN”, “TWIN”, and “MIN”. We point out a mistake in their argument. Namely, the theorem is of the form deterministic model with SPIN & TWIN & MIN ⇒ contradiction , (1) and in order to derive the further claim, which is of the form stochastic model with SPIN & TWIN & MIN ⇒ contradiction , (2) Conway and Kochen propose a method for converting any stochastic model into a deterministic one . (shrink)
In these articles, I describe Cantor’s power-class theorem, as well as a number of logical and philosophical paradoxes that stem from it, many of which were discovered or considered (implicitly or explicitly) in Bertrand Russell’s work. These include Russell’s paradox of the class of all classes not members of themselves, as well as others involving properties, propositions, descriptive senses, class-intensions, and equivalence classes of coextensional properties. Part I focuses on Cantor’s theorem, its proof, how it can be used (...) to manufacture paradoxes, Frege’s diagnosis of the core difficulty, and several broad categories of strategies for offering solutions to these paradoxes. (shrink)
It has been known for quite a while now that the on-going project of constructing an acceptable population axiology has gloomy prospects. Already in Derek Parfit’s seminal contribution to the topic, an informal paradox was presented and later contributions have proved similar results.1 All of these contributions invoke, however, some version of a principle – the Mere Addition Principle – which is controversial.2 In Arrhenius (1998), I presented a theorem which didn’t invoke this controversial principle but replaced it with (...) logically and intuitively weaker conditions. Still, however, one of the conditions in my theorem shares with these earlier results the presupposition that welfare can be measured on at least an interval scale.3 One can deny this and, as a matter of.. (shrink)
In this study, several proofs of the compactness theorem for propositional logic with countably many atomic sentences are compared. Thereby some steps are taken towards a systematic philosophical study of the compactness theorem. In addition, some related data and morals for the theory of mathematical explanation are presented.
We are going to prove a key theorem that tells us just a bit more about the structure of the non-standard countable models of first-order Peano Arithmetic; and then we will very briefly consider whether any broadly philosophical morals can be drawn from the technical result.
Informal statements of Gödel's Second Incompleteness Theorem, referred to here as Informal Second Incompleteness, are simple and dramatic. However, current versions of Formal Second Incompleteness are complicated and awkward. We present new versions of Formal Second Incompleteness that are simple, and informally imply Informal Second Incompleteness. These results rest on the isolation of simple formal properties shared by consistency statements. Here we do not address any issues concerning proofs of Second Incompleteness.
In 1999, Jeffrey Ketland published a paper which posed a series of technical problems for deflationary theories of truth. Ketland argued that deflationism is incompatible with standard mathematical formalizations of truth, and he claimed that alternate deflationary formalizations are unable to explain some central uses of the truth predicate in mathematics. He also used Beth’s definability theorem to argue that, contrary to deflationists’ claims, the T-schema cannot provide an ‘implicit definition’ of truth. In this article, I want to challenge (...) this final argument. Whatever other faults deflationism may have, the T-schema does provide an implicit definition of the truth predicate. Or so, at any rate, I shall argue. (shrink)
In response to recent work on the aggregation of individual judgments on logically connected propositions into collective judgments, it is often asked whether judgment aggregation is a special case of Arrowian preference aggregation. We argue for the converse claim. After proving two impossibility theorems on judgment aggregation (using "systematicity" and "independence" conditions, respectively), we construct an embedding of preference aggregation into judgment aggregation and prove Arrow’s theorem (stated for strict preferences) as a corollary of our second result. Although we (...) thereby provide a new proof of Arrow’s theorem, our main aim is to identify the analogue of Arrow’s theorem in judgment aggregation, to clarify the relation between judgment and preference aggregation, and to illustrate the generality of the judgment aggregation model. JEL Classi…cation: D70, D71.. (shrink)
I define T-schema deflationism as the thesis that a theory of truth for our language can simply take the form of certain instances of Tarski's schema (T). I show that any effective enumeration of these instances will yield as a dividend an effective enumeration of all truths of our language. But that contradicts Gödel's First Incompleteness Theorem. So the instances of (T) constituting the T-Schema deflationist's theory of truth are not effectively enumerable, which casts doubt on the idea that (...) the T-schema deflationist in any sense has a theory of truth. (The argument in section 2 of "Semantics for Deflationists" supercedes this paper.). (shrink)
Sequel to Part I. In these articles, I describe Cantor’s power-class theorem, as well as a number of logical and philosophical paradoxes that stem from it, many of which were discovered or considered (implicitly or explicitly) in Bertrand Russell’s work. These include Russell’s paradox of the class of all classes not members of themselves, as well as others involving properties, propositions, descriptive senses, class-intensions and equivalence classes of coextensional properties. Part II addresses Russell’s own various attempts to solve these (...) paradoxes, including strategies that he considered and rejected (limitation of size, the zigzag theory, etc.), as well as his own final views whereupon many purported entities that, if reified, lead to these contradictions, must not be genuine entities, but ‘logical fictions’ or ‘logical constructions’ instead. (shrink)
A search is under way for a theory that can accommodate our intuitions in population axiology. The object of this search has proved elusive. This is not surprising since, as we shall see, any welfarist axiology that satisfies three reasonable conditions implies at least one of three counter-intuitive conclusions. I shall start by pointing out the failures in three recent attempts to construct an acceptable population axiology. I shall then present an impossibility theorem and conclude with a short discussion (...) of how it might be extended to pluralist axiologies, that is, axiologies that take more values than welfare into account. (shrink)
Einstein's "spookiness" is now called nonlocality, the mysterious ability of Nature to enforce correlations between separated but entangled parts of a quantum system that are out of speed-of-light contact, to reach faster-than-light across vast spatial distances or even across time itself to ensure that the parts of a quantum system are made to match. This column is about nonlocality, and how, through Bell's theorem, the nonlocality implicit in nature has been demonstrated in the laboratory.
In mathematical logic, Craig’s Theorem (not to be confused with Craig’s Interpolation Theorem) states that any recursively enumerable theory is recursively axiomatizable. Its epistemological interest concerns its possible use as a method of eliminating “theoretical content” from scientific theories.
We consider a seemingly popular justification (we call it the Re-flexivity Defense) for the third derivability condition of the Hilbert-Bernays-Löb generalization of Godel's Second Incompleteness Theorem (G2). We argue that (i) in certain settings (rouglily, those where the representing theory of an arithmetization is allowed to be a proper subtheory of the represented theory), use of the Reflexivity Defense to justify the tliird condition induces a fourth condition, and that (ii) the justification of this fourth condition faces serious obstacles. (...) We conclude that, in the types of settings mentioned, the Reflexivity Defense does not justify the usual ‘reading’ of G2—namely, that the consistency of the represented theory is not provable in the representing theory. (shrink)
A brief, non-technical introduction to technical and philosophical aspects of Frege's philosophy of arithmetic. The exposition focuses on Frege's Theorem, which states that the axioms of arithmetic are provable, in second-order logic, from a single non-logical axiom, "Hume's Principle", which itself is: The number of Fs is the same as the number of Gs if, and only if, the Fs and Gs are in one-one correspondence.
Amalgamating evidence of different kinds for the same hypothesis into an overall confirmation is analogous, I argue, to amalgamating individuals’ preferences into a group preference. The latter faces well-known impossibility theorems, most famously Arrow’s Theorem. Once the analogy between amalgamating evidence and amalgamating preferences is tight, it is obvious that amalgamating evidence might face a theorem similar to Arrow’s. I prove that this is so, and end by discussing the plausibility of the axioms required for the theorem.
Is the restricted, consistent, version of the T-scheme sufficient for an ‘implicit definition’ of truth? In a sense, the answer is yes (Haack 1978 , Quine 1953 ). Section 4 of Ketland 1999 mentions this but gives a result saying that the T-scheme does not implicitly define truth in the stronger sense relevant for Beth’s Definability Theorem. This insinuates that the T-scheme fares worse than the compositional truth theory as an implicit definition. However, the insinuation is mistaken. For, as (...) Bays rightly points out, the result given extends to the compositional truth theory also. So, as regards implicit definability, both kinds of truth theory are equivalent. Some further discussion of this topic is mentioned (Gupta 2008 , Ketland 2003 , McGee 1991 ), all in agreement with Bays’s analysis. (shrink)
Following a long-standing philosophical tradition, impartiality is a distinctive and determining feature of moral judgments, especially in matters of distributive justice. This broad ethical tradition was revived in welfare economics by Vickrey, and above all, Harsanyi, under the form of the so-called Impartial Observer Theorem. The paper offers an analytical reconstruction of this argument and a step-wise philosophical critique of its premisses. It eventually provides a new formal version of the theorem based on subjective probability.
In recent years a number of criticisms have been raised against the formal systems of mathematical logic. The latter, qualified as closed systems, have been contrasted with systems of a new kind, called open systems, whose main feature is that they are always subject to unanticipated outcomes in their operation and can receive new information from outside at any time [cf. Hewitt 1991]. While Gödel's incompleteness theorem has been widely used to refute the main contentions of Hilbert's program, it (...) does not seem to have been generally used to point out the inadequacy of a basic ingredient of that program - the concept of formal system as a closed system - and to stress the need to replace it by the concept of formal system as an open system. (shrink)
A variation of Bell's theorem that deals with the indeterministic case is formulated and proved within the logical framework of Lewis's theory of counterfactuals. The no-faster-than-light-influence condition is expressed in terms of Lewis would counterfactual conditionals. Objections to this procedure raised by certain philosophers of science are examined and answered. The theorem shows that the incompatibility between the predictions of quantum theory and the idea of no faster-than-light influence cannot be ascribed to any auxiliary or tacit assumption of (...) either determinism or the related idea that outcomes of unperformed measurements are determinate within nature. In addition, the theorem provides an example of an application of Lewis's theory of counterfactuals in a rigorous scientific context. (shrink)
The CPT theorem of quantum field theory states that any relativistic (Lorentz-invariant) quantum field theory must also be invariant under CPT, the composition of charge conjugation, parity reversal and time reversal. This paper sketches a puzzle that seems to arise when one puts the existence of this sort of theorem alongside a standard way of thinking about symmetries, according to which spacetime symmetries (at any rate) are associated with features of the spacetime structure. The puzzle is, roughly, that (...) the existence of a CPT theorem seems to show that it is not possible for a well-formulated theory that does not make use of a preferred frame or foliation to make use of a temporal orientation. Since a manifold with only a Lorentzian metric can be temporally orientable—capable of admitting a temporal orientation—this seems to be an odd sort of necessary connection between distinct existences. The paper then suggests a solution to the puzzle: it is suggested that the CPT theorem arises because temporal orientation is unlike other pieces of spacetime structure, in that one cannot represent it by a tensor field. To avoid irrelevant technical details, the discussion is carried out in the setting of classical field theory, using a little-known classical analog of the CPT theorem. (shrink)
A variety of inaccurate claims about Gold's Theorem have appeared in the cognitive science literature. I begin by characterizing the logic of this theorem and its proof. I then examine several claims about Gold's Theorem, and I show why they are false. Finally, I assess the significance of Gold's Theorem for cognitive science.
I apply some of the lessons from quantum theory, in particular from Bell’s theorem, to a debate on the foundations of decision theory and causation. By tracing a formal analogy between the basic assumptions of causal decision theory (CDT)—which was developed partly in response to Newcomb’s problem— and those of a local hidden variable theory in the context of quantum mechanics, I show that an agent who acts according to CDT and gives any nonzero credence to some possible causal (...) interpretations underlying quantum phenomena should bet against quantum mechanics in some feasible game scenarios involving entangled systems, no matter what evidence they acquire. As a consequence, either the most accepted version of decision theory is wrong, or it provides a practical distinction, in terms of the prescribed behaviour of rational agents, between some metaphysical hypotheses regarding the causal structure underlying quantum mechanics. (shrink)
Tennenbaum's Theorem yields an elegant characterisation of the standard model of arithmetic. Several authors have recently claimed that this result has important philosophical consequences: in particular, it offers us a way of responding to model-theoretic worries about how we manage to grasp the standard model. We disagree. If there ever was such a problem about how we come to grasp the standard model, then Tennenbaum's Theorem does not help. We show this by examining a parallel argument, from a (...) simpler model-theoretic result. (shrink)
1.1 In 1955, John Harsanyi proved a remarkable theorem:1 Suppose n agents satisfy the assumptions of von Neumann/Morgenstern (1947) expected utility theory, and so does the group as a whole (or an observer). Suppose that, if each member of the group prefers option a to b, then so does the group, or the observer (Pareto condition). Then the group’s utility function is a weighted sum of the individual utility functions. Despite Harsanyi’s insistence that what he calls the Utilitarian (...) class='Hi'>Theorem embeds utilitarianism into a theory of rationality, the theorem has fallen short of having the kind of impact on the discussion of utilitarianism for which Harsanyi hoped. Yet how could the theorem influence this discussion? Utilitarianism is as attractive to some as it is appalling to others. The prospects for this dispute to be affected by a theorem seem dim. Yet a closer look shows how the theorem could make a contribution. To fix ideas, I understand by utilitarianism the following claims: (1) Consequentialism: Actions are evaluated in terms of their consequences only. (2) Bayesianism: An agent's beliefs about possible outcomes are captured probabilistically. (3) Welfarism: The judgement of the relative goodness of states of affairs is based.. (shrink)
In this paper, we show that Arrow’s well-known impossibility theorem is instrumental in bringing the ongoing discussion about verisimilitude to a more general level of abstraction. After some preparatory technical steps, we show that Arrow’s requirements for voting procedures in social choice are also natural desiderata for a general verisimilitude definition that places content and likeness considerations on the same footing. Our main result states that no qualitative unifying procedure of a functional form can simultaneously satisfy the requirements of (...) Unanimity, Independence of irrelevant alternatives and Non-dictatorship at the level of sentence variables. By giving a formal account of the incompatibility of the considerations of content and likeness, our impossibility result makes it possible to systematize the discussion about verisimilitude, and to understand it in more general terms. (shrink)
Any attempt to construct a realist interpretation of quantum theory founders on the Kochen-Specker theorem, which asserts the impossibility of assigning values to quantum quantities in a way that preserves functional relations between them. We construct a new type of valuation which is defined on all operators, and which respects an appropriate version of the functional composition principle. The truth-values assigned to propositions are (i) contextual; and (ii) multi-valued, where the space of contexts and the multi-valued logic for each (...) context come naturally from the topos theory of presheaves. The first step in our theory is to demonstrate that the Kochen-Specker theorem is equivalent to the statement that a certain presheaf defined on the category of self-adjoint operators has no global elements. We then show how the use of ideas drawn from the theory of presheaves leads to the definition of a generalised valuation in quantum theory whose values are sieves of operators. In particular, we show how each quantum state leads to such a generalised valuation. A key ingredient throughout is the idea that, in a situation where no normal truth-value can be given to a proposition asserting that the value of a physical quantity A lies in a set D of real numbers , it is nevertheless possible to ascribe a partial truth-value which is determined by the set of all coarse-grained propositions that assert that some function f(A) lies in f(D), and that are true in a normal sense. The set of all such coarse-grainings forms a sieve on the category of self-adjoint operators, and is hence fundamentally related to the theory of presheave. (shrink)
On September 6, 2004, using the Isabelle proof assistant, I veriﬁed the following statement: (%x. pi x * ln (real x) / (real x)) ----> 1 The system thereby conﬁrmed that the prime number theorem is a consequence of the axioms of higher-order logic together with an axiom asserting the existence of an inﬁnite set. All told, our number theory session, including the proof of the prime number theorem and supporting libraries, constitutes 673 pages of proof scripts, or (...) roughly 30,000 lines. This count includes about 65 pages of elementary number theory that we had at the outset, developed by Larry Paulson and others; also about 50 pages devoted to a proof of the law of quadratic reciprocity and properties of Euler’s ϕ function, neither of which are used in the proof of the prime number theorem. The page count does not include the basic HOL library, or properties of the real numbers that we obtained from the HOL-Complex library. (shrink)
It is argued that awareness of the distinction between dynamical and variational symmetries is crucial to understanding the significance of Noether's 1918 work. Specific attention is paid, by way of a number of striking examples, to Noether's first theorem, which establishes a correlation between dynamical symmetries and conservation principles.
In a previous paper, we have proposed assigning as the value of a physical quantity in quantum theory, a certain kind of set (a sieve) of quantities that are functions of the given quantity. The motivation was in part physical---such a valuation illuminates the Kochen-Specker theorem; and in part mathematical---the valuation arises naturally in the topos theory of presheaves. This paper discusses the conceptual aspects of this proposal. We also undertake two other tasks. First, we explain how the proposed (...) valuations could arise much more generally than just in quantum physics; in particular, they arise as naturally in classical physics. Second, we give another motivation for such valuations (that applies equally to classical and quantum physics). This arises from applying to propositions about the values of physical quantities some general axioms governing partial truth for any kind of proposition. (shrink)
We argue that Löb's Theorem implies a limitation on mechanism. Specifically, we argue, via an application of a generalized version of Löb's Theorem, that any particular device known by an observer to be mechanical cannot be used as an epistemic authority (of a particular type) by that observer: either the belief-set of such an authority is not mechanizable or, if it is, there is no identifiable formal system of which the observer can know (or truly believe) it to (...) be the theorem-set. This gives, we believe, an important and hitherto unnoticed connection between mechanism and the use of authorities by human-like epistemic agents. (shrink)
Zwart and Franssen’s impossibility theorem reveals a conflict between the possible-world-based content-definition and the possible-world-based likeness-definition of verisimilitude. In Sect. 2 we show that the possible-world-based content-definition violates four basic intuitions of Popper’s consequence-based content-account to verisimilitude, and therefore cannot be said to be in the spirit of Popper’s account, although this is the opinion of some prominent authors. In Sect. 3 we argue that in consequence-accounts , content-aspects and likeness-aspects of verisimilitude are not in conflict with each other, (...) but in agreement . We explain this fact by pointing towards the deep difference between possible-world- and the consequence-accounts, which does not lie in the difference between syntactic (object-language) versus semantic (meta-language) formulations, but in the difference between ‘disjunction-of-possible-worlds’ versus ‘conjunction-of-parts’ representations of theories. Drawing on earlier work, we explain in Sect. 4 how the shortcomings of Popper’s original definition can be repaired by what we call the relevant element approach. We propose a quantitative likeness-definition of verisimilitude based on relevant elements which provably agrees with the qualitative relevant content-definition of verisimilitude on all pairs of comparable theories. We conclude the paper with a plea for consequence-accounts and a brief analysis of the problem of language-dependence (Sect. 6). (shrink)
Carter and Leslie (1996) have argued, using Bayes's theorem, that our being alive now supports the hypothesis of an early 'Doomsday'. Unlike some critics (Eckhardt 1997), we accept their argument in part: given that we exist, our existence now indeed favors 'Doom sooner' over 'Doom later'. The very fact of our existence, however, favors 'Doom later'. In simple cases, a hypothetical approach to the problem of 'old evidence' shows that these two effects cancel out: our existence now yields no (...) information about the coming of Doom. More complex cases suggest a move from countably additive to non-standard probability measures. (shrink)
Condorcet's famous jury theorem reaches an optimistic conclusion on the correctness of majority decisions, based on two controversial premises about voters: they are competent and vote independently, in a technical sense. I carefully analyse these premises and show that: (i) whether a premise is justified depends on the notion of probability considered and (ii) none of the notions renders both premises simultaneously justified. Under the perhaps most interesting notions, the independence assumption should be weakened.
Here’s one version G¨ odel’s 1931 First Incompleteness Theorem: If T is a nice, sound theory of arithmetic, then it is incomplete, i.e. there are arithmetical sentences ϕ such that T proves neither ϕ nor ¬ϕ. There are three things here to explain straight away.
Werning applies a theorem by Hodges in order to put forward an argument against Quine’s thesis of the indeterminacy of translation (understood as a thesis on meaning, not on reference) and in favour of what Werning calls ‘semantic realism’. We show that the argument rests on two critical premises both of which are false. The reasons for these failures are explained and the actual place of this application of Hodges’ theorem within Quine’s philosophy of language is outlined.
This essay touches on a number of topics in philosophy of quantum field theory from the point of view of the LSZ asymptotic approach to scattering theory. First, particle/field duality is seen to be a property of free field theory and not of interacting QFT. Second, it is demonstrated how LSZ side-steps the implicationsof Haag's theorem. Finally, a recent argument due to Redhead (1995), Malament (1996) and Arageorgis (1995) against the concept of localized particle states is addressed. Briefly, the (...) argument observes thatthe Reeh–Schlieder theorem entails that correlations between spacelike separatedvacuum expectation values of local field operators are always present,and this, according to the above authors, dictates against the notion of a localizedparticle state. I claim that this moral is excessive and that a coherentnotion of localized particles is given by the LSZ approach. The underlyingmoral to be drawn from this analysis is that questions concerning theontology of interacting QFT cannot be appropriately addressed if one restrictsoneself to the free theory. (shrink)
A comparison is made of the traditional Loschmidt (reversibility) and Zermelo (recurrence) objections to Boltzmann's H-theorem, and its simplified variant in the Ehrenfests' 1912 wind-tree model. The little-cited 1896 (pre-recurrence) objection of Zermelo (similar to an 1889 argument due to Poincare) is also analysed. Significant differences between the objections are highlighted, and several old and modern misconceptions concerning both them and the H-theorem are clarified. (...) We give particular emphasis to the radical nature of Poincare's and Zermelo's attack, and the importance of the shift in Boltzmann's thinking in response to the objections as a whole. (shrink)
This paper combines personal reminiscences of the philosopher John Corcoran with a discussion of certain conflicts between historians of logic and philosophers of logic. Some mistaken claims about the history of the Bolzano-Weierstrass Theorem are analyzed in detail and corrected.
According to a wrong interpretation of the Bell theorem, it has been repeatedly claimed in recent times that we are forced by experiments to drop any possible form of realism in the foundations of quantum mechanics. In this paper I defend the simple thesis according to which the above claim cannot be consistently supported: the Bell theorem does not concern realism, and realism per se cannot be refuted in itself by any quantum experiment. As a consequence, realism in (...) quantum mechanics is not something that can be simply explained away once and for all on the basis of experiments, but rather something that must be conceptually characterized and discussed in terms of its foundational virtues and vices. To assess it, we cannot rely on experimentation but rather on philosophical discussion: realism is not a phlogiston-like notion, despite the efforts of the contemporary quantum orthodoxy to conceive it in Russellian terms as the relics of a bygone age. (shrink)
The hidden-variables model constructed by Karl Hess and Walter Philipp is claimed by its authors to exploit a "loophole" in Bell's theorem; according to Hess and Philipp, the parameters employed in their model extend beyond those considered by Bell. Furthermore, they claim that their model satisfies Einstein locality and is free of any "suspicion of spooky action at a distance." Both of these claims are false; the Hess-Philipp model achieves agreement with the quantum-mechanical predictions, not by circumventing Bell's (...) class='Hi'>theorem, but via Parameter Dependence. (shrink)
This paper provides a philosophical analysis of the ongoing controversy surrounding R.A. Fisher's famous fundamental theorem of natural selection. The difference between the traditional and modern interpretations of the theorem is explained. I argue that proponents of the modern interpretation have captured Fisher's intended meaning correctly and shown that the theorem is mathematically correct, pace the traditional consensus. However, whether the theorem has any real biological significance remains an unresolved issue. I argue that the answer depends (...) on whether we accept Fisher's non-standard notion of environmental change, on which the theorem rests; arguments for and against this notion are explored. I suggest that there is a close link between Fisher's fundamental theorem and the modern gene's eye view of evolution. Introduction What Does the Fundamental Theorem Say? Key Concepts Explained Alleged Significance of the FTNS Traditional versus Modern Interpretations of the FTNS The Modern Interpretation Illustrated Fisher's Concept of Environmental Change Causality and the Modern Interpretation The Significance of the FTNS Re-considered Appendix CiteULike Connotea Del.icio.us What's this? (shrink)
The aim of this paper is to comprehensively question the validity of the standard way of interpreting Chaitin''s famous incompleteness theorem, which says that for every formalized theory of arithmetic there is a finite constant c such that the theory in question cannot prove any particular number to have Kolmogorov complexity larger than c. The received interpretation of theorem claims that the limiting constant is determined by the complexity of the theory itself, which is assumed to be good (...) measure of the strength of the theory.I exhibit certain strong counterexamples and establish conclusively that the received view is false. Moreover, I show that the limiting constants provided by the theorem do not in any way reflect the power of formalized theories, but that the values of these constants are actually determined by the chosen coding of Turing machines, and are thus quite accidental. (shrink)
I address a number of questions concerning the interpretation of local time and the corresponding states theorem (CST) of the Versuch, questions which have been addressed either incompletely or inadequately in the secondary literature. In particular: (1) What is the relation between local time and the behavior of moving clocks? (2) What is the relation between the primed field variables and the electric and magnetic fields in a moving system? (3) What is the relation of the CST to the (...) principle of relativity and requirements of covariance? (4) Does the introduction of local time and the primed field variables constitute a hypothesis, i.e., an addition to or a modification of the basic theory? (shrink)
According to the received view, formalism – interpreted as the thesis that mathematical truth does not outrun the consequences of our maximal mathematical theory – has been refuted by Goedel's theorem. In support of this claim, proponents of the received view usually invoke an informal argument for the truth of the Goedel sentence, an argument which is supposed to reconstruct our reasoning in seeing its truth. Against this, Field has argued in a series of papers that the (...) principles involved in this argument – when applied to our maximal mathematical theory – are unsound. This paper defends the received view by showing that there is a way of seeing the truth of the Goedel sentence which is immune to Field's strategy. (shrink)
This paper examines Boltzmann’s responses to the Loschmidt reversibility objection to the H-theorem, as presented in his Lectures on Gas Theory. I describe and evaluate two distinct conceptions of the assumption of molecular disorder found in this work, and contrast these notions with the Stosszahlansatz, as well as with the predominant contemporary conception of molecular disorder. Both these conceptions are assessed with respect to the reversibility objection. Finally, I interpret Boltzmann as claiming that a state of molecular disorder serves (...) as a necessary condition for the application of probabilistic arguments. This in turn offers a way to bridge the conceptual gap between the H-theorem and his combinatorial argument. (shrink)
Starting from a brief recapitulation of the contemporary debate on scientific realism, this paper argues for the following thesis : Assume a theory T has been empirically successful in a domain of application A, but was superseded later on by a superior theory T * , which was likewise successful in A but has an arbitrarily different theoretical superstructure. Then under natural conditions T contains certain theoretical expressions, which yielded T's empirical success, such that these T-expressions correspond (in A) to (...) certain theoretical expressions of T * , and given T * is true, they refer indirectly to the entities denoted by these expressions of T * . The thesis is first motivated by a study of the phlogiston–oxygen example. Then the thesis is proved in the form of a logical theorem , and illustrated by further examples. The final sections explain how the correspondence theorem justifies scientific realism and work out the advantages of the suggested account. Introduction: Pessimistic Meta-induction vs. Structural Correspondence The Case of the Phlogiston Theory Steps Towards a Systematic Correspondence Theorem The Correspondence Theorem and Its Ontological Interpretation Further Historical Applications Discussion of the Correspondence Theorem: Objections and Replies Consequences for Scientific Realism and Comparison with Other Positions 7.1 Comparison with constructive empiricism 7.2 Major difference from standard scientific realism 7.3 From minimal realism and correspondence to scientific realism 7.4 Comparison with particular realistic positions CiteULike Connotea Del.icio.us What's this? (shrink)
Amartya Sen has recently urged that political philosophers pay attention to social choice theory in their deliberations about justice. However, despite its merits, social choice theory is not standardly part of undergraduate political philosophy. One difficulty is that it involves symbolic logic and difficult concepts. We can reduce this challenge by making the material no harder than it needs to be. I consider the standard proof of Arrow’s Theorem, a seminal result. Kenneth Arrow does not explicate the role of (...) the irrelevance of independent alternatives. Sen and Wulf Gaertner have offered clarifications, but I shall elucidate the full role. (shrink)
The renewed interest in the foundations of quantum statistical mechanics in recent years has led us to study John von Neumann’s 1929 article on the quantum ergodic theorem. We have found this almost forgotten article, which until now has been available only in German, to be a treasure chest, and to be much misunderstood. In it, von Neumann studied the long-time behavior of macroscopic quantum systems. While one of the two theorems announced in his title, the one he calls (...) the “quantum H-theorem,” is actually a much weaker statement than Boltzmann’s classical H-theorem, the other theorem, which he calls the “quantum ergodic theorem,” is a beautiful and very non-trivial result. It expresses a fact we call “normal typicality” and can be summarized as follows: For a “typical” finite family of commuting macroscopic observables, every initial wave function ψ0 from a micro-canonical energy shell so evolves that for most times t in the long run, the joint probability distribution of these observables obtained from ψt is close to their micro-canonical distribution. (shrink)
Bayes' Theorem is a simple mathematical formula used for calculating conditional probabilities. It figures prominently in subjectivist or Bayesian approaches to epistemology, statistics, and inductive logic. Subjectivists, who maintain that rational belief is governed by the laws of probability, lean heavily on conditional probabilities in their theories of evidence and their models of empirical learning. Bayes' Theorem is central to these enterprises both because it simplifies the calculation of conditional probabilities and because it clarifies significant features of subjectivist (...) position. Indeed, the Theorem's central insight — that a hypothesis is confirmed by any body of data that its truth renders probable — is the cornerstone of all subjectivist methodology. (shrink)
A semantical proof of Craig's interpolation theorem for the intuitionistic predicate logic and some intermediate prepositional logics will be given. Our proof is an extension of Henkin's method developed in . It will clarify the relation between the interpolation theorem and Robinson's consistency theorem for these logics and will enable us to give a uniform way of proving the interpolation theorem for them.
Bell’s theorem in its standard version demonstrates that the joint assumptions of the hidden-variable hypothesis and the principle of local causation lead to a conflict with quantum-mechanical predictions. In his latest counterfactual strengthening of Bell’s theorem, Stapp attempts to prove that the locality assumption itself contradicts the quantum-mechanical predictions in the Hardy case. His method relies on constructing a complex, non-truth functional formula which consists of statements about measurements and outcomes in some region R, and whose truth value (...) depends on the selection of a measurement setting in a space-like separated location L. Stapp argues that this fact shows that the information about the measurement selection made in L has to be present in R. I give detailed reasons why this conclusion can and should be resisted. Next I correct and formalize an informal argument by Shimony and Stein showing that the locality condition coupled with Einstein’s criterion of reality is inconsistent with quantum-mechanical predictions. I discuss the possibility of avoiding the inconsistency by rejecting Einstein’s criterion rather than the locality assumption. (shrink)
In this work I propose an analogy between Pythagoras's theorem and the logical-formal structure of Werner Heisenberg's "relations of uncertainty." The reasons that they have pushed to me to place this analogy have been determined from the following ascertainment: Often, when in exact sciences a problem of measurement precision arises, it has been resolved with the resource of the elevation to the square. To me it seems also that the aporie deriving from the uncertainty principle can find one solution (...) with the resource to this stratagem. In fact, if the first classic example of the argument is the solution of the incommensurability between catheti and the hypotenuse of the triangle rectangle, one of the last cases is that which is represented from Heisenberg's principle of uncertainty. (shrink)
This essay explores the detailed argument of the Coase Theorem, as found in Ronald Coase’s “The Problem of Social Cost” and subsequently defended by Coase in The Firm, the Market, and the Law. Fascination with the Coase Theorem arises over its apparently unassailable counterintuitive conclusion that the imposition of legal liability has no effect on which of two competing uses of land prevails, and also over the general difficulty in tying down an unqualified statement of the theorem. (...) Instead of entering the debate over what exactly the theorem holds, this article suggests that the core of Coase’s reasoning is flawed and to the extent that any version of the theorem relies upon this reasoning it can be disproved. It is further suggested that a version of the theorem which avoided the need for Coase’s core argument by focusing on “the efficiency thesis” at the expense of “the invariance thesis” would be insufficiently significant to merit the status of a theorem; and that, in any event, Coase’s reasoning does not sustain an efficient outcome. The heart of the essay comprises the allegation of an error made by Coase when he transferred his core argument to the context of economic rents. The essay commences by considering and modelling the nature of the counter-intuitive thrust to the Coase Theorem, which is used to trace the development of Coase’s reasoning, and ultimately to expose the flaw it contains. Ancillary observations are made on the relationship between the Coasean analysis of characteristically legal problems and the conditions of general market equilibrium, and the theoretical status of law-and-economics. (shrink)
Does the proof of Fermat’s Last Theorem (FLT) go beyond Zermelo Fraenkel set theory (ZFC)? Or does it merely use Peano Arithmetic (PA) or some weaker fragment of that? The answers depend on what is meant by “proof ” and “use,” and are not entirely known. This paper surveys the current state of these questions and brieﬂy sketches the methods of cohomological number theory used in the existing proof. The existing proof of FLT is Wiles  plus improvements that (...) do not yet change its character. Far from self-contained it has vast prerequisites merely introduced in the 500 pages of [Cornell et al., 1997]. We will say that the assumptions explicitly used in proofs that Wiles cites as steps in his own are “used in fact in the published proof.” It is currently unknown what assumptions are “used in principle” in the sense of being proof-theoretically indispensable to FLT. Certainly much less than ZFC is used in principle, probably nothing beyond PA, and perhaps much less than that. The oddly contentious issue is universes, often called Grothendieck uni- verses.1 On ZFC foundations a universe is an uncountable transitive set U.. (shrink)
The dream of a community of philosophers engaged in inquiry with shared standards of evidence and justification has long been with us. It has led some thinkers puzzled by our mathematical experience to look to mathematics for adjudication between competing views. I am skeptical of this approach and consider Skolem's philosophical uses of the Löwenheim-Skolem Theorem to exemplify it. I argue that these uses invariably beg the questions at issue. I say ?uses?, because I claim further that Skolem shifted (...) his position on the philosophical significance of the theorem as a result of a shift in his background beliefs. The nature of this shift and possible explanations for it are investigated. Ironically, Skolem's own case provides a historical example of the philosophical flexibility of his theorem. Our suspicion ought always to be aroused when a proof proves more than its means allow it. Something of this sort might be called ?a puffed-up proof?. Ludwig Wittgenstein, Remarks on the foundations of mathematics (revised edition), vol. 2, 21. (shrink)
This paper reconstructs and evaluates the representation theorem presented by Ramsey in his essay 'Truth and Probability', showing how its proof depends on a novel application of Hölder's theory of measurement. I argue that it must be understood as a solution to the problem of measuring partial belief, a solution that in many ways remains unsurpassed. Finally I show that the method it employs may be interpreted in such a way as to avoid a well known objection to it (...) due to Richard Jeffrey. (shrink)
We prove a uniqueness theorem showing that, subject to certain natural constraints, all 'no collapse' interpretations of quantum mechanics can be uniquely characterized and reduced to the choice of a particular preferred observable as determine (definite, sharp). We show how certain versions of the modal interpretation, Bohm's 'causal' interpretation, Bohr's complementarity interpretation, and the orthodox (Dirac-von Neumann) interpretation without the projection postulate can be recovered from the theorem. Bohr's complementarity and Einstein's realism appear as two quite different proposals (...) for selecting the preferred determinate observable--either settled pragmatically by what we choose to observe, or fixed once and for all, as the Einsteinian realist would require, in which case the preferred observable is a 'beable' in Bell's sense, as in Bohm's interpretation (where the preferred observable is position in configuration space). (shrink)
We introduce a notion of semantical closure for theories by formalizing Nepeivoda notion of truth. . Tarski theorem on truth definitions is discussed in the light of Kleene's three valued logic (here treated with a formal reinterpretation of logical constants). Connections with Definability Theory are also established.
We discuss the content and significance of John von Neumann’s quantum ergodic theorem (QET) of 1929, a strong result arising from the mere mathematical structure of quantum mechanics. The QET is a precise formulation of what we call normal typicality, i.e., the statement that, for typical large systems, every initial wave function ψ0 from an energy shell is “normal”: it evolves in such a way that |ψt ψt| is, for most t, macroscopically equivalent to the micro-canonical density matrix. The (...) QET has been mostly forgotten after it was criticized as a dynamically vacuous statement in several papers in the 1950s. However, we point out that this criticism does not apply to the actual QET, a correct statement of which does not appear in these papers, but to a different (indeed weaker) statement. Furthermore, we formulate a stronger statement of normal typicality, based on the observation that the bound on the deviations from the average specified by von Neumann is unnecessarily coarse and a much tighter (and more relevant) bound actually follows from his proof. (shrink)
This paper concerns voting with logical consequences, which means that anybody voting for an alternative x should vote for the logical consequences of x as well. Similarly, the social choice set is also supposed to be closed under logical consequences. The central result of the paper is that, given a set of fairly natural conditions, the only social choice functions that satisfy social logical closure are oligarchic (where a subset of the voters are decisive for the social choice). The set (...) of conditions needed for the proof include a version of Independence of Irrelevant Alternatives that also plays a central role in Arrow's impossibility theorem. (Published Online July 11 2006) Footnotes1 Much of this article was written while the author was a fellow at the Swedish Collegium for Advanced Study in the Social Sciences (SCASSS) in Uppsala. I want to thank the Collegium for providing me with excellent working conditions. Wlodek Rabinowicz and other fellows gave me valuable comments at a seminar at SCASSS when an early version of the paper was presented. I also want to thank Luc Bovens, Franz Dietrich, Christian List and an anonymous referee for their excellent comments on a later version. The final version was prepared during a stay at Oxford University for which I am grateful to the British Academy. (shrink)
We extend the topos-theoretic treatment given in previous papers of assigning values to quantities in quantum theory, and of related issues such as the Kochen-Specker theorem. This extension has two main parts: the use of von Neumann algebras as a base category (Section 2); and the relation of our generalized valuations to (i) the assignment to quantities of intervals of real numbers, and (ii) the idea of a subobject of the coarse-graining presheaf (Section 3).
Kochen and Specker's theorem can be seen as a consequence of Gleason's theorem and logical compactness. Similar compactness arguments lead to stronger results about finite sets of rays in Hilbert space, which we also prove by a direct construction. Finally, we demonstrate that Gleason's theorem itself has a constructive proof, based on a generic, finite, effectively generated set of rays, on which every quantum state can be approximated.
Since the validity of Bell's inequalities implies the existence of joint probabilities for non-commuting observables, there is no universal consensus as to what the violation of these inequalities signifies. While the majority view is that the violation teaches us an important lesson about the possibility of explanations, if not about metaphysical issues, there is also a minimalist position claiming that the violation is to be expected from simple facts about probability theory. This minimalist position is backed by theorems due to (...) A. Fine and I. Pitowsky.Our paper shows that the minimalist position cannot be sustained. To this end,we give a formally rigorous interpretation of joint probabilities in thecombined modal and spatiotemporal framework of `stochastic outcomes inbranching space-time' (SOBST) (Kowalski and Placek, 1999; Placek, 2000). We show in this framework that the claim that there can be no joint probabilities fornon-commuting observables is incorrect. The lesson from Fine's theorem is notthat Bell's inequalities will be violated anyhow, but that an adequate modelfor the Bell/Aspect experiment must not define global joint probabilities. Thus we investigate the class of stochastic hidden variable models, whichprima facie do not define such joint probabilities. The reasonwhy these models fail supports the majority view: Bell's inequalities are notjust a mathematical artifact. (shrink)
There is a familiar derivation of G¨ odel’s Theorem from the proof by diagonalization of the unsolvability of the Halting Problem. That proof, though, still involves a kind of self-referential trick, as we in effect construct a sentence that says ‘the algorithm searching for a proof of me doesn’t halt’. It is worth showing, then, that some core results in the theory of partial recursive functions directly entail G¨ odel’s First Incompleteness Theorem without any further self-referential trick.
A geometrical interpretation of independence and exchangeability leads to understanding the failure of de Finetti's theorem for a finite exchangeable sequence. In particular an exchangeable sequence of length r which can be extended to an exchangeable sequence of length k is almost a mixture of independent experiments, the error going to zero like 1/k.
This junior/senior level text is devoted to a study of first-order logic and its role in the foundations of mathematics: What is a proof? How can a proof be justified? To what extent can a proof be made a purely mechanical procedure? How much faith can we have in a proof that is so complex that no one can follow it through in a lifetime? The first substantial answers to these questions have only been obtained in this century. The most (...) striking results are contained in Goedel's work: First, it is possible to give a simple set of rules that suffice to carry out all mathematical proofs; but, second, these rules are necessarily incomplete - it is impossible, for example, to prove all true statements of arithmetic. The book begins with an introduction to first-order logic, Goedel's theorem, and model theory. A second part covers extensions of first-order logic and limitations of the formal methods. The book covers several advanced topics, not commonly treated in introductory texts, such as Trachtenbrot's undecidability theorem. Fraissé's elementary equivalence, and Lindstroem's theorem on the maximality of first-order logic. (shrink)