Critical pedagogy has often been linked in the literature to faith traditions such as liberation theology, usually with the intent of improving or redirecting it. While recognizing and drawing from those previous linkages, Jacob Neumann goes further in this essay and develops the thesis that critical pedagogy can not just benefit from a connection with faith traditions, but is actually, in and of itself, a practice of faith. In this analysis, he juxtaposes critical pedagogy against three conceptualizations of faith: (...)John Caputo's blurring of the modernist division between faith and reason, Paul Tillich's argument that faith is “ultimate concern,” and Paulo Freire's theology and early Christian influences. Using this three-pronged approach, Neumann argues that regardless of how it is seen, critical pedagogy manifests as a practice of faith “all the way down.”. (shrink)
The renewed interest in the foundations of quantum statistical mechanics in recent years has led us to study John von Neumann’s 1929 article on the quantum ergodic theorem. We have found this almost forgotten article, which until now has been available only in German, to be a treasure chest, and to be much misunderstood. In it, von Neumann studied the long-time behavior of macroscopic quantum systems. While one of the two theorems announced in his title, the one (...) he calls the “quantum H-theorem,” is actually a much weaker statement than Boltzmann’s classical H-theorem, the other theorem, which he calls the “quantum ergodic theorem,” is a beautiful and very non-trivial result. It expresses a fact we call “normal typicality” and can be summarized as follows: For a “typical” finite family of commuting macroscopic observables, every initial wave function ψ0 from a micro-canonical energy shell so evolves that for most times t in the long run, the joint probability distribution of these observables obtained from ψt is close to their micro-canonical distribution. (shrink)
Applied mathematics often operates by way of shakily rationalizedexpedients that can neither be understood in a deductive-nomological nor in an anti-realist setting.Rather do these complexities, so a recent paper of Mark Wilson argues, indicate some element in ourmathematical descriptions that is alien to the physical world. In this vein the mathematical opportunistopenly seeks or engineers appropriate conditions for mathematics to get hold on a given problem.Honest mathematical optimists, instead, try to liberalize mathematical ontology so as to include all physicalsolutions. Following (...)John von Neumann, the present paper argues that the axiomatization of a scientifictheory can be performed in a rather opportunistic fashion, such that optimism and opportunism appear as twomodes of a single strategy whose relative weight is determined by the status of the field to beinvestigated. Wilson's promising approach may thus be reformulated so as to avoid precarious talk about a physicalworld that is void of mathematical structure. This also makes the appraisal of the axiomatic method inapplied matthematics less dependent upon foundationalist issues. (shrink)
The evolution of John von Neumann's scientific interests and a study of his writings show that von Neumann increasingly supported an empirical, computational method. This is in stark contrast with the extant view of von Neumann as a pure theorist.
Both von Neumann and Wiener were outsiders to biology. Both were inspired by biology and both proposed models and generalizations that proved inspirational for biologists. Around the same time in the 1940s von Neumann developed the notion of self reproducing automata and Wiener suggested an explication of teleology using the notion of negative feedback. These efforts were similar in spirit. Both von Neumann and Wiener used mathematical ideas to attack foundational issues in biology, and the concepts they (...) articulated had lasting effect. But there were significant differences as well. Von Neumann presented a how-possibly model, which sparked interest by mathematicians and computer scientists, while Wiener collaborated more directly with biologists, and his proposal influenced the philosophy of biology. The two cases illustrate different strategies by which mathematicians, the “professional outsiders” of science, can choose to guide their engagement with biological questions and with the biological community, and illustrate different kinds of generalizations that mathematization can contribute to biology. The different strategies employed by von Neumann and Wiener and the types of models they constructed may have affected the fate of von Neumann’s and Wiener’s ideas – as well as the reputation, in biology, of von Neumann and Wiener themselves. (shrink)
_René Descartes proposed an interactive dualism that posits an interaction between the_ _mind of a human being and some of the matter located in his or her brain. Isaac Newton_ _subsequently formulated a physical theory based exclusively on the material/physical_ _part of Descartes’ ontology. Newton’s theory enforced the principle of the causal closure_ _of the physical, and the classical physics that grew out of it enforces this same principle._ _This classical theory purports to give, in principle, a complete deterministic account (...) of the_ _physically described properties of nature, expressed exclusively in terms of these_ _physically described properties themselves. Orthodox contemporary physical theory_ _violates this principle in two separate ways. First, it injects random elements into the_ _dynamics. Second, it allows, and also requires, abrupt probing actions that disrupt the_ _mechanistically described evolution of the physically described systems. These probing_ _actions are called Process 1 interventions by von Neumann. They are psycho-physical_ _events. Neither the content nor the timing of these events is determined either by any_ _known law, or by the afore-mentioned random elements. Orthodox quantum mechanics_ _considers these events to be instigated by choices made by conscious agents. In von_ _Neumann’s formulation of quantum theory each such intervention acts upon the state of_ _the brain of some conscious agent. Thus orthodox von Neumann contemporary physics_ _posits an interactive dualism similar to that of Descartes. But in this quantum version the_ _effects of the conscious choices upon our brains are controlled, in part, by the known_ _basic rules of quantum physics. This theoretically specified mind-brain connection allows_ _many basic psychological and neuropsychological findings associated with the apparent_ _physical effectiveness of our conscious volitional efforts to be explained in a causal and_ _practically useful way.. (shrink)
We discuss the content and significance of John von Neumann’s quantum ergodic theorem (QET) of 1929, a strong result arising from the mere mathematical structure of quantum mechanics. The QET is a precise formulation of what we call normal typicality, i.e., the statement that, for typical large systems, every initial wave function ψ0 from an energy shell is “normal”: it evolves in such a way that |ψt ψt| is, for most t, macroscopically equivalent to the micro-canonical density matrix. (...) The QET has been mostly forgotten after it was criticized as a dynamically vacuous statement in several papers in the 1950s. However, we point out that this criticism does not apply to the actual QET, a correct statement of which does not appear in these papers, but to a different (indeed weaker) statement. Furthermore, we formulate a stronger statement of normal typicality, based on the observation that the bound on the deviations from the average specified by von Neumann is unnecessarily coarse and a much tighter (and more relevant) bound actually follows from his proof. (shrink)
Describing the methodology of a prominent mathematician can be an over-ambitious task, especially if the mathematician in question has made crucial contributions to almost the whole of mathematical science. John von Neumann’s case study falls within this category. Nonetheless, we can still provide a clear picture of von Neumann’s methodology of science. Recent literature has clarified its key feature—the opportunistic approach to axiomatics—and has laid out its main principles. To be honest, this work can hardly be superseded. (...) What I would like to do is to complete the picture by adding one more step and emphasizing a point so far neglected, namely the role of Hilbert’s ideal in von Neumann’s epistemology. Von .. (shrink)
Around 1989, a striking letter written in March 1956 from Kurt Gödel to John von Neumann came to light. It poses some problems about the complexity of algorithms; in particular, it asks a question that can be seen as the first formulation of the P=?NP question. This paper discusses some of the background to this letter, including von Neumann's own ideas on complexity theory. Von Neumann had already raised explicit questions about the complexity of Tarski's decision (...) procedure for elementary algebra and geometry in a letter of 1949 to J. C. C. McKinsey. The paper concludes with a discussion of why theoretical computer science did not emerge as a separate discipline until the 1960s. (shrink)
Many works intended to introduce interpretive issues in quantum mechanics present John von Neumann as having a view in which measurement produces a physical collapse in the system being measured. In this paper I argue that such a reading of von Neumann is inconsistent with what von Neumann actually says. I show that much of what he says makes no sense on the physical collapse reading, but falls into place if we assume he does not have (...) such a view. I show that the physical collapse view is based on an understanding of ‘state’ which von Neumann does not share. Introduction The standard reading of von Neumann The standard reading of von Neumann and Chapter VI The Chapter VI argument The Chapter V argument The Chapters III and IV argument Conclusion. (shrink)
In this paper, I shall discuss the heuristic role of symmetry in the mathematical formulation of quantum mechanics. I shall first set out the scene in terms of Bas van Fraassen’s elegant presentation of how symmetry principles can be used as problem-solving devices (see van Fraassen  and ). I will then examine in what ways Hermann Weyl and John von Neumann have used symmetry principles in their work as a crucial problem-solving tool. Finally, I shall explore one (...) consequence of this situation to recent debates about structural realism (SR) and empiricism in physics (Worrall , Ladyman , and French ). (shrink)
We extend the topos-theoretic treatment given in previous papers of assigning values to quantities in quantum theory, and of related issues such as the Kochen-Specker theorem. This extension has two main parts: the use of von Neumann algebras as a base category (Section 2); and the relation of our generalized valuations to (i) the assignment to quantities of intervals of real numbers, and (ii) the idea of a subobject of the coarse-graining presheaf (Section 3).
In the paper it is shown that every physically sound Birkhoff – von Neumann quantum logic, i.e., an orthomodular partially ordered set with an ordering set of probability measures can be treated as partial infinite-valued Łukasiewicz logic, which unifies two competing approaches: the many-valued, and the two-valued but non-distributive, which have co-existed in the quantum logic theory since its very beginning.
Abstract Von Neumann (1932, Ch. 5) argued by means of a thought experiment involving measurements of spin observables that the quantum mechanical quantity is conceptually equivalent to thermodynamic entropy. We analyze Von Neumann's thought experiment and show that his argument fails. Over the past few years there has been a dispute in the literature regarding the Von Neumann entropy. It turns out that each contribution to this dispute (Shenker 1999, Henderson 2001, Hemmo 2003) addressed a different special (...) case. In this paper we generalize the discussion and examine the full matrix of possibilities that are relevant for the evaluation and understanding of Von Neumann’s argument. (shrink)
Much of the recent discussion of problematic aspects of quantum-mechanical measurement centers around that feature of quantum theory which is called "the projection postulate." This is roughly the claim that a change of a certain sort occurs in the state of a physical system when a measurement is made on the system. In this paper an argument for the projection postulate due to von Neumann is considered. Attention is focused on trying to provide an understanding of the notion of (...) "the state of a physical system" which is compatible with the argument von Neumann offers. An attempt is made to formulate the argument in terms of an objectivistic interpretation of probability concepts. It is seen that such an interpretation does not provide a suitable way of understanding the argument. An attempt is made to illustrate the source of this failure in terms of a non-quantum-mechanical example. (shrink)
Shenker has claimed that Von Neumann's argument for identifying the quantum mechanical entropy with the Von Neumann entropy, S() = – ktr( log ), is invalid. Her claim rests on a misunderstanding of the idea of a quantum mechanical pure state. I demonstrate this, and provide a further explanation of Von Neumann's argument.
We present an axiomatic framework for nonstandard analysis-the Nonstandard Class Theory (NCT) which extends von Neumann-Gödel-Bernays Set Theory (NBG) by adding a unary predicate symbol St to the language of NBG (St(X) means that the class X is standard) and axioms-related to it- analogs of Nelson's idealization, standardization and transfer principles. Those principles are formulated as axioms, rather than axiom schemes, so that NCT is finitely axiomatizable. NCT can be considered as a theory of definable classes of Bounded Set (...) Theory by V. Kanovei and M. Reeken. In many aspects NCT resembles the Alternative Set Theory by P. Vopenka. For example there exist semisets (proper subclasses of sets) in NCT and it can be proved that a set has a standard finite cardinality iff it does not contain any proper subsemiset. Semisets can be considered as external classes in NCT. Thus the saturation principle can be formalized in NCT. (shrink)
This paper offers a modified version of the certainty equivalence (CE) theory of utility for uncertain prospects and a new set of axioms as its basis. It shows that the CE and the von Neumann-Morgenstern (NM) approaches to uncertainty are opposite in spirit: The CE approach represents a flight from the world of uncertainty to the rules of certainty while the NM approach represents a flight from the world of certainty to one of uncertainty. The two approaches differ even (...) in their treatment of compound prospects and their actuarially identical simple counterparts. (shrink)
An information completion of an extensive game is obtained by extending the information partition of every player from the set of her decision nodes to the set of all nodes. The extended partition satisfies Memory of Past Knowledge (MPK) if at any node a player remembers what she knew at earlier nodes. It is shown that MPK can be satisfied in a game if and only if the game is von Neumann (vN) and satisfies memory at decision nodes (the (...) restriction of MPK to a player's own decision nodes). A game is vN if any two decision nodes that belong to the same information set of a player have the same number of predecessors. By providing an axiom for MPK we also obtain a syntactic characterization of the said class of vN games. (shrink)
We announce some new results regarding the classification problem for separable von Neumann algebras. Our results are obtained by applying the notion of Borel reducibility and Hjorth's theory of turbulence to the isomorphism relation for separable von Neumann algebras.
Sequential von Neumann–Morgernstern (VM) games are a very general formalism for representing multi-agent interactions and planning problems in a variety of types of environments. We show that sequential VM games with countably many actions and continuous utility functions have a sound and complete axiomatization in the situation calculus. This axiomatization allows us to represent game-theoretic reasoning and solution concepts such as Nash equilibrium. We discuss the application of various concepts from VM game theory to the theory of planning and (...) multi-agent interactions, such as representing concurrent actions and using the Baire topology to define continuous payoff functions. (shrink)
This article compares Alexander von Humboldt's and John Ruskin's writings on landscape art and natural landscape. In particular, Humboldt's conception of a habitat's essence as predominantly composed of vegetation as well as judgment of tropical American nature as the realm of nature of the highest aesthetic enjoyment is examined in the context of Ruskin's aesthetic theory. The magnitude of Humboldt's contribution to the natural sciences seems to have clouded our appreciation of his prominent status in the field of art (...) history. In addition to his position as scientist, Humboldt's role as aesthetician is demonstrated in this paper. Unlike Ruskin, who comfortably resides in the canon of art history relative to his minor significance in the field of geology, Humboldt has not been recognized for his impact on the world of art; his tremendous scientific importance seems to have overshadowed an appreciation of it. (shrink)
In the centenary year of Turing’s birth, a lot of good things are sure to be written about him. But it is hard to find something new to write about Turing. This is the biggest merit of this article: it shows how von Neumann’s architecture of the modern computer is a serendipitous consequence of the universal Turing machine, built to solve a logical problem.
In assessing the veridicality of utterances, we normally seem to assess the satisfaction of conditions that the speaker had been concerned to get right in making the utterance. However, the debate about assessor-relativism about epistemic modals, predicates of taste, gradable adjectives and conditionals has been largely driven by cases in which seemingly felicitous assessments of utterances are insensitive to aspects of the context of utterance that were highly relevant to the speaker’s choice of words. In this paper, we offer an (...) explanation of why certain locutions invite insensitive assessments, focusing primarily on ’tasty’ and ’might’. We spell out some reasons why felicitous insensitive assessments are puzzling and argue briefly that recent attempts to accommodate such assessments (including attempts by John MacFarlane, Kai von Fintel and Anthony Gillies) all fail to provide more than hints at a solution to the puzzle. In the main part of the paper, we develop an account of felicitous insensitive assessments by identifying a number of pragmatic factors that influence the felicity of assessments. Before closing, we argue that the role of these factors extend beyond cases considered in the debate about assessor-relativism and fit comfortably with standard contextualist analyses of the relevant locutions. (shrink)
A criterion for the existence of human free will is specified: a human action is asserted to be a manifestations of human free-will if this action is a specific physical action that is experienced as being consciously chosen and willed to occur by a human agent, and is not determined within physical theory either in terms of the physically described aspects of nature or by any non-human agency. This criterion is tied to the structure of a physical theory. It is (...) noted that the orthodox quantum mechanics that flows from John von Neumann’s analysis of the process of measurement in quantum theory is described in terms of three processes that are effectively based on a three-level conception of reality. Von Neumann’s “Process 2” is the deterministic evolution, via the Schroedinger equation, of a physically described aspect of reality, the quantum state. His “Process 1” is the physically described aspect of a psychophysical probing action whose psychologically described aspect is an increment in the knowledge of a probing agent/observer. Process 3, in Dirac’s words, is “a choice on the part of nature” of the response to such a probing action. It is argued here that all three levels of this quantum structure, the physically described quantum state, the probing knowledge-acquiring agents, and the response-choosing nature, are all best conceived as idea-like in character. Quantum mechanics, though puzzling when viewed from the inappropriate perspective of the mechanistic classical physics, becomes rationally coherent when the underlying reality is conceived to be not a physically described classical monism, but rather an ideabased quantum triality. This idea-based conception of reality evades the pitfalls of nonphysics-based idealism by being erected directly upon the basic concepts of pragmatic empirically validated quantum mechanics. However, the dynamical structure of quantum theory contains certain causal gaps.. (shrink)
The volumes of G¨ odel’s collected papers under review consist almost entirely of a rich selection of his philosophical/scientific correspondence, including English translations face-to-face with the originals when the latter are in German. The residue consists of correspondence with editors (more amusing than of any scientific value) and five letters from G¨ odel to his mother, in which explains to her his religious views. The term “selection” is strongly operative here: The editors state the total number of items of personal (...) and scientific correspondence in G¨ odel’s Nachlass to be around thirty-five hundred. The correspondence selected involves fifty correspondents, and the editors list the most prominent of these: Paul Bernays, William Boone, Rudolph Carnap. Paul Cohen, Burton Dreben, Jacques Herbrand, Arend Heyting, Karl Menger, Ernest Nagel, Emil Post, Abraham Robinson, Alfred Tarski, Stanislaw Ulam, John von Neumann, Hao Wang, and Ernest Zermelo. The correspondence is arranged alphebetically, with A-G in Volume IV. The imbalance results from the disproportionate size of the Bernays correrspondence: 85 letters are included (almost all of them), spanning 234 pages) including the face-to-face originals and translations). Each volume contains a calendar of all the items included in the volume together with separate calendars listing all known correspondence (whether included or not) with the major correspondents (seven in Volume IV and ten in Volume V). Let me recommend to the reader the review of these same volumes by Paolo Mancosu in the Notre Dame Journal of Formal Logic 45 (2004):109- 125. This essay very nicely describes much of the correspondence in terms of broad themes relating, especially, to the incompleteness theorems—their origins in G¨ odel’s thought, their reception, their impact on Hilbert’s program. (shrink)
In the early 1920s, the German mathematician David Hilbert (1862-1943) put forward a new proposal for the foundation of classical mathematics which has come to be known as Hilbert's Program. It calls for a formalization of all of mathematics in axiomatic form, together with a proof that this axiomatization of mathematics is consistent. The consistency proof itself was to be carried out using only what Hilbert called "finitary" methods. The special epistemological character of finitary reasoning then yields the required justification (...) of classical mathematics. Although Hilbert proposed his program in this form only in 1921, various facets of it are rooted in foundational work of his going back until around 1900, when he first pointed out the necessity of giving a direct consistency proof of analysis. Work on the program progressed significantly in the 1920s with contributions from logicians such as Paul Bernays, Wilhelm Ackermann, John von Neumann, and Jacques Herbrand. It was also a great influence on Kurt Gödel, whose work on the incompleteness theorems were motivated by Hilbert's Program. Gödel's work is generally taken to show that Hilbert's Program cannot be carried out. It has nevertheless continued to be an influential position in the philosophy of mathematics, and, starting with the work of Gerhard Gentzen in the 1930s, work on so-called Relativized Hilbert Programs have been central to the development of proof theory. (shrink)
Orthodox Copenhagen quantum theory renounces the quest to understand the reality in which we are imbedded, and settles for practical rules that describe connections between our observations. Many physicist have believed that this renunciation of the attempt describe nature herself was premature, and John von Neumann, in a major work, reformulated quantum theory as a theory of the evolving objective universe. In the course of his work he converted to a benefit what had appeared to be a severe (...) deficiency of the Copenhagen interpretation, namely its introduction into physical theory of the human observers. He used this subjective element of quantum theory to achieve a significant advance on the main problem in philosophy, which is to understand the relationship between mind and matter. That problem had been tied closely to physical theory by the works of Newton and Descartes. The present work examines the major problems that have appeared to block the development of von Neumann’s theory into a fully satisfactory theory of Nature, and proposes solutions to these problems. (shrink)
One account of the history of computation might begin in the 1930’s with some of the work of Alonzo Church, Alan Turing, and Emil Post. One might say that this is where something like the core concept of computation was first formally articulated. Here were the first attempts to formalize an informal notion of an algorithm or effective procedure by which a mathematician might decide one or another logico-mathematical question. As each of these formalisms was shown to compute the same (...) set of functions—the partial recursive functions—each of them might be described as a form of Turing-equivalent computation. This work set the cornerstone for what we might call computation theory. This history might then proceed to give pride of place to this form of computation in subsequent developments in cognitive science and in related disciplines and subdisciplines. Such a history might note that, in the 1940’s, the results of this work would have been transferred into the emerging field of computer science with the design and construction of the first electronic digital computers. Here one would mention Turing again, as well as perhaps Norbert Wiener, Julian Bigelow, John von Neumann, and many others. At about the same time, this theory of computation would have been inserted into the theory of neural networks by way of Warren McCulloch and Walter Pitts’s seminal work, “A Logical Calculus of the Ideas Immanent in Nervous Activity.” Somewhat later, during the 1960’s, Hilary Putnam introduced Turing machine tables into the philosophy of mind as a tool for illuminating various features of the mind-body problem, eventually transforming the intellectual landscape in the metaphysics of mind. Also during the 1960’s, Turingequivalent computation would have infiltrated psychology through the influence of Chomskyan linguistics and under the rubric of information processing psychology. Further, such computation would have been integrated into the fields of cognitive science and neuroscience as they emerged during the 1970’s and 1980’s.. (shrink)
Two hundred years ago, Friedrich Schleiermacher took critical issue with Immanuel Kant's intellectual notion of intuition as applied to human nature (Wellmon 2006). He found it necessary to modify—"hermeneutically," as he said—Kant's notion of anthropology by enabling it to include as human the new and strange human tribes Captain Cook found in the Pacific South Seas. A similar hermeneutic move is necessary if physics is to include the local contextual empirical syntheses of relativity and quantum physics. In this hermeneutical revision (...) the synthesis is formed around the notion of a Hilbert Vector Space as the universal grammar of physics, adding to it the dynamic of the Schrödinger equation, and representing empirical "observables" by projection operators that map the subspaces of definite measurable values. Among the set of observable projection operators, some pairs share the same subspace, commute with one another, and share a common laboratory setting. Other pairs do not share this property and are described as being mutually complementary. Complementary symmetries introduce into the discursive language of physics the commonsense notion of contextuality. The new synthesis, proposed by Eugene Wigner, John von Neumann, and (in his own way) Paul Dirac, brought physics into the community of common language and established it as a work of general human achievement. 1. (shrink)
The renewed interest in the foundations of quantum statistical mechanics in recent years has led us to study John von Neumann’s 1929 article on the quantum ergodic theorem. We have found this almost forgotten article, which until now has been available only in German, to be a treasure chest, and to be much misunderstood. In it, von Neumann studied the long-time behavior of macroscopic quantum systems. While one of the two theorems announced in his title, the one (...) he calls the “quantum H-theorem,” is actually a much weaker statement than Boltzmann’s classical H-theorem, the other theorem, which he calls the “quantum ergodic theorem,” is a beautiful and very non-trivial result. It expresses a fact we call “normal typicality” and can be summarized as follows: For a “typical” finite family of commuting macroscopic observables, every initial wave function ψ0 from a micro-canonical energy shell so evolves that for most times t in the long run, the joint probability distribution of these observables obtained from ψ is close to.. (shrink)
One account of the history of computation might begin in the 1930's with some of the work of Alonzo Church, Alan Turing, and Emil Post. One might say that this is where something like the core concept of computation was first formally articulated. Here were the first attempts to formalize an informal notion of an algorithm or effective procedure by which a mathematician might decide one or another logico-mathematical question. As each of these formalisms was shown to compute the same (...) set of functions—the partial recursive functions—each of them might be described as a form of Turing-equivalent computation. This work set the cornerstone for what we might call computation theory. This history might then proceed to give pride of place to this form of computation in subsequent developments in cognitive science and in related disciplines and subdisciplines. Such a history might note that, in the 1940's, the results of this work would have been transferred into the emerging field of computer science with the design and construction of the first electronic digital computers. Here one would mention Turing again, as well as perhaps Norbert Wiener, Julian Bigelow, John von Neumann, and many others. At about the same time, this theory of computation would have been inserted into the theory of neural networks by way of Warren McCulloch and Walter Pitts's seminal work, “A Logical Calculus of the Ideas Immanent in Nervous Activity.” Somewhat later, during the 1960's, Hilary Putnam introduced Turing machine tables into the philosophy of mind as a tool for illuminating various features of the mind-body problem, eventually transforming the intellectual landscape in.. (shrink)
“Anyone who considers arithmetical methods of producing random digits is, of course, in a state of sin.”1 John von Neumann’s famous dictum points an accusing ﬁnger at all who set their ordered minds to engender disorder. Much as in times past thieves, pimps, and actors carried on their profession with an uneasy conscience, so in this day scientists who devise random number generators suﬀer pangs of guilt. George Marsaglia, perhaps the preeminent worker in the ﬁeld, quips when he (...) asks his colleagues, “Who among us has not sinned?” Marsaglia’s work at the Supercomputer Computations Research Institute at Florida State University is well-known. Inasmuch as Marsaglia’s design and testing of random number generators depends on computation, and inasmuch as computation is fundamentally arithmetical, Marsaglia is by von Neumann’s own account a sinner. Working as he does on a supercomputer, Marsaglia is in fact a gross sinner. This he freely admits. Writing of the best random number generators he is aware of, Marsaglia states, “they are the result of arithmetic methods and those using them must, as all sinners must, face Redemption [sic] Day. But perhaps with better understanding we can postpone it.”. (shrink)
By quoting extensively from unpublished letters written by John von Neumann to Garret Birkhoff during the preparatory phase (in 1935) of their ground-breaking 1936 paper that established quantum logic, the main steps in the thought process leading to the 1936 Birkhoff?von Neumann paper are reconstructed. The reconstruction makes it clear why Birkhoff and von Neumann rejected the notion of quantum logic as the projection lattice of an infinite dimensional complex Hilbert space and why they postulated in (...) their 1936 paper that the quantum propositional system should be isomorphic to an abstract projective geometry. Looking at the paper now I see, that I forgot to say this, which should be said somewhere in the first ?: That while common logics did apply to quantum mechanics, if the notion of simultaneous measurability is introduced as an auxiliary notion, we wished to construct a logical system, which applies directly to quantum mechanics ? without any extraneous secondary notions like simultaneous measurability. And in order to have such a consequent, one-piece system of logics, we must change the classical class calculus of logics. (J. von Neumann to G. Birkhoff, November 21, 1935). (shrink)
The unprecedented opportunities for experiments in complexity presented by the first modern computers in the late 1940's raised hopes in early computer scientists (eg. John von Neumann and Alan Turing) that the ability to think, our greatest asset in our dealings with the world, might soon be understood well enough to be duplicated. Success in such an endeavor would extend mankind's mind in the same way that the development of energy machinery extended his muscles.
Almost anyone seriously interested in decision theory will name John von Neumann's (1928) Minimax Theorem as its foundation, whereas Utility and Rationality are imagined to be the twin towers on which the theory rests. Yet, experimental results and real-life observations seldom support that expectation. Over two centuries ago, Hume (1739–40/1978) put his finger on the discrepancy. “Reason,” he wrote “is, and ought to be the slave of passions, and can never pretend to any other office than to serve (...) and obey them.” In other words, effective means to reach specific goals can be prescribed, but not the goals. A wide range of experimental results and daily life behavior support this dictum. (shrink)
When John von Neumann turned his interest to computers, he was one of the leading mathematicians of his time. In the 1940s, he helped design two of the ﬁrst stored-program digital electronic computers. He authored reports explaining the functional organization of modern computers for the ﬁrst time, thereby inﬂuencing their construction worldwide (von Neumann, 1945; Burks et al., 1946). In the ﬁrst of these reports, von Neumann described the computer as analogous to a brain, with an (...) input “organ” (analogous to sensory neurons), a memory, an arithmetical and a logical “organ” (analogous to associative neurons), and an output “organ” (analogous to motor neurons). His experience with computers convinced him that brains and computers, both having to do with the processing of information, should be studied by a new discipline–automata theory. In fact, according to von Neumann, automata theory would cover not only computers and brains, but also any biological or artiﬁcial systems that dealt with information and control, including robots and genes. Von Neumann never formulated a full-blown mathematical theory of automata, but he wrote several important exploratory papers (von Neumann, 1951, 1956, 1966). Meanwhile, besides designing hardware, he developed some of the ﬁrst programs, programming languages, programming techniques, and numerical methods for solving mathematical problems using computers. (Much of his work on computing is reprinted in Aspray and Burks, 1987.) Shortly before his death in 1956, he wrote an informal synthesis of his views about brains. Though von Neumann left his manuscript sketchy and unﬁnished, Yale University Press published it as The Com- puter and the Brain in 1958. The 2000 reprint of this small but informative book is an opportunity to learn, or be reminded of, von Neumann’s thoughts on the computational organization of the mind-brain. Von Neumann began by explaining computers, which for him were essentially number crunchers: to compute was “to operate on .. (shrink)
I must start with an apologia. My original paper, ``Minds, Machines and GĂ¶del'', was written in the wake of Turing's 1950 paper in Mind, and was intended to show that minds were not Turing machines. Why, then, didn't I couch the argument in terms of Turing's theorem, which is easyish to prove and applies directly to Turing machines, instead of GĂ¶del's theorem, which is horrendously difficult to prove, and doesn't so naturally or obviously apply to machines? The reason was that (...) GĂ¶del's theorem gave me something more: it raises questions of truth which evidently bear on the nature of mind, whereas Turing's theorem does not; it shows not only that the GĂ¶delian well-formed formula is unprovable-in-the-system, but that it is true. It shows something about reasoning, that it is not completely rule-bound, so that we, who are rational, can transcend the rules of any particular logistic system, and construe the GĂ¶delian well-formed formula not just as a string of symbols but as a proposition which is true. Turing's theorem might well be applied to a computer which someone claimed to represent a human mind, but it is not so obvious that what the computer could not do, the mind could. But it is very obvious that we have a concept of truth. Even if, as was claimed in a previous paper, it is not the summum bonum, it is a bonum, and one it is characteristic of minds to value. A representation of the human mind which could take no account of truth would be inherently implausible. Turing's theorem, though making the same negative point as GĂ¶del's theorem, that some things cannot be done by even idealised computers, does not make the further positive point that we, in as much as we are rational agents, can do that very thing that the computer cannot. I have however, sometimes wondered whether I could not construct a parallel argument based on Turing's theorem, and have toyed with the idea of a von Neumann machine. A von Neumann machine was a black box, inside which was housed John von Neumann.. (shrink)
This article is my latest attempt to come up with a minimal version of my evolutionary theory of fairness (Binmore [11, 10, 8, 9]). The naturalism that I espouse is currently unpopular, but Figure 1 shows that the scientific tradition in moral philosophy nevertheless has a long and distinguished history. John Mackie  has been its most eloquent spokesman in modern times. His demolition of the claims made for a priori reasoning in moral philosophy seem unanswerable to me. (...) In Mackie’s view, human morality is an artefact of our evolutionary history. To study it, he tells us to look at the anthropological facts presented in such pioneering studies as Westermarck’s  Origin and Development of the Moral Ideas. And for a framework within which to make sense of such anthropological data, he directs our attention to Von Neumann’s theory of games. (shrink)
1.1 In 1955, John Harsanyi proved a remarkable theorem:1 Suppose n agents satisfy the assumptions of von Neumann/Morgenstern (1947) expected utility theory, and so does the group as a whole (or an observer). Suppose that, if each member of the group prefers option a to b, then so does the group, or the observer (Pareto condition). Then the group’s utility function is a weighted sum of the individual utility functions. Despite Harsanyi’s insistence that what he calls the Utilitarian (...) Theorem embeds utilitarianism into a theory of rationality, the theorem has fallen short of having the kind of impact on the discussion of utilitarianism for which Harsanyi hoped. Yet how could the theorem influence this discussion? Utilitarianism is as attractive to some as it is appalling to others. The prospects for this dispute to be affected by a theorem seem dim. Yet a closer look shows how the theorem could make a contribution. To fix ideas, I understand by utilitarianism the following claims: (1) Consequentialism: Actions are evaluated in terms of their consequences only. (2) Bayesianism: An agent's beliefs about possible outcomes are captured probabilistically. (3) Welfarism: The judgement of the relative goodness of states of affairs is based.. (shrink)
During the past two decades non-cooperative game theory has become a central topic in economic theory. Many scholars have contributed to this revolution, none more than John Nash. Following the publication of von Neumann and Morgenstern's book, it was Nash's papers in the early fifties which pointed the way for future research in game theory. The notion of Nash equilibrium is indispensable. Nash's formulation of the bargaining problem and the Nash bargaining solution constitute the cornerstone of modern bargaining (...) theory. His insights into the non-cooperative foundations of cooperative game theory initiated an area of research known as the Nash program. Nash's analysis of the demand game in which he uses a perturbation of a game to select an equilibrium inspired the construction of several refinements of the notion of Nash equilibrium. A scholar's influence does not necessarily qualify him for a Nobel prize. One may argue that such awards are a social institution established to serve social goals. It is legitimate to ask what message the Swedish Academy sends to the scientific community and the rest of the world. (shrink)
Constructor theory seeks to express all fundamental scientific theories in terms of a dichotomy between possible and impossible physical transformations–those that can be caused to happen and those that cannot. This is a departure from the prevailing conception of fundamental physics which is to predict what will happen from initial conditions and laws of motion. Several converging motivations for expecting constructor theory to be a fundamental branch of physics are discussed. Some principles of the theory are suggested and its potential (...) for solving various problems and achieving various unifications is explored. These include providing a theory of information underlying classical and quantum information; generalising the theory of computation to include all physical transformations; unifying formal statements of conservation laws with the stronger operational ones (such as the ruling-out of perpetual motion machines); expressing the principles of testability and of the computability of nature (currently deemed methodological and metaphysical respectively) as laws of physics; allowing exact statements of emergent laws (such as the second law of thermodynamics); and expressing certain apparently anthropocentric attributes such as knowledge in physical terms. (shrink)
This is the standard edition of John Locke's classic work of the early 1660s, Essays on the Law of Nature. Also included are selected shorter philosophical writings from the same decade. In his 1664 valedictory speech as Censor of Moral Philosophy at Christ Church, Oxford, Locke discusses the question: Can anyone by nature be happy in this life? The volume is completed by selections from Locke's manuscript journals, unpublished elsewhere: on translating Nicole's Essais de Morale; on spelling; on extension; (...) on idolatry; on pleasure and pain; and on faith and reason. The great Locke scholar W. von Leyden introduces each of these works, setting them in their historical context. This volume is an invaluable source for Locke's early thought, of interest to philosophers, political theorists, jurists, theologians, and historians. (shrink)
In 1911, Drs John Freeman and Leonard Noon published an account of a novel treatment for hay fever. Their method of desensitisation consisted of injecting increasing doses of an extract of pollen subcutaneously until the hypersensitivity reaction was diminished or abolished. Over subsequent decades, desensitisation established itself as the cornerstone of clinical allergy in both England and the United States, at least until the advent of novel pharmaceutical agents in the 1950s and 1960s. Although British allergists such as Noon (...) and Freeman were aware of conceptual developments within European immunology and pathology (such as the identification of anaphylaxis by Richet and Portier or von Pirquet's coining of the term allergy), their approach to hay fever was driven by more immediate pragmatic, and indeed financial, considerations. Freeman's immersion in the problems of hay fever and asthma and his pioneering use of allergen desensitisation or immunotherapy were shaped by his adherence to the convictions and bacteriological practices of his principal at St Mary's Hospital, Almroth Wright, and by the drive to produce commercial vaccines which would help to subsidise the experimental and therapeutic work at St Mary's. The aim of this paper is to explore early twentieth-century approaches to hay fever and other allergic diseases by tracing the intellectual and institutional origins of clinical allergy in Britain. (shrink)