Critical pedagogy has often been linked in the literature to faith traditions such as liberation theology, usually with the intent of improving or redirecting it. While recognizing and drawing from those previous linkages, Jacob Neumann goes further in this essay and develops the thesis that critical pedagogy can not just benefit from a connection with faith traditions, but is actually, in and of itself, a practice of faith. In this analysis, he juxtaposes critical pedagogy against three conceptualizations of faith: (...)John Caputo's blurring of the modernist division between faith and reason, Paul Tillich's argument that faith is “ultimate concern,” and Paulo Freire's theology and early Christian influences. Using this three-pronged approach, Neumann argues that regardless of how it is seen, critical pedagogy manifests as a practice of faith “all the way down.”. (shrink)
The renewed interest in the foundations of quantum statistical mechanics in recent years has led us to study John von Neumann’s 1929 article on the quantum ergodic theorem. We have found this almost forgotten article, which until now has been available only in German, to be a treasure chest, and to be much misunderstood. In it, von Neumann studied the long-time behavior of macroscopic quantum systems. While one of the two theorems announced in his title, the one (...) he calls the “quantum H-theorem,” is actually a much weaker statement than Boltzmann’s classical H-theorem, the other theorem, which he calls the “quantum ergodic theorem,” is a beautiful and very non-trivial result. It expresses a fact we call “normal typicality” and can be summarized as follows: For a “typical” finite family of commuting macroscopic observables, every initial wave function ψ0 from a micro-canonical energy shell so evolves that for most times t in the long run, the joint probability distribution of these observables obtained from ψt is close to their micro-canonical distribution. (shrink)
Applied mathematics often operates by way of shakily rationalizedexpedients that can neither be understood in a deductive-nomological nor in an anti-realist setting.Rather do these complexities, so a recent paper of Mark Wilson argues, indicate some element in ourmathematical descriptions that is alien to the physical world. In this vein the mathematical opportunistopenly seeks or engineers appropriate conditions for mathematics to get hold on a given problem.Honest mathematical optimists, instead, try to liberalize mathematical ontology so as to include all physicalsolutions. Following (...)John von Neumann, the present paper argues that the axiomatization of a scientifictheory can be performed in a rather opportunistic fashion, such that optimism and opportunism appear as twomodes of a single strategy whose relative weight is determined by the status of the field to beinvestigated. Wilson's promising approach may thus be reformulated so as to avoid precarious talk about a physicalworld that is void of mathematical structure. This also makes the appraisal of the axiomatic method inapplied matthematics less dependent upon foundationalist issues. (shrink)
The evolution of John von Neumann's scientific interests and a study of his writings show that von Neumann increasingly supported an empirical, computational method. This is in stark contrast with the extant view of von Neumann as a pure theorist.
As interpreted by Pattee, von Neumann’s Theory of Self-Reproducing Automata has proved to be a useful tool for understanding some of the difficulties and paradoxes of molecular biosemiotics. But is its utility limited to molecular systems or is it more generally applicable within biosemiotics? One way of answering that question is to look at the Theory as a model for one particular high-level biosemiotic activity, human language. If the model is not useful for language, then it certainly cannot be (...) generally useful to biosemiotics. Beginning with the Universal Turing Machine and continuing with von Neumann’s Theory and Pattee’s interpretation, the properties of universality, programmability, underspecification, complementarity of description/construction, and open-ended evolutionary potential are shown to be usefully applicable to language, thus opening a new line of inquiry in biosemiotics. (shrink)
Both von Neumann and Wiener were outsiders to biology. Both were inspired by biology and both proposed models and generalizations that proved inspirational for biologists. Around the same time in the 1940s von Neumann developed the notion of self reproducing automata and Wiener suggested an explication of teleology using the notion of negative feedback. These efforts were similar in spirit. Both von Neumann and Wiener used mathematical ideas to attack foundational issues in biology, and the concepts they (...) articulated had lasting effect. But there were significant differences as well. Von Neumann presented a how-possibly model, which sparked interest by mathematicians and computer scientists, while Wiener collaborated more directly with biologists, and his proposal influenced the philosophy of biology. The two cases illustrate different strategies by which mathematicians, the “professional outsiders” of science, can choose to guide their engagement with biological questions and with the biological community, and illustrate different kinds of generalizations that mathematization can contribute to biology. The different strategies employed by von Neumann and Wiener and the types of models they constructed may have affected the fate of von Neumann’s and Wiener’s ideas – as well as the reputation, in biology, of von Neumann and Wiener themselves. (shrink)
Since the analysis by John Bell in 1965, the consensus in the literature is that von Neumann’s ‘no hidden variables’ proof fails to exclude any significant class of hidden variables. Bell raised the question whether it could be shown that any hidden variable theory would have to be nonlocal, and in this sense ‘like Bohm’s theory.’ His seminal result provides a positive answer to the question. I argue that Bell’s analysis misconstrues von Neumann’s argument. What von (...) class='Hi'>Neumann proved was the impossibility of recovering the quantum probabilities from a hidden variable theory of dispersion free (deterministic) states in which the quantum observables are represented as the ‘beables’ of the theory, to use Bell’s term. That is, the quantum probabilities could not reflect the distribution of pre-measurement values of beables, but would have to be derived in some other way, e.g., as in Bohm’s theory, where the probabilities are an artefact of a dynamical process that is not in fact a measurement of any beable of the system. (shrink)
For a finite von Neumann algebra factor M, the projections form a modular ortholattice L(M). We show that the equational theory of L(M) coincides with that of some resp. all L(ℂ n × n ) and is decidable. In contrast, the uniform word problem for the variety generated by all L(ℂ n × n ) is shown to be undecidable.
_René Descartes proposed an interactive dualism that posits an interaction between the_ _mind of a human being and some of the matter located in his or her brain. Isaac Newton_ _subsequently formulated a physical theory based exclusively on the material/physical_ _part of Descartes’ ontology. Newton’s theory enforced the principle of the causal closure_ _of the physical, and the classical physics that grew out of it enforces this same principle._ _This classical theory purports to give, in principle, a complete deterministic account (...) of the_ _physically described properties of nature, expressed exclusively in terms of these_ _physically described properties themselves. Orthodox contemporary physical theory_ _violates this principle in two separate ways. First, it injects random elements into the_ _dynamics. Second, it allows, and also requires, abrupt probing actions that disrupt the_ _mechanistically described evolution of the physically described systems. These probing_ _actions are called Process 1 interventions by von Neumann. They are psycho-physical_ _events. Neither the content nor the timing of these events is determined either by any_ _known law, or by the afore-mentioned random elements. Orthodox quantum mechanics_ _considers these events to be instigated by choices made by conscious agents. In von_ _Neumann’s formulation of quantum theory each such intervention acts upon the state of_ _the brain of some conscious agent. Thus orthodox von Neumann contemporary physics_ _posits an interactive dualism similar to that of Descartes. But in this quantum version the_ _effects of the conscious choices upon our brains are controlled, in part, by the known_ _basic rules of quantum physics. This theoretically specified mind-brain connection allows_ _many basic psychological and neuropsychological findings associated with the apparent_ _physical effectiveness of our conscious volitional efforts to be explained in a causal and_ _practically useful way.. (shrink)
We discuss the content and significance of John von Neumann’s quantum ergodic theorem (QET) of 1929, a strong result arising from the mere mathematical structure of quantum mechanics. The QET is a precise formulation of what we call normal typicality, i.e., the statement that, for typical large systems, every initial wave function ψ0 from an energy shell is “normal”: it evolves in such a way that |ψt ψt| is, for most t, macroscopically equivalent to the micro-canonical density matrix. (...) The QET has been mostly forgotten after it was criticized as a dynamically vacuous statement in several papers in the 1950s. However, we point out that this criticism does not apply to the actual QET, a correct statement of which does not appear in these papers, but to a different (indeed weaker) statement. Furthermore, we formulate a stronger statement of normal typicality, based on the observation that the bound on the deviations from the average specified by von Neumann is unnecessarily coarse and a much tighter (and more relevant) bound actually follows from his proof. (shrink)
Describing the methodology of a prominent mathematician can be an over-ambitious task, especially if the mathematician in question has made crucial contributions to almost the whole of mathematical science. John von Neumann’s case study falls within this category. Nonetheless, we can still provide a clear picture of von Neumann’s methodology of science. Recent literature has clarified its key feature—the opportunistic approach to axiomatics—and has laid out its main principles. To be honest, this work can hardly be superseded. (...) What I would like to do is to complete the picture by adding one more step and emphasizing a point so far neglected, namely the role of Hilbert’s ideal in von Neumann’s epistemology. Von .. (shrink)
Around 1989, a striking letter written in March 1956 from Kurt Gödel to John von Neumann came to light. It poses some problems about the complexity of algorithms; in particular, it asks a question that can be seen as the first formulation of the P=?NP question. This paper discusses some of the background to this letter, including von Neumann's own ideas on complexity theory. Von Neumann had already raised explicit questions about the complexity of Tarski's decision (...) procedure for elementary algebra and geometry in a letter of 1949 to J. C. C. McKinsey. The paper concludes with a discussion of why theoretical computer science did not emerge as a separate discipline until the 1960s. (shrink)
Many works intended to introduce interpretive issues in quantum mechanics present John von Neumann as having a view in which measurement produces a physical collapse in the system being measured. In this paper I argue that such a reading of von Neumann is inconsistent with what von Neumann actually says. I show that much of what he says makes no sense on the physical collapse reading, but falls into place if we assume he does not have (...) such a view. I show that the physical collapse view is based on an understanding of ‘state’ which von Neumann does not share. Introduction The standard reading of von Neumann The standard reading of von Neumann and Chapter VI The Chapter VI argument The Chapter V argument The Chapters III and IV argument Conclusion. (shrink)
In this paper, I shall discuss the heuristic role of symmetry in the mathematical formulation of quantum mechanics. I shall first set out the scene in terms of Bas van Fraassen’s elegant presentation of how symmetry principles can be used as problem-solving devices (see van Fraassen  and ). I will then examine in what ways Hermann Weyl and John von Neumann have used symmetry principles in their work as a crucial problem-solving tool. Finally, I shall explore one (...) consequence of this situation to recent debates about structural realism (SR) and empiricism in physics (Worrall , Ladyman , and French ). (shrink)
We extend the topos-theoretic treatment given in previous papers of assigning values to quantities in quantum theory, and of related issues such as the Kochen-Specker theorem. This extension has two main parts: the use of von Neumann algebras as a base category (Section 2); and the relation of our generalized valuations to (i) the assignment to quantities of intervals of real numbers, and (ii) the idea of a subobject of the coarse-graining presheaf (Section 3).
In the paper it is shown that every physically sound Birkhoff - von Neumann quantum logic, i.e., an orthomodular partially ordered set with an ordering set of probability measures can be treated as partial infini te-valued Lukasiewicz logic, which unifies two competing approaches: the many-valued, and the two-valued but non-distributive, which have co-existed in the quantum logic theory since its very beginning.
Abstract Von Neumann (1932, Ch. 5) argued by means of a thought experiment involving measurements of spin observables that the quantum mechanical quantity is conceptually equivalent to thermodynamic entropy. We analyze Von Neumann's thought experiment and show that his argument fails. Over the past few years there has been a dispute in the literature regarding the Von Neumann entropy. It turns out that each contribution to this dispute (Shenker 1999, Henderson 2001, Hemmo 2003) addressed a different special (...) case. In this paper we generalize the discussion and examine the full matrix of possibilities that are relevant for the evaluation and understanding of Von Neumann’s argument. (shrink)
Shenker has claimed that Von Neumann's argument for identifying the quantum mechanical entropy with the Von Neumann entropy, S() = – ktr( log ), is invalid. Her claim rests on a misunderstanding of the idea of a quantum mechanical pure state. I demonstrate this, and provide a further explanation of Von Neumann's argument.
Much of the recent discussion of problematic aspects of quantum-mechanical measurement centers around that feature of quantum theory which is called "the projection postulate." This is roughly the claim that a change of a certain sort occurs in the state of a physical system when a measurement is made on the system. In this paper an argument for the projection postulate due to von Neumann is considered. Attention is focused on trying to provide an understanding of the notion of (...) "the state of a physical system" which is compatible with the argument von Neumann offers. An attempt is made to formulate the argument in terms of an objectivistic interpretation of probability concepts. It is seen that such an interpretation does not provide a suitable way of understanding the argument. An attempt is made to illustrate the source of this failure in terms of a non-quantum-mechanical example. (shrink)
We present an axiomatic framework for nonstandard analysis-the Nonstandard Class Theory (NCT) which extends von Neumann-Gödel-Bernays Set Theory (NBG) by adding a unary predicate symbol St to the language of NBG (St(X) means that the class X is standard) and axioms-related to it- analogs of Nelson's idealization, standardization and transfer principles. Those principles are formulated as axioms, rather than axiom schemes, so that NCT is finitely axiomatizable. NCT can be considered as a theory of definable classes of Bounded Set (...) Theory by V. Kanovei and M. Reeken. In many aspects NCT resembles the Alternative Set Theory by P. Vopenka. For example there exist semisets (proper subclasses of sets) in NCT and it can be proved that a set has a standard finite cardinality iff it does not contain any proper subsemiset. Semisets can be considered as external classes in NCT. Thus the saturation principle can be formalized in NCT. (shrink)
This paper offers a modified version of the certainty equivalence (CE) theory of utility for uncertain prospects and a new set of axioms as its basis. It shows that the CE and the von Neumann-Morgenstern (NM) approaches to uncertainty are opposite in spirit: The CE approach represents a flight from the world of uncertainty to the rules of certainty while the NM approach represents a flight from the world of certainty to one of uncertainty. The two approaches differ even (...) in their treatment of compound prospects and their actuarially identical simple counterparts. (shrink)
An information completion of an extensive game is obtained by extending the information partition of every player from the set of her decision nodes to the set of all nodes. The extended partition satisfies Memory of Past Knowledge (MPK) if at any node a player remembers what she knew at earlier nodes. It is shown that MPK can be satisfied in a game if and only if the game is von Neumann (vN) and satisfies memory at decision nodes (the (...) restriction of MPK to a player's own decision nodes). A game is vN if any two decision nodes that belong to the same information set of a player have the same number of predecessors. By providing an axiom for MPK we also obtain a syntactic characterization of the said class of vN games. (shrink)
An analysis is presented of the significance and consequent limitations on the applicability of the von Neumann measurement postulate in quantum mechanics. Directly observable quantities, such as the expectation value of the velocity operator, are distinguished from mathematical constructs, such as the expectation value of the canonical momentum, which are not directly observable. A simple criterion to distinguish between the two types of operators is derived. The non-observability of the electromagnetic four-potentials is shown to imply the non-measurability of the (...) canonical momentum. The concept of a mechanical gauge is introduced and discussed. Classically the Lagrangian is nonunique within a total time derivative. This may be interpreted as the freedom of choosing a “mechanical” (M) gauge function. In quantum mechanics it is often implicitly assumed that the M-gauge vanishes. However, the requirement that directly observable quantities be independent of the arbitrary mechanical gauge is shown to lead to results analogous to those derived from the requirement of electromagnetic gauge independence of observables. The significance of the above to the observability of transition amplitudes between field-free energy eigenstates in the presence (and absence) of electromagnetic fields is discussed. E- and M-gauge independent transition amplitudes between field-free energy eigenstates in the absence of electromagnetic fields are defined. It is shown that, in general, such measurable amplitudes cannot be defined in the presence of externally applied time-dependent fields. Transition amplitudes in the presence of time-independent fields are discussed. The path dependence of previous derivations of E-gauge independent Hamiltonians and/or transition amplitudes in the presence of electromagnetic fields are related to the inherent M-gauge dependence of these quantities in the presence of such fields. (shrink)
We announce some new results regarding the classification problem for separable von Neumann algebras. Our results are obtained by applying the notion of Borel reducibility and Hjorth's theory of turbulence to the isomorphism relation for separable von Neumann algebras.
The issues, raised in Żukowski (arXiv:0809.0115v1, 2008), concerning the relevance of the von Neumann theorem for the single-system’s quantumness test proposed in Alicki and Van Ryn (J. Phys. A: Math. Theor. 41:062001, 2008) and performed for the case of single photon polarization in Brida et al. (Opt. Express 16:11750, 2008; arXiv:0811.3376, 2008) and the usefulness of Bell’s inequality for testing the idea of macroscopic quantum systems are discussed in some details. Finally, the proper quantum mechanical description of the experiment (...) with polarized photon beams is presented. (shrink)
Von Neumann's theory of measurement in quantum mechanics is reinterpreted so that the experimental arrangement specifies the location of the “cut” by calling for the separate observation of the object and the measuring apparatus after the initial measurement interaction. The measurement ascertains which element of the mixture describing the final state of the apparatus is actually present. The relevance and feasibility of observing the final coherent state of the object plus apparatus is criticized and the paradoxes of “Schrödinger's cat” (...) and “Wigner's friend” are discussed. (shrink)
At the point of choice, let N be the delay in learning the outcome. Then von Neumann and Morgenstern's postulates contradictorily imply that N = 0 and N > 0. As a consequence, Savage's ‘sure-thing’ proof, which has bestowed on expected utility theory most of its normative appeal, depends on inconsistent assumptions. Further, the validity of Savage's proof cannot be retrieved by minimizing N > 0, by making the delay a mere moment or so. The historical origins of these (...) contradictions are traced to (i) von Neumann and Morgenstern inadvertently limiting their risk model to the certain period, that is the period after gamblers learn the outcome(s), and (ii) Savage's use of the sure-thing principle for analysing “atemporally but also quite formally” compound gambles [Savage, 1954, p. 23]. (shrink)
Sequential von Neumann–Morgernstern (VM) games are a very general formalism for representing multi-agent interactions and planning problems in a variety of types of environments. We show that sequential VM games with countably many actions and continuous utility functions have a sound and complete axiomatization in the situation calculus. This axiomatization allows us to represent game-theoretic reasoning and solution concepts such as Nash equilibrium. We discuss the application of various concepts from VM game theory to the theory of planning and (...) multi-agent interactions, such as representing concurrent actions and using the Baire topology to define continuous payoff functions. (shrink)
This article compares Alexander von Humboldt's and John Ruskin's writings on landscape art and natural landscape. In particular, Humboldt's conception of a habitat's essence as predominantly composed of vegetation as well as judgment of tropical American nature as the realm of nature of the highest aesthetic enjoyment is examined in the context of Ruskin's aesthetic theory. The magnitude of Humboldt's contribution to the natural sciences seems to have clouded our appreciation of his prominent status in the field of art (...) history. In addition to his position as scientist, Humboldt's role as aesthetician is demonstrated in this paper. Unlike Ruskin, who comfortably resides in the canon of art history relative to his minor significance in the field of geology, Humboldt has not been recognized for his impact on the world of art; his tremendous scientific importance seems to have overshadowed an appreciation of it. (shrink)
A synaptic algebra is a generalization of the self-adjoint part of a von Neumann algebra. In this article we extend to synaptic algebras the type-I/II/III decomposition of von Neumann algebras, AW∗-algebras, and JW-algebras.