The first of ErichNeumann's works to be translated into English, this eloquent book draws on a full range of world mythology to show that individual consciousness undergoes the same archetypal stages of development as has human consciousness as a whole. Neumann, one of Jung's most creative students and a renowned practitioner of analytical psychology in his own right, shows how the stages begin and end with the symbol of the Uroboros, or tail-eating serpent. The intermediate stages (...) are projected in the universal myths of the World Creation, Great Mother, Separation of the World Parents, Birth of the Hero, Slaying of the Dragon, Rescue of the Captive, and Transformation and Deification of the Hero. Throughout the sequence the Hero is the evolving ego consciousness. (shrink)
Four essays on the psychological aspects of art. A study of Leonardo treats the work of art, and art itself, not as ends in themselves, but rather as instruments of the artist's inner situation. Two other essays discuss the relation of art to its epoch and specifically the relation of modern art to our own time. An essay on Chagall views this artist in the context of the problems explored in the other studies.
Critical pedagogy has often been linked in the literature to faith traditions such as liberation theology, usually with the intent of improving or redirecting it. While recognizing and drawing from those previous linkages, Jacob Neumann goes further in this essay and develops the thesis that critical pedagogy can not just benefit from a connection with faith traditions, but is actually, in and of itself, a practice of faith. In this analysis, he juxtaposes critical pedagogy against three conceptualizations of faith: (...) John Caputo's blurring of the modernist division between faith and reason, Paul Tillich's argument that faith is “ultimate concern,” and Paulo Freire's theology and early Christian influences. Using this three-pronged approach, Neumann argues that regardless of how it is seen, critical pedagogy manifests as a practice of faith “all the way down.”. (shrink)
This book presents a comprehensive view of an important new field in human geography and interdisciplinary studies of nature-society relations. Tracing the development of political ecology from its origins in geography and ecological anthropology in the 1970s, to its current status as an established field, the book investigates how late twentieth-century developments in social and ecological theories are brought together to create a powerful framework for comprehending environmental problems. Making Political Ecology argues for an inclusionary conceptualization of the field that (...) absorbs empirical studies from urban, rural, First World and Third World contexts and the theoretical insights of feminism, poststructuralism, neo-Marxism, and non-equilibrium ecology. Extracts from the writings of key figures in political ecology provide an empirical grounding for these abstract concepts. Neumann's book will convince readers of political ecology's particular suitability for grappling with the most difficult questions concerning social justice, environmental change, and human relationships with nature. (shrink)
Almost all admit that there is beauty in the natural world. Many suspect that such beauty is more than an adornment of nature. Few in our contemporary world suggest that this beauty is an empirical principle of the natural world itself and instead relegate beauty to the eye and mind of the beholder. Guided by theological and scientific insight, the authors propose that such exclusion is no longer tenable, at least in the data of modern biology and in our view (...) of the natural world in general. More important, we believe an empirical aesthetics exists that can help guide experimental design and development of computational models in biology. Moreover, because theology and science can both contribute toward and equally profit from such an aesthetics, we propose that this empirical aesthetics provides the foundation for a living synergy between theology and science. (shrink)
The illusion that Kant respects persons comes from ascribing contemporary meanings to purely technical terms within his second formulation of the categorical imperative, “[A]ct so that you treat humanity, whether in your own person or in that of another, always as an end and never as a means only”. When we realize that “humanity” means rational nature and “person” means the supersensible self (homo noumenon), we find that we are to respect, not human selves in all their diversity (homo phaenomenon), (...) but rational selves in all their sameness, in their unvarying conformity to the universal principles of pure practical reason. Contemporary individualism gets no support from Kant. (shrink)
In this essay we develop and argue for the adoption of a more comprehensive model of research ethics than is included within current conceptions of responsible conduct of research (RCR). We argue that our model, which we label the ethical dimensions of scientific research (EDSR), is a more comprehensive approach to encouraging ethically responsible scientific research compared to the currently typically adopted approach in RCR training. This essay focuses on developing a pedagogical approach that enables scientists to better understand and (...) appreciate one important component of this model, what we call intrinsic ethics . Intrinsic ethical issues arise when values and ethical assumptions are embedded within scientific findings and analytical methods. Through a close examination of a case study and its application in teaching, namely, evaluation of climate change integrated assessment models, this paper develops a method and case for including intrinsic ethics within research ethics training to provide scientists with a comprehensive understanding and appreciation of the critical role of values and ethical choices in the production of research outcomes. (shrink)
This commentary examines Glenberg's characterization of “suppression” in light of negative priming and related phenomena. After offering a radically different slant on suppression, an attempt is made to weave this alternative version into Glenberg's provocative discussion of embodied memories.
From the perspective of biological cybernetics, “real world” robots have no fundamental advantage over computer simulations when used as models for biological behavior. They can even weaken biological relevance. From an engineering point of view, however, robots can benefit from solutions found in biological systems. We emphasize the importance of this distinction and give examples for artificial systems based on insect biology.
Implicit and explicit filling-in phenomena should be distinguished. Blind spot phenomena and mechanisms of boundary completion can be accounted for by implicit filling-in. Surface regions are “painted” with perceptual quantities, such as brightness, by explicit filling-in. “Filling-in” and “finding-out” relate to different computational tasks. Mechanisms of purposive computation (e.g., for navigation) evaluate local measurements, thus “finding out”; whereas mechanisms for grasping might require passive reconstruction, thus “filling in.”.
In the paper it is shown that every physically sound Birkhoff – von Neumann quantum logic, i.e., an orthomodular partially ordered set with an ordering set of probability measures can be treated as partial infinite-valued Łukasiewicz logic, which unifies two competing approaches: the many-valued, and the two-valued but non-distributive, which have co-existed in the quantum logic theory since its very beginning.
In recent philosophy of mathematics a variety of writers have presented "structuralist" views and arguments. There are, however, a number of substantive differences in what their proponents take "structuralism" to be. In this paper we make explicit these differences, as well as some underlying similarities and common roots. We thus identify, systematically and in detail, several main variants of structuralism, including some not often recognized as such. As a result the relations between these variants, and between the respective problems they (...) face, become manifest. Throughout our focus is on semantic and metaphysical issues, including what is or could be meant by "structure" in this connection. (shrink)
The renewed interest in the foundations of quantum statistical mechanics in recent years has led us to study John von Neumann’s 1929 article on the quantum ergodic theorem. We have found this almost forgotten article, which until now has been available only in German, to be a treasure chest, and to be much misunderstood. In it, von Neumann studied the long-time behavior of macroscopic quantum systems. While one of the two theorems announced in his title, the one he (...) calls the “quantum H-theorem,” is actually a much weaker statement than Boltzmann’s classical H-theorem, the other theorem, which he calls the “quantum ergodic theorem,” is a beautiful and very non-trivial result. It expresses a fact we call “normal typicality” and can be summarized as follows: For a “typical” finite family of commuting macroscopic observables, every initial wave function ψ0 from a micro-canonical energy shell so evolves that for most times t in the long run, the joint probability distribution of these observables obtained from ψt is close to their micro-canonical distribution. (shrink)
We discuss the content and significance of John von Neumann’s quantum ergodic theorem (QET) of 1929, a strong result arising from the mere mathematical structure of quantum mechanics. The QET is a precise formulation of what we call normal typicality, i.e., the statement that, for typical large systems, every initial wave function ψ0 from an energy shell is “normal”: it evolves in such a way that |ψt ψt| is, for most t, macroscopically equivalent to the micro-canonical density matrix. The (...) QET has been mostly forgotten after it was criticized as a dynamically vacuous statement in several papers in the 1950s. However, we point out that this criticism does not apply to the actual QET, a correct statement of which does not appear in these papers, but to a different (indeed weaker) statement. Furthermore, we formulate a stronger statement of normal typicality, based on the observation that the bound on the deviations from the average specified by von Neumann is unnecessarily coarse and a much tighter (and more relevant) bound actually follows from his proof. (shrink)
We extend the topos-theoretic treatment given in previous papers of assigning values to quantities in quantum theory, and of related issues such as the Kochen-Specker theorem. This extension has two main parts: the use of von Neumann algebras as a base category (Section 2); and the relation of our generalized valuations to (i) the assignment to quantities of intervals of real numbers, and (ii) the idea of a subobject of the coarse-graining presheaf (Section 3).
Describing the methodology of a prominent mathematician can be an over-ambitious task, especially if the mathematician in question has made crucial contributions to almost the whole of mathematical science. John von Neumann’s case study falls within this category. Nonetheless, we can still provide a clear picture of von Neumann’s methodology of science. Recent literature has clarified its key feature—the opportunistic approach to axiomatics—and has laid out its main principles. To be honest, this work can hardly be superseded. What (...) I would like to do is to complete the picture by adding one more step and emphasizing a point so far neglected, namely the role of Hilbert’s ideal in von Neumann’s epistemology. Von .. (shrink)
Abstract Von Neumann (1932, Ch. 5) argued by means of a thought experiment involving measurements of spin observables that the quantum mechanical quantity is conceptually equivalent to thermodynamic entropy. We analyze Von Neumann's thought experiment and show that his argument fails. Over the past few years there has been a dispute in the literature regarding the Von Neumann entropy. It turns out that each contribution to this dispute (Shenker 1999, Henderson 2001, Hemmo 2003) addressed a different special (...) case. In this paper we generalize the discussion and examine the full matrix of possibilities that are relevant for the evaluation and understanding of Von Neumann’s argument. (shrink)
This article addresses, from a Frankfurt School perspective on law identified with Franz Neumann and more recently Habermas, the attack upon the principles of war criminality formulated at the Nuremberg trials by the increasingly influential legal and political theory of Carl Schmitt. It also considers the contradictions within certain of the defence arguments that Schmitt himself resorted to when interrogated as a possible war crimes defendant at Nuremberg. The overall argument is that a distinctly internal, or “immanent”, form of (...) critique is required of Schmitt's position, in which its is found wanting even on its own terms. In principle, the application of this dialectical mode of critique can allow a genuine debate to emerge between those seeking to continue both the Schmittian and critical theory traditions, whilst safeguarding the latter from the dangers of formulating polemical interventions that are, in effect, counterproductive to their own intentions. (shrink)
Around 1989, a striking letter written in March 1956 from Kurt Gödel to John von Neumann came to light. It poses some problems about the complexity of algorithms; in particular, it asks a question that can be seen as the first formulation of the P=?NP question. This paper discusses some of the background to this letter, including von Neumann's own ideas on complexity theory. Von Neumann had already raised explicit questions about the complexity of Tarski's decision procedure (...) for elementary algebra and geometry in a letter of 1949 to J. C. C. McKinsey. The paper concludes with a discussion of why theoretical computer science did not emerge as a separate discipline until the 1960s. (shrink)
Many works intended to introduce interpretive issues in quantum mechanics present John von Neumann as having a view in which measurement produces a physical collapse in the system being measured. In this paper I argue that such a reading of von Neumann is inconsistent with what von Neumann actually says. I show that much of what he says makes no sense on the physical collapse reading, but falls into place if we assume he does not have such (...) a view. I show that the physical collapse view is based on an understanding of ‘state’ which von Neumann does not share. Introduction The standard reading of von Neumann The standard reading of von Neumann and Chapter VI The Chapter VI argument The Chapter V argument The Chapters III and IV argument Conclusion. (shrink)
This paper begins by examining Erich Fromm’s “Manifesto and Program” written for the Socialist Party in 1959 or 1960, and addresses a simple question: Why would Fromm speak of something so apparently arcane as “prophetic messianism,” in his socialist program? When he insists that we have forgotten thatsocialism is “rooted in the spiritual tradition which came to us from prophetic messianism, the gospels, humanism, and from the enlightenment philosophers,” is this simply a literary flourish, a concession to liberalism, or (...) religious sentimentality? Part I, written by Nick Braune, answers the question by examining Fromm’ssocialist organizing commitments in the context of the late 1950s. Part II, written by Joan Braune, offers further defense of the term “prophetic messianism,” distinguishes two types of messianism, and suggests that Fromm may be attempting to address a problem in the Frankfurt School. (shrink)
Much of the recent discussion of problematic aspects of quantum-mechanical measurement centers around that feature of quantum theory which is called "the projection postulate." This is roughly the claim that a change of a certain sort occurs in the state of a physical system when a measurement is made on the system. In this paper an argument for the projection postulate due to von Neumann is considered. Attention is focused on trying to provide an understanding of the notion of (...) "the state of a physical system" which is compatible with the argument von Neumann offers. An attempt is made to formulate the argument in terms of an objectivistic interpretation of probability concepts. It is seen that such an interpretation does not provide a suitable way of understanding the argument. An attempt is made to illustrate the source of this failure in terms of a non-quantum-mechanical example. (shrink)
Shenker has claimed that Von Neumann's argument for identifying the quantum mechanical entropy with the Von Neumann entropy, S() = – ktr( log ), is invalid. Her claim rests on a misunderstanding of the idea of a quantum mechanical pure state. I demonstrate this, and provide a further explanation of Von Neumann's argument.
We present an axiomatic framework for nonstandard analysis-the Nonstandard Class Theory (NCT) which extends von Neumann-Gödel-Bernays Set Theory (NBG) by adding a unary predicate symbol St to the language of NBG (St(X) means that the class X is standard) and axioms-related to it- analogs of Nelson's idealization, standardization and transfer principles. Those principles are formulated as axioms, rather than axiom schemes, so that NCT is finitely axiomatizable. NCT can be considered as a theory of definable classes of Bounded Set (...) Theory by V. Kanovei and M. Reeken. In many aspects NCT resembles the Alternative Set Theory by P. Vopenka. For example there exist semisets (proper subclasses of sets) in NCT and it can be proved that a set has a standard finite cardinality iff it does not contain any proper subsemiset. Semisets can be considered as external classes in NCT. Thus the saturation principle can be formalized in NCT. (shrink)
This paper offers a modified version of the certainty equivalence (CE) theory of utility for uncertain prospects and a new set of axioms as its basis. It shows that the CE and the von Neumann-Morgenstern (NM) approaches to uncertainty are opposite in spirit: The CE approach represents a flight from the world of uncertainty to the rules of certainty while the NM approach represents a flight from the world of certainty to one of uncertainty. The two approaches differ even (...) in their treatment of compound prospects and their actuarially identical simple counterparts. (shrink)
Both von Neumann and Wiener were outsiders to biology. Both were inspired by biology and both proposed models and generalizations that proved inspirational for biologists. Around the same time in the 1940s von Neumann developed the notion of self reproducing automata and Wiener suggested an explication of teleology using the notion of negative feedback. These efforts were similar in spirit. Both von Neumann and Wiener used mathematical ideas to attack foundational issues in biology, and the concepts they (...) articulated had lasting effect. But there were significant differences as well. Von Neumann presented a how-possibly model, which sparked interest by mathematicians and computer scientists, while Wiener collaborated more directly with biologists, and his proposal influenced the philosophy of biology. The two cases illustrate different strategies by which mathematicians, the “professional outsiders” of science, can choose to guide their engagement with biological questions and with the biological community, and illustrate different kinds of generalizations that mathematization can contribute to biology. The different strategies employed by von Neumann and Wiener and the types of models they constructed may have affected the fate of von Neumann’s and Wiener’s ideas – as well as the reputation, in biology, of von Neumann and Wiener themselves. (shrink)
An information completion of an extensive game is obtained by extending the information partition of every player from the set of her decision nodes to the set of all nodes. The extended partition satisfies Memory of Past Knowledge (MPK) if at any node a player remembers what she knew at earlier nodes. It is shown that MPK can be satisfied in a game if and only if the game is von Neumann (vN) and satisfies memory at decision nodes (the (...) restriction of MPK to a player's own decision nodes). A game is vN if any two decision nodes that belong to the same information set of a player have the same number of predecessors. By providing an axiom for MPK we also obtain a syntactic characterization of the said class of vN games. (shrink)
In this paper, I shall discuss the heuristic role of symmetry in the mathematical formulation of quantum mechanics. I shall first set out the scene in terms of Bas van Fraassen’s elegant presentation of how symmetry principles can be used as problem-solving devices (see van Fraassen  and ). I will then examine in what ways Hermann Weyl and John von Neumann have used symmetry principles in their work as a crucial problem-solving tool. Finally, I shall explore one consequence (...) of this situation to recent debates about structural realism (SR) and empiricism in physics (Worrall , Ladyman , and French ). (shrink)
Applied mathematics often operates by way of shakily rationalizedexpedients that can neither be understood in a deductive-nomological nor in an anti-realist setting.Rather do these complexities, so a recent paper of Mark Wilson argues, indicate some element in ourmathematical descriptions that is alien to the physical world. In this vein the mathematical opportunistopenly seeks or engineers appropriate conditions for mathematics to get hold on a given problem.Honest mathematical optimists, instead, try to liberalize mathematical ontology so as to include all physicalsolutions. Following (...) John von Neumann, the present paper argues that the axiomatization of a scientifictheory can be performed in a rather opportunistic fashion, such that optimism and opportunism appear as twomodes of a single strategy whose relative weight is determined by the status of the field to beinvestigated. Wilson's promising approach may thus be reformulated so as to avoid precarious talk about a physicalworld that is void of mathematical structure. This also makes the appraisal of the axiomatic method inapplied matthematics less dependent upon foundationalist issues. (shrink)
We announce some new results regarding the classification problem for separable von Neumann algebras. Our results are obtained by applying the notion of Borel reducibility and Hjorth's theory of turbulence to the isomorphism relation for separable von Neumann algebras.
The evolution of John von Neumann's scientific interests and a study of his writings show that von Neumann increasingly supported an empirical, computational method. This is in stark contrast with the extant view of von Neumann as a pure theorist.
The State of the Political offers a broad-ranging re-interpretation of the understanding of politics and the state in the writings of three major German thinkers, Max Weber, Carl Schmitt, and Franz Neumann. It rejects the typical separation of these writers on the basis of their allegedly incompatible ideological positions, and suggests instead that once properly located in their historical context, the tendentious character of these interpretative boundaries becomes clear. -/- The book interprets the conceptions of politics and the state (...) in the writings of these three thinkers by means of an investigation of their adaptation and modification of particular German traditions of thinking about the state, or Staatsrechtslehre. Indeed, when the theoretical considerations of this state-legal theory are combined with their contemporary political criticism, a richer and more deeply textured account of the issues that engaged the attention of Weber, Schmitt and Neumann is possible. Thus, the broad range of subjects discussed in this book include parliamentarism and democracy in Germany, academic freedom and political economy, political representation, cultural criticism and patriotism, and the relationship between rationality, law, sovereignty and the constitution. -/- The State of the Political is based on extensive consideration of primary and secondary materials, and is held together by a general focus on the importance to these authors of distilling an adequate account of the state and the political - largely because this could bolster their subsequent criticisms of contemporary politics. The study attempts to restore a sense of proportion to discussion of their writings, focusing on the extensive ideas that they shared rather than insisting on their necessary ideological separation. It is a detailed re-appraisal of a crucial moment in modern intellectual history, and highlights the profound importance of Max Weber, Carl Schmitt and Franz Neumann for the history of European ideas. (shrink)
This volume examines the ways in which the authors of the early Frankfurt School criticized, adopted and modified traditional forms of religious thought and practice. Focusing on the works of Theodor W. Adorno, Walter Benjamin, Erich Fromm, Max Horkheimer, Otto Kirchheimer and Franz Neumann, it analyzes the relevance of religious traditions and of the Enlightenment critique of religion for modern conceptions of emancipatory thought, art, law, and politics.
Sequential von Neumann–Morgernstern (VM) games are a very general formalism for representing multi-agent interactions and planning problems in a variety of types of environments. We show that sequential VM games with countably many actions and continuous utility functions have a sound and complete axiomatization in the situation calculus. This axiomatization allows us to represent game-theoretic reasoning and solution concepts such as Nash equilibrium. We discuss the application of various concepts from VM game theory to the theory of planning and (...) multi-agent interactions, such as representing concurrent actions and using the Baire topology to define continuous payoff functions. (shrink)
_René Descartes proposed an interactive dualism that posits an interaction between the_ _mind of a human being and some of the matter located in his or her brain. Isaac Newton_ _subsequently formulated a physical theory based exclusively on the material/physical_ _part of Descartes’ ontology. Newton’s theory enforced the principle of the causal closure_ _of the physical, and the classical physics that grew out of it enforces this same principle._ _This classical theory purports to give, in principle, a complete deterministic account (...) of the_ _physically described properties of nature, expressed exclusively in terms of these_ _physically described properties themselves. Orthodox contemporary physical theory_ _violates this principle in two separate ways. First, it injects random elements into the_ _dynamics. Second, it allows, and also requires, abrupt probing actions that disrupt the_ _mechanistically described evolution of the physically described systems. These probing_ _actions are called Process 1 interventions by von Neumann. They are psycho-physical_ _events. Neither the content nor the timing of these events is determined either by any_ _known law, or by the afore-mentioned random elements. Orthodox quantum mechanics_ _considers these events to be instigated by choices made by conscious agents. In von_ _Neumann’s formulation of quantum theory each such intervention acts upon the state of_ _the brain of some conscious agent. Thus orthodox von Neumann contemporary physics_ _posits an interactive dualism similar to that of Descartes. But in this quantum version the_ _effects of the conscious choices upon our brains are controlled, in part, by the known_ _basic rules of quantum physics. This theoretically specified mind-brain connection allows_ _many basic psychological and neuropsychological findings associated with the apparent_ _physical effectiveness of our conscious volitional efforts to be explained in a causal and_ _practically useful way.. (shrink)