In a seminal 1977 article, Rumelhart argued that perception required the simultaneous use of multiple sources of information, allowing perceivers to optimally interpret sensory information at many levels of representation in real time as information arrives. Building on Rumelhart's arguments, we present the Interactive Activation hypothesis—the idea that the mechanism used in perception and comprehension to achieve these feats exploits an interactive activation process implemented through the bidirectional propagation of activation among simple processing units. We then examine the interactive activation (...) model of letter and word perception and the TRACE model of speech perception, as early attempts to explore this hypothesis, and review the experimental evidence relevant to their assumptions and predictions. We consider how well these models address the computational challenge posed by the problem of perception, and we consider how consistent they are with evidence from behavioral experiments. We examine empirical and theoretical controversies surrounding the idea of interactive processing, including a controversy that swirls around the relationship between interactive computation and optimal Bayesian inference. Some of the implementation details of early versions of interactive activation models caused deviation from optimality and from aspects of human performance data. More recent versions of these models, however, overcome these deficiencies. Among these is a model called the multinomial interactive activation model, which explicitly links interactive activation and Bayesian computations. We also review evidence from neurophysiological and neuroimaging studies supporting the view that interactive processing is a characteristic of the perceptual processing machinery in the brain. In sum, we argue that a computational analysis, as well as behavioral and neuroscience evidence, all support the Interactive Activation hypothesis. The evidence suggests that contemporary versions of models based on the idea of interactive activation continue to provide a basis for efforts to achieve a fuller understanding of the process of perception. (shrink)
This is an engaging book on a subject which most people in our culture assume went out of fashion long ago. The book had its genesis in one of a series of symposia convened by the Church Society for College Work of Cambridge to explore certain themes and ideas which have great import for our time. The various authors of the essays eschew the habit of viewing Transcendence as the traditional content of metaphysical arguments or revelatory statements about the nature (...) of God outside or independent of the world and seek for signs of the possibility of achieving a renewed sense of Transcendence in the domains of inner experience, history, culture, language, science, technology, and the arts. Huston Smith suggests two options of human fulfillment: psychological and ontological, both providing a kind of transcendence of self and society. He refuses to accept one and reject the other but instead to convince us that both are legitimate. M. Murphy claims that the experience of psychological and social transcendence can be fostered by educational projects such as encounter groups, gestalt therapy workshops and sensitivity training programs. In his "Manifesto for a Dionysian Theology," Sam Keen analyzes the Apollonian way of reasoning and planning which has come to dominate in Western culture and makes a plea for a reassertion of the Dionysian principle in religion which might be expressive of life as dance, centrality of feelings and sensations, a pantheistic conception of God and a theology of play. Harvey Cox asserts that our society has lost its capacity for utopian fantasy resulting in the inability to conceive of any world which is not a mere modification or extension of our own world. He suggests that we consciously become "fools for Christ" again and give full rein to our powers of creative fantasizing even at the risk of contracting "religious madness." Donald Schon urges post-modern man to give up hope for achieving a stable political order and to develop an "ethic for change" appropriate to the demands of our society where change is all-pervasive. Essays by R. Bellah and H. Richardson explore myths of transcendence as ordering structures in society and plead for increased attention to correspondences among the disciplines of theology, sociology, and psychology. E. Fackenheim and W. Kaufman seek to clarify the notion of transcendence in Judaism and Christianity respectively and to "clear the way for a positive revelation." In keeping with the shared notion that transcendent reality is present and active within the human process, two eminent process philosophers, Wieman and Hartshorne discuss the "implications for a concept of transcendence that follows from affirming the creative freedom of man." The essays singly and together reject the notion that the existence and nature of transcendent reality can be arrived at by pursuing a single line of argumentation to its bitter end and instead they work together from various points of view to reinforce our sense of man's renewed thirst for the Divine and the subsequent rediscovery of God's uninterrupted presence within the world of "Immanent Possibility."--J. B. L. (shrink)
This collection, with an agreeable proportion of new material and a sensible selection of old, is worth the money and ought to be on the shelf of anyone interested in recent work on language by philosophers, psychologists, and linguists. The section by linguists proper is the longer and more up to date but this seems quite in order: today neither work in philosophy nor psychology can provide a plausible center-of-attention that will take in the other and linguistics as flanking material. (...) For better and worse linguistics is the centerpiece: and the debate between "interpretive" and "generative" semanticists, here respectively represented by Chomsky and George Lakoff, is the center, most likely, of the centerpiece. The generative semanticists suggest that the base and semantic components ultimately come to the same: the distinction between syntactic rules and semantic rules is presumed as in the Chomskian position but it is thought that the algorithm of wellformedness will turn out to provide all the rules needed for semantic interpretation. The interpretive semantic alternative, here argued by Chomsky in a paper otherwise difficult to obtain except in mimeo, distinguishes semantic from base component by insisting, particularly in matters respecting reference and quantification, that transformations are not meaning-invariant, and that, hence, the semantic component is fed by both the base and surface structures independently. To put the interpretive view in terms of Tarski-cum-Davidsonian biconditionals, we would no longer have on the left side of the biconditional one ’structural-descriptive’ string but rather two separate strings, one surface and the other deep, that would jointly and independently determine meaning. The generative semanticists, following James McCawley, stress that their argument against autonomous deep syntax follows in form Morris Halle’s well-known argument against a phonemic level of description supposed intermediate between superficial surface syntax and systematic phonetics. The basic question one raises against this argument is whether logico-semantic form constitutes itself for linguistic science as one level of description and as an essentially linguistic level of description. One can see an obvious place for philosophers in these arguments, though one finds in this volume very little suggestion of philosophical-semantic work, in the Frege-Carnap tradition, that Donald Davidson, Richard Montague, John Wallace, etc., have been carrying on lately. There is a previously unpublished paper by David Wiggins in this vein, but though Wiggins is his usual brilliant and playfully convoluted self, this is too idiosyncratic and occasional a piece to represent what is by way of a movement. Indeed, aside from the Wiggins-Alston material, the philosopher’s section is solid but familiar material: H. P. Grice’s famous paper on meaning and Paul Ziff’s criticism of Grice’s theory; Gilbert Harman’s "Three Levels of Meaning"; late-1960s papers by Donnellan, Linsky, Quine, Strawson, Vendler, and Searle on reference. But this aside this volume vividly makes the point that philosophy and linguistics have never been more entangled with each other in a genuine working relationship. Chomsky’s arguments come in part from recent philosopher’s work. There is evident concern by linguists with presuppositions and performatives. "Fact," an important and not easily available paper by Paul and Carol Kilparski, sparks the philosophic imagination—as do new pieces on lexical entries, semantic features, and categories by Charles Fillmore, Manfried Bierwisch, and others. Almost enough to justify J. L. Austin’s hopes for a joint endeavor of linguists, philosophers, and psychologists: one sees in the footnotes and bibliographies, in the issue and vocabulary, that disciplines are joining and reflecting upon each other in day-to-day work. The psychology section also contains one large new piece: a splendidly energetic defense of linguistic behaviorism by Charles Osgood. One finds balance for this in Jerry Fodor’s "Could meaning be an rm?" And some good, current, and often not easily available material by George Miller, Eric Lennberg, and others. The "overviews" for the various sections are quite distinguished themselves: but this is only in keeping with general character of this reader.—J. L. (shrink)
A physical and mathematical framework for the analysis of probabilities in quantum theory is proposed and developed. One purpose is to surmount the problem, crucial to any reconciliation between quantum theory and space-time physics, of requiring instantaneous “wave-packet collapse” across the entire universe. The physical starting point is the idea of an observer as an entity, localized in space-time, for whom any physical system can be described at any moment, by a set of (not necessarily pure) quantum states compatible with (...) his observations of the system at that moment. The mathematical starting point is the theory of local algebras from axiomatic relativistic quantum field theory. A function defining thea priori probability of mistaking one local state for another is analysed. This function is shown to possess a broad range of appropriate properties and to be uniquely defined by a selection of them. Through a general model for observations, it is argued that the probabilities defined here are as compatible with experiment as the probabilities of conventional interpretations of quantum mechanics but are more likely to be compatible, not only with modern developments in mathematical physics, but also with a complete and consistent theory of measurement. (shrink)
It is proposed that the physical structure of an observer in quantum mechanics is constituted by a pattern of elementary localized switching events. A key preliminary step in giving mathematical expression to this proposal is the introduction of an equivalence relation on sequences of spacetime sets which relates a sequence to any other sequence to which it can be deformed without change of causal arrangement. This allows an individual observer to be associated with a finite structure. The identification of suitable (...) switching events in the human brain is discussed. A definition is given for the sets of sequences of quantum states which such an observer could occupy. Finally, by providing an a priori probability for such sets, the definitions are incorporated into a complete mathematical framework for a many-worlds interpretation. At a less ambitious level, the paper can be read as an exploration of some of the technical and conceptual difficulties involved in constructing such a framework. (shrink)
This paper is a response to some recent discussions of many-minds interpretations in the philosophical literature. After an introduction to the many-minds idea, the complexity of quantum states for macroscopic objects is stressed. Then it is proposed that a characterization of the physical structure of observers is a proper goal for physical theory. It is argued that an observer cannot be defined merely by the instantaneous structure of a brain, but that the history of the brain's functioning must also be (...) taken into account. Next the nature of probability in many-minds interpretations is discussed and it is suggested that only discrete probability models are needed. The paper concludes with brief comments on issues of actuality and identity over time. (shrink)
A civic science curriculum is advocated. We discuss practical mechanisms for (and highlight the possible benefits of) addressing the relationship between scientific knowledge and civic responsibility coextensively with rigorous scientific content. As a strategy, we suggest an in-course treatment of well known (and relevant) historical and contemporary controversies among scientists over science policy or the use of sciences. The scientific content of the course is used to understand the controversy and to inform the debate while allowing students to see the (...) role of scientists in shaping public perceptions of science and the value of scientific inquiry, discoveries and technology in society. The examples of the activism of Linus Pauling, Alfred Nobel and Joseph Rotblat as scientists and engaged citizens are cited. We discuss the role of science professors in informing the social conscience of students and consider ways in which a treatment of the function of science in society may find, coherently, a meaningful space in a science curriculum at the college level. Strategies for helping students to recognize early the crucial contributions that science can make in informing public policy and global governance are discussed. (shrink)
It has been suggested, on the one hand, that quantum states are just states of knowledge; and, on the other, that quantum theory is merely a theory of correlations. These suggestions are confronted with problems about the nature of psycho-physical parallelism and about how we could define probabilities for our individual future observations given our individual present and previous observations. The complexity of the problems is underlined by arguments that unpredictability in ordinary everyday neural functioning, ultimately stemming from small-scale uncertainties (...) in molecular motions, may overwhelm, by many orders of magnitude, many conventionally recognized sources of observed ``quantum'' uncertainty. Some possible ways of avoiding the problems are considered but found wanting. It is proposed that a complete understanding of the relationship between subjective experience and its physical correlates requires the introduction of mathematical definitions and indeed of new physical laws. (shrink)
In his long 1957 paper, “The Theory of the Universal Wave Function”, Hugh Everett III made some significant preliminary steps towards the application and generalization of Shannon’s information theory to quantum mechanics. In the course of doing so, he conjectured that, for a given wavefunction on a compound space, the Schmidt decomposition maximises the correlation between subsystem bases. This is proved here.
The original development of the formalism of quantum mechanics involved the study of isolated quantum systems in pure states. Such systems fail to capture important aspects of the warm, wet, and noisy physical world which can better be modelled by quantum statistical mechanics and local quantum field theory using mixed states of continuous systems. In this context, we need to be able to compute quantum probabilities given only partial information. Specifically, suppose that B is a set of operators. This set (...) need not be a von Neumann algebra. Simple axioms are proposed which allow us to identify a function which can be interpreted as the probability, per unit trial of the information specified by B, of observing the (mixed) state of the world restricted to B to be σ when we are given ρ – the restriction to B of a prior state. This probability generalizes the idea of a mixed state (ρ) as being a sum of terms (σ) weighted by probabilities. The unique function satisfying the axioms can be defined in terms of the relative entropy. The analogous inference problem in classical probability would be a situation where we have some information about the prior distribution, but not enough to determine it uniquely. In such a situation in quantum theory, because only what we observe should be taken to be specified, it is not appropriate to assume the existence of a fixed, definite, unknown prior state, beyond the set B about which we have information. The theory was developed for the purposes of a fairly radical attack on the interpretation of quantum theory, involving many-worlds ideas and the abstract characterization of observers as finite information-processing structures, but deals with quantum inference problems of broad generality. (shrink)
Donald J. Munro's essay, "When Science Is in Defense of Value-Linked Facts," takes a stand against the fact-value dichotomy which has been heavily pronounced within the Greco-European philosophical canon. As Munro also points out, the continuing persistence of the fact-value dichotomy is traceable to Moore's discussion of the "naturalistic fallacy" and Hume's discussion of the is-ought problem. In opposition to these two views, classical Confucian thinkers present us with descriptive statements about human commonalities, including their inborn affects....
The election of Donald J. Trump as the 45th President of the United States reminded us that climate deniers are anything but endangered species. In this short paper, we discuss President Trump’s position on climate change in the wider context of climate controversies and denial. In particular, we put it into perspective with other notorious contrarian leaders and their influence on national and international climate politics. Finally, we provide a brief analysis of President Trump discourses on climate change and (...) discuss them in light of reflections about post-truth politics. (shrink)
Over the past two years, several political commentators have drawn on Plato’s Republic to shed light on our last presidential election. Many of these authors emphasize the features of democracy that make it especially susceptible to demagoguery, which heralds the arrival of tyranny, and then go on to relate this to Donald Trump’s political ascension. The problem with these analyses is that they tend to unquestioningly adopt Plato’s pessimistic view of democracy. While Plato’s criticisms do have the virtue of (...) making us aware of democracy’s weaknesses, we would argue that our present political circumstances did not issue from these flaws. This makes these criticisms irrelevant. Other commentators come closer to the mark when they talk about Plato’s passages addressing the person of the tyrant in Book IX, but what is lacking in these accounts is a context that more fully explains why the tyrant is what he is, in Platonic terms. In this essay we argue that other parts of the Republic, particularly Book IV, can tell us much more about Trump and his presidency. This part of the dialogue deals with Plato’s conception of human nature, which he presents in his discussion of the soul or psyche [ψυχή]. An examination of these passages will grant us insight into the Trump’s actions and utterances and show that the president is not only intellectually but also temperamentally unqualified for his office, which should give citizens good cause for worry. (shrink)
Rationality and reasonableness are often sharply distinguished from one another and are even held to be in conflict. On this construal, rationality consists in means-end calculation of the most efficient means to one's ends, while reasonableness consists in equitableness whereby one respects the rights of other persons as well as oneself. To deal with this conflict, it is noted that both rationality and reasonableness are based on reason, which is analyzed as the power of attaining truth, and especially necessary truth. (...) It is then shown that, by the rationality involved in reason, the moral principle of reasonableness, the Principle of Generic Consistency, has a stringently rational justification in that to deny or violate it is to incur self-contradiction. Objections are considered bearing on relevance and motivation. It is concluded that, where reasonableness and egoistic rationality conflict, the former is rationally superior. (shrink)
Presumably, great men, including John Dewey, have great flaws. For decades, Dewey scholars assumed that the Hegelian cast of his early philosophy proved, prima facie, that it was merely derivative and hopelessly metaphysical in the worst possible sense of that term, as though nothing original or practically applicable to real life could possibly come from studying Hegel. I believe it is fair to say that, among Dewey scholars, the term “Hegelian” became an ossified pejorative that required little, if any, explanation. (...) “Hegelian,” and related terms such as “idealism” and “the dialectic,” were exempt from further inquiry. In recent years a growing number of scholars have taken closer looks at Dewey’s early writings .. (shrink)