When Herbert Marcuse's essay entitled “Repressive tolerance” was published in the mid-1960s it was trenchantly criticised because it was anti-democratic and defied the academic canon of value neutrality. Yet his argument is attracting renewed interest in the 21st century, particularly when, post 9/11, the thresholds or limits of tolerance are being contested. This article argues that Marcuse's original essay was concerned to problematise the dominant social understandings of tolerance at the time, which were more about insisting that individual citizens tolerate (...) government policy than governments encourage debate and dissent. The article shows how Marcuse attempted to demonstrate the social production of knowledge about tolerance, and how he diagnosed the social function performed by “impartiality” and “relativism”, and by “neutrality” and “objectivity”, which contributed to tolerance being repressive. In the sense that he was concerned about what counted socially as tolerance, and how it was socially defended and justified, his article can helpfully be conceived as an exercise in social epistemology. (shrink)
Artificial intelligence research has foundered on the issue of representation. When intelligence is approached in an incremental manner, with strict reliance on interfacing to the real world through perception and action, reliance on representation disappears. In this paper we outline our approach to incrementally building complete intelligent Creatures. The fundamental decomposition of the intelligent system is not into independent information processing units which must interface with each other via representations. Instead, the intelligent system is decomposed into independent and parallel activity (...) producers which all interface directly to the world through perception and action, rather than interface to each other particularly much. The notions of central and peripheral systems evaporateeverything is both central and peripheral. Based on these principles we have built a very successful series of mobile robots which operate without supervision as Creatures in standard office environments. (shrink)
This is an exciting time for brain science. Recent progress has been such that it now seems realistic to look toward an explanation of mind in terms of the brain's anatomy and physiology. Models based on artificially symmetrical arrays of idealized neurons are now being superseded by ones which properly take into account the brain's actual circuitry. This book presents a comprehensive overview of the current state of brain modeling, containing contributions from many leading researchers in this field. It will (...) be of interest not only to researchers in the fields of brain science and neurobiology, but also to psychologists and those involved in the study of artificial intelligence. (shrink)
The emergence of private authority has become a feature of the post-Cold War world. The contributors to this volume examine the implications of this erosion of the power of the state for global governance. They analyse actors as diverse as financial institutions, multinational corporations, religious terrorists and organised criminals. The themes of the book relate directly to debates concerning globalization and the role of international law, and will be of interest to scholars and students of international relations, politics, sociology and (...) law. (shrink)
The universe appears fine-tuned for life. Bayesian confirmation theory is utilized to examine two competing explanations for this fine-tuning, namely design (theism) and the existence of many universes, in comparison with the ’null’ hypothesis that just one universe exists as a brute fact. Some authors have invoked the so-called ’inverse gambler’s fallacy’ to argue that the many-universes hypothesis does not explain the fine-tuning of ’this’ universe, but flaws in this argument are exposed. Nevertheless, the hypothesis of design, being simpler, is (...) arguably of higher prior probability and, therefore, to be preferred. The hypothesis of the single brute-fact universe is disconfirmed. (shrink)
Both direct, and evolved, behavior-based approaches to mobile robots have yielded a number of interesting demonstrations of robots that navigate, map, plan and operate in the real world. The work can best be described as attempts to emulate insect level locomotion and navigation, with very little work on behavior-based non-trivial manipulation of the world. There have been some behavior-based attempts at exploring social interactions, but these too have been modeled after the sorts of social interactions we see in insects. But (...) thinking how to scale from all this insect level work to full human level intelligence and social interactions leads to a synthesis that is very di erent from that imagined in traditional Arti cial Intelligence and Cognitive Science. We report on work towards that goal. (shrink)
In order to build autonomous robots that can carry out useful work in unstructured environments new approaches have been developed to building intelligent systems. The relationship to traditional academic robotics and traditional artificial intelligence is examined. In the new approaches a tight coupling of sensing to action produces architectures for intelligence that are networks of simple computational elements which are quite broad, but not very deep. Recent work within this approach has demonstrated the use of representations, expectations, plans, goals, and (...) learning, but without resorting to the traditional uses, of central, abstractly manipulable or symbolic representations. Perception within these systems is often an active process, and the dynamics of the interactions with the world are extremely important. The question of how to evaluate and compare the new to traditional work still provokes vigorous discussion. (shrink)
Most animals have significant behavioral expertise built in without having to explicitly learn it all from scratch. This expertise is a product of evolution of the organism; it can be viewed as a very long term form of learning which provides a structured system within which individuals might learn more specialized skills or abilities. This paper suggests one possible mechanism for analagous robot evolution by describing a carefully designed series of networks, each one being a strict augmentation of the previous (...) one, which control a six legged walking machine capable of walking over rough terrain and following a person passively sensed in the infrared spectrum. As the completely decentralized networks are augmented. the robot’s performance and behavior repetoire demonstrably improve. The rationale for such demonstrations is that they may provide a hint as to the requirements for automatically building massive networks to carry out complex sensory-motor tasks. The experiments with an actual robot ensure that an essence of reality is maintained and that no critical disabling problems have been ignored. (shrink)
In this book R. G. Peffer tackles the challenges of finding in Marx's work an implicit moral theory, of answering claims that Marxism is incompatible with morality, and of developing the outlines of an adequate Marxist moral and social ...
The challenge that confronts corporate decision-makers in connection with global labor conditions is often in identifying the standardsby which they should govern themselves. In an effort to provide greater direction in the face of possible global cultural conflicts, ethicistsThomas Donaldson and Thomas Dunfee draw on social contract theory to develop a method for identifying basic human rights: Integrated Social Contract Theory (ISCT). In this paper, we apply ISCT to the challenge of global labor standards, attempting to identify labor rights that (...) could serve as guides for corporations producing or outsourcing outside of their home country. In addition to identifying areas of universal agreement, we also examine whether ISCT is, in fact, a sufficient basis for determining worker rights; we seek to define the parameters of the “sweatshop” problem; we include the application and results of our ISCT analysis as applied to labor standards: the global labor rights hypernorms; and conclude that ISCT is sufficient only for rights that are universal. We also discuss whether market-driven decisions can identify the boundaries of labor rights, or at least assure that market outcomes are compatible with maintaining labor rights, in order to respond to the shortcomings of ISCT. We conclude with some comments on directions of analysis for labor rights determination. (shrink)
"Written in a racy, persuasive style, the book impresses the reader as a work of significant scholarship...I encourage students of comparative religions- and especially those of Islamic economics- to read it with great care."&$151; Islamic Studies The worlds of economics and theology rarely intersect. The former appears occupied exclusively with the concrete equations of supply and demand, while the latter revolves largely around the less tangible concerns of the soul and spirit. Intended as an interfaith clarification of the relationship between (...) the material and the spiritual worlds, this volume first inspects secular beliefs about the relationship between economics and ethics. Exploring the differences and similarities between the treatment of economic issues in each of the great monotheistic religions, Rodney Wilson reveals how each tradition considers such subjects as individual wealth, lending, economic regulation, usury, insurance capitalism, socialism, and banking. He concludes with an intriguing epilogue on the rapidly expanding field of business ethics. (shrink)
In a major revision of my earlier theoretical work on religion, I attempt to identify and connect the basic micro elements and processes underlying religious expression. I show that all primary aspects of religion-belief, emotion, ritual, prayer, sacrifice, mysticism, and miracle-can be understood on the basis of exchange relations between humans and supernatural beings. Although I utilize a cognitive definition of religion, this new version of the theory is especially concerned with the emotional and expressive aspects of religion. Along the (...) way I also clarify the difference between religion and magic and this sets the stage for explaining the conditions under which religion (but not magic) can require extended and exclusive exchange relations between humans and the gods, thus enabling some religions to sustain stable organizations based on a lay membership. (shrink)
Hume's argument concerning miracles is interpreted by making approximations to terms in Bayes's theorem. This formulation is then used to analyse the impact of multiple testimony. Individual testimonies which are ‘non-miraculous’ in Hume's sense can in principle be accumulated to yield a high probability both for the occurrence of a single miracle and for the occurrence of at least one of a set of miracles. Conditions are given under which testimony for miracles may provide support for the existence of God.
In this paper I shall outline the approach to consciousness adopted by ethnomethodology and its `associate'conversation analysis. I shall attempt to do this by taking a minimalist stance, namely a basic formulation of the elements of these approaches, trying to strip away the ornate superstructures which have been erected upon that basis. I shall proceed in two ways. First, I shall seek to define ethnomethodology and conversation analysis by contrasting them to varying degrees with a variety of other approaches: symbolic (...) interactionism and, derivatively, the work of Goffman, the -social psychology of Rom Harre and his associates and with Norbert Wiley. Secondly, I shall give some examples of the use of the notion of `self'held by ethnomethodologists and conversation analysts that take a definitive turn towards a non-ironic, non-mentalist, non-essentialist and non-cognitivist approach to knowledge, consciousness and self. (shrink)
It is suggested that consciousness is primarily associated not with stimuli and perception, as commonly supposed, but with movement and responses. Consciousness of stimuli arises in situations in which possible movements are planned, or in which information must be actively acquired rather than passively registered, and may or may not require overt movements to be performed. By emphasizing response, this formulation provides a simple explanation for the perceived unity of consciousness: though stimuli can be diverse, with independent components, movements must (...) necessarily be coordinated. Therefore, if we are to look for a `site' for consciousness, it is likely to be in a region such as the anterior cingulate that is neurally close to the higher motor hierarchical levels, and also accessible both to real sensory feedback and also to virtual feedback derived through mechanisms of efference copy from actual or proposed motor commands. It is suggested that synchrony of arrival of such information may be an important prerequisite of this unity, and that on this basis such a `master node' might be expected to be temporally `equidistant' from each of these sources; this may well be true of the anterior cingulate, but no doubt also of other structures. (shrink)
: My aim in this essay is to remove some of the rubbish that lies in the way of an appropriate understanding of rectificatory compensation, by arguing for the rejection of the counterfactual conception of compensation. Although there is a significant extent to which contemporary theorists have relied upon this idea, the counterfactual conception of compensation is merely a popular assumption, having no positive argument in support of it. Moreover, it can make rendering compensation impossible, and absurd notions of compensation (...) can result from its use, results that may themselves constitute injustices. This latter difficulty is most troubling when the CCC is employed in large compensatory cases like the case of rectificatory compensation for the descendants of American slaves. I want to suggest that, taken together, the difficulties with the CCC yield sufficient reason for rejecting it as an acceptable rectificatory notion. (shrink)
We study connections between classical asymptotic density, computability and computable enumerability. In an earlier paper, the second two authors proved that there is a computably enumerable set A of density 1 with no computable subset of density 1. In the current paper, we extend this result in three different ways: The degrees of such sets A are precisely the nonlow c.e. degrees. There is a c.e. set A of density 1 with no computable subset of nonzero density. There is a (...) c.e. set A of density 1 such that every subset of A of density 1 is of high degree. We also study the extent to which c.e. sets A can be approximated by their computable subsets B in the sense that A\B has small density. There is a very close connection between the computational complexity of a set and the arithmetical complexity of its density and we characterize the lower densities, upper densities and densities of both computable and computably enumerable sets. We also study the notion of "computable at density r" where r is a real in the unit interval. Finally, we study connections between density and classical smallness notions such as immunity, hyperimmunity, and cohesiveness. (shrink)
Artificial Intelligence as a discipline has gotten bogged down in subproblems of intelligence. These subproblems are the result of applying reductionist methods to the goal of creating a complete artificial thinking mind. In Brooks (1987) 1 have argued that these methods will lead us to solving irrelevant problems; interesting as intellectual puzzles, but useless in the long run for creating an artificial being.
It is claimed that the universe appears to be fine-tuned so as to admit the development of life. In this article Rodney Holder examines the evidence for fine-tuning and the chief rival explanation to design, namely the existence of a.
"If we are to understand not only the direct impact of Marx on the development of German thought but also his sometimes extremely indirect influence, an exact knowledge of Hegel, of both his greatness and his limitation, is absolutely indispensable."- from the preface. It is well known that Hegel exerted a major influence on the development of Marx's thought. This circumstance led Lukács, one of the chief Marxist theoreticians of this century, to embark on his exploration of Hegelian antecedents in (...) the German intellectual tradition, their concrete expression in the work of Hegel himself, and later syntheses of seemingly contradictory modes of though. Four phases of Hegel's intellectual development are examined: "Hegel's early republican phase," "the crisis in Hegel's views on society and the earliest beginnings of his dialectical method," "rationale and defense of objective idealism," and "the breach with Schelling and _ The Phenomenology of Mind_." Lukács completed this study in 1938, but because of the imminent outbreak of war, it was not published until the late 1940s. A revised German edition appeared in 1954, and it is this text that is the basis of this first English translation of the work. (shrink)
"Structure and Sentiment is an important book. Reading it may make an anthropologist more keenly aware of certain issues that are crucial in social anthropology, and this awareness may make one's field work as well as one's reading of published ethnographies more perceptive."—F. G. Lounsbury, American Anthropologist "A theoretical and methodological essay of first importance. As such, the book should be of interest to all social scientists interested in the development of specific and general theory in social anthropology."—Southwestern Social Science (...) Quarterly. (shrink)
This book aims to help answer two questions that Western philosophy has paid relatively little attention to - what is injustice and what does justice require when injustice occurs? Injustice and Rectification offers a taxonomy of justice, which sets forth an initial framework for a moral theory of justice and focuses on framing a conception of rectificatory justice. The taxonomy is ground for this book's eleven other essays, in which a diverse group of authors brings philosophical analysis to bear on (...) the idea of injustice itself and on some important conceptual and normative issues concerning the rectification of injustice. (shrink)
It is shown that, for certain classes of cosmological model which either postulate or give rise to infinitely many universes, only a measure zero subset of the set of possible universes above a given size can in fact be physically realized. It follows that claims to explain the fine tuning of our universe on the basis of such models by appeal to the existence of all possible universes fail.
The German theologian and martyr Dietrich Bonhoeffer is not widely known for engaging with scientific thought, having been heavily influenced by Karl Barth's celebrated stance against natural theology. However, during the period of his maturing theology in prison Bonhoeffer read a significant scientific work, Carl Friedrich von Weizsäcker's The World View of Physics. From this he gained two major insights for his theological outlook. First, he realized that the notion of a "God of the gaps" is futile, not just in (...) science but in other areas of human inquiry. Second, he felt that an infinite universe, as considered by science, would be self-subsistent and could exist as if there were no God. Bonhoeffer replaced Barth's radical critique of religion with the even more extreme view that it is a mere passing phase in history that grown-up humanity can dispense with. At the same time Bonhoeffer began an important critique of Barth's reaction, namely, the latter's retreat to a "positivism of revelation." While Bonhoeffer did not go quite as far as one might like, his approach opened up hopeful avenues for an answer to "the liberal question" and even a revived place for some kind of natural theology. (shrink)
The first brief description is given of a project aimed at searching for the neural correlates of consciousness through computer simulation. The underlying model is based on the known circuitry of the mammalian nervous system, the neuronal groups of which are approximated as binary composite units. The simulated nervous system includes just two senses - hearing and touch - and it drives a set of muscles that serve vocalisation, feeding and bladder control. These functions were chosen because of their relevance (...) to the earliest stages of human life, and the simulation has been given the name CyberChild. The system's pain receptors respond to a sufficiently low milk level in the stomach, if there is simultaneously a low level of blood sugar, and also to a full bladder and an unchanged diaper. It is believed that it may be possible to infer the presence of consciousness in the simulation through observations of CyberChild's behaviour, and from the monitoring of its ability to ontogenetically acquire novel reflexes. The author has suggested that this ability is the crucial evolutionary advantage of possessing consciousness. The project is still in its very early stages, and although no suggestion of consciousness has yet emerged, there appears to be no fundamental reason why consciousness could not ultimately develop and be observed. (shrink)
One of the central themes of inquiry for Karl Barth, the twentieth-century Protestant theologian, was the notion of revelation. Although he was suspicious of natural theology, recent scientific advances and the flourishing modern dialogue between science and religion offer compelling reasons to revisit Barth’s thinking on the concept. We must again ask whether and how it might be possible to hold together the notion of revelation whilst employing reason and scientific evidence in the justification of belief. In The Heavens Declare, (...) author Rodney Holder re-examines Barth’s natural theology argument and then explores how it has been critiqued and responded to by others, starting with Dietrich Bonhoeffer and Wolfhart Pannenberg. Holder then considers the contributions of two notable British participants in the science-religion dialogue, Thomas Torrance and Alister McGrath, who, despite their repudiation of natural theology in the traditional sense, also provide many positive lessons. The book concludes by defending an overall position which takes into account the ideas of the aforementioned theologians as well as others who are currently engaged positively in natural theology, such as John Polkinghorne and Richard Swinburne. Holder’s new study is sure to be of interest to theologians, philosophers of religion, and all scholars interested in the science-religion dialogue, especially those interested in natural theology as an enterprise in itself. (shrink)
Schnorr randomness is a notion of algorithmic randomness for real numbers closely related to Martin-Löf randomness. After its initial development in the 1970s the notion received considerably less attention than Martin-Löf randomness, but recently interest has increased in a range of randomness concepts. In this article, we explore the properties of Schnorr random reals, and in particular the c.e. Schnorr random reals. We show that there are c.e. reals that are Schnorr random but not Martin-Löf random, and provide a new (...) characterization of Schnorr random real numbers in terms of prefix-free machines. We prove that unlike Martin-Löf random c.e. reals, not all Schnorr random c.e. reals are Turing complete, though all are in high Turing degrees. We use the machine characterization to define a notion of "Schnorr reducibility" which allows us to calibrate the Schnorr complexity of reals. We define the class of "Schnorr trivial" reals, which are ones whose initial segment complexity is identical with the computable reals, and demonstrate that this class has non-computable members. (shrink)
. The accounting profession’s image and reputation is built upon the members of the profession acting with the “highest sense of integrity” in “the public interest” (AICPA, 2003, www.aicpa.org/about). The Enron debacle initiated the latest crisis facing the profession regarding its image and reputation. The American Institute of Certified Public Accountants (AICPA) is the largest professional body representing the accounting profession and the one to which regulators have looked in establishing and upholding professional standards relating to the public practice of (...) accounting and auditing. One of the AICPA’s responsibilities is to “promote public awareness and confidence in the integrity, objectivity, competence and professionalism of CPAs ....” (AICPA, 2003, www.aicpa.org/about). We analyze the public statements issued by the AICPA (i.e., press releases, speeches of officers, testimony, published articles) during this ethical and identity crisis beginning with the AICPA’s first public statement on the Enron debâcle (AIPCA 2001) and concluding with the AICPA recognizing the need for a “new accounting culture” (Melancon 2002). In order to better understand the public discourse, we use image restoration theory (Benoit, 1995), because it provides a typology of strategies for dealing with the public face of crises. We identify the three most common strategies the AICPA employs during this period. Proposals for taking corrective action represent the most commonly employed strategy, but the analysis also indicates an attempt to evade responsibility by claiming defeasibility and to reduce the offensiveness of the situation by employing a bolstering strategy. A second analysis using DICTION, a software package useful in revealing latent dimensions in a text, indicates that early statements tend to use language related to accomplishing specific action while the later statements tend more toward general language that relates to peoples’ everyday lives. The findings raise questions as to substantive nature of the changes proposed by the AICPA, and thus, the extent to which the public interest is being served by them. (shrink)