: The purpose of this paper and its sister paper I (Farrell and Hooker, a) is to present, evaluate and elaborate a proposed new model for the process of scientific development: self-directed anticipative learning. The vehicle for its evaluation is a new analysis of a well-known historical episode: the development of ape language research. Paper I examined the basic features of SDAL in relation to the early history of ape-language research. In this second paper we examine the reconceptualization of (...) ape-language research following what many conceived to be Terrace's refutation of ape-language. We show that the apparent 'revolution' in our understanding of ape linguistic capacities was not based upon 'revolutionary' research different in kind from 'normal' research. The same processes of self-directed interactive exploration of possibility space, that enables a homing-in upon both error and success, is present in all phases of productive science. Moreover, conceiving science as an SDAL process explains how scientists learn how to learn about their research domain. (shrink)
What are the appropriate criteria for assessing a theory of morality? In this enlightening work, Brad Hooker begins by answering this question. He then argues for a rule-consequentialist theory which, in part, asserts that acts should be assessed morally in terms of impartially justified rules. In the end, he considers the implications of rule-consequentialism for several current controversies in practical ethics, making this clearly written, engaging book the best overall statement of this approach to ethics.
: The purpose of this paper and its sister paper (Farrell and Hooker, b) is to present, evaluate and elaborate a proposed new model for the process of scientific development: self-directed anticipative learning (SDAL). The vehicle for its evaluation is a new analysis of a well-known historical episode: the development of ape-language research. In this first paper we outline five prominent features of SDAL that will need to be realized in applying SDAL to science: 1) interactive exploration of possibility (...) space; 2) self-directedness; 3) localization of success and error; 4) Synergistic increase in learning capacity; and 5) continuity of SDAL process across scientific change. In this paper we examine the first three features of SDAL in relation to the early history of ape-language research. We show that this history is readily explicated as a self-directed, ever-finer, delineation of possibility space that enables the localization of both success and error. Paper II examines the last two features against this history. (shrink)
This book focuses on showing how the ideas central to the new wave oj dynamic systems studies may also form the basis for a new and distinctive theory of human development where both global order and local variability in behaviour emerge together from the same organising dynamical interactions. This also sharpens our understanding of the weaknesses of the traditional formal, structuralist theories. Conversely, dynamical models have their own matching set of problems, many of which are consiously explored here. Less readily (...) acknowledged, the youthfulness of this field means that many of the studies presented here struggle to pass beyond speculative metaphor. Nonetheless, the field is shown to be one of vigour, intelligence and great promise. (shrink)
Fixed-rate versions of rule-consequentialism and rule-utilitarianism evaluate rules in terms of the expected net value of one particular level of social acceptance, but one far enough below 100% social acceptance to make salient the complexities created by partial compliance. Variable-rate versions of rule-consequentialism and rule-utilitarianism instead evaluate rules in terms of their expected net value at all different levels of social acceptance. Brad Hooker has advocated a fixed-rate version. Michael Ridge has argued that the variable-rate version is better. The (...) debate continues here. Of particular interest is the difference between the implications of Hooker's and Ridge's rules about doing good for others. (shrink)
The term ‘moral particularism’ has been used to refer to different doctrines. The main body of this paper begins by identifying the most important doctrines associated with the term, at least as the term is used by Jonathan Dancy, on whose work I will focus. I then discuss whether holism in the theory of reasons supports moral particularism, and I call into question the thesis that particular judgements have epistemological priority over general principles. Dancy’s recent book Ethics without Principles (Dancy (...) 2004) makes much of a distinction between reasons, enablers, disablers, intensi- ﬁers, and attenuators. I will suggest that the distinction is unnecessary, and I will argue that, even if there is such a distinction, it does not entail moral particularism. In the ﬁnal two sections, I try to give improved versions of arguments against particularism that I put forward in my paper ‘Moral Particularism: Wrong and Bad’ (Hooker 2000b: 1–22, esp. pp. 7–11, 15–22). (shrink)
Rule-consequentialism has been accused of either collapsing into act-consequentialism or being internally inconsistent. I have tried to develop a form of rule-consequentialism without these flaws. In this June's issue of Utilitas, Robert Card argued that I have failed. Here I assess his arguments.
All the major inter-theoretic relations of fundamental science are asymptotic ones, e.g. quantum theory as Planck's constant h 0, yielding (roughly) Newtonian mechanics. Thus asymptotics ultimately grounds claims about inter-theoretic explanation, reduction and emergence. This paper examines four recent, central claims by Batterman concerning asymptotics and reduction. While these claims are criticised, the discussion is used to develop an enriched, dynamically-based account of reduction and emergence, to show its capacity to illuminate the complex variety of inter-theory relationships in physics, and (...) to provide a principled resolution to such persistent philosophical problems as multiple realisability and the nature of the special sciences. Introduction Exposition Examination I: Claims (1) and (2), asymptotic explanation and reference Examination II: Claim (3), reduction and singular asymptotics Examination III: Claim (4), emergence and multiple realisability Conclusion. (shrink)
This paper replies to Carson's attacks on an earlier paper of Hooker's. Carson argued that rule-consequentialism--the theory that an act is morally right if and only if it is allowed by the set of rules and corresponding virtues the having of which by everyone would bring about the best consequences considered impartially--can and does require the comfortably off to make enormous sacrifices in order to help the needy. Hooker defends rule-consequentialism against Carson's arguments.
Donald Campbell has long advocated a naturalist epistemology based on a general selection theory, with the scope of knowledge restricted to vicarious adaptive processes. But being a vicariant is problematic because it involves an unexplained epistemic relation. We argue that this relation is to be explicated organizationally in terms of the regulation of behavior and internal state by the vicariant, but that Campbell's selectionist approach can give no satisfactory account of it because it is opaque to organization. We show how (...) organizational constraints and capacities are crucial to understanding both evolution and cognition and conclude with a proposal for an enriched, generalized model of evolutionary epistemology that places high-order regulatory organization at the center. (shrink)
It is argued that fundamental to Piaget''s life works is a biologically based naturalism in which the living world is a nested complex of self-regulating, self-organising (constructing) adaptive systems. A structuralist-rationalist overlay on this core position is distinguished and it is shown how it may be excised without significant loss of content or insight. A new and richer conception of the nature of Piaget''s genetic epistemology emerges, one which enjoys rich interrelationships with evolutionary epistemology. These are explored and it is (...) shown how a regulatory systems evolutionary epistemology may be embedded within genetic epistemology. (shrink)
In his bookMinimal Rationality (1986), Christopher Cherniak draws deep and widespread conclusions from our finitude, and not only for philosophy but also for a wide range of science as well. Cherniak's basic idea is that traditional philosophical theories of rationality represent idealisations that are inaccessible to finite rational agents. It is the purpose of this paper to apply a theory of idealisation in science to Cherniak's arguments. The heart of the theory is a distinction between idealisations that represent reversible, solely (...) quantitative simplifications and those that represent irreversible, degenerate idealisations which collapse out essential theoretical structure. I argue that Cherniak's position is best understood as assigning the latter status to traditional rationality theories and that, so understood, his arguments may be illuminated, expanded, and certain common criticisms of them rebutted. The result, however, is a departure from traditional, formalist theories of rationality of a more radical kind than Cherniak contemplates, with widespread ramifications for philosophical theory, especially philosophy of science itself. (shrink)
Error is protean, ubiquitous and crucial in scientific process. In this paper it is argued that understanding scientific process requires what is currently absent: an adaptable, context-sensitive functional role for error in science that naturally harnesses error identification and avoidance to positive, success-driven, science. This paper develops a new account of scientific process of this sort, error and success driving Self-Directed Anticipative Learning (SDAL) cycling, using a recent re-analysis of ape-language research as test example. The example shows the limitations of (...) other accounts of error, in particular Mayo’s (Error and the growth of experimental knowledge, 1996) error-statistical approach, and SDAL cycling shows how they can be fruitfully contextualised. (shrink)
The Aristotle-Kant tradition requires that autonomous activity must originate within the self and points toward a new type of causation (different from natural efficient causation) associated with teleology. Notoriously, it has so far proven impossible to uncover a workable model of causation satisfying these requirements without an increasingly unsatisfying appeal to extra-physical elements tailor-made for the purpose. In this paper we first provide the essential reason why the standard linear model of efficient causation cannot support the required model of agency: (...) its causal thread model of efficient causation cannot support the core requirement that an action is determined by, and thus an expression of, the agent’s nature. We then provide a model that corrects these deficiencies, constructed naturalistically from within contemporary biology, and argue that it provides an appropriate foundation for all the features of genuine agency. Further, we provide general characterisations of freedom and reason suitable to this bio-context (but that also capture the core classical conceptions) and show how this model reconciles them. (shrink)
An explicit philosophy and meta-philosophy of positivism, empiricism and popperianism is provided. Early popperianism is argued to be essentially a form of empiricism, the deviations from empiricism are traced. In contrast, the meta-philosophy and philosophy of an evolutionary naturalistic realism is developed and it is shown how the maximal conflict of this doctrine with all forms of empiricism at the meta-philosophical level both accounts for the form of its development at the philosophical level and its defense against attack from nonrealist (...) quarters. Following an earlier article on realism of similar theme (Synthese 26 (1974), 409) this paper then further explores the ramifications of a thoroughgoing realist position. (shrink)
The role of interaction in learning is essential and profound: it must provide the means to solve open problems (those only vaguely specified in advance), but cannot be captured using our familiar formal cognitive tools. This presents an impasse to those confined to present formalisms; but interaction is fundamentally dynamical, not formal, and with its importance thus underlined it invites the development of a distinctively interactivist account of life and mind. This account is provided, from its roots in the interactivist (...) biological constitution of life, through the evolution of the dual internal regulatory capacities expressed as intentionality and intelligence, to its expression in self-directed anticipative learning in persons and in science. (shrink)
This paper outlines an original interactivist-constructivist (I-C) approach to modelling intelligence and learning as a dynamical embodied form of adaptiveness and explores some applications of I-C to understanding the way cognitive learning is realized in the brain. Two key ideas for conceptualizing intelligence within this framework are developed. These are: (1) intelligence is centrally concerned with the capacity for coherent, context-sensitive, self-directed management of interaction; and (2) the primary model for cognitive learning is anticipative skill construction. Self-directedness is a capacity (...) for integrative process modulation which allows a system to "steer" itself through its world by anticipatively matching its own viability requirements to interaction with its environment. Because the adaptive interaction processes required of intelligent systems are too complex for effective action to be prespecified (e.g. genetically) learning is an important component of intelligence. A model of self-directed anticipative learning (SDAL) is formulated based on interactive skill construction, and argued to constitute a central constructivist process involved in cognitive development. SDAL illuminates the capacity of intelligent learners to start with the vague, poorly defined problems typically posed in realistic learning situations and progressively refine them, transforming them into problems with sufficient structure to guide the construction of a solution. Finally, some of the implications of I-C for modelling of the neuronal basis of intelligence and learning are explored; in particular, Quartz and Sejnowski's recent neural constructivism paradigm, enriched by Montague and Sejnowski's dopaminergic model of anticipative-predictive neural learning, is assessed as a promising, but incomplete, contribution to this approach. The paper concludes with a fourfold reflection on the divergence in cognitive modelling philosophy between the I-C and the traditional computational information processing approaches. (shrink)
Some things aren't what their names suggest. This is true of rubber ducks, stool pigeons, clay pigeons, hot dogs, and clothes horses. Frances Howard-Snyder's "Rule Consequentialism is a Rubber Duck" ("APQ", 30 (1993) 271-78) argues that the answer is Yes. Howard-Snyder thinks rule-consequentialism is a form of deontology, not a form of consequentialism. This thought is understandable: many recent definitions of consequentialism are such as to invite it. Thinking rule-consequentialism inferior to act-consequentialism, many philosophers, when discussing consequentialism, have had act-consequentialism (...) in mind. Having just one kind of consequentialism in mind has led them to offer definitions of consequentialism that are really definitions of just act-consequentialism. My paper discusses three different possible definitions of consequentialism and defends one that does justice to rule-consequentialism's family membership. (shrink)
A SLOW revolution in cognitive science is banishing this century's technological conception of mind as disembodied pure thought, namely a material symbol manipulation, and replacing it with next century's conception: mind as the organisation of bodily interaction, intelligent robotics. Here is Clark: Intelligence and understanding are rooted not in the presence and manipulation of explicit, language-like data structures, but in something more earthy: the tuning of basic responses to a real world that enables an embodied organism to sense, act and (...) survive . . . it is now increasingly clear that the alternative to the ``disembodied explicit data manipulation'' vision of AI is not to retreat from hard science; it is to pursue some even harder science. It is to put intelligence where it belongs: in the coupling of organisms and the world that is at the root of daily, fluent action. (p. 4) # AAHPSSS, 1998. Published by Blackwell Publishers, 108 Cowley Road, Oxford OX4 1JF, and 350 Main Street, Malden MA 02148, USA.. (shrink)
Contrary to the Empiricist model of science, successful sufficiently fundamental theories not only fit and unify their data fields but also prescribe the general terms in which relevantly to describe observation; specify what is and is not observable; specify the conditions under which what is observable, is observable; specify the instrumental means and reliability by which what is measurable is measured; specify what is causally, statistically, and merely accidentally connected. Moreover, such theories typically require all or most of the entire (...) remainder of science to be properly applied in any given situation and theoreticians' models play a crucial role in such applications. (I call these respectively the internal and external globalnesses of theories.) A discussion of the consequences of these global features of theories for philosophy of science is offered in the context of specific examples and a structural model for science. (shrink)
It is shown how the development of physics has involved making explicit what were homocentric projections which had heretofore been implicit, indeed inexpressible in theory. This is shown to support a particular notion of the invariant as the real. On this basis the divergence in ideals of physical intelligibility between Bohr and Einstein is set out. This in turn leads to divergent, but explicit, conceptions of objectivity and completeness for physical theory. *I am indebted to Dr. G. McLelland. Professor F. (...) Rohrlich and an anonymous referee of this journal for several improvements in the formulation of the paper. (shrink)
D. H. Sharp has recently argued that Einstein, Podolsky, and Rosen failed to make good their claim that elementary quantum theory provides only an incomplete description of physical reality. Sharp expounds in detail three criticisms (a fourth is mentioned) which focus largely on formal features of the quantum theory. I argue, on grounds centered largely in our search for an adequate physical understanding of the micro domain, that each of these criticisms must be rejected. The original criticism of quantum theory (...) reemerges as a still-important baseline in our search for an adequate understanding of quantum theory. (shrink)
Solidarity-the reciprocal relations of trust and obligation between citizens that are essential for a thriving polity-is a basic goal of all political communities. Yet it is extremely difficult to achieve, especially in multiracial societies. In an era of increasing global migration and democratization, that issue is more pressing than perhaps ever before. In the past few decades, racial diversity and the problems of justice that often accompany it have risen dramatically throughout the world. It features prominently nearly everywhere: from the (...) United States, where it has been a perennial social and political problem, to Europe, which has experienced an unprecedented influx of Muslim and African immigrants, to Latin America, where the rise of vocal black and indigenous movements has brought the question to the fore. Political theorists have long wrestled with the topic of political solidarity, but they have not had much to say about the impact of race on such solidarity, except to claim that what is necessary is to move beyond race. The prevailing approach has been: How can a multicultural and multiracial polity, with all of the different allegiances inherent in it, be transformed into a unified, liberal one? Juliet Hooker flips this question around. In multiracial and multicultural societies, she argues, the practice of political solidarity has been indelibly shaped by the social fact of race. The starting point should thus be the existence of racialized solidarity itself: How can we create political solidarity when racial and cultural diversity are more or less permanent? Unlike the tendency to claim that the best way to deal with the problem of racism is to abandon the concept of race altogether, Hooker stresses the importance of coming to terms with racial injustice, and explores the role that it plays in both the United States and Latin America. Coming to terms with the lasting power of racial identity, she contends, is the starting point for any political project attempting to achieve solidarity. (shrink)
The relationship of the identity of indiscernibles principle to other major metaphysical principles (e.g., relational doctrine of space and time, elimination of singular terms) is discussed, the aim being to outline the necessary requirements of a systematic metaphysics incorporating the former principle. the conclusion is that no adequate systematic metaphysics of this sort is defensible. throughout special attention is paid to modern logical formulation of the principle and to its role in quine's philosophy.
The thesis of this paper is that scientific method is to be thought of as a complex many-leveled regulatory hierarchy of principles, interacting with theory also viewed as a complex many-leveled hierarchy. This conception of method is illustrated in particular through one episode in the contemporary development of plasma physics, and related to others. It provides for method-theory interaction and for the development of method itself as science develops.
It is now commonly accepted that N. Goodman's predicate "grue" presents the theory of confirmation of C. G. Hempel (and other such theories) with grave difficulties. The precise nature and status of these "difficulties" has, however, never been made clear. In this paper it is argued that it is very unlikely that "grue" raises any formal difficulties for Hempel and appearances to the contrary are examined, rejected and an explanation of their intuitive appeal offered. However "grue" is shown to raise (...) an informal, "over-arching" difficulty of great magnitude for all theories of confirmation, including Hempel's theory. (shrink)