There are important ethical issues that must be carefully thought through when undertaking research with children. This paper explores how the context of such issues changes with the individual circumstances of the children involved, particularly when they are marginalised or excluded by wider society. By reflecting on experiences of research with Kampala street children, this paper highlights how participation throughout the research process can both raise and resolve ethical dilemmas. This is illustrated by reflecting on two examples, namely discussing sensitive (...) topics and the dissemination of socio-spatial research findings. In conclusion, the paper demonstrates the importance of ethical sensitivity to the changing situations that arise when conducting research with street children and the importance of incorporating and involving them in both the research process and ethical dilemmas. (shrink)
There are important ethical issues that must be carefully thought through when undertaking research with children. This paper explores how the context of such issues changes with the individual circumstances of the children involved, particularly when they are marginalised or excluded by wider society. By reflecting on experiences of research with Kampala street children, this paper highlights how participation throughout the research process can both raise and resolve ethical dilemmas. This is illustrated by reflecting on two examples, namely discussing sensitive (...) topics and the dissemination of socio-spatial research findings. In conclusion, the paper demonstrates the importance of ethical sensitivity to the changing situations that arise when conducting research with street children and the importance of incorporating and involving them in both the research process and ethical dilemmas. (shrink)
When a chimpanzee stockpiles rocks as weapons or when a frog sends out mating calls, we might easily assume these animals know their own motivations--that they use the same psychological mechanisms that we do. But as Beyond the Brain indicates, this is a dangerous assumption because animals have different evolutionary trajectories, ecological niches, and physical attributes. How do these differences influence animal thinking and behavior? Removing our human-centered spectacles, Louise Barrett investigates the mind and brain and offers an alternative (...) approach for understanding animal and human cognition. Drawing on examples from animal behavior, comparative psychology, robotics, artificial life, developmental psychology, and cognitive science, Barrett provides remarkable new insights into how animals and humans depend on their bodies and environment--not just their brains--to behave intelligently. Barrett begins with an overview of human cognitive adaptations and how these color our views of other species, brains, and minds. Considering when it is worth having a big brain--or indeed having a brain at all--she investigates exactly what brains are good at. Showing that the brain's evolutionary function guides action in the world, she looks at how physical structure contributes to cognitive processes, and she demonstrates how these processes employ materials and resources in specific environments. Arguing that thinking and behavior constitute a property of the whole organism, not just the brain, Beyond the Brain illustrates how the body, brain, and cognition are tied to the wider world. (shrink)
Jeffrey Barrett presents the most comprehensive study yet of a problem that has puzzled physicists and philosophers since the 1930s. Quantum mechanics is in one sense the most successful physical theory ever, accurately predicting the behaviour of the basic constituents of matter. But it has an apparent ambiguity or inconsistency at its heart; Barrett gives a careful, clear, and challenging evaluation of attempts to deal with this problem.
My purpose in this essay is to describe and define the ways in which Afro-American women intellectuals, in the last decade of the nineteenth century, theorized about the possibilities and limits of patriarchal power through its manipulation of racialized and gendered social categories and practices. The essay is especially directed toward two academic constituencies: the practitioners of Afro-American cultural analysis and of feminist historiography and theory. The dialogue with each has its own peculiar form, characterized by its own specific history; (...) yet both groups are addressed in an assertion of difference, of alterity, and in a voice characterized by an anger dangerously self-restrained. For it is not in the nature of Caliban to curse; rather, like Caliban, the black woman has learned from her behaviour of her master and mistress that if accommodation results in a patronizing loosening of her bonds, liberation will be more painful. Hazel V. Carby is assistant professor of English at Wesleyan University. She is the coauthor of the Empire Strikes Back: Race and Racism in Seventies Britain and the author of Uplifting as They Write: The Emergence of the Afro-American Woman Novelist. (shrink)
Jeffrey Barrett presents the most comprehensive study yet of a problem that has puzzled physicists and philosophers since the 1930s. The standard theory of quantum mechanics is in one sense the most successful physical theory ever, predicting the behaviour of the basic constituents of all physical things; no other theory has ever made such accurate empirical predictions. However, if one tries to understand the theory as providing a complete and accurate framework for the description of the behaviour of all (...) physical interactions, it becomes evident that the theory is ambiguous, or even logically inconsistent. The most notable attempt to formulate the theory so as to deal with this problem, the quantum measurement problem, was initiated by Hugh Everett III in the 1950s. Barrett gives a careful and challenging examination and evaluation of the work of Everett and those who have followed him. His informal approach, minimizing technicality, will make the book accessible and illuminating for philosophers and physicists alike. Anyone interested in the interpretation of quantum mechanics should read it. (shrink)
We relate Popper functions to regular and perfectly additive such non-Archimedean probability functions by means of a representation theorem: every such non-Archimedean probability function is infinitesimally close to some Popper function, and vice versa. We also show that regular and perfectly additive non-Archimedean probability functions can be given a lexicographic representation. Thus Popper functions, a specific kind of non-Archimedean probability functions, and lexicographic probability functions triangulate to the same place: they are in a good sense interchangeable.
Glymour and Quine propose two different formal criteria for theoretical equivalence. In this paper we examine the relationships between these criteria.
Modularity has been the subject of intense debate in the cognitive sciences for more than 2 decades. In some cases, misunderstandings have impeded conceptual progress. Here the authors identify arguments about modularity that either have been abandoned or were never held by proponents of modular views of the mind. The authors review arguments that purport to undermine modularity, with particular attention on cognitive architecture, development, genetics, and evolution. The authors propose that modularity, cleanly defined, provides a useful framework for directing (...) research and resolving debates about individual cognitive systems and the nature of human evolved cognition. Modularity is a fundamental property of living things at every level of organization; it might prove indispensable for understanding the structure of the mind as well. (shrink)
The book explores and explains the relationship between law and ethics in the context of medically related research in order to provide a practical guide to understanding for members of research ethics committees (RECs), professionals involved with medical research and those with an academic interest in the subject.
Logicians and philosophers of science have proposed various formal criteria for theoretical equivalence. In this paper, we examine two such proposals: definitional equivalence and categorical equivalence. In order to show precisely how these two well-known criteria are related to one another, we investigate an intermediate criterion called Morita equivalence.
Recent work in cognitive science of religion (CSR) is beginning to converge on a very interesting thesis—that, given the ordinary features of human minds operating in typical human environments, we are naturally disposed to believe in the existence of gods, among other religious ideas (e.g., seeAtran [2002], Barrett [2004; 2012], Bering [2011], Boyer [2001], Guthrie [1993], McCauley [2011], Pyysiäinen [2004; 2009]). In this paper, we explore whether such a discovery ultimately helps or hurts the atheist position—whether, for example, it (...) lends credence to atheism by explaining away religious belief or whether it actually strengthens some already powerful arguments against atheism in the relevant philosophical literature.We argue that the recent discoveries of CSR hurt, not help, the atheist position—that CSR, if anything, should not give atheists epistemic assurance. (shrink)
What would it mean to apply quantum theory, without restriction and without involving any notion of measurement and state reduction, to the whole universe? What would realism about the quantum state then imply? This book brings together an illustrious team of philosophers and physicists to debate these questions. The contributors broadly agree on the need, or aspiration, for a realist theory that unites micro- and macro-worlds. But they disagree on what this implies. Some argue that if unitary quantum evolution has (...) unrestricted application, and if the quantum state is taken to be something physically real, then this universe emerges from the quantum state as one of countless others, constantly branching in time, all of which are real. The result, they argue, is many worlds quantum theory, also known as the Everett interpretation of quantum mechanics. No other realist interpretation of unitary quantum theory has ever been found. Others argue in reply that this picture of many worlds is in no sense inherent to quantum theory, or fails to make physical sense, or is scientifically inadequate. The stuff of these worlds, what they are made of, is never adequately explained, nor are the worlds precisely defined; ordinary ideas about time and identity over time are compromised; no satisfactory role or substitute for probability can be found in many worlds theories; they can't explain experimental data; anyway, there are attractive realist alternatives to many worlds. Twenty original essays, accompanied by commentaries and discussions, examine these claims and counterclaims in depth. They consider questions of ontology - the existence of worlds; probability - whether and how probability can be related to the branching structure of the quantum state; alternatives to many worlds - whether there are one-world realist interpretations of quantum theory that leave quantum dynamics unchanged; and open questions even given many worlds, including the multiverse concept as it has arisen elsewhere in modern cosmology. A comprehensive introduction lays out the main arguments of the book, which provides a state-of-the-art guide to many worlds quantum theory and its problems. (shrink)
In this article, I examine whether or not the Hamiltonian and Lagrangian formulations of classical mechanics are equivalent theories. I do so by applying a standard for equivalence that was recently introduced into philosophy of science by Halvorson and Weatherall. This case study yields three general philosophical payoffs. The first concerns what a theory is, while the second and third concern how we should interpret what our physical theories say about the world. 1Introduction 2When Are Two Theories Equivalent? 3Preliminaries on (...) Classical Mechanics 3.1Hamiltonian mechanics 3.2Lagrangian mechanics 4Are Hamiltonian and Lagrangian Mechanics Equivalent Theories? 4.1Tangent bundle versus cotangent bundle 4.2Tangent bundle versus symplectic manifold 4.3Lagrangian vector field versus Hamiltonian vector field 5Conclusion Appendix. (shrink)
Some years ago I came across the following question thrown out almost casually in the course of discussion: How many of us, it was asked, want to call a ‘bad work of art’ a ‘work of art’? The question was clearly rhetorical; the author quite obviously did not consider that anyone in his right mind would suggest that a bad work of art was a work of art. This struck me as rather odd. Surely there can be good and bad (...) works of art, just as there can be good and bad apples or good and bad men. An apple does not cease to be an apple just because it is bad, unless perhaps it has become thoroughly rotten; but the gardener who says ‘The Coxes are bad this year’ does not mean that they have grown rotten on the trees, much less that they are not apples at all. Moreover, if so-called bad works of art are not works of art, what are they? You may not think highly of the works in the Royal Academy Summer Exhibition but they are not totally dissimilar to some works in Bond Street next door which are highly regarded. (shrink)
It is over forty years since Merleau-Ponty published his first major work, Le structure de comportement and a quarter of a century since he died. He belongs, therefore, with Sartre and Marcel, to the first post-War generation of French philosophers. Like his friend Sartre's, his philosophy may be regarded as dated, passé, of no interest or relevance to truly contemporary thought. In philosophical terms forty years are nothing; in terms of trends, fashions and novelties they are an eternity. But perhaps (...) the work of Merleau-Ponty has not dated because it was never in vogue. He did not write plays and novels, or take part in political demonstrations, though he was involved in politics, or win a Nobel prize and refuse to receive it. He was very much a philosopher's philosopher, eminent in his field, well known in academic circles in France but hardly a household name. In this country he is hardly known even in philosophical circles, except by name. More is the pity, since his philosophical approach and manner of philosophizing have much in common with certain modes of British philosophizing, as I hope to show. (shrink)
To talk of a logic of mysticism may sound distinctly odd. If anything, mysticism is alogical; it would be uncharitable if not false, on mature consideration, to call it illogical—though many, without due deliberation, might be tempted to use that term. Wittgenstein comes close to calling it illogical. In his lecture on ethics he draws attention to the logical oddity of statements of absolute value. But he does not accuse the mystics or prophets or religious teachers of contradicting themselves or (...) of invalid reasoning. What he accuses them of may be something worse, namely, talking nonsense, of not giving sense to the words they use or the expressions they utter. Russell and Ayer come to much the same conclusion but by a different route. (shrink)
The standard view is that the Lagrangian and Hamiltonian formulations of classical mechanics are theoretically equivalent. Jill North, however, argues that they are not. In particular, she argues that the state-space of Hamiltonian mechanics has less structure than the state-space of Lagrangian mechanics. I will isolate two arguments that North puts forward for this conclusion and argue that neither yet succeeds. 1 Introduction2 Hamiltonian State-space Has less Structure than Lagrangian State-space2.1 Lagrangian state-space is metrical2.2 Hamiltonian state-space is symplectic2.3 Metric > (...) symplectic3 Hamiltonian State-space Does Not Have Less Structure than Lagrangian State-space3.1 Lagrangian state-space has less than metric structure3.1.1 A potential worry3.1.2 General Lagrangians3.2 Hamiltonian state-space has more than symplectic structure3.2.1 A dual structure on the Hamiltonian state-space3.2.2 Simple Hamiltonians3.3 Comparing Lagrangian and Hamiltonian structures3.3.1 Counting mathematical structure3.3.2 Symplectic and metric structure are incomparable4 An Alternative Argument for LS4.1 The argument for P3*4.2 Symplectic manifold vs. tangent bundle structure4.3 Trying to patch up the argument for P3*5 Interpreting Mathematical Structure5.1 State-space realism5.2 Model isomorphism and theoretical equivalence6 Conclusion. (shrink)
This paper presents novel data regarding the logophoric pronoun in Ewe. We show that, contrary to what had been assumed in the absence of the necessary fieldwork, Ewe logophors are not obligatorily interpreted de se. We discuss the prima facie rather surprising nature of this discovery given the assumptions that de se construals arise via binding of the pronoun by an abstraction operator in the left periphery of the clausal complement of an attitude predicate, and that logophors are elements that (...) are obligatorily bound by such abstractors. We show that this approach can be reconciled with these facts given the additional assumption that elements that are ‘de se’ bound can interact with the concept generator variables posited by Percus and Sauerland to derive de re interpretations of embedded nominals. The proposed set-up has consequences for our understanding of puzzles raised by Heim and Sharvit concerning binding-theoretic effects with de re elements, and for the derivation of the obligatorily de se interpretation of controlled PRO. (shrink)
According to the conceptual act theory, emotions emerge when physical sensations in the self and physical actions in others are meaningfully linked to situations during a process that can be called both cognitive and perceptual. There are key four hypotheses: an emotion is a conceptual category, populated with instances that are tailored to the environment; each instance of emotion is constructed within the brain’s functional architecture of domain-general core systems; the workings of each system must be holistically understood within the (...) momentary state of the brain, the body, and the surrounding context; being emergent states, emotional episodes have functional features that physical states, alone, do not have. Similarities and differences to other theoretical approaches to emotion are discussed. (shrink)
Experiences of emotion are content-rich events that emerge at the level of psychological description, but must be causally constituted by neurobiological processes. This chapter outlines an emerging scientific agenda for understanding what these experiences feel like and how they arise. We review the available answers to what is felt (i.e., the content that makes up an experience of emotion) and how neurobiological processes instantiate these properties of experience. These answers are then integrated into a broad framework that describes, in psychological (...) terms, how the experience of emotion emerges from more basic processes. We then discuss the role of such experiences in the economy of the mind and behavior. (shrink)
A dearth of clinical research involving children has resulted in off-licence and sometimes inappropriate medications being prescribed to the paediatric population. In this environment, recent years have seen the introduction of a raft of regulation aimed at increasing the involvement of children in clinical trials research and generating evidence-based medicinal preparations for their use. However, this regulation pays scant attention to the autonomy of competent minors. In particular, it makes no provision for the ability of competent minors to consent to (...) participate in medical research and is therefore at odds with best ethical practice. This article explores the tensions between law and ethics in relation to clinical research involving minors and concludes that greater respect should be given to the autonomy of those minors who are competent to decide for themselves. (shrink)
Mathematicians, physicists, and philosophers of physics often look to the symmetries of an object for insight into the structure and constitution of the object. My aim in this paper is to explain why this practice is successful. In order to do so, I present a collection of results that are closely related to (and in a sense, generalizations of) Beth’s and Svenonius’ theorems.
Widely recognized as the finest definition of existentialist Philosophy, this book introduced existentialism to America in 1958. Barrett discusses the views of 19th and 20th century existentialists Kierkegaard, Nietzsche, Heidegger, and Sartre and interprets the impact of their thinking on literature, art, and philosophy.
Intent and mitigating circumstances play a central role in moral and legal assessments in large-scale industrialized societies. Al- though these features of moral assessment are widely assumed to be universal, to date, they have only been studied in a narrow range of societies. We show that there is substantial cross-cultural variation among eight traditional small-scale societies (ranging from hunter-gatherer to pastoralist to horticulturalist) and two Western societies (one urban, one rural) in the extent to which intent and mitigating circumstances influence (...) moral judgments. Although participants in all societies took such factors into account to some degree, they did so to very different extents, varying in both the types of considerations taken into account and the types of violations to which such considerations were applied. The particular patterns of assessment characteristic of large-scale industrialized societies may thus reflect relatively recently culturally evolved norms rather than inherent features of human moral judgment. (shrink)
We consider how an epistemic network might self-assemble from the ritualization of the individual decisions of simple heterogeneous agents. In such evolved social networks, inquirers may be significantly more successful than they could be investigating nature on their own. The evolved network may also dramatically lower the epistemic risk faced by even the most talented inquirers. We consider networks that self-assemble in the context of both perfect and imperfect communication and compare the behaviour of inquirers in each. This provides a (...) step in bringing together two new and developing research programs, the theory of self-assembling games and the theory of network epistemology. (shrink)
The purported fact that geometric theories formulated in terms of points and geometric theories formulated in terms of lines are “equally correct” is often invoked in arguments for conceptual relativity, in particular by Putnam and Goodman. We discuss a few notions of equivalence between first-order theories, and we then demonstrate a precise sense in which this purported fact is true. We argue, however, that this fact does not undermine metaphysical realism.
Calls for the inclusion of women in clinical trials raise the obvious question: why have sponsors excluded them? The answer most often given is one tragically evocative word: Thalidomide. The tragedies of the children born with seal limbs because their mothers took this over-the-counter sleeping pill and cure for morning sickness showed that, contrary to previous perceptions, the placenta could not be depended upon to filter out toxins before they reached the fetus. The specter of birth defects spawned sponsors’ fears (...) of a variety of catastrophes which contributed to closing the doors of clinical trials for women.This paper will not argue that the possibility of birth defects arising from the ingestion of an experimental drug does not exist. Sadly, scientists do not yet have the ability to predict which drugs will cause birth defects. Rather, it will argue that case law does not provide a basis for sponsor liability when a woman gives informed consent and the regulations governing clinical trials are followed. (shrink)
Calls for the inclusion of women in clinical trials raise the obvious question: why have sponsors excluded them? The answer most often given is one tragically evocative word: Thalidomide. The tragedies of the children born with seal limbs because their mothers took this over-the-counter sleeping pill and cure for morning sickness showed that, contrary to previous perceptions, the placenta could not be depended upon to filter out toxins before they reached the fetus. The specter of birth defects spawned sponsors’ fears (...) of a variety of catastrophes which contributed to closing the doors of clinical trials for women.This paper will not argue that the possibility of birth defects arising from the ingestion of an experimental drug does not exist. Sadly, scientists do not yet have the ability to predict which drugs will cause birth defects. Rather, it will argue that case law does not provide a basis for sponsor liability when a woman gives informed consent and the regulations governing clinical trials are followed. (shrink)
The cases of Diane Pretty and Ms B. raise crucial issues about decision-making and autonomy at the end of life. Ms B. was permitted her wish to die rather than live permanently dependent upon a ventilator because her case was constructed as one about withholding consent to medical treatment, which every adult with capacity has a right to do. Mrs Pretty, however, sought active intervention to end her life. Requiring assistance to die, and claiming that this was her human right, (...) she sought an assurance that her husband would not be prosecuted if he helped her. Her claim was rejected and the assurance refused. The cases prompt questions about the nature of autonomy, the influence of others and the different way sin which medical and legal decisions are made. (shrink)
Currently, there is widespread skepticism that higher cognitive processes, given their apparent flexibility and globality, could be carried out by specialized computational devices, or modules. This skepticism is largely due to Fodor’s influential definition of modularity. From the rather flexible catalogue of possible modular features that Fodor originally proposed has emerged a widely held notion of modules as rigid, informationally encapsulated devices that accept highly local inputs and whose opera- tions are insensitive to context. It is a mistake, however, to (...) equate such features with computational devices in general and therefore to assume, as Fodor does, that higher cognitive processes must be non-computational. Of the many possible non-Fodorean architectures, one is explored here that offers possible solutions to computational problems faced by conventional modular systems: an ‘enzymatic’ architecture. Enzymes are computational devices that use lock-and-key template matching to iden- tify relevant information (substrates), which is then operated upon and returned to a common pool for possible processing by other devices. Highly specialized enzymes can operate together in a common pool of information that is not pre-sorted by information type. Moreover, enzymes can use molecular ‘tags’ to regulate the operations of other devices and to change how particular substrates are construed and operated upon, allowing for highly interactive, context-specific processing. This model shows how specialized, modular processing can occur in an open system, and suggests that skepti- cism about modularity may largely be due to failure to consider alternatives to the standard model. (shrink)
Quine often argued for a simple, untyped system of logic rather than the typed systems that were championed by Russell and Carnap, among others. He claimed that nothing important would be lost by eliminating sorts, and the result would be additional simplicity and elegance. In support of this claim, Quine conjectured that every many-sorted theory is equivalent to a single-sorted theory. We make this conjecture precise, and prove that it is true, at least according to one reasonable notion of theoretical (...) equivalence. Our clarification of Quine’s conjecture, however, exposes the shortcomings of his argument against many-sorted logic. (shrink)
This paper concerns how rule-following behavior might evolve in the context of a variety of Skyrms–Lewis signaling game, how such rules might subsequently evolve to be used in new contexts, and how such appropriation allows for the composition of evolved rules. We will also consider how the composition of simpler rules to form more complex rules may be significantly more efficient than evolving the complex rules directly. And we will review an example of rule following by pinyon and scrub jays (...) as an illustration of the appropriation of a rule to a new context :142–150, 2013a; Barrett, Philos Sci, 2014). The proposal here is that the composition of rules might occur in a way that is precisely analogous to such simple appropriation. Finally, we will briefly consider how any finite truth-functional operation might evolve by the sequential appropriation of simpler rules. (shrink)
We consider how cue-reading, sensory-manipulation, and signaling games may initially evolve from ritualized decisions and how more complex games may evolve from simpler games by polymerization, template transfer, and modular composition. Modular composition is a process that combines simpler games into more complex games. Template transfer, a process by which a game is appropriated to a context other than the one in which it initially evolved, is one mechanism for modular composition. And polymerization is a particularly salient example of modular (...) composition where simpler games evolve to form more complex chains. We also consider how the evolution of new capacities by modular composition may be more efficient than evolving those capacities from basic decisions. (shrink)
Everett's relative-state formulation of quantum mechanics is an attempt to solve the measurement problem by dropping the collapse dynamics from the standard von Neumann-Dirac theory of quantum mechanics. The main problem with Everett's theory is that it is not at all clear how it is supposed to work. In particular, while it is clear that he wanted to explain why we get determinate measurement results in the context of his theory, it is unclear how he intended to do this. There (...) have been many attempts to reconstruct Everett's no-collapse theory in order to account for the apparent determinateness of measurement outcomes. These attempts have led to such formulations of quantum mechanics as the many-worlds, many-minds, many-histories, and relative-fact theories. Each of these captures part of what Everett claimed for his theory, but each also encounters problems. (shrink)
Lewis sender‐receiver games illustrate how a meaningful term language might evolve from initially meaningless random signals (Lewis 1969; Skyrms 2006). Here we consider how a meaningful language with a primitive grammar might evolve in a somewhat more subtle sort of game. The evolution of such a language involves the co‐evolution of partitions of the physical world into what may seem, at least from the perspective of someone using the language, to correspond to canonical natural kinds. While the evolved language may (...) allow for the sort of precise representation that is required for successful coordinated action and prediction, the apparent natural kinds reflected in its structure may be purely conventional. This has both positive and negative implications for the limits of naturalized metaphysics. (shrink)
Reformed epistemology and cognitive science have remarkably converged on belief in God. Reformed epistemology holds that belief in God is basic—that is, belief in God is a natural, non-inferential belief that is immediately produced by a cognitive faculty. Cognitive science of religion also holds that belief in gods is (often) non-reflectively and instinctively produced—that is, non-inferentially and automatically produced by a cognitive faculty or system. But there are differences. In this paper, we will show some remarkable points of convergence, and (...) a few points of divergence, between Reformed epistemology and the cognitive science of religion. (shrink)