Germline Gene Editing has enormous potential both as a research tool and a therapeutic intervention. While other types of gene editing are relatively uncontroversial, GGE has been strongly resisted. In this article, we analyse the ethical arguments for and against pursuing GGE by allowing and funding its development. We argue there is a strong case for pursuing GGE for the prevention of disease. We then examine objections that have been raised against pursuing GGE and argue that these fail. We conclude (...) that the moral case in favour of pursuing GGE is stronger than the case against. This suggests that pursuing GGE is morally permissible and indeed morally desirable. (shrink)
For early modern metaphysician Anne Conway, the world comprises creatures. In some sense, Conway is a monist about creatures: all creatures are one. Yet, as Jessica Gordon-Roth has astutely pointed out, that monism can be understood in very different ways. One might read Conway as an ‘existence pluralist’: creatures are all composed of the same type of substance, but many substances exist. Alternatively, one might read Conway as an ‘existence monist’: there is only one created substance. Gordon-Roth has done the (...) scholarship a great favor by illuminating these issues in Conway. However, this article takes issue with Gordon-Roth's further view that Conway ‘oscillates’ between the extremes of existence pluralism and monism. In its place, I argue we should read Conway as a priority monist: the whole of creation is ontologically prior to its parts. (shrink)
Is the overall value of a world just the sum of values contributed by each value-bearing entity in that world? Additively separable axiologies (like total utilitarianism, prioritarianism, and critical level views) say 'yes', but non-additive axiologies (like average utilitarianism, rank-discounted utilitarianism, and variable value views) say 'no'. This distinction is practically important: additive axiologies support 'arguments from astronomical scale' which suggest (among other things) that it is overwhelmingly important for humanity to avoid premature extinction and ensure the existence of a (...) large future population, while non-additive axiologies need not. We show, however, that when there is a large enough 'background population' unaffected by our choices, a wide range of non-additive axiologies converge in their implications with some additive axiology -- for instance, average utilitarianism converges to critical-level utilitarianism and various egalitarian theories converge to prioritiarianism. We further argue that real-world background populations may be large enough to make these limit results practically significant. This means that arguments from astronomical scale, and other arguments in practical ethics that seem to presuppose additive separability, may be truth-preserving in practice whether or not we accept additive separability as a basic axiological principle. (shrink)
What are the processes, from conception to adulthood, that enable a single cell to grow into a sentient adult? Neuroconstructivism is a pioneering 2 volume work that sets out a whole new framework for considering the complex topic of development, integrating data from cognitive studies, computational work, and neuroimaging.
We give two social aggregation theorems under conditions of risk, one for constant population cases, the other an extension to variable populations. Intra and interpersonal welfare comparisons are encoded in a single ‘individual preorder’. The theorems give axioms that uniquely determine a social preorder in terms of this individual preorder. The social preorders described by these theorems have features that may be considered characteristic of Harsanyi-style utilitarianism, such as indifference to ex ante and ex post equality. However, the theorems are (...) also consistent with the rejection of all of the expected utility axioms, completeness, continuity, and independence, at both the individual and social levels. In that sense, expected utility is inessential to Harsanyi-style utilitarianism. In fact, the variable population theorem imposes only a mild constraint on the individual preorder, while the constant population theorem imposes no constraint at all. We then derive further results under the assumption of our basic axioms. First, the individual preorder satisfies the main expected utility axiom of strong independence if and only if the social preorder has a vector-valued expected total utility representation, covering Harsanyi’s utilitarian theorem as a special case. Second, stronger utilitarian-friendly assumptions, like Pareto or strong separability, are essentially equivalent to strong independence. Third, if the individual preorder satisfies a ‘local expected utility’ condition popular in non-expected utility theory, then the social preorder has a ‘local expected total utility’ representation. Fourth, a wide range of non-expected utility theories nevertheless lead to social preorders of outcomes that have been seen as canonically egalitarian, such as rank-dependent social preorders. Although our aggregation theorems are stated under conditions of risk, they are valid in more general frameworks for representing uncertainty or ambiguity. (shrink)
Political realism criticises the putative abstraction, foundationalism and neglect of the agonistic dimension of political practice in the work of John Rawls. This paper argues that had Rawls not fully specified the implementation of his theory of justice in one particular form of political economy then he would be vulnerable to a realist critique. But he did present such an implementation: a property-owning democracy. An appreciation of Rawls s specificationist method undercuts the realist critique of his conception of justice as (...) fairness. (shrink)
We set out an account of how self-domestication plays a crucial role in the evolution of language. In doing so, we focus on the growing body of work that treats language structure as emerging from the process of cultural transmission. We argue that a full recognition of the importance of cultural transmission fundamentally changes the kind of questions we should be asking regarding the biological basis of language structure. If we think of language structure as reflecting an accumulated set of (...) changes in our genome, then we might ask something like, “What are the genetic bases of language structure and why were they selected?” However, if cultural evolution can account for language structure, then this question no longer applies. Instead, we face the task of accounting for the origin of the traits that enabled that process of structure-creating cultural evolution to get started in the first place. In light of work on cultural evolution, then, the new question for biological evolution becomes, “How did those precursor traits evolve?” We identify two key precursor traits: the transmission of the communication system through learning; and the ability to infer the communicative intent associated with a signal or action. We then describe two comparative case studies—the Bengalese finch and the domestic dog—in which parallel traits can be seen emerging following domestication. Finally, we turn to the role of domestication in human evolution. We argue that the cultural evolution of language structure has its origin in an earlier process of self-domestication. (shrink)
It is notoriously difficult to find an intuitively satisfactory rule for evaluating populations based on the welfare of the people in them. Standard examples, like total utilitarianism, either entail the Repugnant Conclusion or in some other way contradict common intuitions about the relative value of populations. Several philosophers have presented formal arguments that seem to show that this happens of necessity: our core intuitions stand in contradiction. This paper assesses the state of play, focusing on the most powerful of these (...) ‘impossibility theorems’, as developed by Gustaf Arrhenius. I highlight two ways in which these theorems fall short of their goal: some appeal to a supposedly egalitarian condition which, however, does not properly reflect egalitarian intuitions; the others rely on a background assumption about the structure of welfare which cannot be taken for granted. Nonetheless, the theorems remain important: they give insight into the difficulty, if not perhaps the impossibility, of constructing a satisfactory population axiology. We should aim for reflective equilibrium between intuitions and more theoretical considerations. I conclude by highlighting one possible ingredient in this equilibrium, which, I argue, leaves open a still wider range of acceptable theories: the possibility of vague or otherwise indeterminate value relations. (shrink)
We have synthesized a 582,970-base pair Mycoplasma genitalium genome. This synthetic genome, named M. genitalium JCVI-1.0, contains all the genes of wild-type M. genitalium G37 except MG408, which was disrupted by an antibiotic marker to block pathogenicity and to allow for selection. To identify the genome as synthetic, we inserted "watermarks" at intergenic sites known to tolerate transposon insertions. Overlapping "cassettes" of 5 to 7 kilobases (kb), assembled from chemically synthesized oligonucleotides, were joined by in vitro recombination to produce intermediate (...) assemblies of approximately 24 kb, 72 kb ("1/8 genome"), and 144 kb ("1/4 genome"), which were all cloned as bacterial artificial chromosomes in Escherichia coli. Most of these intermediate clones were sequenced, and clones of all four 1/4 genomes with the correct sequence were identified. The complete synthetic genome was assembled by transformation-associated recombination cloning in the yeast Saccharomyces cerevisiae, then isolated and sequenced. A clone with the correct sequence was identified. The methods described here will be generally useful for constructing large DNA molecules from chemically synthesized pieces and also from combinations of natural and synthetic DNA segments. 10.1126/science.1151721. (shrink)
Many believe that the ethical problems of donation after cardiocirculatory death (DCD) have been "worked out" and that it is unclear why DCD should be resisted. In this paper we will argue that DCD donors may not yet be dead, and therefore that organ donation during DCD may violate the dead donor rule. We first present a description of the process of DCD and the standard ethical rationale for the practice. We then present our concerns with DCD, including the following: (...) irreversibility of absent circulation has not occurred and the many attempts to claim it has have all failed; conflicts of interest at all steps in the DCD process, including the decision to withdraw life support before DCD, are simply unavoidable; potentially harmful premortem interventions to preserve organ utility are not justifiable, even with the help of the principle of double effect; claims that DCD conforms with the intent of the law and current accepted medical standards are misleading and inaccurate; and consensus statements by respected medical groups do not change these arguments due to their low quality including being plagued by conflict of interest. Moreover, some arguments in favor of DCD, while likely true, are "straw-man arguments," such as the great benefit of organ donation. The truth is that honesty and trustworthiness require that we face these problems instead of avoiding them. We believe that DCD is not ethically allowable because it abandons the dead donor rule, has unavoidable conflicts of interests, and implements premortem interventions which can hasten death. These important points have not been, but need to be fully disclosed to the public and incorporated into fully informed consent. These are tall orders, and require open public debate. Until this debate occurs, we call for a moratorium on the practice of DCD. (shrink)
The first book length study of property-owning democracy, Republic of Equals argues that a society in which capital is universally accessible to all citizens is uniquely placed to meet the demands of justice. Arguing from a basis in liberal-republican principles, this expanded conception of the economic structure of society contextualizes the market to make its transactions fair. The author shows that a property-owning democracy structures economic incentives such that the domination of one agent by another in the market is structurally (...) impossible. The result is a renovated form of capitalism in which the free market is no longer a threat to social democratic values, but is potentially convergent with them. It is argued that a property-owning democracy has advantages that give it priority over rival forms of social organization such as welfare state capitalism and market socialist institutions. The book also addresses the currently high levels of inequality in the societies of the developed West to suggest a range of policies that target the "New Inequality" of our times. For this reason, the work engages not only with political philosophers such as John Rawls, Philip Pettit and John Tomasi, but also with the work of economists and historians such as Anthony B. Atkinson, François Bourguignon, Jacob S. Hacker, Lane Kenworthy, and Thomas Piketty. (shrink)
This study examined the effect of various antecedent variables on marketers’ perceptions of the role of ethics and socialresponsibility in the overall success of the firm. Variables examined included Hofstede’s cultural dimensions , as well as corporate ethical values and enforcement ofan ethics code. Additionally, individual variables such as ethical idealism and relativism were included. Results indicated that most ofthese variables impacted marketers’ perceptions of the importance of ethics and social responsibility, although to varying degrees.
In this article I caution that María Lugones's critiques of Kimberlé Crenshaw's intersectional theory posit a dangerous form of epistemic erasure, which underlies Lugones's decolonial methodology. This essay serves as a critical engagement with Lugones's essay “Radical Multiculturalism and Women of Color Feminisms” in order to uncover the decolonial lens within Crenshaw's theory of intersectionality. In her assertion that intersectionality is a “white bourgeois feminism colluding with the oppression of Women of Color,” Lugones precludes any possibility of intersectionality operating as (...) a decolonial method. Although Lugones states that her “decolonial feminism” is for all women of color, it ultimately excludes Black women, particularly with her misconstruing of Crenshaw's articulation of intersectionality that is rooted within the Black American feminist tradition. I explore Lugones's claims by juxtaposing her rendering of intersectionality with Crenshaw's and conclude that Lugones's decolonial theory risks erasing Black women from her framework. (shrink)
In Value and Context Alan Thomas articulates and defends the view that human beings do possess moral and political knowledge but it is historically and culturally contextual knowledge in ways that, say, mathematical or chemical knowledge is not. In his exposition of "cognitive contextualism" in ethics and politics he makes wide-ranging use of contemporary work in epistemology, moral philosophy, and political theory.
ABSTRACTMany scholars have drawn attention to the way that elements of Anne Conway’s system anticipate ideas found in Leibniz. This paper explores the relationship between Conway and Leibniz’s work with regard to time, space, and process. It argues – against existing scholarship – that Conway is not a proto-Leibnizian relationist about time or space, and in fact her views lie much closer to those of Henry More; yet Conway and Leibniz agree on the primacy of process. This exploration advances our (...) understanding of Conway’s system, and the intellectual relationships between Conway, More, and Leibniz. (shrink)
Virtue ethics has long provided fruitful resources for the study of issues in medical ethics. In particular, study of the moral virtues of the good doctor—like kindness, fairness and good judgement—have provided insights into the nature of medical professionalism and the ethical demands on the medical practitioner as a moral person. Today, a substantial literature exists exploring the virtues in medical practice and many commentators advocate an emphasis on the inculcation of the virtues of good medical practice in medical education (...) and throughout the medical career. However, until very recently, no empirical studies have attempted to investigate which virtues, in particular, medical doctors and medical students tend to have or not to have, nor how these virtues influence how they think about or practise medicine. The question of what virtuous medical practice is, is vast and, as we have written elsewhere, the question of how to study doctors’ moral character is fraught with difficulty. In this paper, we report the results of a first-of-a-kind study that attempted to explore these issues at three medical schools in the United Kingdom. We identify which character traits are important in the good doctor in the opinion of medical students and doctors and identify which virtues they say of themselves they possess and do not possess. Moreover, we identify how thinking about the virtues contributes to doctors’ and medical students’ thinking about common moral dilemmas in medicine. In ending, we remark on the implications for medical education. (shrink)
The Jubilee Centre’s new report, Virtuous Medical Practice, examines the place of character and values in the medical profession in Britain today. Its findings are drawn from a UK-focused multi-methods study of 549 doctors and aspiring doctors at three career stages, first and final year students and experienced doctors.
Using Hofstede's culture theory (1980, 2001 Culture's Consequences: Comparing Values, Behaviours, Institutions, and Organizations Across Nation. Sage, NewYork), the current study incorporates the moral development (e.g. Thorne, 2000; Thorne and Magnan, 2000; Thorne et al., 2003) and multidimensional ethics scale (e.g. Cohen et al., 1993; Cohen et al., 1996b; Cohen et al., 2001; Flory et al., 1992) approaches to compare the ethical reasoning and decisions of Canadian and Mainland Chinese final year undergraduate accounting students. The results indicate that Canadian accounting (...) students' formulation of an intention to act on a particular ethical dilemma (deliberative reasoning) as measured by the moral development approach (Thorne, 2000) was higher than Mainland Chinese accounting students. The current study proposes that the five factors identified by the multidimensional ethics scale (MES), as being relevant to ethical decision making can be placed into the three levels of ethical reasoning identified by Kohlberg's (1958, The Development of Modes of Moral Thinking and Choice in the Years Ten to Sixteen. University of Chicago, Doctoral dissertation) theory of cognitive moral development. Canadian accounting students used post-conventional MES factors (moral equity, contractualism, and utilitarianism) more frequently and made more ethical audit decisions than Chinese accounting students. (shrink)
Mental imagery (varieties of which are sometimes colloquially refered to as “visualizing,” “seeing in the mind's eye,” “hearing in the head,” “imagining the feel of,” etc.) is quasi-perceptual experience; it resembles perceptual experience, but occurs in the absence of the appropriate external stimuli. It is also generally understood to bear intentionality (i.e., mental images are always images of something or other), and thereby to function as a form of mental representation. Traditionally, visual mental imagery, the most discussed variety, was thought (...) to be caused by the presence of picturelike representations (mental images) in the mind, soul, or brain, but this is no longer universally accepted. (shrink)
Super-substantivalism is the thesis that space is identical to matter; it is currently under discussion ? see Sklar (1977, 221?4), Earman (1989, 115?6) and Schaffer (2009) ? in contemporary philosophy of physics and metaphysics. Given this current interest, it is worth investigating the thesis in the history of philosophy. This paper examines the super-substantivalism of Samuel Alexander, an early twentieth century metaphysician primarily associated with (the movement now known as) British Emergentism. Alexander argues that spacetime is ontologically fundamental and it (...) gives rise to an ontological hierarchy of emergence, involving novel properties such as matter, life and mind. Alexander's super-substantivalism is interesting not just because of its historical importance but also because Alexander unusually attempts to explain why spacetime is identical to matter. This paper carefully unpacks that explanation and shows how Alexander is best read as conceiving of spacetime as a Spinozistic substance, worked upon by evolution. (shrink)
In a paper recently published in this journal, Navin and Largent argue in favour of a type of policy to regulate non-medical exemptions from childhood vaccination which they call ‘Inconvenience’. This policy makes it burdensome for parents to obtain an exemption to child vaccination, for example, by requiring parents to attend immunization education sessions and to complete an application form to receive a waiver. Navin and Largent argue that this policy is preferable to ‘Eliminationism’, i.e. to policies that do not (...) allow non-medical exemptions, because Inconvenience has been shown to maintain exemption rates low while not harming parents by forcing them to do something that goes against their beliefs. We argue that it is at least doubtful that Inconvenience is ethically preferable to Eliminationism: while the latter disregards the value of liberty, Inconvenience disregards the value of fairness in the distribution of the burdens entailed by the preservation of a public good like herd immunity. We propose a variant of Inconvenience, which we call ‘Contribution’, which we think is preferable to the versions of Inconvenience discussed by Navin and Largent in that it successfully strikes a balance between the values of parents’ liberty, fairness and expected utility. (shrink)
A general meta-logical theory is developed by considering ontological disputes in the systems of metaphysics. The usefulness of this general meta-logical theory is demonstrated by considering the case of the ontological dispute between the metaphysical systems of Lewis’ Modal Realism and Terence Parsons’ Meinongianism. Using Quine’s criterion of ontological commitments and his views on ontological disagreement, three principles of metalogic is formulated. Based on the three principles of metalogic, the notions of independent variable and dependent variable are introduced. Then, the (...) ontological dispute between Lewis’ Modal Realism and Terence Parsons’ Meinongianism are restated in the light of the principles of metalogic. After the restatement, Independent variable and dependent variables are fixed in both Lewis’ Modal Realism and Terence Parsons’ Meinongianism to resolve the dispute. Subsequently, a new variety of quantifiers are introduced which is known as functionally isomorphic quantifiers to provide a formal representation of the resolution of the dispute. The specific functionally isomorphic quantifier which is developed in this work is known as st-quantifier. It is indicated that how st-quantifier which is one of the functionally isomorphic quantifiers can function like existential quantifier. It is also shown that there is some kind of inconsistency which is unavoidable in stating the ontological disagreement and therefore, paraconsistent logic is a requirement in stating the ontological disputes. (shrink)
We generalize Harsanyi's social aggregation theorem. We allow the population to be infi nite, and merely assume that individual and social preferences are given by strongly independent preorders on a convex set of arbitrary dimension. Thus we assume neither completeness nor any form of continuity. Under Pareto indifference, the conclusion of Harsanyi's theorem nevertheless holds almost entirely unchanged when utility values are taken to be vectors in a product of lexicographic function spaces. The addition of weak or strong Pareto has (...) essentially the same implications in the general case as it does in Harsanyi's original setting. (shrink)
Picking up the question of what FLaK might be, this editorial considers the relationship between openness and closure in feminist legal studies. How do we draw on feminist struggles for openness in common resources, from security to knowledge, as we inhabit a compromised space in commercial publishing? We think about this first in relation to the content of this issue: on image-based abuse continuums, asylum struggles, trials of protestors, customary justice, and not-so-timely reparations. Our thoughts take us through the different (...) ways that openness and closure work in struggles against violence, cruel welcomes, and re-arrangements of code and custom. Secondly, we share some reflections on methodological openness and closure as the roundtable conversation on asylum, and the interview with Riles, remind us of #FLaK2016 and its method of scattering sources as we think about how best to mix knowledges. Thirdly, prompted by the FLaK kitchen table conversations on openness, publishing and ‘getting the word out’, we respond to Kember’s call to ‘open up open access’. We explain the different current arrangements for opening up FLS content and how green open access, the sharedit initiative, author request and publisher discretion present alternatives to gold open access. Finally drawing on Franklin and Spade, we show how there are a range of ‘wench tactics’—adapting gifts, stalling and resting—which we deploy as academic editors who are trying to have an impact on the access, use and circulation of our journal, even though we do not own the journal we edit. These wench tactics are alternatives to the more obvious or reported tactic of resignation, or withdrawing academic labour from editing and reviewing altogether. They help us think about brewing editorial time, what ambivalence over our 25th birthday might mean, and how to inhabit painful places. In this, we respond in our own impure, compromised way to da Silva’s call not to forget the native and slave as we do FLaK, and repurpose shrapnel, in our common commitments. (shrink)
Social media applications such as Twitter, YouTube and Facebook have attained huge popularity, with more than three billion people and organizations predicted to have a social networking account by 2015. Social media offers a rapid avenue of communication with the public and has potential benefits for communicable disease control and surveillance. However, its application in everyday public health practice raises a number of important issues around confidentiality and autonomy. We report here a case from local level health protection where the (...) friend of an individual with meningococcal septicaemia used a social networking site to notify potential contacts. (shrink)
We provide conditions under which an incomplete strongly independent preorder on a convex set X can be represented by a set of mixture preserving real-valued functions. We allow X to be infi nite dimensional. The main continuity condition we focus on is mixture continuity. This is sufficient for such a representation provided X has countable dimension or satisfi es a condition that we call Polarization.
What makes us conscious? Many theories that attempt to answer this question have appeared recently in the context of widespread interest about consciousness in the cognitive neurosciences. Most of these proposals are formulated in terms of the information processing conducted by the brain. In this overview, we survey and contrast these models. We first delineate several notions of consciousness, addressing what it is that the various models are attempting to explain. Next, we describe a conceptual landscape that addresses how the (...) theories attempt to explain consciousness. We then situate each of several representative models in this landscape and indicate which aspect of consciousness they try to explain. We conclude that the search for the neural correlates of consciousness should be usefully complemented by a search for the computational correlates of consciousness. (shrink)
What is time? This is one of the most fundamental questions we can ask. Emily Thomas explores how a new theory of time emerged in the seventeenth century. The 'absolute' theory of time held that it is independent of material bodies or human minds, so even if nothing else existed there would be time.
The literature regarding social and environmental sustainability of business focuses primarily on rationales for adopting sustainability strategies and operational practices in support of that goal. In contrast, we examine sustainability from a perspective that has received far less research attention—attitudes that inform managerial decision-making. We develop a conceptual model that identifies six elemental categories of attitudes that can be held independently or aggregated to yield a meta-attitude representing the legitimacy of sustainability. Our model distinguishes among three types of internally held (...) attitudes and externally perceived subjective norms: pragmatic, moral, and cognitive. We propose a refinement of Ajzen's (In: Kuhl J, Beckmann J (eds) Action control: from cognition to behavior, 1985; Organ Behav Hum Decis Process 50:179-211, 1991) Theory of planned behavior (TPB) that incorporates these sub-categories of personal attitudes and subjective norms. Practical implications are discussed including how organizations considering adopting sustainability programs might use the model as a conceptual tool to help achieve and assess program success. (shrink)
A theory of the structure and cognitive function of the human imagination that attempts to do justice to traditional intuitions about its psychological centrality is developed, largely through a detailed critique of the theory propounded by Colin McGinn. Like McGinn, I eschew the highly deflationary views of imagination, common amongst analytical philosophers, that treat it either as a conceptually incoherent notion, or as psychologically trivial. However, McGinn fails to develop his alternative account satisfactorily because (following Reid, Wittgenstein and Sartre) he (...) draws an excessively sharp, qualitative distinction between imagination and perception, and because of his flawed, empirically ungrounded conception of hallucination. His arguments in defense of these views are rebutted in detail, and the traditional, passive, Cartesian view of visual perception, upon which several of them implicitly rely, is criticized in the light of findings from recent cognitive science and neuroscience. It is also argued that the apparent intuitiveness of the passive view of visual perception is a result of mere historical contingency. An understanding of perception (informed by modern visual science) as an inherently active process enables us to unify our accounts of perception, mental imagery, dreaming, hallucination, creativity, and other aspects of imagination within a single coherent theoretical framework. (shrink)
While the work of Benedict de Spinoza has been a source of inspiration and curiosity for a variety of literary and artistic figures,1 his grounding philosophical principles are often cited as a hindrance for a productive engagement with art and art theory. Certain commentators cite Spinoza's "naturalism" and "rationalism" as reasons for his philosophy's "hostility" to art and culture.2 But these criticisms only prevail if: one holds that works of art and literature ought to have an ontological ground other than (...) the natural, and if art and literature are given in opposition to reason and rationality. In contrast to such studies that focus on the... (shrink)
It is only in the last few decades that analytic philosophers in particular have begun to pay any serious attention to the topic of life’s meaning. Such philosophers, however, do not usually attempt to answer or analyse the traditional question ‘What is the meaning of life?’, but rather the subtly different question ‘What makes a life meaningful?’ and it is generally assumed that the latter can be discussed independently of the former. Nevertheless, this paper will argue that the two questions (...) are indeed connected, and that identifying and expanding upon the most plausible analysis of the former will provide the resources necessary to determine the most plausible answer to the latter. Specifically, this paper will argue that the traditional question is simply a request for the information which constitutes a coherent answer to one or more of a certain set of questions regarding human existence that were salient to the asker. In simpler language, the meaning of life itself is the information a person needs to make sense of it. This analysis can then also be applied to individual lives, such that asking for the meaning of X’s life is an analogous request for the information necessary to make sense of that life in particular. Running with this concept of the ‘meaning’ of something as its ‘sense’, the paper then outlines an accompanying theory of ‘meaningfulness’ as ‘sensefulness’: a measure of the richness of certain aspects of the life, multiplied by their intelligibility. (shrink)
It is often assumed that similar domain-specific behavioural impairments found in cases of adult brain damage and developmental disorders correspond to similar underlying causes, and can serve as convergent evidence for the modular structure of the normal adult cognitive system. We argue that this correspondence is contingent on an unsupported assumption that atypical development can produce selective deficits while the rest of the system develops normally (Residual Normality), and that this assumption tends to bias data collection in the field. Based (...) on a review of connectionist models of acquired and developmental disorders in the domains of reading and past tense, as well as on new simulations, we explore the computational viability of Residual Normality and the potential role of development in producing behavioural deficits. Simulations demonstrate that damage to a developmental model can produce very different effects depending on whether it occurs prior to or following the training process. Because developmental disorders typically involve damage prior to learning, we conclude that the developmental process is a key component of the explanation of endstate impairments in such disorders. Further simulations demonstrate that in simple connectionist learning systems, the assumption of Residual Normality is undermined by processes of compensation or alteration elsewhere in the system. We outline the precise computational conditions required for Residual Normality to hold in development, and suggest that in many cases it is an unlikely hypothesis. We conclude that in developmental disorders, inferences from behavioural deficits to underlying structure crucially depend on developmental conditions, and that the process of ontogenetic development cannot be ignored in constructing models of developmental disorders. Key Words: Acquired and developmental disorders; connectionist models; modularity; past tense; reading. (shrink)
The Cartesian view that animals are automata sparked a major controversy in early modern European philosophy. This paper studies an early contribution to this controversy. I provide an interpretation of an influential objection to Cartesian animal automatism raised by Ignace-Gaston Pardies (1636–1673). Pardies objects that the Cartesian arguments show only that animals lack ‘intellectual perception’ but do not show that animals lack ‘sensible perception.’ According to Pardies, the difference between these two types of perception is that the former is reflexive (...) such that we both perceive an object and the perception itself, whereas sensible perception lacks this reflexivity. This notion of sensible perception was criticized by the Cartesian Antoine Dilly for violating the doctrine that all thought is conscious. However, I argue that sensible perceptions are not unconscious for Pardies. Rather, they are conscious perceptions that are unaccompanied by a kind of reflexive perception that is constitutive of attention. Moreover, I argue that when understood in this way Pardies raises a compelling objection to Cartesian animal automatists. (shrink)