Emerging technologies are increasingly used in an attempt to “enhance the human body and/or mind” beyond the contemporary standards that characterize human beings. Yet, such standards are deeply controversial and it is not an easy task to determine whether the application of a given technology to an individual and its outcome can be defined as a human enhancement or not. Despite much debate on its potential or actual ethical and social impacts, human enhancement is not subject to any consensual definition. (...) This paper proposes a timely and much needed examination of the various definitions found in the literature. We classify these definitions into four main categories: the implicit approach, the therapy-enhancement distinction, the improvement of general human capacities and the increase of well-being. After commenting on these different approaches and their limitations, we propose a definition of human enhancement that focuses on individual perceptions. While acknowledging that a definition that mainly depends on personal and subjective individual perceptions raises many challenges, we suggest that a comprehensive approach to define human enhancement could constitute a useful premise to appropriately address the complexity of the ethical and social issues it generates. (shrink)
This essay explains and criticizes Gentile's attempts to connect his metaphysical theories with his ideas about education, and especially the relationship between education and nationalism. It begins with a critical examination of the distinguishing features of the view Gentile specifies in Theory of Mind as Pure Act. Vincent then considers Gentile's account of how this theory, for which mind is an act of perpetual self-creation, leads to a conception of education with an explicitly nationalist bent. His attempts to connect (...) these are ultimately unsuccessful, argues Vincent; actual idealism does not give rise to any specific political order, and certainly not the kind of state-led nationalism that Gentile ultimately supported. (shrink)
In his controversial new book, Andrew Vincent offers a comprehensive, synoptic, and comparative analysis of the major conceptions of political theory throughout the twentieth century. The book challenges established views of contemporary political theory and provides critical perspectives on the future of the subject. It will be an indispensable resource for all scholars and students of the discipline.
Luck egalitarians think that considerations of responsibility can excuse departures from strict equality. However critics argue that allowing responsibility to play this role has objectionably harsh consequences. Luck egalitarians usually respond either by explaining why that harshness is not excessive, or by identifying allegedly legitimate exclusions from the default responsibility-tracking rule to tone down that harshness. And in response, critics respectively deny that this harshness is not excessive, or they argue that those exclusions would be ineffective or lacking in justification. (...) Rather than taking sides, after criticizing both positions I also argue that this way of carrying on the debate – i.e. as a debate about whether the harsh demands of responsibility outweigh other considerations, and about whether exclusions to responsibility-tracking would be effective and/or justified – is deeply problematic. On my account, the demands of responsibility do not – in fact, they can not – conflict with the demands of other normative considerations, because responsibility only provides a formal structure within which those other considerations determine how people may be treated, but it does not generate its own practical demands. (shrink)
Garrath Williams claims that truly responsible people must possess a “capacity … to respond [appropriately] to normative demands” (2008:462). However, there are people whom we would normally praise for their responsibility despite the fact that they do not yet possess such a capacity (e.g. consistently well-behaved young children), and others who have such capacity but who are still patently irresponsible (e.g. some badly-behaved adults). Thus, I argue that to qualify for the accolade “a responsible person” one need not possess such (...) a capacity, but only to be earnestly willing to do the right thing and to have a history that testifies to this willingness. Although we may have good reasons to prefer to have such a capacity ourselves, and to associate ourselves with others who have it, at a conceptual level I do not think that such considerations support the claim that having this capacity is a necessary condition of being a responsible person in the virtue sense. (shrink)
Could neuroimaging evidence help us to assess the degree of a person’s responsibility for a crime which we know that they committed? This essay defends an affirmative answer to this question. A range of standard objections to this high-tech approach to assessing people’s responsibility is considered and then set aside, but I also bring to light and then reject a novel objection—an objection which is only encountered when functional (rather than structural) neuroimaging is used to assess people’s responsibility.
The way in which we characterize the structural and functional differences between psychopath and normal brains – either as biological disorders or as mere biological differences – can influence our judgments about psychopaths’ responsibility for criminal misconduct. However, Marga Reimer (Neuroethics 1(2):14, 2008) points out that whether our characterization of these differences should be allowed to affect our judgments in this manner “is a difficult and important question that really needs to be addressed before policies regarding responsibility... can be implemented (...) with any confidence”. This paper is an attempt to address Reimer’s difficult and important question; I argue that irrespective of which of these two characterizations is chosen, our judgments about psychopaths’ responsibility should not be affected, because responsibility hinges not on whether a particular difference is (referred to as) a disorder or not, but on how that difference affects the mental capacities required for moral agency. (shrink)
Egalitarians must address two questions: i. What should there be an equality of, which concerns the currency of the ‘equalisandum’; and ii. How should this thing be allocated to achieve the so-called equal distribution? A plausible initial composite answer to these two questions is that resources should be allocated in accordance with choice, because this way the resulting distribution of the said equalisandum will ‘track responsibility’ — responsibility will be tracked in the sense that only we will be responsible for (...) the resources that are available to us, since our allocation of resources will be a consequence of our own choices. But the effects of actual choices should not be preserved until the prior effects of luck in constitution and circumstance are first eliminated. For instance, people can choose badly because their choice-making capacity was compromised due to a lack of intelligence (i.e. due to constitutional bad luck), or because only bad options were open to them (i.e. due to circumstantial bad luck), and under such conditions we are not responsible for our choices. So perhaps a better composite answer to our two questions (from the perspective of tracking responsibility) might be that resources should be allocated so as to reflect people’s choices, but only once those choices have been corrected for the distorting effects of constitutional and circumstantial luck, and on this account choice preservation and luck elimination are two complementary aims of the egalitarian ideal. Nevertheless, it is one thing to say that luck’s effects should be eliminated, but quite another to figure out just how much resource redistribution would be required to achieve this outcome, and so it was precisely for this purpose that in 1981 Ronald Dworkin developed the ingenuous hypothetical insurance market argumentative device (HIMAD), which he then used in conjunction with the talent slavery (TS) argument, to arrive at an estimate of the amount of redistribution that would be required to reduce the extent of luck’s effects. However recently Daniel Markovits has cast doubt over Dworkin’s estimates of the amount of redistribution that would be required, by pointing out flaws with his understanding of how the hypothetical insurance market would function. Nevertheless, Markovits patched it up and he used this patched-up version of Dworkin’s HIMAD together with his own version of the TS argument to reach his own conservative estimate of how much redistribution there ought to be in an egalitarian society. Notably though, on Markovits’ account once the HIMAD is patched-up and properly understood, the TS argument will also allegedly show that the two aims of egalitarianism are not necessarily complementary, but rather that they can actually compete with one another. According to his own ‘equal-agent’ egalitarian theory, the aim of choice preservation is more important than the aim of luck elimination, and so he alleges that when the latter aim comes into conflict with the former aim then the latter will need to be sacrificed to ensure that people are not subordinated to one another as agents. I believe that Markovits’ critique of Dworkin is spot on, but I also think that his own positive thesis — and hence his conclusion about how much redistribution there ought to be in an egalitarian society — is flawed. Hence, this paper will begin in Section I by explaining how Dworkin uses the HIMAD and his TS argument to estimate the amount of redistribution that there ought to be in an egalitarian society — this section will be largely expository in content. Markovits’ critique of Dworkin will then be outlined in Section II, as will be his own positive thesis. My critique of Markovits, and my own positive thesis, will then make a fleeting appearance in Section III. Finally, I will conclude by rejecting both Dworkin’s and Markovits’ estimates of the amount of redistribution that there ought to be in an egalitarian society, and by reaffirming the responsibility-tracking egalitarian claim that choice preservation and luck elimination are complementary and not competing egalitarian aims. (shrink)
This thesis considers two allegations which conservatives often level at no-fault systems — namely, that responsibility is abnegated under no-fault systems, and that no-fault systems under- and over-compensate. I argue that although each of these allegations can be satisfactorily met – the responsibility allegation rests on the mistaken assumption that to properly take responsibility for our actions we must accept liability for those losses for which we are causally responsible; and the compensation allegation rests on the mistaken assumption that tort (...) law’s compensatory decisions provide a legitimate norm against which no-fault’s decisions can be compared and criticized – doing so leads in a direction which is at odds with accident law reform advocates’ typical recommendations. On my account, accident law should not just be reformed in line with no-fault’s principles, but rather it should be completely abandoned since the principles that protect no- fault systems from the conservatives’ two allegations are incompatible with retaining the category of accident law, they entail that no-fault systems are a form of social welfare and not accident law systems, and that under these systems serious deprivation – and to a lesser extent causal responsibility – should be conditions of eligibility to claim benefits. (shrink)
In "Torts, Egalitarianism and Distributive Justice" , Tsachi Keren-Paz presents impressingly detailed analysis that bolsters the case in favour of incremental tort law reform. However, although this book's greatest strength is the depth of analysis offered, at the same time supporters of radical law reform proposals may interpret the complexity of the solution that is offered as conclusive proof that tort law can only take adequate account of egalitarian aims at an unacceptably high cost.
It could be argued that tort law is failing, and arguably an example of this failure is the recent public liability and insurance (‘PL&I’) crisis. A number of solutions have been proposed, but ultimately the chosen solution should address whatever we take to be the cause of this failure. On one account, the PL&I crisis is a result of an unwarranted expansion of the scope of tort law. Proponents of this position sometimes argue that the duty of care owed by (...) defendants to plaintiffs has expanded beyond reasonable levels, such that parties who were not really responsible for another’s misfortune are successfully sued, while those who really were to blame get away without taking any responsibility. However people should take responsibility for their actions, and the only likely consequence of allowing them to shirk it is that they and others will be less likely to exercise due care in the future, since the deterrents of liability and of no compensation for accidentally self-imposed losses will not be there. Others also argue that this expansion is not warranted because it is inappropriately fueled by ‘deep pocket’ considerations rather than by considerations of fault. They argue that the presence of liability insurance sways the judiciary to award damages against defendants since they know that insurers, and not the defendant personally, will pay for it in the end anyway. But although it may seem that no real person has to bear these burdens when they are imposed onto insurers, in reality all of society bears them collectively when insurers are forced to hike their premiums to cover these increasing damages payments. In any case, it seems unfair to force insurers to cover these costs simply because they can afford to do so. If such an expansion is indeed the cause of the PL&I crisis, then a contraction of the scope of tort liability, and a pious return to the fault principle, might remedy the situation. However it could also be argued that inadequate deterrence is the cause of this crisis. On this account the problem would lie not with the tort system’s continued unwarranted expansion, but in the fact that defendants really have been too careless. If prospective injurers were appropriately deterred from engaging in unnecessarily risky activities, then fewer accidents would ever occur in the first place, and this would reduce the need for litigation at its very source. If we take this to be the cause of tort law’s failure then our solution should aim to improve deterrence. Glen Robinson has argued that improved deterrence could be achieved if plaintiffs were allowed to sue defendants for wrongful exposure to ongoing risks of future harm, even in the absence of currently materialized losses. He argues that at least in toxic injury type cases the tortious creation of risk [should be seen as] an appropriate basis of liability, with damages being assessed according to the value of the risk, as an alternative to forcing risk victims to abide the outcome of the event and seek damages only if and when harm materializes. In a sense, Robinson wishes to treat newly-acquired wrongful risks as de-facto wrongful losses, and these are what would be compensated in liability for risk creation (‘LFRC’) cases. Robinson argues that if the extent of damages were fixed to the extent of risk exposure, all detected unreasonable risk creators would be forced to bear the costs of their activities, rather than only those who could be found responsible for another’s injuries ‘on the balance of probabilities’. The incidence of accidents should decrease as a result of improved deterrence, reduce the ‘suing fest’, and so resolve the PL&I crisis. So whilst the first solution involves contracting the scope of tort liability, Robinson’s solution involves an expansion of its scope. However Robinson acknowledges that LFRC seems prima facie incompatible with current tort principles which in the least require the presence of plaintiff losses, defendant fault, and causation to be established before making defendants liable for plaintiffs’ compensation. Since losses would be absent in LFRC cases by definition, the first evidentiary requirement would always be frustrated, and in its absence proof of defendant fault and causation would also seem scant. If such an expansion of tort liability were not supported by current tort principles then it would be no better than proposals to switch accident law across to no-fault, since both solutions would require comprehensive legal reform. However Robinson argues that the above three evidentiary requirements could be met in LFRC cases to the same extent that they are met in other currently accepted cases, and hence that his solution would therefore be preferable to no-fault solutions as it would only require incremental but not comprehensive legal reform. Although I believe that actual losses should be present before allowing plaintiffs to seek compensation, I will not present a positive argument for this conclusion. My aim in this paper is not to debate the relative merits of Robinson’s solution as compared to no-fault solutions, nor to determine which account of the cause of the PL&I crisis is closer to the truth, but rather to find out whether Robinson’s solution would indeed require less radical legal reform than, for example, proposed no-fault solutions. I will argue that Robinson fails to show that current tort principles would support his proposed solution, and hence that his solution is at best on an even footing with no-fault solutions since both would require comprehensive legal reform. (shrink)
This is a report on the 3-day workshop “The Neuroscience of Responsibility” that was held in the Philosophy Department at Delft University of Technology in The Netherlands during February 11th–13th, 2010. The workshop had 25 participants from The Netherlands, Germany, Italy, UK, USA, Canada and Australia, with expertise in philosophy, neuroscience, psychology, psychiatry and law. Its aim was to identify current trends in neurolaw research related specifically to the topic of responsibility, and to foster international collaborative research on this topic. (...) The workshop agenda was constructed by the participants at the start of each day by surveying the topics of greatest interest and relevance to participants. In what follows, we summarize (1) the questions which participants identified as most important for future research in this field, (2) the most prominent themes that emerged from the discussions, and (3) the two main international collaborative research project plans that came out of this meeting. (shrink)
Third-party property insurance (TPPI) protects insured drivers who accidentally damage an expensive car from the threat of financial ruin. Perhaps more importantly though, TPPI also protects the victims whose losses might otherwise go uncompensated. Ought responsible drivers therefore take out TPPI? This paper begins by enumerating some reasons for why a rational person might believe that they have a moral obligation to take out TPPI. It will be argued that if what is at stake in taking responsibility is the ability (...) to compensate our possible future victims for their losses, then it might initially seem that most people should be thankful for the availability of relatively inexpensive TPPI because without it they may not have sufficient funds to do the right thing and compensate their victims in the event of an accident. But is the ability to compensate one's victims really what is at stake in taking responsibility? The second part of this paper will critically examine the arguments for the above position, and it will argue that these arguments do not support the conclusion that injurers should compensate their victims for their losses, and hence that drivers need not take out TPPI in order to be responsible. Further still, even if these arguments did support the conclusion that injurers should compensate their victims for their losses, then (perhaps surprisingly) nobody should to be allowed to take out TPPI because doing so would frustrate justice. (shrink)
In the field of ?neurolaw?, reformists claim that recent scientific discoveries from the mind sciences have serious ramifications for how legal responsibility should be adjudicated, but conservatives deny that this is so. In contrast, I criticise both of these polar opposite positions by arguing that although scientific findings can have often-weighty normative significance, they lack the normative authority with which reformists often imbue them. After explaining why conservatives and reformists are both wrong, I then offer my own moderate suggestions about (...) what views we have reason to endorse. My moderate position reflects the familiar capacitarian idea which underlies much lay, legal, and philosophical thinking about responsibility ? namely, that responsibility tracks mental capacity. (shrink)
New concepts may prove necessary to profit from the avalanche of sequence data on the genome, transcriptome, proteome and interactome and to relate this information to cell physiology. Here, we focus on the concept of large activity-based structures, or hyperstructures, in which a variety of types of molecules are brought together to perform a function. We review the evidence for the existence of hyperstructures responsible for the initiation of DNA replication, the sequestration of newly replicated origins of replication, cell division (...) and for metabolism. The processes responsible for hyperstructure formation include changes in enzyme affinities due to metabolite-induction, lipid-protein affinities, elevated local concentrations of proteins and their binding sites on DNA and RNA, and transertion. Experimental techniques exist that can be used to study hyperstructures and we review some of the ones less familiar to biologists. Finally, we speculate on how a variety of in silico approaches involving cellular automata and multi-agent systems could be combined to develop new concepts in the form of an Integrated cell (I-cell) which would undergo selection for growth and survival in a world of artificial microbiology. (shrink)
I ARGUE IN THIS PAPER that there are profound and legitimate worries concerning the application of organic and personal criteria to groups. I try to specify the reasons why we object to such ideas, while contending that some of these objections are misguided. Primarily, to refer to a group as a person is not necessarily the same as referring to it as either organic or as an individual. Further, each term--organic, individual, and person--must be carefully unpacked and analyzed. One conclusion (...) of this analysis is that there are important senses in which these concepts both overlap and also diverge. One should not therefore automatically conflate them. Another conclusion is that there are senses of these terms which are perfectly innocuous. Finally, there are clearly ways in which we can sensibly use the notion of personality to apply to groups. The groups in question are usually diverse, and do not have to include the state. In fact, despite the absence of such "group person" usage from mainstream political theory, such language is more of a commonplace in international politics and legal discourse and practice. I attempt, in the final part of the paper, to schematize the various uses of personality. I conclude by suggesting the fruitfulness of one particular sense of juristic personality. (shrink)
In the last four decades there has been a great deal of work done on German idealism in all fields of humanistic study, including theology and the philosophy of religion, devoted particularly to the philosophies of Kant and Hegel. A great deal less has been done of the British idealist school, often because they are regarded as slavish imitators of Kant or Hegel. Such a judgment is though misplaced. There is a rich and independent vein of idealist philosophical and theological (...) speculation in Britain, in the late nineteenth and early twentieth century, which is not often studied in any great depth or mentioned except in passing. (shrink)
Nationalism has had a complex relation with the discipline of political theory during the 20th century. Political theory has often been deeply uneasy with nationalism in relation to its role in the events leading up to and during the Second World War. Many theorists saw nationalism as an overly narrow and potentially irrationalist doctrine. In essence it embodied a closed vision of the world. This article focuses on one key contributor to the immediate post-war debate—Karl Popper—who retained deep misgivings about (...) nationalism until the end of his life, and indeed saw the events of the early 1990s (shortly before his death) as a confirmation of this distrust. Popper was one of a number of immediate post war writers, such as Friedrich Hayek and Ludwig von Mises, who shared this unease with nationalism. They all had a powerful effect on social and political thought in the English-speaking world. Popper particularly articulated a deeply influential perspective that fortuitously encapsulated a cold war mentality in the 1950s. In 2005 Popper's critical views are doubly interesting, since the last decade has seen a renaissance of nationalist interests. The collapse of the Berlin wall in 1989, and the changing political landscape of international and domestic politics, has seen once again a massive growth of interest in nationalism, particularly from liberal political theorists and a growing, and, at times, immensely enthusiastic academic literature, trying to provide a distinctively benign benediction to nationalism. (shrink)
This paper is concerned with the conception of the individual in Hegelian thought. The discussion will focus on some of the textual uses that Hegel and some Hegelians make of the term individual. The ultimate aim of the paper, however, is to focus on the concrete individual and to argue that there are two fundamentally important yet distinct uses to which Hegel and some Hegelians put the term. These two uses are not compatible, dialectically or otherwise. The plan of this (...) paper is to state the nature of the problem of the individual and then to examine it in more detail through the writings of representative British Hegelians. (shrink)
What did God mean to F.H. Bradley? Bradley’s style and subtle philosophical approach makes it difficult to ascertain precisely what his settled thoughts were on this issue. He does say, for example, quite a lot as to what God is not. This essay will initially follow out this negative reading. This latter enterprise entails comparisons, first, with philosophy, or more appropriately the ‘metaphysical impulse’, second, with morality, and third, with history. Having followed out the more negative arguments, the essay turns (...) to what religion is for Bradley. This is intrinsically more difficult. The discussion elicits here some help from the other idealist thinkers who were close readers of Bradley. Almost surprisingly, Bradley does comment substantively, if briefly, on issues of faith, miracle, atonement, prayer, church going, and clerics. The most significant theological point to arise in both Bradley is the significance of St. Paul’s writings. The Pauline ‘justification by faith’ becomes a repeated motif. However, there still remains an ambiguous relation of God and religion to ‘social life’ or ‘sociality’, which is explored more carefully at the conclusion of the essay. The essay finally argues that Bradley’s highly nuanced response to religion and God embodies paradoxical and unresolved elements which encapsulate, unwittingly, a dilemma at the heart of the Idealist conception of religion. (shrink)
For living beings, information is as fundamental as matter or energy. In this paper we show: a) inadequacies of quantitative theories of information, b) how a qualitative analysis leads to a classification of information systems and to a modelling of intercellular communication.From a quantitative point of view, the application in biology of information theories borrowed from communication techniques proved to be disappointing. These theories ignore deliberately the significance of messages, and do not give any definition of information. They refer to (...) quantities, based upon arbitrarily defined probabilistic events. Probability is subjective. The receiver of the message needs to have meta-knowledge of the events. The quantity of information depends on language, coding, and arbitrary definition of disorder. The suggested objectivity is fallacious. (shrink)
If code is law then standards bodies are governments. This flawed but powerful metaphor suggests the need to examine more closely those standards bodies that are defining standards for the Internet. In this paper we examine the International Telecommunications Union, the Institute for Electrical and Electronics Engineers Standards Association, the Internet Engineering Task Force, and the World Wide Web Consortium. We compare the organizations on the basis of participation, transparency, authority, openness, security and interoperability. We conclude that the IETF and (...) the W3C are becoming increasingly similar. We also conclude that the classical distinction between standards and implementations is decreasingly useful as standards are embodies in code – itself a form of speech or documentation. Recent Internet standards bodies have flourished in part by discarding or modifying the implementation/standards distinction. We illustrate that no single model is superior on all dimensions. The IETF is not effectively scaling, struggling with its explosive growth with the creation of thousands of working groups. The IETF coordinating body, the Internet Society, addressed growth by reorganization that removed democratic oversight. The W3C, initially the most closed, is becoming responsive to criticism and now includes open code participants. The IEEE SA and ITU have institutional controls appropriate for hardware but too constraining for code. Each organization has much to learn from the others. (shrink)
A new approach to information is proposed with the intention of providing a conceptual tool adapted to biology, including a semantic value.Information involves a material support as well as a significance, adapted to the cognitive domain of the receiver and/or the transmitter. A message does not carry any information, only data. The receiver makes an identification by a procedure of recognition of the forms, which activate previously learned significance. This treatment leads to a new significance (or new knowledge).