As artificial intelligence becomes ubiquitous, it will be increasingly involved in novel, morally significant situations. Thus, understanding what it means for a machine to be morally responsible is important for machine ethics. Any method for ascribing moral responsibility to AI must be intelligible and intuitive to the humans who interact with it. We argue that the appropriate approach is to determine how AIs might fare on a standard account of human moral responsibility: a Strawsonian account. We make no claim that (...) our Strawsonian approach is either the only one worthy of consideration or the obviously correct approach, but we think it is preferable to trying to marry fundamentally different ideas of moral responsibility into a single cohesive account. Under a Strawsonian framework, people are morally responsible when they are appropriately subject to a particular set of attitudes—reactive attitudes—and determine under what conditions it might be appropriate to subject machines to this same set of attitudes. Although the Strawsonian account traditionally applies to individual humans, it is plausible that entities that are not individual humans but possess these attitudes are candidates for moral responsibility under a Strawsonian framework. We conclude that weak AI is never morally responsible, while a strong AI with the right emotional capacities may be morally responsible. (shrink)
What is the source of rights? Rights have been grounded in divine agency, human nature, and morally justified claims, and have been used to assess the moral status of legal and customary social practices. The orthodoxy is that some of our rights are a species of unrecognized or natural rights. For example, black slaves in antebellum America were said to have such rights, and this was taken to provide a basis for establishing the immorality of slavery. Derrick Darby exposes (...) the main shortcomings of the orthodox conception of the source of rights and proposes a radical alternative. He draws on the legacy of race and racism in the USA to argue that all rights are products of social recognition. This bold, lucid and meticulously argued book will inspire readers to rethink the central role assigned to rights in moral, political, and legal theory as well as in everyday evaluative discourse. (shrink)
This paper constructs a model of metaphysical indeterminacy that can accommodate a kind of ‘deep’ worldly indeterminacy that arguably arises in quantum mechanics via the Kochen-Specker theorem, and that is incompatible with prominent theories of metaphysical indeterminacy such as that in Barnes and Williams (2011). We construct a variant of Barnes and Williams's theory that avoids this problem. Our version builds on situation semantics and uses incomplete, local situations rather than possible worlds to build a model. We evaluate the resulting (...) theory and contrast it with similar alternatives, concluding that our model successfully captures deep indeterminacy. (shrink)
Peter Vickers examines 'inconsistent theories' in the history of science--theories which, though contradictory, are held to be extremely useful. He argues that these 'theories' are actually significantly different entities, and warns that the traditional goal of philosophy to make substantial, general claims about how science works is misguided.
The paper studies how the localic notion of sublocale transfers to formal topology. For any formal topology (not necessarily with positivity predicate) we define a sublocale to be a cover relation that includes that of the formal topology. The family of sublocales has set-indexed joins. For each set of base elements there are corresponding open and closed sublocales, boolean complements of each other. They generate a boolean algebra amongst the sublocales. In the case of an inductively generated formal topology, the (...) collection of inductively generated sublocales has coframe structure. Overt sublocales and weakly closed sublocales are described, and related via a new notion of "rest closed" sublocale to the binary positivity predicate. Overt, weakly closed sublocales of an inductively generated formal topology are in bijection with "lower powerpoints", arising from the impredicative theory of the lower powerlocale. Compact sublocales and fitted sublocales are described. Compact fitted sublocales of an inductively generated formal topology are in bijection with "upper powerpoints", arising from the impredicative theory of the upper powerlocale. (shrink)
Pluralist and eliminativist positions have proliferated within both science and philosophy of science in recent decades. This paper asks the question why this shift of thinking has occurred, and where it is leading us. We provide an explanation which, if correct, entails that we should expect pluralism and eliminativism to transform other debates currently unaffected, and for good reasons. We then consider the question under what circumstances eliminativism will be appropriate, arguing that it depends not only on the term in (...) question, but also on the context of discussion and details of the debate at hand. The resultant selective eliminativism is an appealing compromise for various ‘pluralists’ and ‘eliminativists’ who are currently locking horns. (shrink)
Kirchhoff’s diffraction theory is introduced as a new case study in the realism debate. The theory is extremely successful despite being both inconsistent and not even approximately true. Some habitual realist proclamations simply cannot be maintained in the face of Kirchhoff’s theory, as the realist is forced to acknowledge that theoretical success can in some circumstances be explained in terms other than truth. The idiosyncrasy (or otherwise) of Kirchhoff’s case is considered.
Male circumcision—partial or total removal of the penile prepuce—has been proposed as a public health measure in Sub-Saharan Africa, based on the results of three randomized control trials showing a relative risk reduction of approximately 60 per cent for voluntary, adult male circumcision against female-to-male human immunodeficiency virus transmission in that context. More recently, long-time advocates of infant male circumcision have argued that these findings justify involuntary circumcision of babies and children in dissimilar public health environments, such as the USA, (...) Australasia and Europe. In this article, we take a close look at the necessary ethical and empirical steps that would be needed to bridge the gap between the African RCTs and responsible public health policy in developed countries. In the course of doing so, we discuss some of the main disagreements about the moral permissibility of performing a nontherapeutic surgery on a child to benefit potential future sexual partners of his. In this context, we raise concerns not only about weaknesses in the available evidence concerning such claims of benefit, but also about a child’s moral interest in future autonomy and the preservation of his bodily integrity. We conclude that circumcision of minors in developed countries on public health grounds is much harder to justify than proponents of the surgery suggest. (shrink)
It has been widely noted that Humean supervenience , according to which everything supervenes on intrinsic properties of point-sized things and the spatiotemporal relations between them, is at odds with the nonlocal character of quantum mechanics, according to which not everything supervenes on intrinsic properties of point-sized things and the spatiotemporal relations between them. In particular, a standard view is that the parts of a composite quantum system instantiate further relations which are not accounted for in Lewis's Humean mosaic. But (...) that suggests a simple solution: Why couldn't Lewis simply add these new relations to the supervenience basis? The aim of this article is to use Humean supervenience as a foil to spell out a feature of entanglement of general metaphysical interest: The way in which the metaphysical lessons drawn for two-party systems ramify when systems of many parties are considered. The main conclusion is that the proposed simple fix in fact results in a supervenience thesis different in kind from Lewis's, by making the relations in the supervenience basis global in a certain sense. (shrink)
According to Lewis's modal realism, all ways the world could be are represented by possible worlds, and all possible worlds represent some way the world could be. That there are just the right possible worlds to represent all and only the ways the world could be is to be guaranteed by the principle of recombination. Lewis sketches the principle , but does not spell out a precise version that generates just the right possibilities. David Efird and Tom Stoneham have offered (...) a principle that aims to do just that.In this paper, we argue that Efird's and Stoneham's principle of recombination is not successful – it fails to generate the right possibilities – but we also suggest ways that their account might be improved to solve the problem we raise. We also argue against Efird's and Stoneham's claim that the correct principle of recombination demonstrates the possibility of nothing concrete – it is true that their principle of recombination has models consistent with the existence of an empty world, but we only get the possibility of nothing if mereologically null individuals are possible. The Lewisian should only think mereologically null individuals are possible if he or she has some independent reason for believing in the possibility of an empty world, so the principle of recombination provides no new evidence for that possibility. We draw some morals from this for the correct way to formulate the principle of recombination. (shrink)
In this article I discuss Geoffrey Vickers’ ideas from the perspective of moral and political philosophy. His thought is presented through three key terms, which I suggest can encapsulate his philosophy: (i) our human capacity to respond aptly to our situation; (ii) the analysis of modern society in terms of institutions; and (iii) the moral importance of responsibility to the maintenance of human culture and cooperation.
Quantum entanglement has long been thought to be have deep metaphysical consequences. For example, it has been claimed to show that Humean supervenience is false or to involve a novel form of ontological holism. One way to avoid confronting the metaphysical consequences is to adopt some form of antirealism. In this paper we discuss two prominent strands in recent literature—wavefunction realism and “Super-Humeanism”—that appear quite different, but, as we see it, are instances of a more general strategy. In effect, what (...) these attempt to do is to diffuse the puzzle of entanglement by eliminating it. These interpretative movements are advertised as equally realist, but, we claim, fail to take an appropriately realist attitude towards entanglement. What we advocate instead is a genuine metaphysics of entanglement: instead of eliminating entanglement, develop a metaphysics that accounts for and explains it. (shrink)
There has been recent interest in formulating theories of non-representational indeterminacy. The aim of this paper is to clarify the relevance of quantum mechanics to this project. Quantum-mechanical examples of vague objects have been offered by various authors, displaying indeterminate identity, in the face of the famous Evans argument that such an idea is incoherent. It has also been suggested that the quantum-mechanical treatment of state-dependent properties exhibits metaphysical indeterminacy. In both cases it is important to consider the details of (...) the metaphysical account and the way in which the quantum phenomenon is captured within it. Indeed if we adopt a familiar way of thinking about indeterminacy and apply it in a natural way to quantum mechanics, we run into illuminating difficulties and see that the case is far less straightforward than might be hoped. (shrink)
The ontological status of theories themselves has recently re-emerged as a live topic in the philosophy of science. We consider whether a recent approach within the philosophy of art can shed some light on this issue. For many years philosophers of aesthetics have debated a paradox in the (meta)ontology of musical works (e.g. Levinson ). Taken individually, there are good reasons to accept each of the following three propositions: (i) musical works are created; (ii) musical works are abstract objects; (iii) (...) abstract objects cannot be created. However it seems clear that, if one wants to avoid inconsistency, one cannot commit to all three. Following up recent developments courtesy of Cameron ([2008a]), we consider how one might respond to the corresponding set of propositions in the (meta)ontology of scientific theories. (shrink)
The principle of the child's right to an open future was first proposed by the legal philosopher Joel Feinberg and developed further by bioethicist Dena Davis. The principle holds that children possess a unique class of rights called rights in trust—rights that they cannot yet exercise, but which they will be able to exercise when they reach maturity. Parents should not, therefore, take actions that permanently foreclose on or pre-empt the future options of their children, but leave them the greatest (...) possible scope for exercising personal life choices in adulthood. Davis particularly applies the principle to genetic counselling, arguing that parents should not take deliberate steps to create physically abnormal children, and to religion, arguing that while parents are entitled to bring their children up in accordance with their own values, they are not entitled to inflict physical or mental harm, neither by omission nor commission. In this paper, I aim to elucidate the open future principle, and consider whether it is applicable to non-therapeutic circumcision of boys, whether performed for cultural/religious or for prophylactic/health reasons. I argue that the principle is highly applicable to non-therapeutic circumcision, and conclude that non-therapeutic circumcision would be a violation of the child's right to an open future, and thus objectionable from both an ethical and a human rights perspective. (shrink)
Pluralist and eliminativist positions in philosophy – and other disciplines – have proliferated in recent decades. This paper emphasises the sheer scale of this movement: we start by summarising twenty debates which have been affected, thus illustrating how often debates have been transformed by the introduction of pluralist and/or eliminativist thinking. We then provide an explanation of why this shift of philosophical terrain has occurred, an explanation which in turn predicts that its reach will extend to other debates currently unaffected, (...) and for good reasons. We go on to detail the landscape of various different pluralist and eliminativist positions one may favour. We ultimately argue for pluralism at the meta-level: whether one should implement pluralism or eliminativism depends on the context of discussion and the details of the debate at hand. We use this analysis to dissolve debates between ‘pluralists’ and ‘eliminativists’ in various domains. (shrink)
One of the popular realist responses to the pessimistic meta-induction is the ‘selective’ move, where a realist only commits to the ‘working posits’ of a successful theory, and withholds commitment to ‘idle posits’. Antirealists often criticise selective realists for not being able to articulate exactly what is meant by ‘working’ and/or not being able to identify the working posits except in hindsight. This paper aims to establish two results: sometimes a proposition is, in an important sense, ‘doing work’, and yet (...) does not warrant realist commitment, and the realist will be able to respond to PMI-style historical challenges if she can merely show that certain selected posits do not require realist commitment. These two results act to significantly adjust the dialectic vis-à-vis PMI-style challenges to selective realism. (shrink)