Germline Gene Editing has enormous potential both as a research tool and a therapeutic intervention. While other types of gene editing are relatively uncontroversial, GGE has been strongly resisted. In this article, we analyse the ethical arguments for and against pursuing GGE by allowing and funding its development. We argue there is a strong case for pursuing GGE for the prevention of disease. We then examine objections that have been raised against pursuing GGE and argue that these fail. We conclude (...) that the moral case in favour of pursuing GGE is stronger than the case against. This suggests that pursuing GGE is morally permissible and indeed morally desirable. (shrink)
It is notoriously difficult to find an intuitively satisfactory rule for evaluating populations based on the welfare of the people in them. Standard examples, like total utilitarianism, either entail the Repugnant Conclusion or in some other way contradict common intuitions about the relative value of populations. Several philosophers have presented formal arguments that seem to show that this happens of necessity: our core intuitions stand in contradiction. This paper assesses the state of play, focusing on the most powerful of these (...) ‘impossibility theorems’, as developed by Gustaf Arrhenius. I highlight two ways in which these theorems fall short of their goal: some appeal to a supposedly egalitarian condition which, however, does not properly reflect egalitarian intuitions; the others rely on a background assumption about the structure of welfare which cannot be taken for granted. Nonetheless, the theorems remain important: they give insight into the difficulty, if not perhaps the impossibility, of constructing a satisfactory population axiology. We should aim for reflective equilibrium between intuitions and more theoretical considerations. I conclude by highlighting one possible ingredient in this equilibrium, which, I argue, leaves open a still wider range of acceptable theories: the possibility of vague or otherwise indeterminate value relations. (shrink)
The first book length study of property-owning democracy, Republic of Equals argues that a society in which capital is universally accessible to all citizens is uniquely placed to meet the demands of justice. Arguing from a basis in liberal-republican principles, this expanded conception of the economic structure of society contextualizes the market to make its transactions fair. The author shows that a property-owning democracy structures economic incentives such that the domination of one agent by another in the market is structurally (...) impossible. The result is a renovated form of capitalism in which the free market is no longer a threat to social democratic values, but is potentially convergent with them. It is argued that a property-owning democracy has advantages that give it priority over rival forms of social organization such as welfare state capitalism and market socialist institutions. The book also addresses the currently high levels of inequality in the societies of the developed West to suggest a range of policies that target the "New Inequality" of our times. For this reason, the work engages not only with political philosophers such as John Rawls, Philip Pettit and John Tomasi, but also with the work of economists and historians such as Anthony B. Atkinson, François Bourguignon, Jacob S. Hacker, Lane Kenworthy, and Thomas Piketty. (shrink)
We give two social aggregation theorems under conditions of risk, one for constant population cases, the other an extension to variable populations. Intra and interpersonal welfare comparisons are encoded in a single ‘individual preorder’. The theorems give axioms that uniquely determine a social preorder in terms of this individual preorder. The social preorders described by these theorems have features that may be considered characteristic of Harsanyi-style utilitarianism, such as indifference to ex ante and ex post equality. However, the theorems are (...) also consistent with the rejection of all of the expected utility axioms, completeness, continuity, and independence, at both the individual and social levels. In that sense, expected utility is inessential to Harsanyi-style utilitarianism. In fact, the variable population theorem imposes only a mild constraint on the individual preorder, while the constant population theorem imposes no constraint at all. We then derive further results under the assumption of our basic axioms. First, the individual preorder satisfies the main expected utility axiom of strong independence if and only if the social preorder has a vector-valued expected total utility representation, covering Harsanyi’s utilitarian theorem as a special case. Second, stronger utilitarian-friendly assumptions, like Pareto or strong separability, are essentially equivalent to strong independence. Third, if the individual preorder satisfies a ‘local expected utility’ condition popular in non-expected utility theory, then the social preorder has a ‘local expected total utility’ representation. Fourth, a wide range of non-expected utility theories nevertheless lead to social preorders of outcomes that have been seen as canonically egalitarian, such as rank-dependent social preorders. Although our aggregation theorems are stated under conditions of risk, they are valid in more general frameworks for representing uncertainty or ambiguity. (shrink)
What are the processes, from conception to adulthood, that enable a single cell to grow into a sentient adult? Neuroconstructivism is a pioneering 2 volume work that sets out a whole new framework for considering the complex topic of development, integrating data from cognitive studies, computational work, and neuroimaging.
The first book-length study of property-owning democracy, Republic of Equals, argues that a society in which capital is universally accessible to all citizens is uniquely placed to meet the demands of justice. Arguing from a basis in liberal-republican principles, this expanded conception of the economic structure of society contextualizes the market to make its transactions fair. It shows that a property-owning democracy structures economic incentives such that the domination of one agent by another in the market is structurally impossible. The (...) result is a renovated form of capitalism in which the free market is no longer a threat to social democratic values but is potentially convergent with them. It is argued that a property-owning democracy has advantages that give it priority over rival forms of social organization such as welfare-state capitalism and market socialist institutions. The book also addresses the currently high levels of inequality in the societies of the developed West to suggest a range of policies that target the “New Inequality” of the twenty-first century. For this reason, the work engages not only with political philosophers such as John Rawls, Philip Pettit, and John Tomasi but also with the work of economists and historians such as Anthony B. Atkinson, François Bourguignon, Jacob S. Hacker, Lane Kenworthy, and Thomas Piketty. (shrink)
We set out an account of how self-domestication plays a crucial role in the evolution of language. In doing so, we focus on the growing body of work that treats language structure as emerging from the process of cultural transmission. We argue that a full recognition of the importance of cultural transmission fundamentally changes the kind of questions we should be asking regarding the biological basis of language structure. If we think of language structure as reflecting an accumulated set of (...) changes in our genome, then we might ask something like, “What are the genetic bases of language structure and why were they selected?” However, if cultural evolution can account for language structure, then this question no longer applies. Instead, we face the task of accounting for the origin of the traits that enabled that process of structure-creating cultural evolution to get started in the first place. In light of work on cultural evolution, then, the new question for biological evolution becomes, “How did those precursor traits evolve?” We identify two key precursor traits: the transmission of the communication system through learning; and the ability to infer the communicative intent associated with a signal or action. We then describe two comparative case studies—the Bengalese finch and the domestic dog—in which parallel traits can be seen emerging following domestication. Finally, we turn to the role of domestication in human evolution. We argue that the cultural evolution of language structure has its origin in an earlier process of self-domestication. (shrink)
It is only in the last few decades that analytic philosophers in particular have begun to pay any serious attention to the topic of life’s meaning. Such philosophers, however, do not usually attempt to answer or analyse the traditional question ‘What is the meaning of life?’, but rather the subtly different question ‘What makes a life meaningful?’ and it is generally assumed that the latter can be discussed independently of the former. Nevertheless, this paper will argue that the two questions (...) are indeed connected, and that identifying and expanding upon the most plausible analysis of the former will provide the resources necessary to determine the most plausible answer to the latter. Specifically, this paper will argue that the traditional question is simply a request for the information which constitutes a coherent answer to one or more of a certain set of questions regarding human existence that were salient to the asker. In simpler language, the meaning of life itself is the information a person needs to make sense of it. This analysis can then also be applied to individual lives, such that asking for the meaning of X’s life is an analogous request for the information necessary to make sense of that life in particular. Running with this concept of the ‘meaning’ of something as its ‘sense’, the paper then outlines an accompanying theory of ‘meaningfulness’ as ‘sensefulness’: a measure of the richness of certain aspects of the life, multiplied by their intelligibility. (shrink)
Political realism criticises the putative abstraction, foundationalism and neglect of the agonistic dimension of political practice in the work of John Rawls. This paper argues that had Rawls not fully specified the implementation of his theory of justice in one particular form of political economy then he would be vulnerable to a realist critique. But he did present such an implementation: a property-owning democracy. An appreciation of Rawls s specificationist method undercuts the realist critique of his conception of justice as (...) fairness. (shrink)
We have synthesized a 582,970-base pair Mycoplasma genitalium genome. This synthetic genome, named M. genitalium JCVI-1.0, contains all the genes of wild-type M. genitalium G37 except MG408, which was disrupted by an antibiotic marker to block pathogenicity and to allow for selection. To identify the genome as synthetic, we inserted "watermarks" at intergenic sites known to tolerate transposon insertions. Overlapping "cassettes" of 5 to 7 kilobases (kb), assembled from chemically synthesized oligonucleotides, were joined by in vitro recombination to produce intermediate (...) assemblies of approximately 24 kb, 72 kb ("1/8 genome"), and 144 kb ("1/4 genome"), which were all cloned as bacterial artificial chromosomes in Escherichia coli. Most of these intermediate clones were sequenced, and clones of all four 1/4 genomes with the correct sequence were identified. The complete synthetic genome was assembled by transformation-associated recombination cloning in the yeast Saccharomyces cerevisiae, then isolated and sequenced. A clone with the correct sequence was identified. The methods described here will be generally useful for constructing large DNA molecules from chemically synthesized pieces and also from combinations of natural and synthetic DNA segments. 10.1126/science.1151721. (shrink)
Many believe that the ethical problems of donation after cardiocirculatory death (DCD) have been "worked out" and that it is unclear why DCD should be resisted. In this paper we will argue that DCD donors may not yet be dead, and therefore that organ donation during DCD may violate the dead donor rule. We first present a description of the process of DCD and the standard ethical rationale for the practice. We then present our concerns with DCD, including the following: (...) irreversibility of absent circulation has not occurred and the many attempts to claim it has have all failed; conflicts of interest at all steps in the DCD process, including the decision to withdraw life support before DCD, are simply unavoidable; potentially harmful premortem interventions to preserve organ utility are not justifiable, even with the help of the principle of double effect; claims that DCD conforms with the intent of the law and current accepted medical standards are misleading and inaccurate; and consensus statements by respected medical groups do not change these arguments due to their low quality including being plagued by conflict of interest. Moreover, some arguments in favor of DCD, while likely true, are "straw-man arguments," such as the great benefit of organ donation. The truth is that honesty and trustworthiness require that we face these problems instead of avoiding them. We believe that DCD is not ethically allowable because it abandons the dead donor rule, has unavoidable conflicts of interests, and implements premortem interventions which can hasten death. These important points have not been, but need to be fully disclosed to the public and incorporated into fully informed consent. These are tall orders, and require open public debate. Until this debate occurs, we call for a moratorium on the practice of DCD. (shrink)
In Value and Context Alan Thomas articulates and defends the view that human beings do possess moral and political knowledge but it is historically and culturally contextual knowledge in ways that, say, mathematical or chemical knowledge is not. In his exposition of "cognitive contextualism" in ethics and politics he makes wide-ranging use of contemporary work in epistemology, moral philosophy, and political theory.
This study examined the effect of various antecedent variables on marketers’ perceptions of the role of ethics and socialresponsibility in the overall success of the firm. Variables examined included Hofstede’s cultural dimensions , as well as corporate ethical values and enforcement ofan ethics code. Additionally, individual variables such as ethical idealism and relativism were included. Results indicated that most ofthese variables impacted marketers’ perceptions of the importance of ethics and social responsibility, although to varying degrees.
Duncan Purves and Nicolas Delon have argued that one’s life will be meaningful to the extent that one contributes to valuable states of affairs and this contribution is a result of one’s intentional actions. They then argue, contrary to some theorists’ intuitions, that non-human animals are capable of fulfilling these requirements, and that this finding might entail important things for the animal ethics movement. In this paper, I also argue that things besides human beings can have meaningful existences, but I (...) disagree with Purves and Delon’s theory of meaning, and some of the practical implications they suggest arise from their conclusion. Specifically, I argue that Purves and Delon are wrong to suggest that intentional agency is necessary for one’s life to be meaningful; contributing to valuable states of affairs can be sufficient by itself. Purves and Delon’s objection to such a claim is that it would allow even inanimate objects’ existences to count as meaningful. However, while I accept this... (shrink)
Mental imagery (varieties of which are sometimes colloquially refered to as “visualizing,” “seeing in the mind's eye,” “hearing in the head,” “imagining the feel of,” etc.) is quasi-perceptual experience; it resembles perceptual experience, but occurs in the absence of the appropriate external stimuli. It is also generally understood to bear intentionality (i.e., mental images are always images of something or other), and thereby to function as a form of mental representation. Traditionally, visual mental imagery, the most discussed variety, was thought (...) to be caused by the presence of picturelike representations (mental images) in the mind, soul, or brain, but this is no longer universally accepted. (shrink)
What is time? This is one of the most fundamental questions we can ask. Emily Thomas explores how a new theory of time emerged in the seventeenth century. The 'absolute' theory of time held that it is independent of material bodies or human minds, so even if nothing else existed there would be time.
We provide conditions under which an incomplete strongly independent preorder on a convex set X can be represented by a set of mixture preserving real-valued functions. We allow X to be infi nite dimensional. The main continuity condition we focus on is mixture continuity. This is sufficient for such a representation provided X has countable dimension or satisfi es a condition that we call Polarization.
What makes us conscious? Many theories that attempt to answer this question have appeared recently in the context of widespread interest about consciousness in the cognitive neurosciences. Most of these proposals are formulated in terms of the information processing conducted by the brain. In this overview, we survey and contrast these models. We first delineate several notions of consciousness, addressing what it is that the various models are attempting to explain. Next, we describe a conceptual landscape that addresses how the (...) theories attempt to explain consciousness. We then situate each of several representative models in this landscape and indicate which aspect of consciousness they try to explain. We conclude that the search for the neural correlates of consciousness should be usefully complemented by a search for the computational correlates of consciousness. (shrink)
ABSTRACTMany scholars have drawn attention to the way that elements of Anne Conway’s system anticipate ideas found in Leibniz. This paper explores the relationship between Conway and Leibniz’s work with regard to time, space, and process. It argues – against existing scholarship – that Conway is not a proto-Leibnizian relationist about time or space, and in fact her views lie much closer to those of Henry More; yet Conway and Leibniz agree on the primacy of process. This exploration advances our (...) understanding of Conway’s system, and the intellectual relationships between Conway, More, and Leibniz. (shrink)
This thesis consists of several independent papers in population ethics. I begin in Chapter 1 by critiquing some well-known 'impossibility theorems', which purport to show there can be no intuitively satisfactory population axiology. I identify axiological vagueness as a promising way to escape or at least mitigate the effects of these theorems. In particular, in Chapter 2, I argue that certain of the impossibility theorems have little more dialectical force than sorites arguments do. From these negative arguments I move to (...) positive ones. In Chapter 3, I justify the use of a 'veil of ignorance', starting from three more basic normative principles. This leads to positive arguments for various kinds of utilitarianism - the best such arguments I know. But in general the implications of the veil depend on how one answers what I call 'the risky existential question': what is the value to an individual of a chance of non-existence? I chart out the main options, and raise some puzzles for non-comparativism, the view that life is incomparable to non-existence. Finally, in Chapter 4, I consider the consequences for population ethics of the idea that what is normatively relevant is not personal identity, but a degreed relation of psychological connectedness. In particular, I pursue a strategy based in population ethics for understanding the controversial 'time-relative interests' account of the badness of death. (shrink)
Using Hofstede's culture theory (1980, 2001 Culture's Consequences: Comparing Values, Behaviours, Institutions, and Organizations Across Nation. Sage, NewYork), the current study incorporates the moral development (e.g. Thorne, 2000; Thorne and Magnan, 2000; Thorne et al., 2003) and multidimensional ethics scale (e.g. Cohen et al., 1993; Cohen et al., 1996b; Cohen et al., 2001; Flory et al., 1992) approaches to compare the ethical reasoning and decisions of Canadian and Mainland Chinese final year undergraduate accounting students. The results indicate that Canadian accounting (...) students' formulation of an intention to act on a particular ethical dilemma (deliberative reasoning) as measured by the moral development approach (Thorne, 2000) was higher than Mainland Chinese accounting students. The current study proposes that the five factors identified by the multidimensional ethics scale (MES), as being relevant to ethical decision making can be placed into the three levels of ethical reasoning identified by Kohlberg's (1958, The Development of Modes of Moral Thinking and Choice in the Years Ten to Sixteen. University of Chicago, Doctoral dissertation) theory of cognitive moral development. Canadian accounting students used post-conventional MES factors (moral equity, contractualism, and utilitarianism) more frequently and made more ethical audit decisions than Chinese accounting students. (shrink)
Super-substantivalism is the thesis that space is identical to matter; it is currently under discussion ? see Sklar (1977, 221?4), Earman (1989, 115?6) and Schaffer (2009) ? in contemporary philosophy of physics and metaphysics. Given this current interest, it is worth investigating the thesis in the history of philosophy. This paper examines the super-substantivalism of Samuel Alexander, an early twentieth century metaphysician primarily associated with (the movement now known as) British Emergentism. Alexander argues that spacetime is ontologically fundamental and it (...) gives rise to an ontological hierarchy of emergence, involving novel properties such as matter, life and mind. Alexander's super-substantivalism is interesting not just because of its historical importance but also because Alexander unusually attempts to explain why spacetime is identical to matter. This paper carefully unpacks that explanation and shows how Alexander is best read as conceiving of spacetime as a Spinozistic substance, worked upon by evolution. (shrink)
The growing block view of time holds that the past and present are real whilst the future is unreal; as future events become present and real, they are added on to the growing block of reality. Surprisingly, given the recent interest in this view, there is very little literature on its origins. This paper explores those origins, and advances two theses. First, I show that although C. D. Broad’s Scientific Thought provides the first defence of the growing block theory, the (...) theory receives its first articulation in Samuel Alexander’s Space, Time, and Deity. Further, Alexander’s account of deity inclines towards the growing block view. Second, I argue that Broad shifted towards the growing block theory as a result of his newfound conviction that time has a direction. By way of tying these theses together, I argue that Broad’s views on the direction of time – and possibly even his growing block theory – are sourced in Alexander. (shrink)
Virtue ethics has long provided fruitful resources for the study of issues in medical ethics. In particular, study of the moral virtues of the good doctor—like kindness, fairness and good judgement—have provided insights into the nature of medical professionalism and the ethical demands on the medical practitioner as a moral person. Today, a substantial literature exists exploring the virtues in medical practice and many commentators advocate an emphasis on the inculcation of the virtues of good medical practice in medical education (...) and throughout the medical career. However, until very recently, no empirical studies have attempted to investigate which virtues, in particular, medical doctors and medical students tend to have or not to have, nor how these virtues influence how they think about or practise medicine. The question of what virtuous medical practice is, is vast and, as we have written elsewhere, the question of how to study doctors’ moral character is fraught with difficulty. In this paper, we report the results of a first-of-a-kind study that attempted to explore these issues at three medical schools in the United Kingdom. We identify which character traits are important in the good doctor in the opinion of medical students and doctors and identify which virtues they say of themselves they possess and do not possess. Moreover, we identify how thinking about the virtues contributes to doctors’ and medical students’ thinking about common moral dilemmas in medicine. In ending, we remark on the implications for medical education. (shrink)
The Jubilee Centre’s new report, Virtuous Medical Practice, examines the place of character and values in the medical profession in Britain today. Its findings are drawn from a UK-focused multi-methods study of 549 doctors and aspiring doctors at three career stages, first and final year students and experienced doctors.
Moral distress is one of the core topics of clinical ethics. Although there is a large and growing empirical literature on the psychological aspects of moral distress, scholars, and empirical investigators of moral distress have recently called for greater conceptual clarity. To meet this recognized need, we provide a philosophical taxonomy of the categories of what we call ethically significant moral distress: the judgment that one is not able, to differing degrees, to act on one’s moral knowledge about what one (...) ought to do. We begin by unpacking the philosophical components of Andrew Jameton’s original formulation from his landmark 1984 work and identify two key respects in which that formulation remains unclear: the origins of moral knowledge and impediments to acting on that moral knowledge. We then selectively review subsequent literature that shows that there is more than one concept of moral distress and that explores the origin of the values implicated in moral distress and impediments to acting on those values. This review sets the stage for identifying the elements of a philosophical taxonomy of ethically significant moral distress. The taxonomy uses these elements to create six categories of ethically significant moral distress: challenges to, threats to, and violations of professional integrity; and challenges to, threats to, and violations of individual integrity. We close with suggestions about how the proposed philosophical taxonomy of ethically significant moral distress sheds light on the concepts of moral residue and crescendo effect of moral distress and how the proposed taxonomy might usefully guide prevention of and future qualitative and quantitative empirical research on ethically significant moral distress. (shrink)
Social media applications such as Twitter, YouTube and Facebook have attained huge popularity, with more than three billion people and organizations predicted to have a social networking account by 2015. Social media offers a rapid avenue of communication with the public and has potential benefits for communicable disease control and surveillance. However, its application in everyday public health practice raises a number of important issues around confidentiality and autonomy. We report here a case from local level health protection where the (...) friend of an individual with meningococcal septicaemia used a social networking site to notify potential contacts. (shrink)
In a paper recently published in this journal, Navin and Largent argue in favour of a type of policy to regulate non-medical exemptions from childhood vaccination which they call ‘Inconvenience’. This policy makes it burdensome for parents to obtain an exemption to child vaccination, for example, by requiring parents to attend immunization education sessions and to complete an application form to receive a waiver. Navin and Largent argue that this policy is preferable to ‘Eliminationism’, i.e. to policies that do not (...) allow non-medical exemptions, because Inconvenience has been shown to maintain exemption rates low while not harming parents by forcing them to do something that goes against their beliefs. We argue that it is at least doubtful that Inconvenience is ethically preferable to Eliminationism: while the latter disregards the value of liberty, Inconvenience disregards the value of fairness in the distribution of the burdens entailed by the preservation of a public good like herd immunity. We propose a variant of Inconvenience, which we call ‘Contribution’, which we think is preferable to the versions of Inconvenience discussed by Navin and Largent in that it successfully strikes a balance between the values of parents’ liberty, fairness and expected utility. (shrink)
It is often assumed that similar domain-specific behavioural impairments found in cases of adult brain damage and developmental disorders correspond to similar underlying causes, and can serve as convergent evidence for the modular structure of the normal adult cognitive system. We argue that this correspondence is contingent on an unsupported assumption that atypical development can produce selective deficits while the rest of the system develops normally (Residual Normality), and that this assumption tends to bias data collection in the field. Based (...) on a review of connectionist models of acquired and developmental disorders in the domains of reading and past tense, as well as on new simulations, we explore the computational viability of Residual Normality and the potential role of development in producing behavioural deficits. Simulations demonstrate that damage to a developmental model can produce very different effects depending on whether it occurs prior to or following the training process. Because developmental disorders typically involve damage prior to learning, we conclude that the developmental process is a key component of the explanation of endstate impairments in such disorders. Further simulations demonstrate that in simple connectionist learning systems, the assumption of Residual Normality is undermined by processes of compensation or alteration elsewhere in the system. We outline the precise computational conditions required for Residual Normality to hold in development, and suggest that in many cases it is an unlikely hypothesis. We conclude that in developmental disorders, inferences from behavioural deficits to underlying structure crucially depend on developmental conditions, and that the process of ontogenetic development cannot be ignored in constructing models of developmental disorders. Key Words: Acquired and developmental disorders; connectionist models; modularity; past tense; reading. (shrink)
For early modern metaphysician Anne Conway, the world comprises creatures. In some sense, Conway is a monist about creatures: all creatures are one. Yet, as Jessica Gordon-Roth has astutely pointed out, that monism can be understood in very different ways. One might read Conway as an ‘existence pluralist’: creatures are all composed of the same type of substance, but many substances exist. Alternatively, one might read Conway as an ‘existence monist’: there is only one created substance. Gordon-Roth has done the (...) scholarship a great favor by illuminating these issues in Conway. However, this article takes issue with Gordon-Roth's further view that Conway ‘oscillates’ between the extremes of existence pluralism and monism. In its place, I argue we should read Conway as a priority monist: the whole of creation is ontologically prior to its parts. (shrink)
The work of women philosophers in the early modern period has traditionally been overlooked, yet their writing on topics such as reality, time, mind and matter holds valuable lessons for our understanding of metaphysics and its history. This volume of new essays explores the work of nine key female figures: Bathsua Makin, Anna Maria van Schurman, Elisabeth of Bohemia, Margaret Cavendish, Anne Conway, Damaris Cudworth Masham, Mary Astell, Catharine Trotter Cockburn, and Émilie Du Châtelet. Investigating issues from eternity to free (...) will and from body to natural laws, the essays uncover long-neglected perspectives and demonstrate their importance for philosophical debates, both then and now. Combining careful philosophical analysis with discussion of the intellectual and historical context of each thinker, they will set the agenda for future enquiry and will appeal to scholars and students of the history of metaphysics, science, religion and feminism. (shrink)
Algorithms are powerful because we invest in them the power to do things. With such promise, they can transform the ordinary, say snapshots along a robotic vacuum cleaner’s route, into something much more, such as a clean home. Echoing David Graeber’s revision of fetishism, we argue that this easy slip from technical capabilities to broader claims betrays not the “magic” of algorithms but rather the dynamics of their exchange. Fetishes are not indicators of false thinking, but social contracts in material (...) form. They mediate emerging distributions of power often too nascent, too slippery or too disconcerting to directly acknowledge. Drawing primarily on 2016 ethnographic research with computer vision professionals, we show how faith in what algorithms can do shapes the social encounters and exchanges of their production. By analyzing algorithms through the lens of fetishism, we can see the social and economic investment in some people’s labor over others. We also see everyday opportunities for social creativity and change. We conclude that what is problematic about algorithms is not their fetishization but instead their stabilization into full-fledged gods and demons – the more deserving objects of critique. (shrink)