The 2008 financial crisis exposed the dark side of the financial sector in the UK. It brought attention to the contaminated culture of the business, which accommodated the systemic malpractices that largely contributed to the financial turmoil of 2008. In the wake of the crisis there seems to be a wide consensus that this contaminated culture can no longer be accepted and needs to change. This article examines the ills of the UK financial market, more specifically the cultural contamination problem, (...) which was uncovered by the 2008 financial crisis, in order to explore its genesis and the suitable solutions for it. In this regard, the article analyses the ethical finance sector from theoretical and practical perspectives in order to assess its role in addressing the cultural contamination problem of the UK financial market. (shrink)
This comprehensive text on African Mathematics addresses some of the problematic issues in the field, such as attitudes, curriculum development, educational change, academic achievement, standardized and other tests, performance factors, student characteristics, cross-cultural differences and studies, literacy, native speakers, social class and differences, equal education, teaching methods, and more.
This article extends the combinatorial approach to support the determination of contextuality amidst causal influences. Contextuality is an active field of study in Quantum Cognition, in systems relating to mental phenomena, such as concepts in human memory. In the cognitive field of study, a contemporary challenge facing the determination of whether a phenomenon is contextual has been the identification and management of disturbances. Whether or not said disturbances are identified through the modeling approach, constitute causal influences, or are disregardableas as (...) noise is important, as contextuality cannot be adequately determined in the presence of causal influences. To address this challenge, we first provide a formalization of necessary elements of the combinatorial approach within the language of canonical causal models. Through this formalization, we extend the combinatorial approach to support a measurement and treatment of disturbance, and offer techniques to separately distinguish noise and causal influences. Thereafter, we develop a protocol through which these elements may be represented within a cognitive experiment. As human cognition seems rife with causal influences, cognitive modelers may apply the extended combinatorial approach to practically determine the contextuality of cognitive phenomena. (shrink)
Parkinson's Disease (PD) is a long-term degenerative disorder of the central nervous system that mainly affects the motor system. The symptoms generally come on slowly over time. Early in the disease, the most obvious are shaking, rigidity, slowness of movement, and difficulty with walking. Doctors do not know what causes it and finds difficulty in early diagnosing the presence of Parkinson’s disease. An artificial neural network system with back propagation algorithm is presented in this paper for helping doctors in identifying (...) PD. Previous research with regards to predict the presence of the PD has shown accuracy rates up to 93% [1]; however, accuracy of prediction for small classes is reduced. The proposed design of the neural network system causes a significant increase of robustness. It is also has shown that networks recognition rates reached 100%. (shrink)
This collection of specially commissioned essays is the first of its kind in English on the work of Antonio Negri, the Italian philosopher and political theorist. The spectacular success of Empire , Negri's collaboration with Michael Hardt, has brought Negri's writing to a new, wider audience. A substantial body of his writing is now available to an English-speaking readership. Outstanding contributors—including Michael Hardt, Sergio Bologna, Kathi Weeks and Nick Dyer-Witheford—reveal the variety and complexity of Negri's thought and explores its unique (...) relevance to modern politics. Negri is one of the most sophisticated analyists of modern political philosophy. Philosophers and critics alike find his work both difficult and exhilarating, engaging as it does with Marx, Spinoza, Deleuze, Guattari, Tronti and others. This book is ideal for readers who want to get to grips with Negri's key themes, in particular his theories on labour, capital, power, the state and revolution. It makes a great introduction to his work for students of political philosophy, as well as providing a comprehensive critical approach for Negri enthusiasts. (shrink)
The spectacular success of Empire and Multitude has brought Antonio Negri's writing to a new and wider audience. Negri'as work is singular in its depth and expression. It can be difficult to grasp the complexity of his ideas as they are rooted in the history of philosophy. This book offers an introduction to his thinking and is ideal for readers who want to come to grips with his key themes. Contributors include Pierre Macherey, Daniel Bensai;d, Charles Wolfe, Alex Callinicos, Miguel (...) Vatter, Jason Read, Alberto Toscano, Mamut Mutman, Ted Stolze, and Judith Revel. Written with dynamism and originality, the book will appeal to anyone interested in the evolution of Negri's thought, and especially to students of political philosophy, international studies, and literary theory. This book is the sequel to The Philosophy of Antonio Negri, Volume One: Resistance in Practice , but can be read entirely independently. (shrink)
Forthcoming in Cognitive Architecture: from bio-politics to noo-politics, eds. Deborah Hauptmann, Warren Neidich and Abdul-Karim Mustapha INTRODUCTION The cognitive and affective sciences have benefitted in the last twenty years from a rethinking of the long-dominant computer model of the mind espoused by the standard approaches of computationalism and connectionism. The development of this alternative, often named the “embodied mind” approach or the “4EA” approach (embodied, embedded, enactive, extended, affective), has relied on a trio of classical 20th century phenomenologists (...) for its philosophical framework: Husserl, Heidegger, and Merleau-Ponty.[1] In this essay I propose that the French thinker Gilles Deleuze can provide the conceptual framework that will enable us to thematize some unstated presuppositions of the 4EA School, as well as to sharpen, extend and / or radicalize some of their explicit presuppositions. I highlight three areas here: 1) an ontology of distributed and differential systems, using Deleuze’s notion of the virtual; 2) a thought of multiple subjectification practices rather than a thought of “the” subject, even if it be seen as embodied and embedded; and 3) a rethinking of the notion of affect in order to thematize a notion of “political affect.”[2] I will develop this proposal with reference to Bruce Wexler’s Brain and Culture,[3] a work which resonates superbly with the Deleuzean approach. (shrink)
Econophysics is a new and exciting cross-disciplinary research field that applies models and modelling techniques from statistical physics to economic systems. It is not, however, without its critics: prominent figures in more mainstream economic theory have criticized some elements of the methodology of econophysics. One of the main lines of criticism concerns the nature of the modelling assumptions and idealizations involved, and a particular target are ‘kinetic exchange’ approaches used to model the emergence of inequality within the distribution of individual (...) monetary income. This article will consider such models in detail, and assess the warrant of the criticisms drawing upon the philosophical literature on modelling and idealization. Our aim is to provide the first steps towards informed mediation of this important and interesting interdisciplinary debate, and our hope is to offer guidance with regard to both the practice of modelling inequality, and the inequality of modelling practice. _1_ Introduction _1.1_ Econophysics and its discontents _1.2_ Against burglar economics _2_ Modelling Inequality _2.1_ Mainstream economic models for income distribution _2.2_ Econophysics models for income distribution _3_ Idealizations in Kinetic Exchange Models _3.1_ Binary interactions _3.2_ Conservation principles _3.3_ Exchange dynamics _ 4 _ Fat Tails and Savings _ 5 _ Evaluation. (shrink)
Embodied and extended cognition is a relatively new paradigm within cognitive science that challenges the basic tenet of classical cognitive science, viz. cognition consists in building and manipulating internal representations. Some of the pioneers of embodied cognitive science have claimed that this new way of conceptualizing cognition puts pressure on epistemological and ontological realism. In this paper I will argue that such anti-realist conclusions do not follow from the basic assumptions of radical embodied cognitive science. Furthermore I will show that (...) one can develop a form of realism that reflects rather than just accommodates the core principles of non-representationalist embodied cognitive science. (shrink)
Conceptual metaphors have received much attention in research on discourse about infectious diseases in recent years. Most studies found that conceptual metaphors of war dominate media discourse about disease. Similarly, a great deal of research has been undertaken on the new coronavirus, i.e., COVID-19, especially in the English news discourse as opposed to other languages. The present study, in contrast, analyses the conceptual metaphors used in COVID-19 discourse in French-language newspapers. The study explored the linguistic metaphors used in COVID-19 discourse (...) in these newspapers and conceptual metaphors that underlie and motivate them, using a conceptual metaphor theory framework (CMT). Therefore, two North African French-language newspapers, namely Libération, published in Morocco, and La Presse de Tunisie, published in Tunisia, formed the corpus of the current study. The results showed that the most frequent framing of COVID-19 was in terms of WAR, followed by DISASTER and KILLER, respectively. (shrink)
In recent decades, non-representational approaches to mental phenomena and cognition have been gaining traction in cognitive science and philosophy of mind. In these alternative approach, mental representations either lose their central status or, in its most radical form, are banned completely. While there is growing agreement that non-representational accounts may succeed in explaining some cognitive capacities, there is widespread skepticism about the possibility of giving non-representational accounts of cognitive capacities such as memory, imagination or abstract thought. In this paper, I (...) will critically examine the view that there are fundamental limitations to non-representational explanations of cognition. Rather than challenging these arguments on general grounds, I will examine a set of human cognitive capacities that are generally thought to fall outside the scope of non-representational accounts, i.e. numerical cognition. After criticizing standard representational accounts of numerical cognition for their lack of explanatory power, I will argue that a non-representational approach that is inspired by radical enactivism offers the best hope for developing a genuine naturalistic explanatory account for these cognitive capacities. (shrink)
Symplectic reduction is a formal process through which degeneracy within the mathematical representations of physical systems displaying gauge symmetry can be controlled via the construction of a reduced phase space. Typically such reduced spaces provide us with a formalism for representing both instantaneous states and evolution uniquely and for this reason can be justifiably afforded the status of fun- damental dynamical arena - the otiose structure having been eliminated from the original phase space. Essential to the application of symplectic reduction (...) is the precept that the first class constraints are the relevant gauge generators. This prescription becomes highly problematic for reparameterization invariant theories within which the Hamiltonian itself is a constraint; not least because it would seem to render prima facie distinct stages of a history physically identical and observable functions changeless. Here we will consider this problem of time within non-relativistic me- chanical theory with a view to both more fully understanding the temporal struc- ture of these timeless theories and better appreciating the corresponding issues in relativistic mechanics. For the case of nonrelativistic reparameterization invariant theory application of symplectic reduction will be demonstrated to be both unnec- essary; since the degeneracy involved is benign; and inappropriate; since it leads to a trivial theory. With this anti-reductive position established we will then examine two rival methodologies for consistently representing change and observable func- tions within the original phase space before evaluating the relevant philosophical implications. We will conclude with a preview of the case against symplectic re- duction being applied to canonical general relativity. (shrink)
Starting from a generalized Hamilton-Jacobi formalism, we develop a new framework for constructing observables and their evolution in theories invariant under global time reparametrizations. Our proposal relaxes the usual Dirac prescription for the observables of a totally constrained system and allows one to recover the influential partial and complete observables approach in a particular limit. Difficulties such as the non-unitary evolution of the complete observables in terms of certain partial observables are explained as a breakdown of this limit. Identification of (...) our observables relies upon a physical distinction between gauge symmetries that exist at the level of histories and states, and those that exist at the level of histories and not states. This distinction resolves a tension in the literature concerning the physical interpretation of the partial observables and allows for a richer class of observables in the quantum theory. There is the potential for the application of our proposal to the quantization of gravity when understood in terms of the Shape Dynamics formalism. (shrink)
The goal of responsible engineers is the creation of useful and safe technological products and commitment to public health, while respecting the autonomy of the clients and the public. Because engineers often face moral dilemma to resolve such issues, different engineers have chosen different course of actions depending on their respective moral value orientations. Islam provides a value-based mechanism rooted in the Maqasid al-Shari‘ah (the objectives of Islamic law). This mechanism prioritizes some values over others and could help resolve the (...) moral dilemmas faced in engineering. This paper introduces the Islamic interpretive-evaluative maxims to two core issues in engineering ethics: genetically modified foods and whistleblowing. The study aims primarily to provide problem-solving maxims within the Maqasid al-Shari‘ah matrix through which such moral dilemmas in science and engineering could be studied and resolved. (shrink)
BackgroundObtaining informed consent for participation in genomic research in low-income settings presents specific ethical issues requiring attention. These include the challenges that arise when providing information about unfamiliar and technical research methods, the implications of complicated infrastructure and data sharing requirements, and the potential consequences of future research with samples and data. This study investigated researchers’ and participants’ parents’ experiences of a consent process and understandings of a genome-wide association study of malaria involving children aged five and under in Mali. (...) It aimed to inform best practices in recruiting participants into genomic research.MethodsA qualitative rapid ethical assessment was undertaken. Fifty-five semi-structured interviews were conducted with the parents of research participants. An additional nine semi-structured interviews were conducted with senior research scientists, research assistants and with a member of an ethics committee. A focus group with five parents of research participants and direct observations of four consent processes were also conducted. French and translated English transcripts were descriptively and thematically coded using OpenCode software.ResultsParticipants’ parents in the MalariaGEN study had differing understandings of the causes of malaria, the rationale for collecting blood samples, the purposes of the study and the kinds of information the study would generate. Genomic aspects of the research, including the gene/environment interaction underlying susceptibility or resistance to severe malaria, proved particularly challenging to explain and understand.ConclusionsThis study identifies a number of areas to be addressed in the design of consent processes for genomic research, some of which require careful ethical analysis. These include determining how much information should be provided about differing aspects of the research and how best to promote understandings of genomic research. We conclude that it is important to build capacity in the design and conduct of effective and appropriate consent processes for genomic research in low and middle-income settings. Additionally, consideration should be given to the role of review committees and community consultation activities in protecting the interests of participants in genomic research. (shrink)
Maqasid al-Shariah based Islamic bioethics is an Islamic bioethics concept which uses the objectives of the Shariah as its approach in analysing and assessing bioethical issues. Analysis based on maqasid al-Shariah based Islamic bioethics will examine any bioethical issues from three main aspects namely intention, method, and output or final goal of the studied issues. Then, the evaluation will be analysed from human interest hierarchy, inclusivity, and degree of certainty. The Islamic bioethics concept is a manifestation of dynamic Islamic jurisprudence (...) which can overcome new complex and complicated bioethical issues such as tri-parent baby technology issues. Therefore, this article will introduce and explain the concept of maqasid al-Shariah based Islamic bioethics and outline a general guidance of maqasid al-Shariah based Islamic bioethics to determine a maqṣad based on standards of human good or well-being and harm. (shrink)
The `problem of time' is a cluster of interpretational and formal issues in the foundations of general relativity relating to both the representation of time in the classical canonical formalism, and to the quantization of the theory. The purpose of this short chapter is to provide an accessible introduction to the problem.
Despite all its merits, the vast majority of critical attention devoted to colonialist literature restricts itself by severely bracketing the political context of culture and history. This typical facet of humanistic closure requires the critic systematically to avoid an analysis of the domination, manipulation, exploitation, and disfranchisement that are inevitably involved in the construction of any cultural artifact or relationship. I can best illustrate such closures in the field of colonialist discourse with two brief examples. In her book The Colonial (...) Encounter, which contrasts the colonial representations of three European and three non-European writers, M. M. Mahood skirts the political issue quite explicitly by arguing that she chose those authors precisely because they are “innocent of emotional exploitation of the colonial scene” and are “distanced” from the politics of domination.`1We find a more interesting example of this closure in Homi Bhabha’s criticism. While otherwise provocative and illuminating, his work rests on two assumptions—the unity of the “colonial subject” and the “ambivalence” of colonial discourse—that are inadequately problematized and, I feel, finally unwarranted and unacceptable. In rejecting Edward Said’s “suggestion that colonial power and discourse is possessed entirely by the colonizer,” Bhabha asserts, without providing any explanation, the unity of the “colonial subject .”2 I do not wish to rule out, a priori, the possibility that at some rarefied theoretical level the varied material and discursive antagonisms between conquerors and natives can be reduced to the workings of a single “subject”; but such a unity, let alone its value, must be demonstrated, not assumed. Though he cites Frantz Fanon, Bhabha completely ignored Fanon’s definition of the conqueror/native relation as a “Manichean” struggle—a definition that is not a fanciful metaphoric caricature but an accurate representation of a profound conflict.3 1. M. M. Mahood, The Colonial Encounter: A Reading of Six Novels , pp. 170, 171; and see p. 3. As many other studies demonstrate, the emotional innocence and the distance of the six writers whom Mahood has chosen—Joseph Conrad, E. M. Forster, Graham Greene, Chinua Achebe, R. K. Narayan, and V. S. Naipaul—are, at best, highly debatable.2. Homi K. Bhabha, “The Other Question—The Stereotype and Colonial Discourse,” Screen 24 : 25, 19.3. Frantz Fanon, The Wretched of the Earth, trans. Constance Farrington , p. 41. Abdul R. JanMohamed, assistant professor of English at the University of California, Berkeley, is the author of Manichean Aesthetics: The Politics of Literature in Colonial Africa. He is a founding member and associate editor of Cultural Critique and is currently working on a study of Richard Wright. (shrink)
In 1981 Unruh proposed that fluid mechanical experiments could be used to probe key aspects of the quantum phenomenology of black holes. In particular, he claimed that an analogue to Hawking radiation could be created within a fluid mechanical `dumb hole', with the event horizon replaced by a sonic horizon. Since then an entire sub-field of `analogue gravity' has been created. In 2016 Steinhauer reported the experimental observation of quantum Hawking radiation and its entanglement in a Bose-Einstein condensate analogue black (...) hole. What can we learn from such analogue experiments? In particular, in what sense can they provide evidence of novel phenomena such as black hole Hawking radiation? (shrink)
The analysis of the temporal structure of canonical general relativity and the connected interpretational questions with regard to the role of time within the theory both rest upon the need to respect the fundamentally dual role of the Hamiltonian constraints found within the formalism. Any consistent philosophical approach towards the theory must pay dues to the role of these constraints in both generating dynamics, in the context of phase space, and generating unphysical symmetry transformations, in the context of a hypersurface (...) embedded within a solution. A first denial of time in the terms of a position of reductive temporal relationalism can be shown to be troubled by failure on the first count, and a second denial in the terms of Machian temporal relationalism can be found to be hampered by failure on the second. A third denial of time, consistent with both of the Hamiltonian constraints roles, is constituted by the implementation of a scheme for constructing observables in terms of correlations and leads to a radical Parmenidean timelessness. The motivation for and implications of each of these three denials are investigated. (shrink)
We claim that, as it stands, the Deutsch–Wallace–Everett approach to quantum theory is conceptually incoherent. This charge is based upon the approach’s reliance upon decoherence arguments that conflict with its own fundamental precepts regarding probabilistic reasoning in two respects. This conceptual conflict obtains even if the decoherence arguments deployed are aimed merely towards the establishment of certain ‘emergent’ or ‘robust’ structures within the wave function: To be relevant to physical science notions such as robustness must be empirically grounded, and, on (...) our analysis, this grounding can only plausibly be done in precisely the probabilistic terms that lead to conceptual conflict. Thus, the incoherence problems presented necessitate either the provision of a new, non-probabilistic empirical grounding for the notions of robustness and emergence in the context of decoherence, or the abandonment of the Deutsch–Wallace–Everett programme for quantum theory. (shrink)
On one popular view, the general covariance of gravity implies that change is relational in a strong sense, such that all it is for a physical degree of freedom to change is for it to vary with regard to a second physical degree of freedom. At a quantum level, this view of change as relative variation leads to a fundamentally timeless formalism for quantum gravity. Here, we will show how one may avoid this acute ‘problem of time’. Under our view, (...) duration is still regarded as relative, but temporal succession is taken to be absolute. Following our approach, which is presented in more formal terms in, it is possible to conceive of a genuinely dynamical theory of quantum gravity within which time, in a substantive sense, remains. 1 Introduction1.1 The problem of time1.2 Our solution2 Understanding Symmetry2.1 Mechanics and representation2.2 Freedom by degrees2.3 Voluntary redundancy3 Understanding Time3.1 Change and order3.2 Quantization and succession4 Time and Gravitation4.1 The two faces of classical gravity4.2 Retaining succession in quantum gravity5 Discussion5.1 Related arguments5.2 Concluding remarks. (shrink)
A fundamental tenet of Paul Feyerabend’s pluralistic view of science has it that theory proliferation, that is, the availability of theoretical alternatives, is of crucial importance for the detection of anomalies in established theories. Paul Hoyningen-Huene calls this the Anomaly Importation Thesis, according to which anomalies are imported, as it were, into well-established theories from competing alternatives. This article pursues two major objectives: (a) to work out the systematic details of Feyerabend’s ideas on theory proliferation and anomaly import as they (...) are presented in his early publications and his Against Method and (b) to compare Feyerabend’s ideas on theory proliferation and anomaly import with corresponding features in Popper’s critical rationalist philosophy of science. As it turns out, neither the Principle of Proliferation nor the Anomaly Importation Thesis are necessarily incompatible with critical rationalism. In spite of Feyerabend’s general anti-Popperian attitude, I argue that theoretical pluralism can be seen as an advancement of the critical rationalist philosophy and that critical rationalism provides good arguments for pluralism. (shrink)
An extensive work has been done on corporate social responsibly practices that mainly emphasized the larger firms within developed nations. Nonetheless, still work is needed to observe the importance of CSRPs’ and ethical cultural practices in terms of sustainable competitive performance that garnered far less attention by the existing literature. This study explores the impact of CSRPs on SACP with the mediating role of ECL from SMEs of two emerging nations, i.e., China and Pakistan based on stakeholders’ theory and practices. (...) The results using SEM affirmed the positive linkages of CSRPs—environment responsibility, community responsibility, customers' responsibility, suppliers responsibility, employee responsibility, and Govt. rules & regulations’ responsibility —on SACP. It found that CSRPs have positive relationships with ECL whereas ECL further positively correlated with SACP in the context of both countries. The findings revealed the positive mediating influence of ECL between CSRPs and SACP, respectively. This study furnishes insightful information for management on how firms may achieve sustainable performance by incorporating ethical cultural practices and corporate social responsibility practices as the strategic tools. The study reports numerous implications for management together with lines for future directions. (shrink)
A small but growing number of studies have aimed to understand, assess and reduce existential risks, or risks that threaten the continued existence of mankind. However, most attention has been focused on known and tangible risks. This paper proposes a heuristic for reducing the risk of black swan extinction events. These events are, as the name suggests, stochastic and unforeseen when they happen. Decision theory based on a fixed model of possible outcomes cannot properly deal with this kind of event. (...) Neither can probabilistic risk analysis. This paper will argue that the approach that is referred to as engineering safety could be applied to reducing the risk from black swan extinction events. It will also propose a conceptual sketch of how such a strategy may be implemented: isolated, self-sufficient, and continuously manned underground refuges. Some characteristics of such refuges are also described, in particular the psychosocial aspects. Furthermore, it is argued that this implementation of the engineering safety strategy safety barriers would be effective and plausible and could reduce the risk of an extinction event in a wide range of possible scenarios. Considering the staggering opportunity cost of an existential catastrophe, such strategies ought to be explored more vigorously. (shrink)
Disease prioritarianism is a principle that is often implicitly or explicitly employed in the realm of healthcare prioritization. This principle states that the healthcare system ought to prioritize the treatment of disease before any other problem. This article argues that disease prioritarianism ought to be rejected. Instead, we should adopt ‘the problem-oriented heuristic’ when making prioritizations in the healthcare system. According to this idea, we ought to focus on specific problems and whether or not it is possible and efficient to (...) address them with medical means. This has radical implications for the extension of the healthcare system. First, getting rid of the binary disease/no-disease dichotomy implicit in disease prioritarianism would improve the ability of the healthcare system to address chronic conditions and disabilities that often defy easy classification. Second, the problem-oriented heuristic could empower medical practitioners to address social problems without the need to pathologize these conditions. Third, the problem-oriented heuristic clearly states that what we choose to treat is a normative consideration. Under this assumption, we can engage in a discussion on de-medicalization without distorting preconceptions. Fourth, this pragmatic and de-compartmentalizing approach should allow us to reconsider the term ‘efficiency’. (shrink)
The professions have focused considerable attention on developing codes of conduct. Despite their efforts there is considerable controversy regarding the propriety of professional codes of ethics. Many provisions of professional codes seem to exacerbate disputes between the profession and the public rather than providing a framework that satisfies the public''s desire for moral behavior.After examining three professional codes, we divide the provisions of professional codes into those provisions which urge professionals to avoid moral hazard, maintain professional courtesy and serve the (...) public interest. We note that whereas provisions urging the avoidance of moral hazard are uncontroversial, the public is suspicious of provisions protecting professional courtesy. Public interest provisions are controversial when the public and the profession disagree as to what is in the public interest. Based on these observations, we conclude with recommendations regarding the content of professional codes. (shrink)
Two sample populations, one refugee and one resident, were studied. The frequencies of consanguineous marriages came out to be 49·8% and 55·4%, respectively, for the refugees and the residents. Caste endogamy was dominant both in the residents and the refugees. The mean coefficient of inbreeding was calculated to be 0·0303 for the refugee population and 0·0332 for the resident population samples. First cousin marriage was the dominant type of marriage in both samples; fathers daughter (FBD) marriage was more frequent among (...) the refugees while mothers daughter (MBD) marriage was more frequent among the residents. Education has no decreasing effect on the incidence of consanguineous marriages. A significant difference in the pattern of marriages in the refugees is observed after the Saur Revolution of 1979. (shrink)
This paper expands Rami Ali’s dissolution of the gamer’s dilemma (Ethics Inf Technol 17:267-274, 2015). Morgan Luck’s gamer’s dilemma (Ethics Inf Technol 11(1):31-36, 2009) rests on our having diverging intuition when considering virtual murder and virtual child molestation in video games. Virtual murder is seemingly permissible, when virtual child molestation is not and there is no obvious morally relevant difference between the two. Ali argues that virtual murder and virtual child molestation are equally permissible/impermissible when considered under different modes of (...) engagement. To this end, Ali distinguishes between story-telling gameplay and simulation games, discussing both in depth. I build on the dissolution by looking into competitive gameplay in order to consider what the morally relevant difference between virtual murder and virtual child molestation might be when competing in a video game. I argue that when competitors consent to participate in a competition, the rules of the competition supersede everyday moral intuitions. As such, virtual competitions ought to represent such consent from virtual characters. Virtual children cannot be represented as giving consent to be molested because (1) children cannot be represented as giving sexual consent, and (2) consent to be possibly molested cannot be given. This creates a morally relevant difference between murder and molestation. By fully addressing competitive gameplay, I answer Luck’s worry that Ali’s dissolution is incomplete (Ethics Inf Technol 20:157-162, 2018). (shrink)
An intelligent machine surpassing human intelligence across a wide set of skills has been proposed as a possible existential catastrophe. Among those concerned about existential risk related to artificial intelligence, it is common to assume that AI will not only be very intelligent, but also be a general agent. This article explores the characteristics of machine agency, and what it would mean for a machine to become a general agent. In particular, it does so by articulating some important differences between (...) belief and desire in the context of machine agency. One such difference is that while an agent can by itself acquire new beliefs through learning, desires need to be derived from preexisting desires or acquired with the help of an external influence. Such influence could be a human programmer or natural selection. We argue that to become a general agent, a machine needs productive desires, or desires that can direct behavior across multiple contexts. However, productive desires cannot sui generis be derived from non-productive desires. Thus, even though general agency in AI could in principle be created by human agents, general agency cannot be spontaneously produced by a non-general AI agent through an endogenous process. In conclusion, we argue that a common AI scenario, where general agency suddenly emerges in a non-general agent AI, such as DeepMind’s superintelligent board game AI AlphaZero, is not plausible. (shrink)