Doug Walton, who died in January 2020, was a prolific author whose work in informal logic and argumentation had a profound influence on Artificial Intelligence, including Artificial Intelligence and Law. He was also very interested in interdisciplinary work, and a frequent and generous collaborator. In this paper seven leading researchers in AI and Law, all past programme chairs of the International Conference on AI and Law who have worked with him, describe his influence on their work.
We provide a retrospective of 25 years of the International Conference on AI and Law, which was first held in 1987. Fifty papers have been selected from the thirteen conferences and each of them is described in a short subsection individually written by one of the 24 authors. These subsections attempt to place the paper discussed in the context of the development of AI and Law, while often offering some personal reactions and reflections. As a whole, the subsections build into (...) a history of the last quarter century of the field, and provide some insights into where it has come from, where it is now, and where it might go. (shrink)
Continuous sedation until death (CSD), the act of reducing or removing the consciousness of an incurably ill patient until death, often provokes medical–ethical discussions in the opinion sections of medical and nursing journals. Some argue that CSD is morally equivalent to physician-assisted death (PAD), that it is a form of “slow euthanasia.” A qualitative thematic content analysis of opinion pieces was conducted to describe and classify arguments that support or reject a moral difference between CSD and PAD. Arguments pro and (...) contra a moral difference refer basically to the same ambiguous themes, namely intention, proportionality, withholding artificial nutrition and hydration, and removing consciousness. This demonstrates that the debate is first and foremost a semantic rather than a factual dispute, focusing on the normative framework of CSD. Given the prevalent ambiguity, the debate on CSD appears to be a classical symbolic struggle for moral authority. (shrink)
This paper explores the interface between users and producers of translational science through three case studies. It argues that effective TS requires a breakdown between user and producer roles: users become producers and producers become users. In making this claim, we challenge conventional understandings of TS as well as linear models of innovation. Policy-makers and funders increasingly expect TS and its associated socioeconomic benefits to occur when funding scientific research. We argue that a better understanding of the hybridity between users (...) and producers in TS is essential to encouraging effective TS activities. In arguing for broader understandings of the hybrid roles of user/producers in TS we rely on empirical observations made during our four-year study of three translational pathways here labeled clinical, commercial, and civic. These pathways were identified in a large-scale network of scientists investigating the pathogenomics of innate immunity (i.e. “the PI.2 network”.. (shrink)
Richerson et al. argue that relatively large culturalFSTvalues provide evidence for group structure and therefore scope for group selection. However, recent research on spatial patterns of cultural variation demonstrates that, as in the genetic case, apparent group structure can be a consequence of geographic clines, not group barriers. Such a pattern limits the scope for cultural group selection.
Before COVID-19, dementia singing groups and choirs flourished, providing activity, cognitive stimulation, and social support for thousands of people with dementia in the UK. Interactive music provides one of the most effective psychosocial interventions for people with dementia; it can allay agitation and promote wellbeing. Since COVID-19 has halted the delivery of in-person musical activities, it is important for the welfare of people with dementia and their carers to investigate what alternatives to live music making exist, how these alternatives are (...) delivered and how their accessibility can be expanded. This community case study examines recent practice in online music-making in response to COVID-19 restrictions for people with dementia and their supporters, focusing on a UK context. It documents current opportunities for digital music making, and assesses the barriers and facilitators to their delivery and accessibility. Online searches of video streaming sites and social media documented what music activities were available. Expert practitioners and providers collaborated on this study and supplied input about the sessions they had been delivering, the technological challenges and solutions they had found, and the responses of the participants. Recommendations for best practice were developed and refined in consultation with these collaborators. Over 50 examples of online music activities were identified. In addition to the challenges of digital inclusion and accessibility for some older people, delivering live music online has unique challenges due to audio latency and sound quality. It is necessary to adapt the session to the technology's limitations rather than expect to overcome these challenges. The recommendations highlight the importance of accessibility, digital safety and wellbeing of participants. They also suggest ways to optimize the quality of their musical experience. The pandemic has prompted innovative approaches to deliver activities and interventions in a digital format, and people with dementia and their carers have adapted rapidly. While online music is meeting a clear current need for social connection and cognitive stimulation, it also offers some advantages which remain relevant after COVID-19 restrictions are relaxed. The recommendations of this study are intended to be useful to musicians, dementia care practitioners, and researchers during the pandemic and beyond. (shrink)
ABSTRACT In this wide-ranging interview Professor Douglas V. Porpora discusses a number of issues. First, how he became a Critical Realist through his early work on the concept of structure. Second, drawing on his Reconstructing Sociology, his take on the current state of American sociology. This leads to discussion of the broader range of his work as part of Margaret Archer’s various Centre for Social Ontology projects, and on moral-macro reasoning and the concept of truth in political discourse.
If “perfectionism” in ethics refers to those normative theories that treat the fulfillment or realization of human nature as central to an account of both goodness and moral obligation, in what sense is “human flourishing” a perfectionist notion? How much of what we take “human flourishing” to signify is the result of our understanding of human nature? Is the content of this concept simply read off an examination of our nature? Is there no place for diversity and individuality? Is the (...) belief that the content of such a normative concept can be determined by an appeal to human nature merely the result of epistemological naiveté? What is the exact character of the connection between human flourishing and human nature? These questions are the ultimate concern of this essay, but to appreciate the answers that will be offered it is necessary to understand what is meant by “human flourishing.” “Human flourishing” is a relatively recent term in ethics. It seems to have developed in the last two decades because the traditional translation of the Greek term eudaimonia as “happiness” failed to communicate clearly that eudaimonia was an objective good, not merely a subjective good. (shrink)
In this article, I argue that Brad Hooker's rule-consequentialism implausibly implies that what earthlings are morally required to sacrifice for the sake of helping their less fortunate brethren depends on whether or not other people exist on some distant planet even when these others would be too far away for earthlings to affect.
Discussions of Karl Popper's falsificationist philosophy of science appear regularly in the recent literature on economic methodology. In this literature, there seem to be two fundamental points of agreement about Popper. First, most economists take Popper's falsificationist method of bold conjecture and severe test to be the correct characterization of scientific conduct in the physical sciences. Second, most economists admit that economic theory fails miserably when judged by these same falsificationist standards. As Latsis states, “the development of economic analysis would (...) look a dismal affair through falsificationist spectacles.”. (shrink)
Douglas proposes a new ideal in which values serve an essential function throughout scientific inquiry, but where the role values play is constrained at key points, protecting the integrity and objectivity of science.
A leading expert in informal logic, Douglas Walton turns his attention in this new book to how reasoning operates in trials and other legal contexts, with special emphasis on the law of evidence. The new model he develops, drawing on methods of argumentation theory that are gaining wide acceptance in computing fields like artificial intelligence, can be used to identify, analyze, and evaluate specific types of legal argument. In contrast with approaches that rely on deductive and inductive logic and (...) rule out many common types of argument as fallacious, Walton’s aim is to provide a more expansive view of what can be considered "reasonable" in legal argument when it is construed as a dynamic, rule-governed, and goal-directed conversation. This dialogical model gives new meaning to the key notions of relevance and probative weight, with the latter analyzed in terms of pragmatic criteria for what constitutes plausible evidence rather than truth. (shrink)
A new pragmatic approach, based on the latest developments in argumentation theory, analyzing appeal to expert opinion as a form of argument. Reliance on authority has always been a common recourse in argumentation, perhaps never more so than today in our highly technological society when knowledge has become so specialized—as manifested, for instance, in the frequent appearance of "expert witnesses" in courtrooms. When is an appeal to the opinion of an expert a reasonable type of argument to make, and when (...) does it become a fallacy? This book provides a method for the evaluation of these appeals in everyday argumentation. Specialized domains of knowledge such as science, medicine, law, and government policy have gradually taken over as the basis on which many of our rational decisions are made daily. Consequently, appeal to expert opinion in these areas has become a powerful type of argument. Challenging an argument based on expert scientific opinion, for example, has become as difficult as it once was to question religious authority. Walton stresses that even in cases where expert opinion is divided, the effect of it can still be so powerful that it overwhelms an individual's ability to make a decision based on personal deliberation of what is right or wrong in a given situation. The book identifies the requirements that make an appeal to expert opinion a reasonable or unreasonable argument. Walton's new pragmatic approach analyzes that appeal as a distinctive form of argument, with an accompanying set of appropriate critical questions matching the form. Throughout the book, a historical survey of the key developments in the evolution of the argument from authority, dating from the time of the ancients, is given, and new light is shed on current problems of "junk science" and battles between experts in legal argumentation. (shrink)
A rational defense of the criminal law must provide a comprehensive theory of culpability. A comprehensive theory of culpability must resolve several difficult issues; in this article I will focus on only one. The general problem arises from the lack of a systematic account of relative culpability. An account of relative culpability would identify and defend a set of considerations to assess whether, why, under what circumstances, and to what extent persons who perform a criminal act with a given culpable (...) state are more or less blameworthy than persons who perform that act with a different culpable state. (shrink)
The problem of standard of care in clinical research concerns the level of treatment that investigators must provide to subjects in clinical trials. Commentators often formulate answers to this problem by appealing to two distinct types of obligations: professional obligations and natural duties. In this article, I investigate whether investigators also possess institutional obligations that are directly relevant to the problem of standard of care, that is, those obligations a person has because she occupies a particular institutional role. I examine (...) two types of institutional contexts: (1) public research agencies – agencies or departments of states that fund or conduct clinical research in the public interest; and (2) private-for-profit corporations. I argue that investigators who are employed or have their research sponsored by the former have a distinctive institutional obligation to conduct their research in a way that is consistent with the state's duty of distributive justice to provide its citizens with access to basic health care, and its duty to aid citizens of lower income countries. By contrast, I argue that investigators who are employed or have their research sponsored by private-for-profit corporations do not possess this obligation nor any other institutional obligation that is directly relevant to the ethics of RCTs. My account of the institutional obligations of investigators aims to contribute to the development of a reasonable, distributive justice-based account of standard of care. (shrink)
In this engaging book, Douglas Anderson begins with the assumption that philosophy—the Greek love of wisdom—is alive and well in American culture. At the same time, professional philosophy remains relatively invisible. Anderson traverses American life to find places in the wider culture where professional philosophy in the distinctively American tradition can strike up a conversation. How might American philosophers talk to us about our religious experience, or political engagement, or literature—or even, popular music? Anderson’s second aim is to find (...) places where philosophy happens in nonprofessional guises—cultural places such as country music, rock’n roll, and Beat literature. He not only enlarges the tradition of American philosophers such as John Dewey and William James by examining lesser-known figures such as Henry Bugbee and Thomas Davidson, but finds the theme and ideas of American philosophy in some unexpected places, such as the music of Hank Williams, Tammy Wynette, and Bruce Springsteen, and the writingsof Jack Kerouac.The idea of “philosophy Americana” trades on the emergent genre of “music Americana,” rooted in traditional themes and styles yet engaging our present experiences. The music is “popular” but not thoroughly driven by economic considerations, and Anderson seeks out an analogous role for philosophical practice, where philosophy and popular culture are co-adventurers in the life of ideas. Philosophy Americana takes seriously Emerson’s quest for the extraordinary in the ordinary and James’s belief that popular philosophy can still be philosophy. (shrink)
Bridging the gap between applied ethics and ethical theory, Ethical Argumentation draws on recent research in argumentation theory to develop a more realistic model of how ethical justification actually works.
Commonsense Consequentialism is a book about morality, rationality, and the interconnections between the two. In it, Douglas W. Portmore defends a version of consequentialism that both comports with our commonsense moral intuitions and shares with other consequentialist theories the same compelling teleological conception of practical reasons. Broadly construed, consequentialism is the view that an act's deontic status is determined by how its outcome ranks relative to those of the available alternatives on some evaluative ranking. Portmore argues that outcomes should (...) be ranked, not according to their impersonal value, but according to how much reason the relevant agent has to desire that each outcome obtains and that, when outcomes are ranked in this way, we arrive at a version of consequentialism that can better account for our commonsense moral intuitions than even many forms of deontology can. What's more, Portmore argues that we should accept this version of consequentialism, because we should accept both that an agent can be morally required to do only what she has most reason to do and that what she has most reason to do is to perform the act that would produce the outcome that she has most reason to want to obtain.Although the primary aim of the book is to defend a particular moral theory, Portmore defends this theory as part of a coherent whole concerning our commonsense views about the nature and substance of both morality and rationality. Thus, it will be of interest not only to those working on consequentialism and other areas of normative ethics, but also to those working in metaethics. Beyond offering an account of morality, Portmore offers accounts of practical reasons, practical rationality, and the objective/subjective obligation distinction. (shrink)
I. Beyond Utilitarianism In the summer of 1982, I published an article called “Missiles and Morals,” in which I argued on utilitarian grounds that nuclear deterrence in its present form is not morally justifiable. The argument of “Missiles and Morals” compared the most likely sort of nuclear war to develop under nuclear deterrence with the most likely sort of nuclear war to develop under American unilateral nuclear disaramament. For a variety of reasons, I claimed diat the number of casualties in (...) a two-sided nuclear war developing under DET would be at least fifteen times greater than the number of casualties in a one-sided nuclear attack developing under UND. If one assumes that human lives lost or saved is the principal criterion by which nuclear weapons policies should be measured, it follows that DET is morally superior to UND on utilitarian grounds only if the chance of a two-sided nuclear war under DET is more than fifteen times less dian the chance of a one-sided nuclear attack under UND. Since I did not believe that the chance of nuclear war under deterrence is fifteen times less than the chance of nuclear war under unilateral nuclear disarmament, I inferred diat utilitaranism failed to justify DET. Indeed, on utilitarian grounds, DET stood condemned. (shrink)
Old philosophical problems never die, but they can be reinterpreted. In this paper, I offer a reinterpretation of the problem of reconciling divine omniscience and human free will. Classical discussions of this problem concentrate on the nature of God and the concept of free will. The present discussion will focus attention on the concept of knowledge, drawing on developments in epistemology that resulted from the posing of a certain problem by Edmund Gettier in 1963.
In this paper, I defend teleological theories of belief against the exclusivity objection. I argue that despite the exclusive influence of truth in doxastic deliberation, multiple epistemic aims interact when we consider what to believe. This is apparent when we focus on the processes involved in specific instances (or concrete cases) of doxastic deliberation, such that the propositions under consideration are specified. First, I out- line a general schema for weighing aims. Second, I discuss recent attempts to defend the teleological (...) position in relation to this schema. And third, I develop and defend my proposal that multiple epistemic aims interact in doxastic deliberation—a possibility which, as of yet, has received no serious attention in the literature. (shrink)
In this paper we consider persuasion in the context of practical reasoning, and discuss the problems associated with construing reasoning about actions in a manner similar to reasoning about beliefs. We propose a perspective on practical reasoning as presumptive justification of a course of action, along with critical questions of this justification, building on the account of Walton. From this perspective, we articulate an interaction protocol, which we call PARMA, for dialogues over proposed actions based on this theory. We outline (...) an axiomatic semantics for the PARMA Protocol, and discuss two implementations which use this protocol to mediate a discussion between humans. We then show how our proposal can be made computational within the framework of agents based on the Belief-Desire-Intention model, and illustrate this proposal with an example debate within a multi agent system. (shrink)
This book provides a systematic analysis of many common argumentation schemes and a compendium of 96 schemes. The study of these schemes, or forms of argument that capture stereotypical patterns of human reasoning, is at the core of argumentation research. Surveying all aspects of argumentation schemes from the ground up, the book takes the reader from the elementary exposition in the first chapter to the latest state of the art in the research efforts to formalize and classify the schemes, outlined (...) in the last chapter. It provides a systematic and comprehensive account, with notation suitable for computational applications that increasingly make use of argumentation schemes. (shrink)
In this paper, I discuss whether different interpretations of the ‘aim’ of belief—both the teleological and normative interpretations—have the resources to explain certain descriptive and normative features of suspended belief (suspension). I argue that, despite the recent efforts of theorists to extend these theories to account for suspension, they ultimately fail. The implication is that we must either develop alternative theories of belief that can account for suspension, or we must abandon the assumption that these theories ought to be able (...) to account for suspension. To close, I briefly consider some of the reasons we have in favour of pursing each of these options, and I suggest that it is worth exploring the possibility that suspension is best understood as its own attitude, independently of theories of belief’s ‘aim’. (shrink)
Fundamentals of Critical Argumentation presents the basic tools for the identification, analysis, and evaluation of common arguments for beginners. The book teaches by using examples of arguments in dialogues, both in the text itself and in the exercises. Examples of controversial legal, political, and ethical arguments are analyzed. Illustrating the most common kinds of arguments, the book also explains how to evaluate each kind by critical questioning. Douglas Walton shows how arguments can be reasonable under the right dialogue conditions (...) by using critical questions to evaluate them. The book teaches by example, both in the text itself and in exercises, but it is based on methods that have been developed through the author's thirty years of research in argumentation studies. (shrink)
Properties and objects are everywhere, but remain a philosophical mystery. Douglas Ehring argues that the idea of tropes--properties and relations understood as particulars--provides the best foundation for a metaphysical account of properties and objects. He develops and defends a new theory of trope nominalism.
During the Gulf war, CNN correspondent Peter Arnett distinguished himself with its courageous reporting in Iraq while under fire by the U.S.-led coalition which dropped more bombs on Iraq than were unleashed in World War II. Reporting live from Baghdad throughout the war, Arnett provided vivid daily accounts of life in Iraq during one of the most sustained air attacks in history. From his live telephone reporting of the early hours of the U.S. attack on Iraq in January 1991 through (...) his live satellite reports of the effects of the daily bombing of Iraq, Arnett distinguished himself through his attempts to cut through the lies and disinformation of both sides and to provide accurate reporting on the effects of the U.S.-led coalition assault against Iraq. (shrink)
In this paper we apply a general account of practical reasoning to arguing about legal cases. In particular, we provide a reconstruction of the reasoning of the majority and dissenting opinions for a particular well-known case from property law. This is done through the use of Belief-Desire-Intention (BDI) agents to replicate the contrasting views involved in the actual decision. This reconstruction suggests that the reasoning involved can be separated into three distinct levels: factual and normative levels and a level connecting (...) the two, with conclusions at one level forming premises at the next. We begin by summarising our general approach, which uses instantiations of an argumentation scheme to provide presumptive justifications for actions, and critical questions to identify arguments which attack these justifications. These arguments and attacks are organised into argumentation frameworks to identify the status of individual arguments. We then discuss the levels of reasoning that occur in this reconstruction and the properties and significance of each of these levels. We illustrate the different levels with short examples and also include a discussion of the role of precedents within these levels of reasoning. (shrink)
In this paper it is shown how tools developed in argumentation theory and artificial intelligence can be applied to the development of a new dialectical analysis of the speech act of making a proposal in a deliberation dialogue. These tools are developed, modified and used to formulate dialogue pre-conditions, defining conditions and post-conditions for the speech act of making a proposal in a deliberation dialogue. The defining conditions set out what is required for a move in a dialogue to count (...) as the making of a proposal by one of the parties. What is required are the conditions that (1) the move fit the requirements of the argumentation scheme for practical reasoning, and (2) the premises are propositions describing common goals of both parties or propositions that they reasonably consider means to achieve these goals. The analysis goes beyond the standard speech act approach by specifying not only the normative requirements for making a well-formed proposal, but also the requirements for responding to it by questioning or criticizing it, and the requirements for defending it. (shrink)
Dual-ranking act-consequentialism (DRAC) is a rather peculiar version of act-consequentialism. Unlike more traditional forms of act-consequentialism, DRAC doesn’t take the deontic status of an action to be a function of some evaluative ranking of outcomes. Rather, it takes the deontic status of an action to be a function of some non-evaluative ranking that is in turn a function of two auxiliary rankings that are evaluative. I argue that DRAC is promising in that it can accommodate certain features of commonsense morality (...) that no single-ranking version of act-consequentialism can: supererogation, agent-centered options, and the self-other asymmetry. I also defend DRAC against three objections: (1) that its dual-ranking structure is ad hoc, (2) that it denies (putatively implausibly) that it is always permissible to make self-sacrifices that don’t make things worse for others, and (3) that it violates certain axioms of expected utility theory, viz., transitivity and independence. (shrink)
The notions of burden of proof and presumption are central to law, but as noted in McCormick on Evidence, they are also the slipperiest of any of the family of legal terms employed in legal reasoning. However, recent studies of burden of proof and presumption (Prakken et al. 2005; Prakken and Sartor 2006). Gordon et al. (2007) offer formal models that can render them into precise tools useful for legal reasoning. In this paper, the various theories and formal models are (...) comparatively evaluated with the aim of working out a more comprehensive theory that can integrate the components of the argumentation structure on which they are based. It is shown that the notion of presumption has both a logical component and a dialectical component, and the new theory of presumption developed in the paper, called the dialogical theory, combines these two components. (shrink)
Argument invention is a method that can be used to help an arguer find arguments that could be used to prove a claim he needs to defend. The aim of this paper is to show how argumentation systems recently developed in artificial intelligence can be applied to the task of argument invention. One such system called Carneades is featured. Carneades can be used to analyze arguments, evaluate arguments, to make an argument diagram, and to construct arguments from a database. Using (...) some simple examples, the paper explains how Carneades works as a system of argument invention. (shrink)
This is an introductory guide to the basic principles of constructing good arguments and criticizing bad ones. It is nontechnical in its approach, and is based on 150 key examples, each discussed and evaluated in clear, illustrative detail. The author explains how errors, fallacies, and other key failures of argument occur. He shows how correct uses of argument are based on sound argument strategies for reasoned persuasion and critical questions for responding. Among the many subjects covered are: techniques of posing, (...) replying to, and criticizing questions, forms of valid argument, relevance, appeals to emotion, personal attack, uses and abuses of expert opinion, problems in deploying statistics, loaded terms, equivocation, and arguments from analogy. (shrink)
Page generated Sun Jul 25 17:41:40 2021 on philpapers-web-84c8c567c7-fqqpj
cache stats: hit=3837, miss=10238, save= autohandler : 1076 ms called component : 1059 ms search.pl : 896 ms render loop : 581 ms addfields : 313 ms initIterator : 312 ms publicCats : 226 ms next : 217 ms menu : 117 ms quotes : 73 ms retrieve cache object : 70 ms search_quotes : 43 ms autosense : 31 ms save cache object : 30 ms match_cats : 28 ms prepCit : 19 ms applytpl : 4 ms intermediate : 1 ms match_authors : 1 ms match_other : 1 ms init renderer : 0 ms setup : 0 ms auth : 0 ms writelog : 0 ms