ABSTRACTWhen Manchester United Football Club publicly announced the signing of Alexis Sanchez in 2018, it was done through a short video that purported to demonstrate the rich traditions and...
Paul Grice has given an account of conversational implicatures that hinges on the hypothesis that communication is a cooperative activity performed by rational agents which pursue a common goal. The attempt to derive Grice’s principles from game theory is a natural step, since its aim is to predict the behaviour of rational agents in situations where the outcome of one agent’s choice depends also on the choices of others. Generalised conversational implicatures, and in particular scalar ones, offer an ideal test (...) bed for this working hypothesis, since with this kind of implicatures the alternative choices available to the agents are less dependent on context, and they can be derived from the meanings of the sentences employed. Some rival game-theoretic accounts of the same phenomena will be criticised. The present paper shows that scalar implicatures can be explained using iterated admissibility, but that some of these need an additional assumption in order to be accounted for. (shrink)
Epistemologists and philosophers of science have often attempted to express formally the impact of a piece of evidence on the credibility of a hypothesis. In this paper we will focus on the Bayesian approach to evidential support. We will propose a new formal treatment of the notion of degree of confirmation and we will argue that it overcomes some limitations of the currently available approaches on two grounds: (i) a theoretical analysis of the confirmation relation seen as an extension of (...) logical deduction and (ii) an empirical comparison of competing measures in an experimental inquiry concerning inductive reasoning in a probabilistic setting. (shrink)
In the first exposition of the doctrine of indeterminacy of translation, Quine asserted that the individuation and translation of truth-functional sentential connectives like 'and', 'or', 'not' are not indeterminate. He changed his mind later on, conjecturing that some sentential connectives might be interpreted in different non-equivalent ways. This issue has not been debated much by Quine, or in the subsequent literature, it is, as it were, an unsolved problem, not well understood. For the sake of the argument, I will adopt (...) Quine's background assumption that all the semantic features of a language can be reduced to the speakers' dispositions toward assent and dissent, as far as only the truth-conditional core of the meaning of sentences is concerned. I will put forward an argument to the effect that the speech dispositions of most, if not all, English (French, Italian, etc.) speakers constrain a unique translation of their connectives. This argument crucially relies on an empirical conjecture concerning the behaviour of these operators. (shrink)
This paper outlines an account of conditionals, the evidential account, which rests on the idea that a conditional is true just in case its antecedent supports its consequent. As we will show, the evidential account exhibits some distinctive logical features that deserve careful consideration. On the one hand, it departs from the material reading of ‘if then’ exactly in the way we would like it to depart from that reading. On the other, it significantly differs from the non-material accounts which (...) hinge on the Ramsey Test, advocated by Adams, Stalnaker, Lewis, and others. (shrink)
The so‐called problem of irrelevant conjunction has been seen as a serious challenge for theories of confirmation. It involves the consequences of conjoining irrelevant statements to a hypothesis that is confirmed by some piece of evidence. Following Hawthorne and Fitelson, we reconstruct the problem with reference to Bayesian confirmation theory. Then we extend it to the case of conjoining irrelevant statements to a hypothesis that is dis confirmed by some piece of evidence. As a consequence, we obtain and formally present (...) a novel and more troublesome problem of irrelevant conjunction. We conclude by indicating a possible solution based on a measure‐sensitive approach and by critically discussing a major alternative way to address the problem. *Received December 2008; revised August 2009. †To contact the authors, please write to: Department of Philosophy, University of Turin, via Sant'Ottavio 20, 10124 Turin, Italy; e‐mail: vincenzo.crupi@unito.it ; katya.tentori@unitn.it or k.tentori@ucl.ac.uk. (shrink)
Uso, significato e riferimento - This article is an exposition of W.V. Quine's doctrine of the indeterminacy of translation of terms. The aim is to provide a clear formulation of this doctrine, to distinguish it from the much stronger claim that the translation of sentences is indeterminate, and to outline the arguments put forward by Quine. The most systematic of these is reconstructed in detail, namely the argument from proxy functions. Finally, it is argued that the ultimate ground of the (...) doctrine is the acceptance of the semantic primacy of sentences. The claim that meaning has to be identified with language use is also discussed. (shrink)
The conjunction fallacy has been a key topic in debates on the rationality of human reasoning and its limitations. Despite extensive inquiry, however, the attempt to provide a satisfactory account of the phenomenon has proved challenging. Here we elaborate the suggestion (first discussed by Sides, Osherson, Bonini, & Viale, 2002) that in standard conjunction problems the fallacious probability judgements observed experimentally are typically guided by sound assessments of _confirmation_ relations, meant in terms of contemporary Bayesian confirmation theory. Our main formal (...) result is a confirmation-theoretic account of the conjunction fallacy, which is proven _robust_ (i.e., not depending on various alternative ways of measuring degrees of confirmation). The proposed analysis is shown distinct from contentions that the conjunction effect is in fact not a fallacy, and is compared with major competing explanations of the phenomenon, including earlier references to a confirmation-theoretic account. (shrink)
Probability ratio and likelihood ratio measures of inductive support and related notions have appeared as theoretical tools for probabilistic approaches in the philosophy of science, the psychology of reasoning, and artificial intelligence. In an effort of conceptual clarification, several authors have pursued axiomatic foundations for these two families of measures. Such results have been criticized, however, as relying on unduly demanding or poorly motivated mathematical assumptions. We provide two novel theorems showing that probability ratio and likelihood ratio measures can be (...) axiomatized in a way that overcomes these difficulties. (shrink)
Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people’s goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and the (...) reduction thereof. However, a variety of alternative entropy metrics are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism. (shrink)
In his mature writings, Kuhn describes the process of specialisation as driven by a form of incommensurability, defined as a conceptual/linguistic barrier which promotes and guarantees the insularity of specialties. In this paper, we reject the idea that the incommensurability among scientific specialties is a linguistic barrier. We argue that the problem with Kuhn’s characterisation of the incommensurability among specialties is that he presupposes a rather abstract theory of semantic incommensurability, which he then tries to apply to his description of (...) the process of specialisation. By contrast, this paper follows a different strategy: after criticising Kuhn’s view, it takes a further look at how new scientific specialties emerge. As a result, a different way of understanding incommensurability among specialties will be proposed. (shrink)
In his late years, Thomas Kuhn became interested in the process of scientific specialization, which does not seem to possess the destructive element that is characteristic of scientific revolutions. It therefore makes sense to investigate whether and how Kuhn’s insights about specialization are consistent with, and actually fit, his model of scientific progress through revolutions. In this paper, I argue that the transition toward a new specialty corresponds to a revolutionary change for the group of scientists involved in such a (...) transition. I will clarify the role of the scientific community in revolutionary changes and characterize the incommensurability across specialties as possessing both semantic and methodological aspects. The discussion of the discovery of the structure of DNA will serve both as an illustration of my main argument and as reply to one criticism raised against Kuhn—namely, that his model cannot capture cases of revolutionary yet non-disruptive episodes of scientific progress. Revisiting Kuhn’s ideas on specialization will shed new light on some often overlooked features of scientific change. (shrink)
We prove that the unification type of Łukasiewicz logic and of its equivalent algebraic semantics, the variety of MV-algebras, is nullary. The proof rests upon Ghilardiʼs algebraic characterisation of unification types in terms of projective objects, recent progress by Cabrer and Mundici in the investigation of projective MV-algebras, the categorical duality between finitely presented MV-algebras and rational polyhedra, and, finally, a homotopy-theoretic argument that exploits lifts of continuous maps to the universal covering space of the circle. We discuss the background (...) to such diverse tools. In particular, we offer a detailed proof of the duality theorem for finitely presented MV-algebras and rational polyhedra—a fundamental result that, albeit known to specialists, seems to appear in print here for the first time. (shrink)
In this paper we are concerned about the ways GCH can fail in relation to rank-into-rank hypotheses, i.e., very large cardinals usually denoted by I3, I2, I1 and I0. The main results are a satisfactory analysis of the way the power function can vary on regular cardinals in the presence of rank-into-rank hypotheses and the consistency under I0 of the existence of j:Vλ+1≺Vλ+1\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${j : V_{\lambda+1} {\prec} V_{\lambda+1}}$$\end{document} with the failure of GCH (...) at λ\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$${\lambda}$$\end{document}. (shrink)
Incommensurability may be regarded as driving specialisation, on the one hand, and as posing some problems to interdisciplinarity, on the other hand. It may be argued, however, that incommensurability plays no role in either specialisation or interdisciplinarity. Scientific specialties could be defined as simply 'different' (that is, about different things), rather than 'incommensurable' (that is, competing for the explanation of the same phenomena). Interdisciplinarity could be viewed as the co- ordinated effort of scientists possessing complemetary and interlocking skills, and not (...) as the overcoming of some sort of incommensurable divide. This article provides a comprehensive evaluative examination of the relations between specialisation, interdisciplinarity, and incommensurability. Its aim is to defend the relevance of incommensurability to both specialisation and interdisciplinarity. At the same time, it aims at correcting the tendency, common among many philosophers, to regard incommensurability in a restrictive manner - such as, for example, as an almost purely semantic issue. (shrink)
We offer a proof of the duality theorem for finitely presented MV-algebras and rational polyhedra, a folklore and yet fundamental result. Our approach develops first a general dual adjunction between MV-algebras and subspaces of Tychonoff cubes, endowed with the transformations that are definable in the language of MV-algebras. We then show that this dual adjunction restricts to a duality between semisimple MV-algebras and closed subspaces of Tychonoff cubes. The duality theorem for finitely presented objects is obtained by a further specialisation. (...) Our treatment is aimed at showing exactly which parts of the basic theory of MV-algebras are needed in order to establish these results, with an eye towards future generalisations. (shrink)
Interaction mining is about discovering and extracting insightful information from digital conversations, namely those human?human information exchanges mediated by digital network technology. We present in this article a computational model of natural arguments and its implementation for the automatic argumentative analysis of digital conversations, which allows us to produce relevant information to build interaction business analytics applications overcoming the limitations of standard text mining and information retrieval technology. Applications include advanced visualisations and abstractive summaries.
The paper lists several editions of Euclid’s Elements in the Early Modern Age, giving for each of them the axioms and postulates employed to ground elementary mathematics.
The problem of artificial precision is a major objection to any theory of vagueness based on real numbers as degrees of truth. Suppose you are willing to admit that, under sufficiently specified circumstances, a predication of “is red” receives a unique, exact number from the real unit interval [0, 1]. You should then be committed to explain what is it that determines that value, settling for instance that my coat is red to degree 0.322 rather than 0.321. In this note (...) I revisit the problem in the important case of Łukasiewicz infinite-valued propositional logic that brings to the foreground the rôle of maximally consistent theories. I argue that the problem of artificial precision, as commonly conceived of in the literature, actually conflates two distinct problems of a very different nature. (shrink)
Theory change is a central concern in contemporary epistemology and philosophy of science. In this paper, we investigate the relationships between two ongoing research programs providing formal treatments of theory change: the (post-Popperian) approach to verisimilitude and the AGM theory of belief change. We show that appropriately construed accounts emerging from those two lines of epistemological research do yield convergences relative to a specified kind of theories, here labeled “conjunctive”. In this domain, a set of plausible conditions are identified which (...) demonstrably capture the verisimilitudinarian effectiveness of AGM belief change, i.e., its effectiveness in tracking truth approximation. We conclude by indicating some further developments and open issues arising from our results. (shrink)
This book reconstructs, both from the historical and theoretical points of view, Leibniz's geometrical studies, focusing in particular on the research Leibniz ...
Aurama is a system designed to provide peace of mind and a sense of connectedness to adults who care for elderly parents living alone. Aurama monitors the elders at home using unobtrusive sensor technology and collects data about sleeping patterns, weight trends, cognitive abilities and presence at home. The system provides an unobtrusive ambient information display that presents the status of the elder and lets its users inspect long-term data about the well-being of the elder interactively. Aurama was designed iteratively (...) with substantial user involvement through interviews, prototype evaluation, focus groups and lab tests. The final prototype was evaluated in two field trials each involving an elder and their adult children. The input of users throughout the design process and during these tests demonstrates clearly the potential of awareness systems to support the target user group to obtain peace of mind and feel connected. Furthermore, the users indicate a clear need for information on long-term trends relating to the well-being of aging parents, in contrast to the current emphasis in this field of research on providing instantaneous status information about daily activities and context. (shrink)
The year 2017 has dawned in a new era. This is an era where cyber terrorism and cyber extremism are increasingly going to be significant factors in our day-to-day lives. Whether we like it or not, today social media platforms are infiltrated with cyber terrorists and cyber extremists. In addition, Cyber radicalization as a phenomenon is constantly on the rise.
This article reports from the International Conference on Cyberlaw, Cybercrime & Cybersecurity. The Conference was addressed by more than 150 speakers backed by more than 80 supporters. It was a wonderful opportunity to network with international thought leaders under one roof.
In this essay, I argue that the Indian state’s response to the Maoist insurgency has been ideologically shaped by the “new terrorism” discourse cultivated by Western powers, particularly by the United States. Following the post-9/11 othering of Islamic terrorism as a trope of a “civilizational clash” between East and West, the Indian state has strategically demarcated the regions affected by the Maoist armed insurgency as the “Red Corridor,” conceiving the insurgency as “the single biggest threat to the internal security of (...) the nation.” The domestic othering of the Red Corridor as an “unpatriotic,” “undemocratic,” “contaminated,” or even a “diseased zone” is further exacerbated by the systemic demonization, criminalization, and depoliticization of the Maoist insurgency through state-sponsored propaganda. In the attempt to uncover the implied collusion and complicity between the U.S.-led “war on terror” and India’s “war with the Maoists,” I draw on Hamid Dabashi’s view of “post-Orientalism,” Giorgio Agamben’s notion of “bare life,” and Achille Mbembe’s coinage of “necropolitics.”. (shrink)