En 1969 Gilles Deleuze obtient son diplôme de doctorat grâce à deux thèses : l'une, principale, parue sous le nom de Différence et répétition, l'autre, secondaire, dédiée à l'analyse du concept de l'expression dans l’œuvre de Spinoza. Le titre de ce deuxième travail est en effet Spinoza et le problème de l'expression et son but concerne la volonté de préciser le rôle que ce concept joue dans la pensée spinoziste et de démontrer en quelle manière il se révèle central pour (...) la compréhension du s... (shrink)
Recently the complexity of discursive practices has been widely acknowledged by the humanities and social sciences. In fact, to know anything is to know in terms of one or more discourse. The "discursive turn" in psychology may be considered as a new paradigm oriented to a correct study of (wo)man only if it is able to grasp the semiotical ground of psychic experience both as an "effort after meaning" and as a "struggle over meaning." In this sense the notion of (...) "diatext" has been proposed as a contribution in working out a psychosemiotical approach to understand how the discursive practices assign subject-positions to the agents of each interlocution scenario. (shrink)
Artefacts do not always do what they are supposed to, due to a variety of reasons, including manufacturing problems, poor maintenance, and normal wear-and-tear. Since software is an artefact, it should be subject to malfunctioning in the same sense in which other artefacts can malfunction. Yet, whether software is on a par with other artefacts when it comes to malfunctioning crucially depends on the abstraction used in the analysis. We distinguish between “negative” and “positive” notions of malfunction. A negative malfunction, (...) or dysfunction, occurs when an artefact token either does not or cannot do what it is supposed to. A positive malfunction, or misfunction, occurs when an artefact token may do what is supposed to but, at least occasionally, it also yields some unintended and undesirable effects. We argue that software, understood as type, may misfunction in some limited sense, but cannot dysfunction. Accordingly, one should distinguish software from other technical artefacts, in view of their design that makes dysfunction impossible for the former, while possible for the latter. (shrink)
The time marked by the clock hands, the so-called “objective time,” is deeply different from the one perceived by the individual. Starting from this hypothesis, directly connected to the subjective modality of “living” the time and defined as time perspective, we will try to understand how much it affects the various domains of people's lives, attitudes, and experiences. Therefore, the research investigates whether all our decisions can be influenced by one or more time perspectives beyond our awareness. Last, but not (...) least, we will try to understand if some time perspectives in specific contexts are more functional and adaptive than others. (shrink)
Accidents involving autonomous vehicles raise difficult ethical dilemmas and legal issues. It has been argued that self-driving cars should be programmed to kill, that is, they should be equipped with pre-programmed approaches to the choice of what lives to sacrifice when losses are inevitable. Here we shall explore a different approach, namely, giving the user/passenger the task of deciding what ethical approach should be taken by AVs in unavoidable accident scenarios. We thus assume that AVs are equipped with what we (...) call an “Ethical Knob”, a device enabling passengers to ethically customise their AVs, namely, to choose between different settings corresponding to different moral approaches or principles. Accordingly, AVs would be entrusted with implementing users’ ethical choices, while manufacturers/programmers would be tasked with enabling the user’s choice and ensuring implementation by the AV. (shrink)
This article reflects on a survey carried out at a non-profit organization that deals with health care for oncological terminally ill in order to find out for those who are involved in this project each worker's time projection and well-being class. The survey has pointed out each single team member's time perspective and well-being class and allowed building a pedagogical path for work orientation that has involved the same team members.
Computing, today more than ever before, is a multi-faceted discipline which collates several methodologies, areas of interest, and approaches: mathematics, engineering, programming, and applications. Given its enormous impact on everyday life, it is essential that its debated origins are understood, and that its different foundations are explained. On the Foundations of Computing offers a comprehensive and critical overview of the birth and evolution of computing, and it presents some of the most important technical results and philosophical problems of the discipline, (...) combining both historical and systematic analyses. -/- The debates this text surveys are among the latest and most urgent ones: the crisis of foundations in mathematics and the birth of the decision problem, the nature of algorithms, the debates on computational artefacts and malfunctioning, and the analysis of computational experiments. By covering these topics, On the Foundations of Computing provides a much-needed resource to contextualize these foundational issues. -/- For practitioners, researchers, and students alike, a historical and philosophical approach such as what this volume offers becomes essential to understand the past of the discipline and to figure out the challenges of its future. (shrink)
A widespread opinion holds that norms and codes of conduct as such can only be established via words, that is, in some lexical form. This perspective can be criticized: some norms produced by human acts are not word-based at all. For example, many norms are actually conveyed through graphics, sounds, a silent gesture. In this article, we will focus on the norms that are created by means of drawings and can be termed “drawn norms” or “graphical norms.” Specifically, we will (...) inquire into the phenomenon of graphical norms with particular regard to traffic signs and land-use plans, and we will discuss the philosophical and legal problems to which these phenomena give rise. (shrink)
The dependence on history of both present and future dynamics of life is a common intuition in biology and in humanities. Historicity will be understood in terms of changes of the space of possibilities as well as by the role of diversity in life’s structural stability and of rare events in history formation. We hint to a rigorous analysis of “path dependence” in terms of invariants and invariance preserving transformations, as it may be found also in physics, while departing from (...) the physico-mathematical analyses. The idea is that the invariant traces of the past under organismal or ecosystemic transformations contribute to the understanding of present and future states of affairs. This yields a peculiar form of unpredictability in biology, at the core of novelty formation: the changes of observables and pertinent parameters may depend also on past events. In particular, in relation to the properties of synchronic measurement in physics, the relevance of diachronic measurement in biology is highlighted. This analysis may a fortiori apply to cognitive and historical human dynamics, while allowing to investigate some general properties of historicity in biology. (shrink)
So-called Locke's thesis is the view that no two things of the same kind may coincide, that is, may be completely in the same place at the same time. A number of counter-examples to this view have been proposed. In this paper, some new and arguably more convincing counter-examples to Locke's thesis are presented. In these counter-examples, a particular entity (a string, a rope, a net, or similar) is interwoven to obtain what appears to be a distinct, thicker entity of (...) the same kind. It is argued that anyone who subscribes to certain standard metaphysical arguments, which are generally taken for granted in the debate about Locke's thesis, is virtually compelled to accept the counter-examples. (shrink)
We provide a full characterization of computational error states for information systems. The class of errors considered is general enough to include human rational processes, logical reasoning, scientific progress and data processing in some functional programming languages. The aim is to reach a full taxonomy of error states by analysing the recovery and processing of data. We conclude by presenting machine-readable checking and resolve algorithms.
This personal, yet scientific, letter to Alan Turing, reflects on Turing's personality in order to better understand his scientific quest. It then focuses on the impact of his work today. By joining human attitude and particular scientific method, Turing is able to “immerse himself” into the phenomena on which he works. This peculiar blend justifies the epistolary style. Turing makes himself a “human computer”, he lives the dramatic quest for an undetectable imitation of a man, a woman, a machine. He (...) makes us see the continuous deformations of a material action/reaction/diffusion dynamics of hardware with no software. Each of these investigations opens the way to new scientific paths with major consequences for contemporary live and for knowledge. The uses and the effects of these investigations will be discussed: the passage from classical AI to today's neural nets, the relevance of non-linearity in biological dynamics, but also their abuses, such as the myth of a computational world, from a Turing-machine like universe to an encoded homunculus in the DNA. It is shown that these latter ideas, which are sometimes even made in Turing's name, contradict his views. (shrink)
This paper introduces a multi-modal polymorphic type theory to model epistemic processes characterized by trust, defined as a second-order relation affecting the communication process between sources and a receiver. In this language, a set of senders is expressed by a modal prioritized context, whereas the receiver is formulated in terms of a contextually derived modal judgement. Introduction and elimination rules for modalities are based on the polymorphism of terms in the language. This leads to a multi-modal non-homogeneous version of a (...) type theory, in which we show the embedding of the modal operators into standard group knowledge operators. (shrink)
We analyze the results from three different risk attitude elicitation methods. First, the broadly used test by Holt and Laury, HL, second, the lottery-panel task by Sabater-Grande and Georgantzis, SG, and third, responses to a survey question on self-assessment of general attitude towards risk. The first and the second task are implemented with real monetary incentives, while the third concerns all domains in life in general. Like in previous studies, the correlation of decisions across tasks is low and usually statistically (...) non-significant. However, when we consider only subjects whose behavior across the panels of the SG task is compatible with constant relative risk aversion, the correlation between HL and self-assessed risk attitude becomes significant. Furthermore, the correlation between HL and SG also increases for CRRA-compatible subjects, although it remains statistically non-significant. (shrink)
This paper proposes a critical analysis of that interpretation of the Nāgārjunian doctrine of the two truths as summarized—by both Mark Siderits and Jay L. Garfield—in the formula: “the ultimate truth is that there is no ultimate truth”. This ‘semantic reading’ of Nāgārjuna’s theory, despite its importance as a criticism of the ‘metaphysical interpretations’, would in itself be defective and improbable. Indeed, firstly, semantic interpretation presents a formal defect: it fails to clearly and explicitly express that which it contains logically; (...) the previously mentioned formula must necessarily be completed by: “the conventional truth is that nothing is conventional truth”. Secondly, after having recognized what Siderits’ and Garfield’s analyses contain implicitly, other logical and philological defects in their position emerge: the existence of the ‘conventional’ would appear—despite the efforts of semantic interpreters to demonstrate quite the contrary—definitively inconceivable without the presupposition of something ‘real’; moreover, the number of verses in Nāgārjuna that are in opposition to the semantic interpretation (even if we grant semantic interpreters that these verses do not justify a metaphysical reconstruction of Nagarjuna’s doctrine) would seem too great and significant to be ignored. (shrink)
This paper presents a systematic discussion, mainly for non-economists, on economic approaches to the concept of sustainable development. As a first step, the concept of sustainability is extensively discussed. As a second step, the argument that it is not possible to consider sustainability only from an economic or ecological point of view is defended; issues such as economic-ecological integration, inter-generational and intra-generational equity are considered of fundamental importance. Two different economic approaches to environmental issues, i.e. neo-classical environmental economics and ecological (...) economics, are compared. Some key differences such as weak versus strong sustainability, commensurability versus incommensurability and ethical neutrality versus different values acceptance are pointed out. (shrink)
We offer a formal treatment of the semantics of both complete and incomplete mistrustful or distrustful information transmissions. The semantics of such relations is analysed in view of rules that define the behaviour of a receiving agent. We justify this approach in view of human agent communications and secure system design. We further specify some properties of such relations.
Software-intensive science challenges in many ways our current scientific methods. This affects significantly our notion of science and scientific interpretation of the world, driving at the same time the philosophical debate. We consider some issues prompted by SIS in the light of the philosophical categories of ontology and epistemology.
Coherent-ambiguity aversion is defined within the smooth-ambiguity model as the combination of choice-ambiguity and value-ambiguity aversion. Five ambiguous decision tasks are analyzed theoretically, where an individual faces two-stage lotteries with binomial, uniform, or unknown second-order probabilities. Theoretical predictions are then tested through a 10-task experiment. In tasks 1–5, risk aversion is elicited through both a portfolio choice method and a BDM mechanism. In tasks 6–10, choice-ambiguity aversion is elicited through the portfolio choice method, while value-ambiguity aversion comes about through the (...) BDM mechanism. The behavior of over 75 % of classified subjects is in line with the KMM model in all tasks 6–10, independent of their degree of risk aversion. Furthermore, the percentage of coherent-ambiguity-averse subjects is lower in the binomial than in the uniform and in the unknown treatments, with only the latter difference being significant. The most part of coherent-ambiguity-loving subjects show a high risk aversion. (shrink)
This study investigates the acquisition of integrated object manipulation and categorization abilities through a series of experiments in which human adults and artificial agents were asked to learn to manipulate two-dimensional objects that varied in shape, color, weight, and color intensity. The analysis of the obtained results and the comparison of the behavior displayed by human and artificial agents allowed us to identify the key role played by features affecting the agent/environment interaction, the relation between category and action development, and (...) the role of cognitive biases originating from previous knowledge. (shrink)
The following article proposes an empirical study to explore communication strategies in the Cosa Nostra. Psychological studies on the characteristics of the language within the criminal organization are undoubtedly recent, but crucial to thoroughly understand the characteristics of implicit and explicit communication it adopts in the various contexts it works, as well as the power and value they assume. The data we have obtained from some videos concerning interviews and police interrogations to men of honor have been analyzed through a (...) method that refers to the grounds of qualitative research in clinical psychology, the Grounded Theory by Glaser and Strauss. The analysis we have carried out and its relevant data show us a world of the Mafia where the care for linguistic choices, for both form and contents, is characterized as a crucial activity, even when words are replaced by silence or gestures. (shrink)
The usual representation of quantum algorithms, limited to the process of solving the problem, is physically incomplete. We complete it in three steps: extending the representation to the process of setting the problem, relativizing the extended representation to the problem solver to whom the problem setting must be concealed, and symmetrizing the relativized representation for time reversal to represent the reversibility of the underlying physical process. The third steps projects the input state of the representation, where the problem solver is (...) completely ignorant of the setting and thus the solution of the problem, on one where she knows half solution. Completing the physical representation shows that the number of computation steps required to solve any oracle problem in an optimal quantum way should be that of a classical algorithm endowed with the advanced knowledge of half solution. (shrink)
This paper proposes an interpretation of Nāgārjuna’s doctrine of the two truths that considers saṃvṛti and paramārtha-satya two visions of reality on which the Buddhas, for soteriological and pedagogical reasons, build teachings of two types: respectively in agreement with (for example, the teaching of the Four Noble Truths) or in contrast to (for example, the teaching of emptiness) the category of svabhāva. The early sections of the article show to what extent the various current interpretations of the Nāgārjunian doctrine of (...) the dve satye—despite their sometimes even macroscopic differences—have a common tendency to consider the notion of śūnyatā as a teaching not based on, but equivalent to supreme truth. This equivalence—philologically questionable—leads to interpretative paths that prove inevitably aporetic: indeed, according to whether the interpretation of śūnyatā is ‘metaphysical’ or ‘anti-metaphysical’, it gives rise to readings of Nāgārjuna’s thought incompatible, respectively, with anti-metaphysical and realistic types of verses traceable in the works of the author of the Mūla-madhyamaka-kārikā (MMK). On the contrary, by giving more emphasis to the expression samupāśritya (“based on”), which recurs in MMK.24.8, and therefore, by epistemologically separating the notion of śūnyatā from the notion of paramārtha-satya (and of some of its conceptual equivalents such as nirvāṇa, tattva and dharmatā), we may obtain an interpretation—at once realistic and anti-metaphysical—of the theory of the two truths compatible with the vast majority (or even totality) of Nāgārjuna’s verses. (shrink)
In his classic introduction to the subject, Cognitive Therapy and the Emotional Disorders, Aaron Beck observes that “the philosophical underpinnings” of cognitive therapy’s (CT) approach to the emotional disorders “go back thousands of years, certainly to the time of the Stoics, who considered man’s conceptions (or misconceptions) of events rather than the events themselves as the key to his emotional upsets” (Beck 1976, 3). But beyond acknowledging that the stoics anticipated the central insight of CT, Beck has very little to (...) say about the philosophical underpinnings of CT, content it would seem for it to be an empirically grounded system of psychological principles and therapeutic methods. Yet even this little .. (shrink)
This paper supports the thesis that nihilistic interpretations of Madhyamaka philosophy derive from generally antirealistic and/or metaphysical approaches to Nāgārjuna’s thought. However, the arguments and many images by way of which the author of the Mūlamadhyamakakārikā and his Indian commentators defend themselves from the charge of nihilism show limits in these approaches, and rather confirm that Nāgārjuna’s philosophy should be read as a theoretical proposal that is at once realistic and antimetaphysical. The epistemology inherent to the soteriological dimension of the (...) Buddha’s teaching, of which Nāgārjuna presents himself as a faithful continuer, assesses on the one hand the accomplishment of a ‘cognitive revolution’ consisting in the achievement of a new vision of reality and on the other the avoidance of any metaphysical description of the same vision. Comprehension of the Madhyamaka philosophical enterprise through a realistic-antimetaphysical lens seems to hinder and prevent the possibility of any nihilistic interpretation of Nāgārjuna. (shrink)
For many years, psychological–clinical research has been aiming at studying the Mafia from different viewpoints: the man of honor's inner world, relational and psychopathological structures of his family matrices, connections between inner and social worlds, interiorized and social rules. Today, however, a complex phenomenon has come to light, which concerns the great connection between the Mafia and financial crime, and for us as researchers it is very interesting and complicated to analyze, because it involves the study of psychological peculiarities that (...) can be specifically found in the numerous illegalities. The crossbreeding of the various criminalities, which are always more interacting and merging and less isolated, makes this picture even more complex. (shrink)
In one of his last writings, Life: Experience and Science, Michel Foucault argued that twentieth-century French philosophy could be read as dividing itself into two divergent lines: on the one hand, we have a philosophical stream which takes individual experience as its point of departure, conceiving it as irreducible to science. On the other hand, we have an analysis of knowledge which takes into account the concrete productions of the mind, as are found in science and human practices. In order (...) to account for this division, Foucault opposed epistemologists such as Cavaillès and Canguilhem to phenomenologists such as Merleau-Ponty and Sartre but, also, and more particularly, he opposed Poincaré to Bergson. The latter was presented by Foucault as being the key-figure of the ?philosophy of experience? at the beginning of the twentieth century. Fifteen years later, in his Deleuze and in the Logics of Worlds, Alain Badiou again uses this dual structure in his interpretation of the past hundred years of French thought. He employs a series of oppositional couples: himself and Deleuze, Lautmann and Sartre, and, finally, Brunschvicg and Bergson. On the one hand a ?mathematical Platonism? and on the other a ?philosophy of vital interiority.? This Manichean reading of philosophy, and the strategic use of the figure of Bergson has, itself, a long tradition. It was also proposed by Althusser who, following Bachelard, opposed his standpoint to any form of ?empiricism.? Althusser developed his thought from a tradition of Marxist thinkers and ideologists, which included Politzer's and Nizan's critique of bourgeois philosophy and, even before that, neo-Kantians such as the philosophers of the Revue de métaphysique et de morale. The aim of this essay is to deconstruct and to put into its precise context of production this series of genealogies which entails the mobilization of Bergsonism and of the name ?Bergson.? By doing so, I hope to weight the importance of Bergsonism in twentieth-century French philosophy, in both its ?positive? and its ?negative? aspect. The essay will proceed regressively, taking into account figures such as Althusser, Badiou, Deleuze, Foucault, Canguilhem, Cavaillès, Sartre, Merleau-Ponty, but also Polizer, Brunschvicg and Alain. The conclusion of the essay is an attempt at reading the ?Bergson renaissance? in the light of new discoveries in genetics and the cognitive sciences and to tie it to the renewal of studies in the history of French philosophy. (shrink)
Various conceptual approaches to the notion of information can currently be traced in the literature in logic and formal epistemology. A main issue of disagreement is the attribution of truthfulness to informational data, the so called Veridicality Thesis (Floridi 2005). The notion of Epistemic Constructive Information (Primiero 2007) is one of those rejecting VT. The present paper develops a formal framework for ECI. It extends on the basic approach of Artemov’s logic of proofs (Artemov 1994), representing an epistemic logic based (...) on dependent justifications, where the definition of information relies on a strict distinction from factual truth. The definition obtained by comparison with a Normal Modal Logic translates a constructive logic for “becoming informed”: its distinction from the logic of “being informed”—which internalizes truthfulness—is essential to a general evaluation of information with respect to truth. The formal disentanglement of these two logics, and the description of the modal version of the former as a weaker embedding into the latter, allows for a proper understanding of the Veridicality Thesis with respect to epistemic states defined in terms of information. (shrink)