The thesis investigates the connection between deconstruction and creativity with regard to three aesthetic fields, namely jazz music, photography, and architecture. The thesis consists of three chapters. Chapter 1 focuses on deconstruction and jazz music. First, the analysis draws a comparison between the linguistic sign and the musical sign in the light of Derrida's analysis of signifier and signified. This supports an investigation of the supplementary character of writing in the specific case of jazz music. Second, the analysis draws an (...) analogy between the deconstructive reading of texts and jazz improvisation to show the relevance that creativity has for both. This is followed by an examination of the similarities between Derrida's notion of différance and the musical figure of syncopation. The analysis is completed by an argument that the jazz event of the jam session is an encounter and creative 'dialogue', with features similar to Derrida's conception of hospitality. Chapter 2 focuses on deconstruction and photography. First, the discussion explores the correlation between truth and photography. It will be argued that deconstruction challenges the logocentric organisation of photographs based on the prominence of what is immediately visible in images and fosters a more creative interpretation, which is based on the play between concealment and unconcealment within photographs. Second, it investigates the implications of Derrida's analysis of temporality for photography. This supports an investigation into how Derrida's notion of responsibility and the future to come can be applied to photography. Chapter 3 focuses on deconstruction and architecture. First, the analysis establishes the links between architecture and language by outlining the creative and transformative outcome that the correlation between function and meaning in the light of deconstruction produces. Second, the investigation examines Bernard Tschumi's idea of architecture as event in Parc de La Villette. Tschumi's work is an example of how the deconstructive approach adopted by architects fosters creativity in users. Finally, the analysis focuses on the transformative and creative character of portable architecture by investigating the correlation between the creative character of deconstructive concepts such as freeplay, parergon, and the axiom of incompleteness, and the transformative features of tents. (shrink)
This study draws a comparative framework between deconstructive reading of texts and jazz standards. It will be argued that both are defined by the constant play of tradition and innovation. On the one hand, the repetition of a set of rules and dominant understanding of texts/tunes that generates tradition. On the other hand, invention and improvisation that take on that tradition and generate innovation. The act of reading/playing becomes also an act of invention/improvisation that manifests a constant tension between the (...) old that is handed down through writing/recording and the new that is generated by the reader/musician. (shrink)
Este artigo apresenta as críticas de Francesco Patrizi à concepção aristotélica de tempo na sua Física, isto é, a crítica de Patrizi ao princípio de que o tempo é infinito em termos de infinidade matemática. A principal tese de Patrizi é a de que a “infinidade possível" da matemática acarreta contradições quando aplicada a substâncias naturais e à ciência natural em geral.
I want to model a finite, fallible cognitive agent who imagines that p in the sense of mentally representing a scenario—a configuration of objects and properties—correctly described by p. I propose to capture imagination, so understood, via variably strict world quantifiers, in a modal framework including both possible and so-called impossible worlds. The latter secure lack of classical logical closure for the relevant mental states, while the variability of strictness captures how the agent imports information from actuality in the imagined (...) non-actual scenarios. Imagination turns out to be highly hyperintensional, but not logically anarchic. Section 1 sets the stage and impossible worlds are quickly introduced in Sect. 2. Section 3 proposes to model imagination via variably strict world quantifiers. Section 4 introduces the formal semantics. Section 5 argues that imagination has a minimal mereological structure validating some logical inferences. Section 6 deals with how imagination under-determines the represented contents. Section 7 proposes additional constraints on the semantics, validating further inferences. Section 8 describes some welcome invalidities. Section 9 examines the effects of importing false beliefs into the imagined scenarios. Finally, Sect. 10 hints at possible developments of the theory in the direction of two-dimensional semantics. (shrink)
This book is both an introduction to and a research work on Meinongianism. “Meinongianism” is taken here, in accordance with the common philosophical jargon, as a general label for a set of theories of existence – probably the most basic notion of ontology. As an introduction, the book provides the first comprehensive survey and guide to Meinongianism and non-standard theories of existence in all their main forms. As a research work, the book exposes and develops the most up-to-date Meinongian theory (...) (called modal Meinongianism), applies it to specific fields, and discusses its open problems. Part I of the book provides a historical introduction to, and critical discussion of, the dominant philosophical view of existence: the “Kantian-Fregean-Quinean” perspective. Part II is the full-fledged introduction to the Meinongian theories of existence as a real property of individuals: after starting with the so-called naïve Meinongian conception and its problems, it provides a self-contained presentation of the main neo-Meinongian proposals, and a detailed discussion of their strengths and weaknesses. Part III develops a specific neo-Meinongian theory of existence employing a model-theoretic semantic framework. It discusses its application to the ontology and semantics of fictional objects, and its open problems. The methodology of the book follows the most recent trends in analytic ontology. In particular, the meta-ontological point of view is largely privileged. (shrink)
Is there a notion of contradiction—let us call it, for dramatic effect, “absolute”—making all contradictions, so understood, unacceptable also for dialetheists? It is argued in this paper that there is, and that spelling it out brings some theoretical benefits. First it gives us a foothold on undisputed ground in the methodologically difficult debate on dialetheism. Second, we can use it to express, without begging questions, the disagreement between dialetheists and their rivals on the nature of truth. Third, dialetheism has an (...) operator allowing it, against the opinion of many critics, to rule things out and manifest disagreement: for unlike other proposed exclusion-expressing-devices (for instance, the entailment of triviality), the operator used to formulate the notion of absolute contradiction appears to be immune both from crippling expressive limitations and from revenge paradoxes—pending a rigorous nontriviality proof for a formal dialetheic theory including it. (shrink)
We outline a neo-Meinongian framework labeled as Modal Meinongian Metaphysics (MMM) to account for the ontology and semantics of fictional discourse. Several competing accounts of fictional objects are originated by the fact that our talking of them mirrors incoherent intuitions: mainstream theories of fiction privilege some such intuitions, but are forced to account for others via complicated paraphrases of the relevant sentences. An ideal theory should resort to as few paraphrases as possible. In Sect. 1, we make this explicit via (...) two methodological principles, called the Minimal Revision and the Acceptability Constraint. In Sect. 2, we introduce the standard distinction between internal and external fictional discourse. In Sects. 3–5, we discuss the approaches of (traditional) Meinongianism, Fictionalism, and Realism—and their main troubles. In Sect. 6 we propose our MMM approach. This is based upon (1) a modal semantics including impossible worlds (Subsect. 6.1); (2) a qualified Comprehension Principle for objects (Subsect. 6.2); (3) a notion of existence-entailment for properties (Subsect. 6.3). In Sect. 7 we present a formal semantics for MMM based upon a representation operator. And in Sect. 8 we have a look at how MMM solves the problems of the three aforementioned theories. (shrink)
The experimental approach in economics is a driving force behind some of the most exciting developments in the field. The 'experimental revolution' was based on a series of bold philosophical premises which have remained until now mostly unexplored. This book provides the first comprehensive analysis and critical discussion of the methodology of experimental economics, written by a philosopher of science with expertise in the field. It outlines the fundamental principles of experimental inference in order to investigate their power, scope and (...) limitations. The author demonstrates that experimental economists have a lot to gain by discussing openly the philosophical principles that guide their work, and that philosophers of science have a lot to learn from their ingenious techniques devised by experimenters in order to tackle difficult scientific problems. (shrink)
Manufacturing and industry practices are undergoing an unprecedented revolution as a consequence of the convergence of emerging technologies such as artificial intelligence, robotics, cloud computing, virtual and augmented reality, among others. This fourth industrial revolution is similarly changing the practices and capabilities of operators in their industrial environments. This paper introduces and explores the notion of the Operator 4.0 as well as how this novel way of conceptualizing the human operator necessarily implicates human values in the technologies that constitute it. (...) The design approach known as value sensitive design (VSD) is used to explore how these Operator 4.0 technologies can be designed for human values. Expert elicitation surveys were used to determine the values of industry stakeholders and examples of how the VSD methodology can be adopted by engineers in order to design for these values is illustrated. The results provide preliminary adoption strategies that industrial teams can take to Operator 4.0 technology for human values. (shrink)
This study considers the contribution of Francesco Patrizi da Cherso to the development of the concepts of void space and an infinite universe. Patrizi plays a greater role in the development of these concepts than any other single figure in the sixteenth century, and yet his work has been almost totally overlooked. I have outlined his views on space in terms of two major aspects of his philosophical attitude: on the one hand, he was a devoted Platonist and sought (...) always to establish Platonism, albeit his own version of it, as the only currect philosophy; and on the other hand, he was more determinedly anti-Aristotelian than any other philosopher at that time. Patrizi's concept of space has its beginnings in Platonic notions, but is extended and refined in the light of a vigorous critique of Aristotle's position. Finally, I consider the influence of Patrizi's ideas in the seventeenth century, when various thinkers are seeking to overthrow the Aristotelian concept of place and the equivalence of dimensionality with corporeality. Pierre Gassendi , for example, needed a coherent concept of void space in which his atoms could move, while Henry More sought to demonstrate the reality of incorporeal entities by reference to an incorporeal space. Both men could find the arguments they needed in Patrizi's comprehensive treatment of the subject. (shrink)
There is a principle in things, about which we cannot be deceived, but must always, on the contrary, recognize the truth – viz. that the same thing cannot at one and the same time be and not be": with these words of the Metaphysics, Aristotle introduced the Law of Non-Contradiction, which was to become the most authoritative principle in the history of Western thought. However, things have recently changed, and nowadays various philosophers, called dialetheists, claim that this Law does not (...) hold unrestrictedly – that in peculiar circumstances the same thing may at the same time be and not be, and contradictions may obtain in the world. This book opens with an examination of the famous logical paradoxes that appear to speak on behalf of contradictions (e.g., the Liar paradox, the set-theoretic paradoxes such as Cantor’s and Russell’s), and of the reasons for the failure of the standard attempts to solve them. It provides, then, an introduction to paraconsistent logics – non-classical logics in which the admission of contradictions does not lead to logical chaos –, and their astonishing applications, going from inconsistent data base management to contradictory arithmetics capable of circumventing Gödel’s celebrated Incompleteness Theorem. The final part of the book discusses the philosophical motivations and difficulties of dialetheism, and shows how to extract from Aristotle’s ancient words a possible reply to the dialetheic challenge. How to Sell a Contradiction will appeal to anyone interested in non-classical logics, analytic metaphysics, and philosophy of mathematics, and especially to those who consider challenging our most entrenched beliefs the main duty of philosophical inquiry. (shrink)
Strong Reciprocity theorists claim that cooperation in social dilemma games can be sustained by costly punishment mechanisms that eliminate incentives to free ride, even in one-shot and finitely repeated games. There is little doubt that costly punishment raises cooperation in laboratory conditions. Its efficacy in the field however is controversial. I distinguish two interpretations of experimental results, and show that the wide interpretation endorsed by Strong Reciprocity theorists is unsupported by ethnographic evidence on decentralised punishment and by historical evidence on (...) common pool institutions. The institutions that spontaneously evolve to solve dilemmas of cooperation typically exploit low-cost mechanisms, turning finite games into indefinitely repeated ones and eliminating the cost of sanctioning. (shrink)
A logic is called 'paraconsistent' if it rejects the rule called 'ex contradictione quodlibet', according to which any conclusion follows from inconsistent premises. While logicians have proposed many technically developed paraconsistent logical systems and contemporary philosophers like Graham Priest have advanced the view that some contradictions can be true, and advocated a paraconsistent logic to deal with them, until recent times these systems have been little understood by philosophers. This book presents a comprehensive overview on paraconsistent logical systems to change (...) this situation. The book includes almost every major author currently working in the field. The papers are on the cutting edge of the literature some of which discuss current debates and others present important new ideas. The editors have avoided papers about technical details of paraconsistent logic, but instead concentrated upon works that discuss more 'big picture' ideas. Different treatments of paradoxes takes centre stage in many of the papers, but also there are several papers on how to interpret paraconistent logic and some on how it can be applied to philosophy of mathematics, the philosophy of language, and metaphysics. (shrink)
Current debates in social ontology are dominated by approaches that view institutions either as rules or as equilibria of strategic games. We argue that these two approaches can be unified within an encompassing theory based on the notion of correlated equilibrium. We show that in a correlated equilibrium each player follows a regulative rule of the form ‘if X then do Y’. We then criticize Searle's claim that constitutive rules of the form ‘X counts as Y in C’ are fundamental (...) building blocks for institutions, showing that such rules can be derived from regulative rules by introducing new institutional terms. Institutional terms are introduced for economy of thought, but are not necessary for the creation of social reality. (shrink)
Recent debates on the nature of preferences in economics have typically assumed that they are to be interpreted either as behavioural regularities or as mental states. In this paper I challenge this dichotomy and argue that neither interpretation is consistent with scientific practice in choice theory and behavioural economics. Preferences are belief-dependent dispositions with a multiply realizable causal basis, which explains why economists are reluctant to make a commitment about their interpretation.
Philosophical dialetheism, whose main exponent is Graham Priest, claims that some contradictions hold, are true, and it is rational to accept and assert them. Such a position is naturally portrayed as a challenge to the Law of Non-Contradiction (LNC). But all the classic formulations of the LNC are, in a sense, not questioned by a typical dialetheist, since she is (cheerfully) required to accept them by her own theory. The goal of this paper is to develop a formulation of the (...) Law which appears to be unquestionable, in the sense that the Priestian dialetheist is committed to accept it without also accepting something inconsistent with it, on pain of trivialism—that is to say, on pain of lapsing into the position according to which everything is the case. This will be achieved via (a) a discussion of Priest's dialetheic treatment of the notions of rejection and denial; and (b) the characterization of a negation via the primitive intuition of content exclusion. Such a result will not constitute a cheap victory for the friends of consistency. We may just learn that different things have been historically conflated under the label of 'Law of Non-Contradiction'; that dialetheists rightly attack some formulations of the Law, and orthodox logicians and philosophers have been mistaken in assimilating them to the indisputable one. (shrink)
Language evolution, intended as an open problem in the evolutionary research programme, will be here analyzed from the theoretical perspective advanced by the supporters of the Extended Evolutionary Synthesis. Four factors and two associated concepts will be matched with a selection of critical examples concerning genus Homo evolution, relevant for the evolution of language, such as the evolution of hominin life-history traits, the enlargement of the social group, increased cooperation among individuals, behavioral change and innovations, heterochronic modifications leading to increased (...) synaptic plasticity. A particular form of niche construction will be considered in a multilevel framework. It will be argued that the four points mentioned above prove to be fundamental explanatory tools to understand how language might have emerged as a result of a gene-culture coevolutionary dynamics. (shrink)
The fitting attitude analysis of value states that for objects to have value is for them to be the fitting targets of attitudes. Good objects are the fitting targets of positive attitudes, while bad objects are the fitting targets of negative attitudes. The following paper presents an argument to the effect that value and the fittingness of attitudes differ in terms of their explanations. Whereas the fittingness of attitudes is explained, inter alia, by both the properties of attitudes and those (...) of their fitting targets, the explanation of value tends to have a different content. In particular, objects have value in virtue of the features that make them valuable, and these need not involve any attitudinal properties. If this is right, then there are reasons to doubt the claim that for objects to have value is just for them to be the fitting targets of attitudes. Insofar as value is a property, it appears to be distinct from the property of objects being the fitting targets of attitudes. (shrink)
An interpretation of Wittgenstein’s much criticized remarks on Gödel’s First Incompleteness Theorem is provided in the light of paraconsistent arithmetic: in taking Gödel’s proof as a paradoxical derivation, Wittgenstein was drawing the consequences of his deliberate rejection of the standard distinction between theory and metatheory. The reasoning behind the proof of the truth of the Gödel sentence is then performed within the formal system itself, which turns out to be inconsistent. It is shown that the features of paraconsistent arithmetics match (...) with some intuitions underlying Wittgenstein’s philosophy of mathematics, such as its strict finitism and the insistence on the decidability of any mathematical question. (shrink)
While corporate social responsibility (CSR) is becoming a mainstream issue for many organizations, most of the research to date addresses CSR in large businesses rather than in small- and medium-sized enterprises (SMEs), because it is too often considered a prerogative of large businesses only. The role of SMEs in an increasingly dynamic context is now being questioned, including what factors might affect their socially responsible behaviour. The goal of this paper is to make a comparison of SME and large firm (...) CSR strategies. Furthermore, size of the firm is analyzed as a factor that influences specific choices in the CSR field, and studied by means of a sample of 3,680 Italian firms. Based on a multi-stakeholder framework, the analysis provides evidence that large firms are more likely to identify relevant stakeholders and meet their requirements through specific and formal CSR strategies. (shrink)
This article investigates the effects of perceived supervisor support on ethical and unethical employee behavior using a multi-method approach. Specifically, we test the mediating mechanism and a boundary condition that moderate the relationship between support and ethical employee behaviors. We find that supervisor-based self-esteem fully mediates the relationship between supervisor support and ethical employee behavior and that employee task satisfaction intensifies the relationship between supervisor support and supervisor-based self-esteem.
We address an argument by Floridi (Synthese 168(1):151–178, 2009; 2011a), to the effect that digital and analogue are not features of reality, only of modes of presentation of reality. One can therefore have an informational ontology, like Floridi’s Informational Structural Realism, without commitment to a supposedly digital or analogue world. After introducing the topic in Sect. 1, in Sect. 2 we explain what the proposition expressed by the title of our paper means. In Sect. 3, we describe Floridi’s argument. In (...) the following three sections, we raise three difficulties for it, (i) an objection from intuitions: Floridi’s view is not supported by the intuitions embedded in the scientific views he exploits (Sect. 4); (ii) an objection from mereology: the view is incompatible with the world’s having parts (Sect. 5); (iii) an objection from counting: the view entails that the question of how many things there are doesn’t make sense (Sect. 6). In Sect. 7, we outline two possible ways out for Floridi’s position. Such ways out involve tampering with the logical properties of identity, and this may be bothersome enough. Thus, Floridi’s modus ponens will be our (and most ontologists’) modus tollens. (shrink)
Berto’s highly readable and lucid guide introduces students and the interested reader to Gödel’s celebrated _Incompleteness Theorem_, and discusses some of the most famous - and infamous - claims arising from Gödel's arguments. Offers a clear understanding of this difficult subject by presenting each of the key steps of the _Theorem_ in separate chapters Discusses interpretations of the _Theorem_ made by celebrated contemporary thinkers Sheds light on the wider extra-mathematical and philosophical implications of Gödel’s theories Written in an accessible, non-technical (...) style. (shrink)
On the one hand, after Matteo d'Acquasparta's distinction between the three types of eternity and the temporal necessity of the past, Meyronnes radicalized Scotus's dynamic vision of duration, conceiving the modality as a relation of implication between predicate and existing subject, and time as relationship between Creator and creature. On the other hand, after Ockham denied the real simultaneity of opposed potencies, the Ochamist extension of temporal necessity to the present was denied by Gregory of Rimini, who was favourable, together (...) with Wodeham, to the mutability of the past in a divided sense. Mirecourt, strong on the English subtleties, appears to follow Gregory and tries to find a solution to the interaction between the two contingencies, from top to bottom, which had been formalized by Gregory: if I, performing or not performing X, can act as if God, as the supreme intellect from eternity, could have known or not known X to come, and, if God as agent, absolutely willing omnipotent and unimpeadable from eternity can act as if X happened or did not happen, then can I act as if X, which is from eternity, did not happen from eternity? (shrink)
Corporate social responsibility (CSR) has acquired an unquestionably high degree of relevance for a large number of different actors. Among others, academics and practitioners are developing a wide range of knowledge and best practices to further improve socially responsible competences. Within this context, one frequent question is according to what theory should general knowledge of CSR be developed, and in particular the relationship between CSR and small and medium-size enterprises (SMEs). This paper suggests that research on large firms should be (...) based on stakeholder theory, while research on CSR among SMEs should be based on the concept of social capital. This paper first provides a theoretical and practical perspective on CSR today; the focus then shifts to the specific literature on CSR and SMEs; some data and information follow on SMEs in Europe and Italy; finally, some conclusions and questions for future research are suggested. (shrink)
Experimental “localism” stresses the importance of context‐specific knowledge, and the limitations of universal theories in science. I illustrate Latour's radical approach to localism and show that it has some unpalatable consequences, in particular the suggestion that problems of external validity (or how to generalize experimental results to nonlaboratory circumstances) cannot be solved. In the last part of the paper I try to sketch a solution to the problem of external validity by extending Mayo's error‐probabilistic approach.
European Journal of Political Theory, Ahead of Print. Starting from the ‘Dewey Lectures’, Rawls presents his conception of justice within a contextualist framework, as an elaboration of the basic ideas embedded in the political culture of liberal-democratic societies. But how are these basic ideas to be justified? In this article, I reconstruct and criticize Rawls’s strategy to answer this question. I explore an alternative strategy, consisting of a genealogical argument of a pragmatic kind – the kind of argument provided by (...) authors like Bernard Williams, Edward Craig and Miranda Fricker. I outline this genealogical argument drawing on Rawls’s reconstruction of the origins of liberalism. Then, I clarify the conditions under which this kind of argument maintains vindicatory power. I claim that the argument satisfies these conditions and that pragmatic genealogy can thus partially vindicate the basic ideas of liberal-democratic societies. (shrink)
ABSTRACT The current analytical debate on time is full of attempts to adjudicate from a purely theoretical standpoint among competing temporal ontologies. Little attention has instead been devoted to the existential attitudes -- emotional or ethical -- that may lurk behind, or ensue from, the endorsement of one of them. Some interesting opinions have however been voiced regarding the two most prominent views in the arena, namely eternalism and presentism; it has been said that the former is nourished by a (...) fear of death, or more generally by a desire of preservation for whatever we find precious and valuable, and that the latter is fuelled by a propensity to reap whatever fruits the present brings, as enshrined in the carpe diem motto. This paper explores such a territory by focusing on the reality of past sentience, whether joyful or painful, and on the open future. The first part contrasts the reality of past sentience that comes with eternalism with the denial of this reality that follows from presentism, and argues that from an emotional, or perhaps even moral, standpoint the latter is preferable to the former. The second part clarifies why the eternalist must renounce the open future, whereas presentism is consistent with it, and considers how its rejection or acceptance, as the case may be, could be emotionally, or even morally, significant for our conception of ourselves as free agents. The conclusion offers a tentative proposal regarding which temporal ontology is superior from an existential perspective and some ruminations on the impact that all this may have on the theoretical side of the issue. (shrink)
This paper examines the view held by Francesco Piccolomini (1523-1607) on the relation between prime matter and extension. In his discussion of prime matter in the Libri ad scientiam de natura attinentes Piccolomini develops a theory of prime matter that incorporates crucial elements of the viewpoint adhered to by the Neoplatonist Simplicius. The originality of Piccolomini’s undertaking is highlighted by contrasting it with the ideas found in Jacopo Zabarella’s De rebus naturalibus . The case of Piccolomini shows that, in (...) order to classify early modern metaphysical theories of prime matter, the category ‘prime matter as sheer dimensionality’ is indispensable. (shrink)
The COVID-19 pandemic has placed an enormous burden on health systems, and guidelines have been developed to help healthcare practitioners when resource shortage imposes the choice on who to treat. However, little is known on the public perception of these guidelines and the underlying moral principles. Here, we assess on a sample of 1033 American citizens’ moral views and agreement with proposed guidelines. We find substantial heterogeneity in citizens’ moral principles, often not in line with the guidelines recommendations. As the (...) guidelines are likely to directly affect a considerable number of citizens, our results call for policy interventions to inform people on the ethical rationale behind physicians or triage committees decisions to avoid resentment and feelings of unfairness. (shrink)
Thaler and Sunstein justify nudge policies from welfaristic premises: nudges are acceptable because they benefit the individuals who are nudged. A tacit assumption behind this strategy is that we can identify the true preferences of decision-makers. We argue that this assumption is often unwarranted, and that as a consequence nudge policies must be justified in a different way. A possible strategy is to abandon welfarism and endorse genuine paternalism. Another one is to argue that the biases of decision that choice (...) architects attempt to eliminate create externalities. For example, in the case of intertemporal discounting, the costs of preference reversals are not always paid by the discounters, because they are transferred onto other individuals. But if this is the case, then nudges are best justified from a political rather than welfaristic standpoint. (shrink)
According to a well-established interpretive line, the Benthamic judge would be allowed no room for autonomous calculations of utility and his or her task would only be that of mechanically applying substantive law, which expresses the legislator's will. For Gerald Postema, in contrast, Bentham's judge would be granted ample power to decide cases by directly applying the principle of utility. This article criticizes both views, by showing that a adjudication was for Bentham utterly impossible, although this does not mean that (...) judges should be free to decide according to direct utility calculations. Judges must be the tutors of the citizens’ expectations, which, under a system of statute law, will focus on the code. However, they can avoid suboptimality in cases where applying a general rule would not maximize utility, without preponderant damage for law-induced expectations: Bentham's suggestion is that they do so by proposing amendments of the code to the legislature. (shrink)
The answer in a nutshell is: Yes, five years ago, but nobody has noticed. Nobody noticed because the majority of social scientists subscribe to one of the following views: (1) the ‘anomalous’ behaviour observed in standard prisoner’s dilemma or ultimatum game experiments has refuted standard game theory a long time ago; (2) game theory is flexible enough to accommodate any observed choices by ‘refining’ players’ preferences; or (3) it is just a piece of pure mathematics (a tautology). None of these (...) views is correct. This paper defends the view that GT as commonly understood is not a tautology, that it suffers from important (albeit very recently discovered) empirical anomalies, and that it is not flexible enough to accommodate all the anomalies in its theoretical framework. It also discusses the experiments that finally refuted game theory, and concludes trying to explain why it took so long for experimental game theorists to design experiments that could adequately test the theory. (shrink)