As the world watches the current crisis in Kosovo unfold through intensive daily media coverage, particularly by major networks in the US and Europe, one can only wonder why the same attention is not given to the crises in Africa. The military intervention by NATO allied forces, including the United States, to avert Milosevic ’ s genocidal campaign towards the Kosovo Albanians, can only be characterized as an exclusive European mission to resolve Europe ’ s problem. This is not, by (...) all means, to subvert or undermine the suffering of ethnic Albanians in Kosovo but to give some perspective as to how the super powers are responding to other similar crises in the world. Today, any crime against humanity, in any part of the world, regardless of this nations political or economic interest, should not be tolerated. Most Americans, perhaps most of the world, do not know about the mass genocide that took place in Rwanda and the civil war in the Southern Sudan that has left more than a million and half people dead. And now we can add to this list, the recent border conflict between Ethiopia and Eritrea that has emerged into a full - scale war. The most recent conflict has already claimed thousands of lives while world superpowers are watching si -. (shrink)
BackgroundThe U.S. Food and Drug Administration traditionally has kept confidential significant amounts of information relevant to the approval or non-approval of specific drugs, devices, and biologics and about the regulatory status of such medical products in FDA’s pipeline.ObjectiveTo develop practical recommendations for FDA to improve its transparency to the public that FDA could implement by rulemaking or other regulatory processes without further congressional authorization. These recommendations would build on the work of FDA’s Transparency Task Force in 2010.MethodsIn 2016-2017, we convened (...) a team of academic faculty from Harvard Medical School, Brigham and Women’s Hospital, Yale Medical School, Yale Law School, and Johns Hopkins Bloomberg School of Public Health to develop recommendations through an iterative process of reviewing FDA’s practices, considering the legal and policy constraints on FDA in expanding transparency, and obtaining insights from independent observers of FDA.ResultsThe team developed 18 specific recommendations for improving FDA’s transparency to the public. FDA could adopt all these recommendations without further congressional action.FundingThe development of the Blueprint for Transparency at the U.S. Food and Drug Administration was funded by the Laura and John Arnold Foundation. (shrink)
The authors consider reflexive games that describe the interaction of subjects making decisions based on an awareness structure, i.e., a hierarchy of beliefs about essential parameters, beliefs about beliefs, and so on. It was shown that the language of graphs of reflexive games represents a convenient uniform description method for reflexion effects in bélles-léttres.
The Mathematical Intelligencer recently published a note by Y. Sergeyev that challenges both mathematics and intelligence. We examine Sergeyev’s claims concerning his purported Infinity computer. We compare his grossone system with the classical Levi-Civita fields and with the hyperreal framework of A. Robinson, and analyze the related algorithmic issues inevitably arising in any genuine computer implementation. We show that Sergeyev’s grossone system is unnecessary and vague, and that whatever consistent subsystem could be salvaged is subsumed entirely within a stronger and (...) clearer system. Lou Kauffman, who published an article on a grossone, places it squarely outside the historical panorama of ideas dealing with infinity and infinitesimals. (shrink)
Significant associations have been found between specific human leukocyte antigen (HLA) alleles and organ transplant rejection, autoimmune disease development, and the response to infection. Traditional searches for disease associations have conventionally measured risk associated with the presence of individual HLA alleles. However, given the high level of HLA polymorphism, the pattern of amino acid variability, and the fact that most of the HLA variation occurs at functionally important sites, it may be that a combination of variable amino acid sites shared (...) by several alleles (shared epitopes) are better descriptors of the actual causative genetic variants. Here we describe a novel approach to genetic association analysis in which genes/proteins are broken down into smaller sequence features and then variant types defined for each feature, allowing for independent analysis of disease association with each sequence feature variant type. We have used this approach to analyze a cohort of systemic sclerosis patients and show that a sequence feature composed of specific amino acid residues in peptide binding pockets 4 and 7 of HLA-DRB1 explains much of the molecular determinant of risk for systemic sclerosis. (shrink)
Throughout the biological and biomedical sciences there is a growing need for, prescriptive ‘minimum information’ (MI) checklists specifying the key information to include when reporting experimental results are beginning to find favor with experimentalists, analysts, publishers and funders alike. Such checklists aim to ensure that methods, data, analyses and results are described to a level sufficient to support the unambiguous interpretation, sophisticated search, reanalysis and experimental corroboration and reuse of data sets, facilitating the extraction of maximum value from data sets (...) them. However, such ‘minimum information’ MI checklists are usually developed independently by groups working within representatives of particular biologically- or technologically-delineated domains. Consequently, an overview of the full range of checklists can be difficult to establish without intensive searching, and even tracking thetheir individual evolution of single checklists may be a non-trivial exercise. Checklists are also inevitably partially redundant when measured one against another, and where they overlap is far from straightforward. Furthermore, conflicts in scope and arbitrary decisions on wording and sub-structuring make integration difficult. This presents inhibit their use in combination. Overall, these issues present significant difficulties for the users of checklists, especially those in areas such as systems biology, who routinely combine information from multiple biological domains and technology platforms. To address all of the above, we present MIBBI (Minimum Information for Biological and Biomedical Investigations); a web-based communal resource for such checklists, designed to act as a ‘one-stop shop’ for those exploring the range of extant checklist projects, and to foster collaborative, integrative development and ultimately promote gradual integration of checklists. (shrink)
This new edition of Alexander Miller’s highly readable introduction to contemporary metaethics provides a critical overview of the main arguments and themes in twentieth- and twenty-first-century contemporary metaethics. Miller traces the development of contemporary debates in metaethics from their beginnings in the work of G. E. Moore up to the most recent arguments between naturalism and non-naturalism, cognitivism and non-cognitivism. From Moore’s attack on ethical naturalism, A. J. Ayer’s emotivism and Simon Blackburn’s quasi-realism to anti-realist and best opinion accounts (...) of moral truth and the non-reductionist naturalism of the ‘Cornell realists’, this book addresses all the key theories and ideas in this field. As well as revisiting the whole terrain with revised and updated guides to further reading, Miller also introduces major new sections on the revolutionary fictionalism of Richard Joyce and the hermeneutic fictionalism of Mark Kalderon. The new edition will continue to be essential reading for students, teachers and professional philosophers with an interest in contemporary metaethics. (shrink)
The Cell Ontology (CL) is designed to provide a standardized representation of cell types for data annotation. Currently, the CL employs multiple is_a relations, defining cell types in terms of histological, functional, and lineage properties, and the majority of definitions are written with sufficient generality to hold across multiple species. This approach limits the CL’s utility for cross-species data integration. To address this problem, we developed a method for the ontological representation of cells and applied this method to develop a (...) dendritic cell ontology (DC-CL). DC-CL subtypes are delineated on the basis of surface protein expression, systematically including both species-general and species-specific types and optimizing DC-CL for the analysis of flow cytometry data. This approach brings benefits in the form of increased accuracy, support for reasoning, and interoperability with other ontology resources. 104. Barry Smith, “Toward a Realistic Science of Environments”, Ecological Psychology, 2009, 21 (2), April-June, 121-130. Abstract: The perceptual psychologist J. J. Gibson embraces a radically externalistic view of mind and action. We have, for Gibson, not a Cartesian mind or soul, with its interior theater of contents and the consequent problem of explaining how this mind or soul and its psychological environment can succeed in grasping physical objects external to itself. Rather, we have a perceiving, acting organism, whose perceptions and actions are always already tuned to the parts and moments, the things and surfaces, of its external environment. We describe how on this basis Gibson sought to develop a realist science of environments which will be ‘consistent with physics, mechanics, optics, acoustics, and chemistry’. (shrink)
The familiar Vendler-Kenny scheme of verb-types, viz., performances (further differentiated by Vedler into accomplishments and achievements), activities, and states, is too narrow in two important respects. First, it is narrow linguistically. It fails to take into account the phenomenon of verb aspect. The trichotomy is not one of verbs as lexical types but of predications. Second, the trichotomy is narrow ontologically. It is a specification in the context of human agency of the more fundamental, topic-neutral trichotomy, event-process-state.The central component in (...) this ontological trichotomy, event, can be sharply differentiated from its two flanking components by adapting a suggestion by Geoffrey N. Leech and others that the contrast between perfective and imperfective aspect in verbs corresponds to the count/mass distinction in the domain of nouns. With the help of two distinctions, of cardinal count adverbials versus frequency adverbials, and of occurrence versus associated occasion, two interrelated criteria for event predication are developed. Accordingly, Mary capsized the boat is an event predication because (a) it is equivalent to There was at least one capsizing of the boat by Mary, or (b) because it admits cardinal count adverbials, e.g., at least once, twice, three times. Ontologically speaking, events are defined as those occurrences that are inherently countable. (shrink)
I offer examples showing that, pace G. E. Moore, it is possible to assert ?Q and I don't believe that Q? sincerely, truly, and without any absurdity. The examples also refute the following principles: (a) justification to assert p entails justification to assert that one believes p (Gareth Evans); (b) the sincerity condition on assertion is that one believes what one says (John Searle); and (c) to assert (to someone) something that one believes to be false is to lie (Don (...) Fallis). (shrink)
Many experiential properties are naturally understood as dispositions such that e.g. a cake tastes good to you iff you are disposed to get gustatory pleasure when you eat it. Such dispositional analyses, however, face a challenge. It has been widely observed that one cannot properly assert “The cake tastes good to me” unless one has tried it. This acquaintance requirement is puzzling on the dispositional account because it should be possible to be disposed to like the cake even if this (...) disposition has never been manifested. We argue that familiar response strategies on behalf of the dispositionalist fail. These include appeals to conversational implicatures, expressivism, semantic presuppositions and norms of assertion. Against this background, we propose a new analysis in terms of what we call tendencies, where a tendency is a disposition that has been manifested. The acquaintance requirement comes out as an entailment. We point out a hitherto unnoticed parallel to sentences ascribing character traits such as “Hannah is brave,” and extend our tendency-based analysis to this domain. (shrink)
In this paper we study generic complexity of undecidable problems. It turns out that some classical undecidable problems are, in fact, strongly undecidable, i.e., they are undecidable on every strongly generic subset of inputs. For instance, the classical Halting Problem is strongly undecidable. Moreover, we prove and analog of the Rice theorem for strongly undecidable problems, which provides plenty of examples of strongly undecidable problems. Then we show that there are natural super-undecidable problems. i.e., problem which are undecidable on every (...) generic (not only strongly generic) subset of inputs. In particular, there are finitely presented semigroups with super-undecidable word problem. To construct strongly- and super-undecidable problems we introducea method of generic amplification (an analog of the amplification in complexity theory). Finally, we construct absolutely undecidable problems, which stay undecidable on every non-negligible set of inputs. Their construction rests on generic immune sets. (shrink)
Those who hold that all fundamental sparse properties have dispositional essences face a problem with structural (e.g. geometrical) properties. In this paper I consider a further route for the dispositional monist that is enabled by the requirement that physical theories should be background-free. If this requirement is respected then we can see how spatial displacement can be a causally active relation and hence may be understood dispositionally.
The usual, comparative, conception of inference to the best explanation (IBE) takes it to be ampliative. In this paper I propose a conception of IBE ('Holmesian inference') that takes it to be a species of eliminative induction and hence not ampliative. This avoids several problems for comparative IBE (for example, how could it be reliable enough to generate knowledge?). My account of Holmesian inference raises the suspicion that it could never be applied, on the grounds that scientific hypotheses are inevitably (...) underdetermined by the evidence (i.e. are inevitably ampliative). I argue that this concern may be resisted by acknowledging, as Timothy Williamson has shown, that all knowledge is evidence. The latter suggests an approach to resisting scepticism different from those (e.g. the reliabilist approach) that embrace fallibilism. (shrink)
G.E. Moore, more than either Bertrand Russell or Ludwig Wittgenstein, was chiefly responsible for the rise of the analytic method in twentieth-century philosophy. This selection of his writings shows Moore at his very best. The classic essays are crucial to major philosophical debates that still resonate today. Amongst those included are: * A Defense of Common Sense * Certainty * Sense-Data * External and Internal Relations * Hume's Theory Explained * Is Existence a Predicate? * Proof of an External World (...) In addition, this collection also contains the key early papers in which Moore signals his break with idealism, and three important previously unpublished papers from his later work which illustrate his relationship with Wittgenstein. (shrink)
Science, and with it our understanding of evolutionary processes, is itself undergoing evolution. The evolutionary framework still most frequently used by the general public to describe and guide processes of societal development is erroneously grounded in Darwinian perspectives or, at the very least, draws facile analogies from biological evolution. The present inquiry incorporates fresh insights on the general systemic nature of developmental dynamics from the most recent advances in the transdisciplinary realm of the sciences of complexity (e.g., general evolution theory, (...) cybernetics, information and communication theory, chaos theory, dynamical systems theory, and nonequilibrium thermodynamics). The description of the evolutionary trajectory of complex dynamical systems as irreversible, periodically chaotic, and strongly nonlinear agrees with certain features of the historical processes of societal development. But there are additional features of the evolutionary dynamic of natural systems that are seldom portrayed as part of human developmental deportment. These features include elements such as the convergence of existing systems at progressively higher levels of organization, the increasingly efficient utilization of environmental energy, and the complexification of system structures in states that are progressively further removed from chemical and thermodynamic equilibria. The sciences of complexity offer insight into the laws and dynamics that govern the evolution of complex systems across a variety of disciplinary areas of investigation. Through a study of the isomorphisms across disciplinary constructs in the theoretical analyses of the principles governing the evolution of human societies, it is possible to enrich the account of developmental dynamics at the socio-civilizational level. Such an account would further our understanding of the phenomenon of societal development and provide the means for the purposeful guidance of this phenomenon in accordance with general evolutionary principles. This article sets forth the type of considerations, and outlines a general research agenda, for inquiry toward an operational model of the evolutionary development of social systems. (shrink)
While much has been written on specificity (e.g., in texts on new institutional economics, agency theory, and team production theory), there are still some insights to be learnt by business ethicists. This article approaches the issue from the perspective of team production, and will propose a new form of corporate governance: enlightened corporate governance, which takes into consideration the specific investments of employees. The article argues that, in addition to shareholders, employees also bear a residual risk which arises due to (...) their specific investments. This residual risk presents a valid and legitimate basis for residual claims. In this way, employees can be seen as residual claimants due to the fact that their income depends upon a hazardous quasi rent. Therefore, this article will call on the fiduciary duty of board members to protect those employees who are exposed to such residual risks and may thus be vulnerable as a result. This leads to a fundamental change of perspective on the “theory of the firm” – a change which will adopt the theories of new institutional economics, agency theory, and team production theory in order to promote business ethics research. Against this background, enlightened corporate governance aims to follow the criterion of specific investments as a legitimate basis for residual claims. Furthermore, it seeks to understand the consequences for board members, and to promote the sharing of control and ownership. The article will close with some discussion of the implications and future prospects for business ethics. (shrink)
The following statement is a report of the Committee on Philosophy in Education of the American Philosophical Association and was approved by the Association's Board of Officers in September, 1959. The Committee was composed of the following: C. W. Hendel, Chairman, H. G. Alexander, R. M. Chisholm, Max Fisch, Lucius Garvin, Douglas Morgan, A. E. Murphy, Charner Perry, and R. G. Turnbull. Primary responsibility for the preparation of this report belonged to a subcommittee composed of Roderick M. Chisholm, Chairman, (...) H. G. Alexander, Lewis Hahn, Paul C. Hayner, and Charles W. Hendel. (shrink)
Hackles have been raised in biosemiotic circles by T. L. Short’s assertion that semiosis, as defined by Peirce, entails “acting for purposes” and therefore is not found below the level of the organism (2007a:174–177). This paper examines Short’s teleology and theory of purposeful behavior and offers a remedy to the disagreement. Remediation becomes possible when the issue is reframed in the terms of the complexity sciences, which allows intentionality to be understood as the interplay between local and global aspects of (...) a system within a system. What is called “acting for purposes” is not itself a type of behavior so much as a relationship between a dynamic system that “exists for a purpose” and its microprocesses that “serve purposes.” The “intentional object” of philosophy is recast here as the holistic self-organized dynamics of a system, which exists for the purpose of self-maintenance, and that constrains the parts’ behaviors, which serve the purpose of forming the system. (A “system” can be any emergent, e.g. an abiotic form, an adapted species, a self, a conditioned response, thought, or a set of ideas.) The self-organized whole, which is represented to the parts in their own constrained behaviors, assumes the guiding function so long attributed to the mysterious “intentional object.” If emergent self-causation is not disallowed, creative originality, as well as directionality, becomes part of the definition of purposeful behavior. Thus, key tools used here, required for understanding emergence, come from poetics rather than semoitics. In the microprocesses of self-organization, I find what I call “accidental” indices and icons — which are poetic in the sense that they involve mere metonymic contiguity and metaphoric similarity — and which are preferentially selected under constrained conditions allowing radically new connections to habituate into an “intentional” self-organized system that, not coincidentally, has some of the emergent characteristics of a conventional symbolic system. (shrink)
In C. S. Peirce, as well as in the work of many biosemioticians, the semiotic object is sometimes described as a physical “object” with material properties and sometimes described as an “ideal object” or mental representation. I argue that to the extent that we can avoid these types of characterizations we will have a more scientific definition of sign use and will be able to better integrate the various fields that interact with biosemiotics. In an effort to end Cartesian dualism (...) in semiotics, which has been the main obstacle to a scientific biosemiotics, I present an argument that the “semiotic object” is always ultimately the objective of self-affirmation (of habits, physical or mental) and/or self-preservation. Therefore, I propose a new model for the sign triad: response-sign-objective. With this new model it is clear, as I will show, that self-mistaking (not self-negation as others have proposed) makes learning, creativity and purposeful action possible via signs. I define an “interpretation” as a response to something as if it were a sign, but whose semiotic objective does not, in fact, exist. If the response-as-interpretation turns out to be beneficial for the system after all, there is biopoiesis. When the response is not “interpretive,” but self-confirming in the usual way, there is biosemiosis. While the conditions conducive to fruitful misinterpretation (e.g., accidental similarity of non-signs to signs and/or contiguity of non-signs to self-sustaining processes) might be artificially enhanced, according to this theory, the outcomes would be, by nature, more or less uncontrollable and unpredictable. Nevertheless, biosemiotics could be instrumental in the manipulation and/or artificial creation of purposeful systems insofar as it can describe a formula for the conditions under which new objectives and novel purposeful behavior may emerge, however unpredictably. (shrink)
The emergence of synthetic biology holds the potential of a major breakthrough in the life sciences by transforming biology into a predictive science. The dual-use characteristics of similar breakthroughs during the twentieth century have led to the application of benignly intended research in e.g. virology, bacteriology and aerobiology in offensive biological weapons programmes. Against this background the article raises the question whether the precautionary governance of synthetic biology can aid in preventing this techno-science witnessing the same fate? In order to (...) address this question, this paper proceeds in four steps: it firstly introduces the emerging techno-science of synthetic biology and presents some of its potential beneficial applications. It secondly analyses contributions to the bioethical discourse on synthetic biology as well as precautionary reasoning and its application to life science research in general and synthetic biology more specifically. The paper then identifies manifestations of a moderate precautionary principle in the emerging synthetic biology dual-use governance discourse. Using a dual-use governance matrix as heuristic device to analyse some of the proposed measures, it concludes that the identified measures can best be described as “patchwork precaution” and that a more systematic approach to construct a web of dual-use precaution for synthetic biology is needed in order to guard more effectively against the field’s future misuse for harmful applications. (shrink)
The Alexandrian emphasis on smallness, elegance, and slightness at the expense of grand themes in major poetic genres was not preciosity for its own sake: although the poetry was written by and for scholars, it had much larger sources than the bibliothecal context in which it was composed. Since the time of the classical poets, much had changed. Earlier Greek poetry was an intimate part of the life of the city-state, written for its religious occasions and performed by its citizens. (...) But eh conquests of Alexander had altered the structure and the boundaries of the Greek world to an astonishing degree. Alexandria, the center of the poetic culture of the new age, was a city that had not even existed at the time of Euripides; it was in Egypt, not in Greece, and was a huge, polyglot community. As immigrants immersed in a new, impersonal, and bureaucratic society, the poets not unreasonably sought out what was small, intimate, and personal in their verses. The heroes of early Greek poetry are larger than life; those of Alexandrian poetry are life-size. They are human, like us; they have a childhood and an old age; they are afraid or in love or caught in a rainstorm. It was simply one way of reducing the world to more manageable dimensions. At the same time, the new world of Alexandria needed a new poetry. To continue writing epics about a mythology that seemed very far away was senseless; it was impossible to recapture either the style or the immediacy of Homer, lyric poetry, or Attic tragedy. The scholar-poets of Alexandria admired the literature of classical Greece; for them Homer was incomparable and inimitable, to be studied—but not to be copied. Far better, then, to find a new voice on a more manageable scale: instead of oral epic, erudite epyllion; instead of lyric, epigram; instead of tragedy, mime. The poets of an urban and unheroic world might long for but could never re-create the grandeur of the past. James E. G. Zetzel is associate professor of classics at Princeton University and editor of the Transactions of the American Philological Association. He is the author of Latin Textual Criticism in Antiquity and, with Anthony T. Grafton and Glenn W. Most, has translated Friedrich August Wolf’s Prolegomena ad Homerum. (shrink)
Selection explanations explain some non-accidental generalizations in virtue of a selection process. Such explanations are not particulaizable - they do not transfer as explanations of the instances of such generalizations. This is unlike many explanations in the physical sciences, where the explanation of the general fact also provides an explanation of its instances (i.e. standard D-N explanations). Are selection explanations (e.g. in biology) therefore a different kind of explanation? I argue that to understand this issue, we need to see that (...) a standard D-N explanation of some non-accidental generalization (al Fs are Gs) may also ipso facto explain its contrapositive (all non-Gs are non-Fs), but the explanation is particularizable with respect to the former but not to the latter. This can be seen by noting that the Raven Paradox counterexample to the H-D model of confirmation also generates a counterexample to the D-N model of explanation (all ravens are black does not explain why the non-black shoe is a non-raven). In such cases it is natural to take the generalization with the positive predicates to have a particularizable explanation. However, this need not be the case, and in selection explanations it is the generalization with the positive predicates whose explanation is no particularizable. Thus there is no need to suppose that selection explanations are fundamentally different. (shrink)
The strong and the weak deontic operators, O and p, Have been employed as operators on names of acts (e.G. By g h von wright) and on imperatives (e.G. By m fisher), But most commonly as proposition forming operators on propositions (e.G. By a n prior). But a strong case can be made out for the introduction of two kinds of adverbial deontic operator operating respectively on a proposition and on a predicate. These two operators can be used to symbolize (...) the kantian notions of acting in accordance with the moral law and acting out of respect for the moral law. In this paper the syntactic properties of these adverbial operators are examined in detail and a start is made on the symbolization of a system, A system of kantian deontic logic, Containing them. (shrink)
Continued confinement of those most vulnerable to COVID-19—e.g., the elderly, those with chronic diseases and other risk factors—is presented as an uncontroversial measure when planning exit strategies from lockdown measures. Policies for deconfinement assume that these persons will remain confined even when others will not. This, however, could last quite a long time, and for some this could mean that they will remain in confinement for the rest of their lives.In a policy brief on ethical, legal, and social issues of (...) transition strategies, the Swiss national COVID-19 science task force stated that:Specific interventions should target the risks associated with isolation... (shrink)
I owe you a dinner invitation, you owe ten years on your mortgage, and the government owes billions. We speak confidently about these cases of debt, but is that concept clear in its meaning? This book aims to clarify the concept of debt so we can find better answers to important moral and political questions. This book seeks to accomplish two things. The first is to clarify the concept of debt by examining how the word is used in language. The (...) second is to develop a general, principled account of how debts generate genuine obligations. This allows us to avoid settling each case by a bare appeal to moral intuitions, which is what we seem to currently do. It requires a close examination of many institutions, e.g. money, contract law, profit-driven finance, government fiscal operations, and central banking. To properly understand the moral and political nature of debt, we must understand how these institutions have worked, how they do work, and how they might be made to work. There have been many excellent anthropological and sociological studies of debt and its related institutions. Philosophy can contribute to the emerging discussion and help us to keep our language precise and to identify the implicit principles contained in our intuitions. (shrink)
This dissertation is structured in such a way as to gradually home in on the true theory of welfare. I start with the whole field of possible theories of welfare and then proceed by narrowing down the options in a series of steps. The first step, undertaken in chapter 2, is to argue that the true theory of welfare must be what I call a partly response independent theory. First I reject the entirely response independent theories because there are widely-shared (...) intuitions suggesting that some psychological responses are indeed relevant to welfare. Then I reject the entirely response dependent theories because there are other central intuitions suggesting that our welfare is not determined solely by our psychological responses. Thus I reach the preliminary conclusion that welfare must involve some response independent (or objective) component. The next step is to consider the most promising theories in the partly response independent category. In particular, I formulate, refine and ultimately reject what seem to be the main monistic theories that have been proposed in this category. In chapter 4, I reject the Adjusted-Enjoyment Theories of Welfare because they cannot account for the claim that a life containing no pleasure or pain can still contain a positive amount of welfare (e.g. if it’s a particularly successful life). Then in chapters 5-7, I discuss Desire Satisfaction theories of welfare. I argue that even the most promising of these theories – e.g. Worthiness Adjusted Desire Satisfactionism – are problematic because they cannot accommodate the claim that a life containing no success with respect to worthwhile projects can still contain a positive amount of welfare (e.g. if it’s a particularly pleasant life). Finally, I suggest that in order to accommodate the intuitions that led to the rejection of all these other theories of welfare, what is needed is a multi-component theory. In the final chapter, I formulate a multi-component theory that is particularly promising. Not only does it avoid the problems of the monistic theories discussed earlier, but, by incorporating a number of novel mathematical devices, it avoids problems that undermine several other initially promising multi-component theories of welfare. (shrink)
Mark Jago has presented a dilemma for truthmaker non-maximalism—the thesis that some but not all truths require truthmakers. The dilemma arises because some truths that do not require truthmakers by the non-maximalist’s lights (e.g., that Santa Claus does not exist) are necessitated by truths that do (e.g., that Barack Obama knows that Santa Claus does not exist). According to Jago, the non-maximalist can supply a truthmaker for such a truth only by conceding the primary motivation for the view: that it (...) allows one to avoid positing strange ‘negative’ entities without adopting a non-standard account of the necessary features of ordinary things. In this paper, I sketch out and defend two plausible non-maximalist proposals that evade Jago’s dilemma. (shrink)
This paper examines the existence of strategic solutions to finite normal form games under the assumption that strategy choices can be described as choices among lotteries where players have security- and potential level preferences over lotteries (e.g., Cohen, Theory and Decision, 33, 101–104, 1992, Gilboa, Journal of Mathematical Psychology, 32, 405–420, 1988, Jaffray, Theory and Decision, 24, 169–200, 1988). Since security- and potential level preferences require discontinuous utility representations, standard existence results for Nash equilibria in mixed strategies (Nash, Proceedings of (...) the National Academy of Sciences, 36, 48–49, 1950a, Non-Cooperative Games, Ph.D. Dissertation, Princeton University Press, 1950b) or for equilibria in beliefs (Crawford, Journal of Economic Theory, 50, 127–154, 1990) do not apply. As a key insight this paper proves that non-existence of equilibria in beliefs, and therefore non-existence of Nash equilibria in mixed strategies, is possible in finite games with security- and potential level players. But, as this paper also shows, rationalizable strategies (Bernheim, Econometrica, 52, 1007–1028, 1984, Moulin, Mathematical Social Sciences, 7, 83–102, 1984, Pearce, Econometrica, 52, 1029–1050, 1984) exist for such games. Rationalizability rather than equilibrium in beliefs therefore appears to be a more favorable solution concept for games with security- and potential level players. (shrink)
We formalise a notion of dynamic rationality in terms of a logic of conditional beliefs on (doxastic) plausibility models. Similarly to other epistemic statements (e.g. negations of Moore sentences and of Muddy Children announcements), dynamic rationality changes its meaning after every act of learning, and it may become true after players learn it is false. Applying this to extensive games, we “simulate” the play of a game as a succession of dynamic updates of the original plausibility model: the epistemic situation (...) when a given node is reached can be thought of as the result of a joint act of learning (via public announcements) that the node is reached. We then use the notion of “stable belief”, i.e. belief that is preserved during the play of the game, in order to give an epistemic condition for backward induction: rationality and common knowledge of stable belief in rationality. This condition is weaker than Aumann’s and compatible with the implicit assumptions (the “epistemic openness of the future”) underlying Stalnaker’s criticism of Aumann’s proof. The “dynamic” nature of our concept of rationality explains why our condition avoids the apparent circularity of the “backward induction paradox”: it is consistent to (continue to) believe in a player’s rationality after updating with his irrationality. (shrink)
There are two ways in which Symbol and Myth are related to each other. Firstly, a certain class of symbols represents the remnant of myths. Such figures as, e.g. the Dragon, Leviathan, etc., which we find in Biblical literature, are not used in the full sense of the underlying mythological conception, but in a metaphorical sense. They are chosen by the author because of their mythical associations, but not in their mythical meaning. Ametaphor of this kind is, as H. J. (...) D. Astley put it, “broken-down mythology.” There are a great many symbols both in poetry and mysticism which must be understood as the relics of mythical thought. We owe a great deal to ethnology for having thrown light on this relation. The microcosm-macrocosm symbolism, for instance, becomes more intelligible if we consider that in primitive mythology the world emerged from the body of primordial man. The gifts to the dead appear in later forms of sacrificial cults as purely symbolical, but there is no doubt that originally they were intended for the real use of the dead. In these and in numerous other cases the symbol has only a reduced value as compared with the original-myth from which it is borrowed. It is not self-evident, but relies on the mythical conception, without, however, taking it seriously. It is “merely” a symbol, and has no truth of its own. (shrink)
In this article I take a loose, functional approach to defining induction: Inductive forms of reasoning include those prima facie reasonable inference patterns that one finds in science and elsewhere that are not clearly deductive. Inductive inference is often taken to be reasoning from the observed to the unobserved. But that is incorrect, since the premises of inductive inferences may themselves be the results of prior inductions. A broader conception of inductive inference regards any ampliative inference as inductive, where an (...) ampliative inference is one where the conclusion ‘goes beyond’ the premises. ‘Goes beyond’ may mean (i) ‘not deducible from’ or (ii) ‘not entailed by’. Both of these are problematic. Regarding (i), some forms of reasoning might have a claim to be called ‘inductive’ because of their role in science, yet turn out to be deductive after all—for example eliminative induction (see below) or Aristotle’s ‘perfect induction’ which is an inference to a generalization from knowledge of every one of its instances. Interpretation (ii) requires that the conclusions of scientific reasoning are always contingent propositions, since necessary propositions are entailed by any premises. But there are good reasons from metaphysics for thinking that many general propositions of scientific interest and known by inductive inference (e.g. “all water is H2O”) are necessarily true. Finally, both (i) and (ii) fail to take account of the fact that there are many ampliative forms of inference one would not want to call inductive, such as counter-induction (exemplified by the ‘gambler’s fallacy’ that the longer a roulette wheel has come up red the more likely it is to come up black on the next roll). Brian Skyrms (1999) provides a useful survey of the issues involved in defining what is meant by ‘inductive argument’. Inductive knowledge will be the outcome of a successful inductive inference. But much discussion of induction concerns the theory of confirmation, which seeks to answer the question, “when and to what degree does evidence support an hypothesis?” Usually, this is understood in an incremental sense and in a way that relates to the rational credibility of a hypothesis: “when and by how much does e add to the credibility of h?”, although ‘confirms’ is sometimes used in an absolute sense to indicate total support that exceeds some suitably high threshold.. (shrink)