Gers (Biol Philos, 2011) provides a positive and constructive view of the project to generalise Darwinian principles in Geoffrey Hodgson and Thorbjørn Knudsen’s Darwin’s Conjecture. We note considerable overlap with his work and ours, and also with important recent work of Godfrey-Smith ( 2009 ), which Gers cites extensively. But we also note that there are differences in research objectives between Gers and Godfrey-Smith, on the one hand, and ourselves, on the other. Gers and Godfrey-Smith focus on the elucidation (...) of the most general principles possible. Our aim is to derive principles that are sufficiently abstract to span the natural and human social worlds, and then add additional principles to help understand the Darwinian evolution of human society. Furthermore, Gers and Godfrey-Smith critique a replicator concept that is different from ours. Once these points are made apparent, the criticisms are essentially disabled, and we end up in a position with different but complementary and overlapping research projects. (shrink)
Advancing a general Darwinian framework to explain culture is an exciting endeavor. It requires that we face up to the challenge of identifying the specific components that are effective in replication processes in culture. This challenge includes the unsolved problem of explaining cultural inheritance, both at the level of individuals and at the level of social organizations and institutions. (Published Online November 9 2006).
The established definition of replication in terms of the conditions of causality, similarity and information transfer is very broad. We draw inspiration from the literature on self-reproducing automata to strengthen the notion of information transfer in replication processes. To the triple conditions of causality, similarity and information transfer, we add a fourth condition that defines a “generative replicator” as a conditional generative mechanism, which can turn input signals from an environment into developmental (...) instructions. Generative replication must have the potential to enhance complexity, which in turn requires that developmental instructions are part of the information that is transmitted in replication. Demonstrating the usefulness of the generative replicator concept in the social domain, we identify social generative replicators that satisfy all of the four proposed conditions. (shrink)
Hull et al.'s construction of operant learning as an instance of selection gives rise to problems that weaken this application of selection theory beyond acceptable limits. We point out that most fundamental is a disregard for the need to include multiple concurrent replicators in any definition of selection and indicate how this problem may be solved.
The purpose of the present article is to strengthen the conceptualisation of the principle of selection in theories of economic evolution and to help clarify a number of unsettled issues regarding the meaning of variety and continuity. In order to achieve this, the emerging general mathematical selection theory is introduced to identify the requirements of a general principle of selection and the specification of variety and continuity that follows from it. It is indicated how general selection theory can help advance (...) evolutionary theories of economic change by clarifying the meaning of selection, and the possible role of habits and routines in economic selection. (shrink)
A survey of the mathematical tradition of a subcontinent Content Type Journal Article Category Book Review Pages 1-3 DOI 10.1007/s11016-011-9608-3 Authors Toke Knudsen, Department of Mathematics, Computer Science, and Statistics, SUNY Oneonta, Fitzelle Hall 234, Oneonta, NY 13820, USA Journal Metascience Online ISSN 1467-9981 Print ISSN 0815-0796.
While a substantial amount of the literature describes corporate benefits of corporate social responsibility (CSR) initiatives, the literature is silent concerning why some companies announce CSR initiatives, yet fail to implement them. The article examines company delistings from the UN Global Compact. Delistings are surprising because the CSR agenda is seen as having won the battle of ideas. The analysis proceeds in two parts. I first analyze firm-level characteristics focusing on geography while controlling for sector and size; I find that (...) geography is a significant factor while small firms are more likely to be delisted than large firms and some sector characteristics determine delistings. Next, I proceed to uncover country-level characteristics including the degree of international economic interdependence as well as the quality of governance institutions. Multivariate regression analysis shows that companies are less likely to be delisted from countries where domestic governance institutions are well-functioning. To a lesser extent, I find that firms from countries with international economies are more willing to comply with the UN Global Compact requirements. Countries with a high share of outward FDI/capita have a lower share of delisted firms as do countries that are internationally competitive. (shrink)
Some have suggested that certain classical physical systems have undecidable long-term behavior, without specifying an appropriate notion of decidability over the reals. We introduce such a notion, decidability in (or d- ) for any measure , which is particularly appropriate for physics and in some ways more intuitive than Ko's (1991) recursive approximability (r.a.). For Lebesgue measure , d- implies r.a. Sets with positive -measure that are sufficiently "riddled" with holes are never d- but are often r.a. This explicates Sommerer (...) and Ott's (1996) claim of uncomputable behavior in a system with riddled basins of attraction. Furthermore, it clarifies speculations that the stability of the solar system (and similar systems) may be undecidable, for the invariant tori established by KAM theory form sets that are not d-. (shrink)
Does transparency in doxastic deliberation entail a constitutive norm of correctness governing belief, as Shah and Velleman argue? No, because this presupposes an implausibly strong relation between normative judgements and motivation from such judgements, ignores our interest in truth, and cannot explain why we pay different attention to how much justification we have for our beliefs in different contexts. An alternative account of transparency is available: transparency can be explained by the aim one necessarily adopts in deliberating about whether to (...) believe that p. To show this, I reconsider the role of the concept of belief in doxastic deliberation, and I defuse 'the teleologian's dilemma'. (shrink)
Metaphysics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent philosophers in the field. We hear their views on metaphysics, the aim, the scope, the future direction of research and how their work fits in these respects. Interviews with Lynne Rudder Baker, Helen Beebee, Thomas Hofweber, Hugh Mellor, Peter Menzies, Stephen Mumford, Daniel Nolan, Eric T.Olson, L. A. Paul, Lorenz B. Puntel, Gonzalo Rodriguez-Pereyra, Gideon Rosen, Jonathan Schaffer, Peter (...) Simons, Barry Smith, Michael Tooley, Peter van Inwagen, Dean Zimmerman. (shrink)
A popular account of epistemic justification holds that justification, in essence, aims at truth. An influential objection against this account points out that it is committed to holding that only true beliefs could be justified, which most epistemologists regard as sufficient reason to reject the account. In this paper I defend the view that epistemic justification aims at truth, not by denying that it is committed to epistemic justification being factive, but by showing that, when we focus on the relevant (...) sense of ‘justification’, it isn’t in fact possible for a belief to be at once justified and false. To this end, I consider and reject three popular intuitions speaking in favor of the possibility of justified false beliefs, and show that a factive account of epistemic justification is less detrimental to our normal belief forming practices than often supposed. (shrink)
Philosophers have long been concerned about what we know and how we know it. Increasingly, however, a related question has gained prominence in philosophical discussion: what should we believe and why? This volume brings together twelve new essays that address different aspects of this question. The essays examine foundational questions about reasons for belief, and use new research on reasons for belief to address traditional epistemological concerns such as knowledge, justification and perceptually acquired beliefs. This book will be of interest (...) to philosophers working on epistemology, theoretical reason, rationality, perception and ethics. It will also be of interest to cognitive scientists and psychologists who wish to gain deeper insight into normative questions about belief and knowledge. (shrink)
For at least three decades, philosophers have argued that general causation and causal explanation are contrastive in nature. When we seek a causal explanation of some particular event, we are usually interested in knowing why that event happened rather than some other specified event. And general causal claims, which state that certain event types cause certain other event types, seem to make sense only if appropriate contrasts to the types of events acting as cause and effect are specified. In recent (...) years, philosophers have extended the contrastive theory of causation to encompass singular causation as well. In this article, I argue that this extension of the theory was a mistake. Although general causation and causal explanation may well be contrastive in nature, singular causation is not. (shrink)
In this paper I propose a teleological account of epistemic reasons. In recent years, the main challenge for any such account has been to explicate a sense in which epistemic reasons depend on the value of epistemic properties. I argue that while epistemic reasons do not directly depend on the value of epistemic properties, they depend on a different class of reasons which are value based in a direct sense, namely reasons to form beliefs about certain propositions or subject matters. (...) In short, S has an epistemic reason to believe that p if and only if S is such that if S has reason to form a belief about p, then S ought to believe that p. I then propose a teleological explanation of this relationship. It is also shown how the proposal can avoid various subsidiary objections commonly thought to riddle the teleological account. (shrink)
Many philosophers have argued that an event is lucky for an agent only if it was suitably improbable, but there is considerable disagreement about how to understand this improbability condition. This paper argues for a hitherto overlooked construal of the improbability condition in terms of the lucky agent’s epistemic situation. According to the proposed account, an event is lucky for an agent only if the agent was not in a position to know that the event would occur. It is also (...) explored whether this new account threatens the anti-luck program in epistemology. It is argued that although not detrimental to the anti-luck program, the epistemic account of luck sets certain important limits to its scope and feasibility. (shrink)
Kathrin Glüer and Åsa Wikforss (2009) argue that any truth norm for belief, linking the correctness of believing p with the truth of p, is bound to be uninformative, since applying the norm to determine the correctness of a belief as to whether p, would itself require forming such a belief. I argue that this conflates the condition under which the norm deems beliefs correct, with the psychological state an agent must be in to apply the norm. I also show (...) that since the truth norm conflicts with other possible norms that clearly are informative, the truth norm must itself be informative. (shrink)
In a recent article, I criticized Kathrin Glüer and Åsa Wikforss's so-called “no guidance argument” against the truth norm for belief, for conflating the conditions under which that norm recommends belief with the psychological state one must be in to apply the norm. In response, Glüer and Wikforss have offered a new formulation of the no guidance argument, which makes it apparent that no such conflation is made. However, their new formulation of the argument presupposes a much too narrow understanding (...) of what it takes for a norm to influence behaviour, and betrays a fundamental misunderstanding of the point of the truth norm. Once this is taken into account, it becomes clear that the no guidance argument fails. (shrink)
Causation is of undeniable importance to our understanding of, and interaction with our surroundings. Despite this, the correct understanding of causation remains subject to considerable philosophical controversy. In this article, I introduce the most influential philosophical theories of causation, and provide an overview of the main difficulties that has led to the currently most popular versions of these theories.
The theory of belief, according to which believing that p essentially involves having as an aim or purpose to believe that p truly, has recently been criticised on the grounds that the putative aim of belief does not interact with the wider aims of believers in the ways we should expect of genuine aims. I argue that this objection to the aim theory fails. When we consider a wider range of deliberative contexts concerning beliefs, it becomes obvious that the aim (...) of belief can interact with and be weighed against the wider aims of agents in the ways required for it to be a genuine aim. (shrink)
Nishi Shah has recently argued that transparency in doxastic deliberation supports a strict version of evidentialism about epistemic reasons. I argue that Shah's argument relies on a principle that is incompatible with the strict version of evidentialism Shah wishes to advocate.
A number of authors have recently developed and defended various versions of ‘normative essentialism’ about the mental, i.e. the claim that propositional attitudes are constitutively or essentially governed by normative principles. I present two arguments to the effect that this claim cannot be right. First, if propositional attitudes were essentially normative, propositional attitude ascriptions would require non-normative justification, but since this is not a requirement of folk-psychology, propositional attitudes cannot be essentially normative. Second, if propositional attitudes were essentially normative, propositional (...) attitude ascriptions could not support normative rationality judgments, which would remove the central appeal of normative essentialism. (shrink)
Semantic theories that violate semantic innocence, i.e. require reference-shifts when terms are embedded in ‘that’ clauses and the like, are often challenged by producing sentences where an anaphoric expression, while not itself embedded in a context in which reference shifts, is anaphoric on an antecedent expression that is embedded in such a context. This, in conjunction with a widely accepted principle concerning unproblematic anaphora, is used to show that such reference shifting has absurd consequences. We show that it is the (...) widely accepted principle concerning anaphora that is to be blamed for these consequences, and not the supposed sin of reference shifting. (shrink)
In this paper, I introduce and discuss a series of problems associated with answering the question of semantic unity, and argue that the truth theoretical approach to semantics put forward by Donald Davidson suggests a possible solution. Although not put forward explicitly as such by Davidson, it is argued that we in Davidson's interpretation of Tarski's definition of truth find the resources to illuminate and resolve the problem of unity.
Trust can be understood as a precondition for a well-functioning society or as a way to handle complexities of living in a risk society, but also as a fundamental aspect of human morality. Interactions on the Internet pose some new challenges to issues of trust, especially connected to disembodiedness. Mistrust may be an important obstacle to Internet use, which is problematic as the Internet becomes a significant arena for political, social and commercial activities necessary for full participation in a liberal (...) democracy. The Categorical Imperative lifts up trust as a fundamental component of human ethical virtues – first of all, because deception and coercion, the antitheses of trust, cannot be universalized. Mistrust is, according to Kant, a natural component of human nature, as we are social beings dependent on recognition by others but also prone to deceiving others. Only in true friendships can this tendency be overcome and give room for unconditional trust. Still we can argue that Kant must hold that trustworthy behaviour as well as trust in others is obligatory, as expressions of respect for humanity. The Kantian approach integrates political and ethical aspects of trust, showing that protecting the external activities of citizens is required in order to act morally. This means that security measures, combined with specific regulations are important preconditions for building online trust, providing an environment enabling people to act morally and for trust-based relationships. (shrink)
In a recent paper (2008), I presented two arguments against the thesis that intentional states are essentially normative. In this paper, I defend those arguments from two recent responses, one from Nick Zangwill in his (2010), and one from Daniel Laurier in the present volume, and offer improvements of my arguments in light of Laurier’s criticism.
In his Knowledge and its Limits (2000) Timothy Williamson argues that knowledge can be causally efficacious and as such figure in psychological explanation. His argument for this claim figures as a response to a key objection to his overall thesis that knowing is a mental state. In this paper I argue that although Williamson succeeds in establishing that knowledge in some cases is essential to the power of certain causal explanations of actions, he fails to do this in a way (...) that establishes knowledge itself as a causal factor. The argument thus fails to support his overall claim that knowledge should be conceived as a state of mind. (shrink)
The point of departure for this article is a review of the discussion between Twaddle and Nordenfelt on the concepts of disease, illness, and sickness, and the objective is to investigate the fruitfulness of these concepts. It is argued that disease, illness, and sickness represent different perspectives on human ailment and that they can be applied to analyze both epistemic and normative challenges to modern medicine. In particular the analysis reveals epistemic and normative differences between the concepts. Furthermore, the article (...) demonstrates, against Nordenfelt's claim, that the concepts of disease, illness, and sickness can exist without a general theory of health. Additionally, the complexity of different perspectives on human ailment also explains why it is so difficult to give strict definitions of basic concepts within modern health care. (shrink)
It is widely assumed that doxastic deliberation is transparent to the factual question of the truth of the proposition being considered for belief, and that this sets doxastic deliberation apart from practical deliberation. This feature is frequently invoked in arguments against doxastic voluntarism. I argue that transparency to factual questions occurs in practical deliberation in ways parallel to transparency in doxastic deliberation. I argue that this should make us reconsider the appeal to transparency in arguments against doxastic voluntarism, and the (...) wider issue of distinguishing theoretical from practical rationality. (shrink)
The relationship of the author's intention to the meaning of a literary work has been a persistently controversial topic in aesthetics. Anti-intentionalists Wimsatt and Beardsley, in the 1946 paper that launched the debate, accused critics who fueled their interpretative activity by poring over the author's private diaries and life story of committing the 'fallacy' of equating the work's meaning, properly determined by context and linguistic convention, with the meaning intended by the author. Hirsch responded that context and convention are not (...) sufficient to determine a unique meaning for a text; to avoid radical ambiguity we must appeal to the author's intention, which actualizes one of the candidate meanings. Subsequent writers have defended refined versions of these views, and a variety of positions on the spectrum between them, in a debate that remains central to philosophical aesthetics. While much of the debate has focused on literature, similar questions arise with respect to the interpretation of visual artworks. Some of the readings listed below address this matter explicitly. Author Recommends: William K. Wimsatt and Monroe C. Beardsley, 'The Intentional Fallacy', Sewanee Review 54 (1946): 468–88. Locus classicus of the anti-intentionalist position: Wimsatt and Beardsley hold that appeal to the author's intention is always extraneous, since intention cannot override the role of linguistic convention and context in determining meaning. Criticism, they argue, should thus proceed by careful examination of the literary work rather than by sifting through biographical material that might hint at the author's intentions. E. D. Hirsch, Jr., Validity in Interpretation (New Haven, CT: Yale University Press, 1967). The seminal statement of actual intentionalism: Hirsch holds that 'meaning is an affair of consciousness and not of physical signs or things' (23), though he allows that linguistic convention constrains the meanings the author can intend for a particular utterance. He argues that the author's intention is necessary to fix meaning, since the application of conventions alone would typically leave a text wildly indeterminate. Alexander Nehamas, 'The Postulated Author: Critical Monism as a Regulative Ideal', Critical Inquiry 8 (1981): 133–49. Nehamas argues for a version of hypothetical intentionalism according to which interpretation is a matter of attributing an intended meaning to a hypothetical author, distinct from the historical writer. This view allows the interpreter to find meaning even in features of the work that may have been mere accidents on the part of the historical writer. Gary Iseminger, ed., Intention and Interpretation (Philadelphia, PA: Temple University Press, 1992). Intention and Interpretation is an outstanding collection including both classic and new essays representing most of the major viewpoints in the debate. Noël Carroll, 'Art, Intention, and Conversation', Intention and Interpretation , ed. Gary Iseminger (Philadelphia, PA: Temple University Press, 1992), 97–131. The essay defends modest actual intentionalism, according to which the work's meaning is one compatible both with the author's meaning intentions and with the conventionally allowable meanings of the text. Carroll holds that literature is on a continuum with ordinary conversation, to which an intentionalist analysis is apt; for this reason he rejects anti-intentionalism and hypothetical intentionalism, which emphasize the purported autonomy of literary works from their authors. Daniel Nathan, 'Irony, Metaphor, and the Problem of Intention', Intention and Interpretation , ed. Gary Iseminger (Philadelphia, PA: Temple University Press, 1992), 183–202. Nathan argues that even irony and metaphor, which are often thought to require an analysis in terms of the author's actual intentions, are in fact best understood on an anti-intentionalist approach. Jerrold Levinson, 'Intention and Interpretation in Literature', The Pleasures of Aesthetics: Philosophical Essays (Ithaca, NY: Cornell University Press, 1996), 175–213. Revised version of 'Intention and Interpretation: A Last Look', Intention and Interpretation , ed. Gary Iseminger (Philadelphia, PA: Temple University Press, 1992), 221–56. The essay defends a version of hypothetical intentionalism according to which the meaning of a literary work is the meaning that would be attributed to the actual author by members of the ideal audience. Levinson argues that literary works should be treated differently from everyday utterances, since it is a convention of literature that its works are substantially autonomous from their authors. Paisley Livingston, Art and Intention: A Philosophical Study (Oxford: Clarendon Press, 2005). Livingston examines competing accounts of the nature of intentions as they pertain to a variety of issues in the philosophy of art, including the ontology of art, the nature of authorship, and art interpretation. In chapter 6, Livingston argues for partial intentionalism, according to which some, but not all, of a work's meanings are non-redundantly determined by the author's intentions. Stephen Davies, 'Authors' Intentions, Literary Interpretation, and Literary Value', British Journal of Aesthetics 46 (2006): 223–47. Davies defends the value-maximizing view, according to which, when there is more than one conventional meaning consistent with the work's features, the meaning that should be attributed to the work is the one that makes the work out to be most aesthetically valuable. He allows for the attribution of multiple meanings when more than one candidate (approximately) maximizes the work's value. Online Materials: http://plato.stanford.edu/entries/beardsley-aesthetics/ Beardsley's Aesthetics (Michael Wreen) http://plato.stanford.edu/entries/conceptual-art/ Conceptual Art (Elisabeth Schellekens) http://plato.stanford.edu/entries/speech-acts/ Speech Acts (Mitchell Green) http://plato.stanford.edu/entries/hermeneutics/ Hermeneutics (Bjørn Ramberg and Kristin Gjesdal) Sample Syllabus: Week 1: Foundations 1. Wimsatt and Beardsley, 'The Intentional Fallacy'. 2. Livingston, 'What Are Intentions?', Art and Intention , 1–30. Weeks 2–3: Actual Intentionalism 1. Hirsch, Validity in Interpretation , ch. 1–2, 1–67. 2. Gary Iseminger, 'An Intentional Demonstration?', Intention and Interpretation , ed. Iseminger, 76–96. Optional reading: 1. Stephen Knapp and Walter Benn Michaels, 'Against Theory', Critical Inquiry 8 (1982): 723–742. 2. Stephen Knapp and Walter Benn Michaels, 'Against Theory 2: Hermeneutics and Deconstruction', Critical Inquiry 14 (1987): 49–58. Weeks 4–5: Modest, Moderate and Partial Intentionalism 1. Carroll, 'Art, Intention, and Conversation'. 2. Robert Stecker, Interpretation and Construction: Art, Speech, and the Law (Malden, MA: Blackwell, 2003), ch. 2, 29–51. 3. Livingston, 'Intention and the Interpretation of Art', Art and Intention , 135–74. Optional reading: 1. Carroll, 'Interpretation and Intention: The Debate between Hypothetical and Actual Intentionalism', Metaphilosophy 31 (2000): 75–95. 2. Stecker, 'Moderate Actual Intentionalism Defended', Journal of Aesthetics and Art Criticism 64 (2006): 429–38. Weeks 6–7: Hypothetical Intentionalism 1. William E. Tolhurst, 'On What a Text Is and How It Means', British Journal of Aesthetics 19 (1979): 3–14. 2. Nehamas, 'Postulated Author'. 3. Levinson, 'Intention and Interpretation in Literature'. Optional reading: 1. Nehamas, 'What an Author Is', Journal of Philosophy 83 (1986): 685–91. 2. Nehamas, 'Writer, Text, Work, Author', Literature and the Question of Philosophy , ed. A. J. Cascardi (Baltimore, MD: Johns Hopkins University Press, 1987), 265–91. 3. Levinson, 'Hypothetical Intentionalism: Statement, Objections, and Replies', Is There a Single Right Interpretation? , ed. M. Krausz (University Park, PA: Pennsylvania State University Press, 2002), 309–18. Week 8: The Value-Maximizing View 1. Davies, 'The Aesthetic Relevance of Authors' and Painters' Intentions', Journal of Aesthetics and Art Criticism 41 (1982): 65–76. 2. Davies, 'Authors' Intentions, Literary Interpretation, and Literary Value'. Weeks 9–10: Anti-Intentionalism 1. Beardsley, 'The Authority of the Text,' The Possibility of Criticism (Detroit: Wayne State University Press, 1970), 16–37. 2. Nathan, 'Irony, Metaphor, and the Problem of Intention'. 3. Nathan, 'Art, Meaning, and Artist's Meaning', Contemporary Debates in Aesthetics and the Philosophy of Art , ed. M. Kieran (Malden, MA: Blackwell, 2006), 282–95. Optional reading: 1. Beardsley, 'Intentions and Interpretations: A Fallacy Revived', The Aesthetic Point of View: Selected Essays , ed. M. J. Wreen and D. M. Callen (Ithaca, NY: Cornell University Press, 1982), 188–207. 2. Nathan, 'Irony and the Author's Intentions', British Journal of Aesthetics 22 (1982): 246–56. Sample Mini-Syllabus: Week 1: Foundations 1. Wimsatt and Beardsley, 'The Intentional Fallacy'. 2. Livingston, 'What Are Intentions?', Art and Intention , 1–30. Week 2: Actual and Modest Intentionalism 1. Hirsch, Validity in Interpretation , ch. 1–2, 1–67. 2. Carroll, 'Art, Intention, and Conversation'. Week 3: Hypothetical Intentionalism and Anti-Intentionalism 1. Levinson, 'Intention and Interpretation in Literature'. 2. Nathan, 'Irony, Metaphor, and the Problem of Intention'. Focus Questions 1. Is the difficulty of ascertaining the author's intentions a good reason to reject actual intentionalism? 2. Should literary works be seen as largely autonomous from their authors, even if we think that interpretation of ordinary utterances is properly a matter of ascertaining the speaker's intentions? 3. Are linguistic context and convention sufficient to determine the meaning of a literary work, or is the author's intention required to stave off an unacceptable degree of ambiguity? 4. Should the author's intentions about the genre or category to which the work belongs have a different status than intentions about the work's meaning? 5. Can the author's intentions have a non-redundant role to play in fixing meaning even if we take the role of context and linguistic convention seriously? 6. Should we expect the author's intention to play the same role (if any) in the interpretation of visual artworks that it plays in the interpretation of literature, or do differences between these two art forms require distinct approaches? (shrink)
The concept of Darwinian Happiness was coined to help people take advantage of knowledge on how evolution has shaped the brain; as processes within this organ are the main contributors to well-being. Fortuitously, the concept has implications that may prove beneficial for society: Compassionate behavior offers more in terms of Darwinian Happiness than malicious behavior; and the probability of obtaining sustainable development may be improved by pointing out that consumption beyond sustenance is not important for well-being. It is difficult to (...) motivate people to act against their own best interests. Darwinian Happiness offers a concept that, to some extent, combines the interests of the individual with the interests of society. (shrink)
According to the moral theory of William Wollaston (1659-1724), the mark of a wrong action is that it signifies a falsehood.1 This theory rests, in part, on an unusual account of actions according to which they have propositional content: they "declare," "signify," "affirm," or "express" propositions (RN 8-13). To take an example from Wollaston, the act of firing on a band of soldiers affirms the proposition "Those soldiers are my enemies" (RN 8-9). Likewise, the act of breaking a promise signifies (...) the proposition "I did not make that promise" (RN 10, 16).2This account of actions, as well as the moral theory that rests on it, has many harsh critics.3 Unfortunately, some of them read Wollaston with little care, and .. (shrink)
The concepts of health and disease are crucial in defining the aim and the limits of modern medicine. Accordingly it is important to understand them and their relationship. However, there appears to be a discrepancy between scholars in philosophy of medicine and health care professionals with regard to these concepts. This article investigates health care professionals’ concepts of health and disease and the relationship between them. In order to do so, four different models are described and analyzed: the ideal model, (...) the holistic model, the medical model and the disjunctive model. The analysis reveals that each model has its pros and cons, and that health care professionals appear to apply more than one models. Furthermore, the models and the way health care professionals’ use them may be helpful for scholars in philosophy of medicine with regard to developing theories and communicating them to health care professionals. (shrink)
We have claimed that truth norms cannot provide genuine guidance for belief formation (Glüer and Wikforss 2009, pp. 43–4). Asbjørn Steglich-Petersen argues that our ‘no guidance argument’ fails because it conflates certain psychological states an agent must have in order to apply the truth norm with the condition under which the norm prescribes forming certain beliefs. We spell out the no guidance argument in more detail and show that there is no such conflation.
This article argues that we must abandon the still predominant view of modernity as based upon a separation between the secular and the religious - a “separation” which is allegedly now brought into question again in “postsecularity”. It is more meaningful to start from the premise that religion and politics have always co-existed in various fields of tension and will continue to do so. The question then concerns the natures and modalities of this tension, and how one can articulate a (...) publically grounded reason with reference to it. It will first be argued that this question cannot be articulated, let alone fully answered, from the position developed by John Rawls. A different approach will then be developed, building on the writings of Eric Voegelin. This involves a much more serious engagement with the classical tradition in thought and philosophy than found in Rawls. It also implies much more than a “pragmatic” recognition of religion as a possible source for overlapping consensus, since for Voegelin a true, balanced rationality can only depart from an experientially grounded encounter with the transcendent. (shrink)
With the rise of multiple geometries in the nineteenth century, and in the last century the rise of abstract algebra, of the axiomatic method, the set-theoretic foundations of mathematics, and the inﬂuential work of the Bourbaki, certain views called “structuralist” have become commonplace. Mathematics is seen as the investigation, by more or less rigorous deductive means, of “abstract structures”, systems of objects fulﬁlling certain structural relations among themselves and in relation to other systems, without regard to the particular nature of (...) the objects themselves. Geometric spaces need not be made up of spatial or temporal points or other intrinsically geometric objects; as Hilbert famously put it, items of furniture suitably interrelated could satisfy all the relevant axiomatic conditions as far as pure mathematics is concerned. A group, for instance, can be any multiplicity of objects with operations fulﬁlling the basic requirements of the binary group operation; indeed the very abstractness of the group concept allows for its remarkably wide applicability in pure and applied mathematics. Similar remarks can be made regarding other algebraic structures, and the many spaces of analysis, diﬀerential geometry, topology, etc. Of course, mathematicians distinguish between “abstract structures” and “concrete ones”, e.g. made up of familiar, basic items such as real or complex numbers or functions of such, or rationals, or integers, etc. (For example, the space L2 of square-integrable functions from R (or Rn) to C, with inner product (f, g) =. (shrink)
In a 2005 paper Ólafur Páll Jónsson presents a puzzle that turns on intentional identity and definite descriptions. He considers eight solutions and rejects them all, thus leaving the puzzle unsolved. In this paper I put forward a solution. The puzzle is this. Little Lotta wants most of all a bicycle for her birthday, but she gets none. Distracted by the gifts she does receive, she at first does not think about the bike. But when seeing her tricycle, she is (...) reminded of the bike. The question is how we are to analyse these two occurrences of ‘the bike’ in the absence of a unique bike that Lotta wants. So the semantics of ‘the bike’ needs to be spelt out, and it must be made explicit what the complements of Lotta’s attitudes are. My analysis shows that the attributer’s usage of ‘the bike’ blurs the distinction between a second-order and a first-order intension (a property concept and a property, respectively). My solution can be summed up in this two-premise argument. (a) In the state-of-affairs S, the property of being a bike is the extension of the property concept the property such that Lotta wants an instance of it more than any other; (b) in S, Lotta does not think about/is reminded of the property that she wants an instance of more than any other; (c) therefore, in S Lotta does not think about/is reminded of the property of being a bike. This solution requires looking beyond the confines of denotational semantics, which all of Jónsson’s eight solution candidates belong to. (shrink)
Richard Rorty (1931–2007) developed a distinctive and controversial brand of pragmatism that expressed itself along two main axes. One is negative—a critical diagnosis of what Rorty takes to be defining projects of modern philosophy. The other is positive—an attempt to show what intellectual culture might look like, once we free ourselves from the governing metaphors of mind and knowledge in which the traditional problems of epistemology and metaphysics (and indeed, in Rorty's view, the self-conception of modern philosophy) are rooted. The (...) centerpiece of Rorty's critique is the provocative account offered in Philosophy and the Mirror of Nature (1979, hereafter PMN). In this book, and in the closely related essays collected in Consequences of Pragmatism (1982, hereafter CP), Rorty's principal target is the philosophical idea of knowledge as representation, as a mental mirroring of a mind-external world. Providing a contrasting image of philosophy, Rorty has sought to integrate and apply the milestone achievements of Dewey, Hegel and Darwin in a pragmatist synthesis of historicism and naturalism. Characterizations and illustrations of a post-epistemological intellectual culture, present in both PMN (part III) and CP (xxxvii-xliv), are more richly developed in later works, such as Contingency, Irony, and Solidarity (1989, hereafter CIS), in the popular essays and articles collected in Philosophy and Social Hope (1999), and in the four volumes of philosophical papers, Objectivity, Relativism, and Truth (1991, hereafter ORT); Essays on Heidegger and Others (1991, hereafter EHO); Truth and Progress (1998, hereafter TP); and Philosophy as Cultural Politics (2007, hereafter PCP). In these writings, ranging over an unusually wide intellectual territory, Rorty offers a highly integrated, multifaceted view of thought, culture, and politics, a view that has made him one of the most widely discussed philosophers in our time. (shrink)
It is widely held that the possibility of value-incomparability between alternatives poses a serious threat to comparativism. Some comparativists have proposed to avoid this problem by supplementing the three traditional value relations with a fourth value relation, variously identified as "roughly equal" or "on a par", which is supposed to hold between alternatives that are incomparable by the three traditional value relations. However, in a recent article in this journal, Nien-he Hsieh has proposed that the comparisons thought to require rough (...) equality or parity could instead be understood in terms of the concept of "clumpiness". Against this suggestion, Martin Peterson has argued that the concept of clumpiness allows agents to be exploited in money-pumps, and thus that there is no way of linking clumpiness to rational choice. This would remove the central appeal of the concept. In this note, I show that Peterson’s argument fails to establish that the concept of clumpiness allows agents to be exploited in money-pumps. (shrink)
The topic of this paper is the notion of technical (as opposed to biological) malfunction. It is shown how to form the property being a malfunctioning F from the property F and the property modifier malfunctioning (a mapping taking a property to a property). We present two interpretations of malfunctioning. Both interpretations agree that a malfunctioning F lacks the dispositional property of functioning as an F. However, its subsective interpretation entails that malfunctioning Fs are Fs, whereas its privative interpretation entails (...) that malfunctioning Fs are not Fs. We chart various of their respective logical consequences and discuss some of the philosophical implications of both interpretations. (shrink)
Within recent years, scientific misconduct has become an increasingly important topic, not only in the scientific community, but in the general public as well. Spectacular cases have been extensively covered in the news media, such as the cases of the Korean stem cell researcher Hwang, the German nanoscientist Schön, or the Norwegian cancer researcher Sudbø. In Science's latest annual "breakthrough of the year" report from December 2006, the descriptions of the year's hottest breakthroughs were accompanied by a similar description of (...) "the breakdown of the year: scientific fraud". Official guidelines for dealing with scientific misconduct were introduced in the 1990s. At this time, research agencies, universities and other research institutions around the world developed guidelines for good scientific practice and formed committees to handle cases of scientific misconduct. In this process it was widely debated how to define scientific misconduct. Most definitions centered on falsification, fabrication, and plagiarism (the so-called FFP definition), but suggestions were also made for definitions that were broader and more open-ended, such as the 1995 suggestion from the US Commission of Research Integrity to replace FFP with misappropriation, interference and misrepresentation (the so-called MIM definition). The MIM definition was not adopted in the US, but MIM-like definitions have been adopted in several other countries. In this paper, I shall describe these MIM-related definitions of scientific misconduct and analyze the arguments that have been advanced in their favor. I shall discuss some of the difficulties inherent in the MIM-related definitions, such as the distinction between misrepresentation and mistake, and the demarcation of misrepresentation in areas characterized by uncertainty or by diverging research paradigms. I shall illustrate the problems inherent in the MIM-definition through a particular case: the ruling of the Danish Committee on Scientific Dishonesty (DCSD) about Bjørn Lomborg's best-selling book The Skeptical Environmentalist in which he argued that contrary to what was claimed in the “litany” of the environmentalists, the state of the environment is getting better rather than worse. Lomborg was reported to the DCSD by several environmental scientists, and this controversial case from 2003 ended with a verdict that characterized Lomborg’s conclusions as misrepresentations, but acquitted Lomborg of misconduct due to his ignorance. I shall analyze this verdict and the problems it reveals with respect to the MIM-related definitions of misconduct. (shrink)
How are we individually and as a society to handle new and emerging technologies? This challenging question underlies much of the bioethical debates of modern times. To address this question we need suitable conceptions of the new technology and ways of identifying its proper management and regulation. To establish conceptions and to find ways to handle emerging technologies we tend to use analogies extensively. The aim of this article is to investigate the role that analogies play or may play in (...) the processes of understanding and managing new technology. More precisely we aim to unveil the role of analogies as analytical devices in exploring the "being" of the new technology as well the normative function of analogies in conceptualizing the characteristics and applications of new technology. Umbilical cord blood biobanking will be used as a case to investigate these roles and functions. (shrink)
The science/non-science distinction has become increasingly blurred. This paper investigates whether recent cases of fraud in science can shed light on the distinction. First, it investigates whether there is an absolute distinction between science and non-science with respect to fraud, and in particular with regards to manipulation and fabrication of data. Finding that it is very hard to make such a distinction leads to the second step: scrutinizing whether there is a normative distinction between science and non-science. This is done (...) by investigating one of the recent internationally famous frauds in science, the Sudbø case. This case demonstrates that moral norms are not only needed to regulate science because of its special characteristics, such as its potential for harm, but moral norms give science its special characteristics. Hence, moral norms are crucial in differentiating science from non-science. Although this does not mean that ethics can save the life of science, it can play a significant role in its resuscitation. (shrink)
BackgroundThe knowledge of scientific dishonesty is scarce and heterogeneous. Therefore this study investigates the experiences with and the attitudes towards various forms of scientific dishonesty among PhD-students at the medical faculties of all Norwegian universities.MethodAnonymous questionnaire distributed to all post graduate students attending introductory PhD-courses at all medical faculties in Norway in 2010/2011. Descriptive statistics.Results189 of 262 questionnaires were returned (72.1%). 65% of the respondents had not, during the last year, heard or read about researchers who committed scientific dishonesty. One (...) respondent had experienced pressure to fabricate and to falsify data, and one had experienced pressure to plagiarize data. On average 60% of the respondents were uncertain whether their department had a written policy concerning scientific conduct. About 11% of the respondents had experienced unethical pressure concerning the order of authors during the last 12 months. 10% did not find it inappropriate to report experimental data without having conducted the experiment and 38% did not find it inappropriate to try a variety of different methods of analysis to find a statistically significant result. 13% agreed that it is acceptable to selectively omit contradictory results to expedite publication and 10% found it acceptable to falsify or fabricate data to expedite publication, if they were confident of their findings. 79% agreed that they would be willing to report misconduct to a responsible official.ConclusionAlthough there is less scientific dishonesty reported in Norway than in other countries, dishonesty is not unknown to doctoral students. Some forms of scientific misconduct are considered to be acceptable by a significant minority. There was little awareness of relevant policies for scientific conduct, but a high level of willingness to report misconduct. (shrink)
Public policy on the development and use of genetically modified organisms (GMOs) has mainly been concerned with defining proper strategies of risk management. However, surveys and focus group interviews show that although lay people are concerned with risks, they also emphasize that genetic modification is ethically questionable in itself. Many people feel that this technology “tampers with nature” in an unacceptable manner. This is often identified as an objection to the crossing of species borders in producing transgenic organisms. Most scientists (...) reject these opinions as based on insufficient knowledge about biotechnology, the concept of species, and nature in general. Some recent projects of genetic modification aim to accommodate the above mentioned concerns by altering the expression of endogenous genes rather than introducing genes from other species. There can be good scientific reasons for this approach, in addition to strategic reasons related to greater public acceptability. But are there also moral reasons for choosing intragenic rather than transgenic modification? I suggest three interrelated moral reasons for giving priority to intragenic modification. First, we should respect the opinions of lay people even when their view is contrary to scientific consensus; they express an alternative world-view, not scientific ignorance. Second, staying within species borders by strengthening endogenous traits reduces the risks and scientific uncertainty. Third, we should show respect for nature as a complex system of laws and interconnections that we cannot fully control. The main moral reason for intragenic modification, in our view, is the need to respect the “otherness” of nature. (shrink)
Some irrational numbers are "random" in a sense which implies that no algorithm can compute their decimal expansions to an arbitrarily high degree of accuracy. This feature of (most) irrational numbers has been claimed to be at the heart of the deterministic, but chaotic, behavior exhibited by many nonlinear dynamical systems. In this paper, a number of now classical chaotic systems are shown to remain chaotic when their domains are restricted to the computable real numbers, providing counterexamples to the above (...) claim. More fundamentally, the randomness view of chaos is shown to be based upon a confusion between a chaotic function on a phase space and its numerical representation in Rn. (shrink)
Søren Kierkegaard (1813-55) was an almost unbelievably prolific writer. At his death he left not only a massive body of published work (25 volumes in the recently completed Princeton University Press edition), but also a sprawling mass of unpublished writings that rivaled the size of the published corpus. This book tells the story of the peculiar fate of this portion of Kierkegaard's literary remains, which flowed ceaselessly from his steel pen from his late teens to a week before his death. (...) It is the story of packets and sacks of paper covered with words and images that, after a vagabond existence in various homes, finally landed at the Royal Danish Library, where they are today guarded with great care. Readers are also introduced to a selection of this enormous body of material, including drawings and doodlings (often human profiles with high foreheads) that escaped from Kierkegaard's pen in unguarded moments and complement the allure of the philosopher's strikingly variable, elusive handwriting. The authors of this book are among the editors of a modern critical edition of Kierkegaard's oeuvre currently being produced in Copenhagen. By the end of his life Kierkegaard had become a controversial figure, engaged in a furious assault upon "Christendom." From the very moment of their discovery in the days following his death, the unpublished words and images constituted a highly problematic bonanza, an intellectual and religious hot potato (or sack of potatoes) that was passed from hand to hand, suppressed, selectively and tendentiously published and republished. Written Images offers readers a fascinating tour of the misadventures of these written images that will, finally, soon be published in their entirety. (shrink)
Sometimes it seems intuitively plausible to hold loosely structured sets of individuals morally responsible for failing to act collectively. Virginia Held, Larry May, and Torbj rn T nnsj have all drawn this conclusion from thought experiments concerning small groups, although they apply the conclusion to large-scale omissions as well. On the other hand it is commonly assumed that (collective) agency is a necessary condition for (collective) responsibility. If that is true, then how can we hold sets of people responsible for (...) not having acted collectively? This paper argues that that loosely structured inactive groups sometimes meet this requirement if we employ a weak (but nonetheless non-reductionist) notion of collective agency. This notion can be defended on independent grounds. The resulting position on distribution of responsibility is more restrictive than Held's, May's or T nnsj 's, and this consequence seems intuitively attractive. (shrink)
There is reasonable evidence suggesting that humans have an innate tendency toward being religious. Consequently, religion is unlikely to disappear; the question then is how this feature will impact on future society. Three scenarios are discussed: One, science will dominate; two, religion will dominate; and three, the present conflict between the two is resolved. The latter scenario may happen through a realization that religion has the potential for doing more good than bad, in terms of individual quality of life and (...) in improving society. Obtaining maximum benefit of religion will require a concept of God that is compatible with science, and that can be accepted as a common core for the various faiths. Science may help in this endeavor. (shrink)
New medical technologies provide us with new possibilities in health care and health care research. Depending on their degree of novelty, they may as well present us with a whole range of unforeseen normative challenges. Partly, this is due to a lack of appropriate norms to perceive and handle new technologies. This article investigates our ways of establishing such norms. We argue that in this respect analogies have at least two normative functions: they inform both our understanding and our conduct. (...) Furthermore, as these functions are intertwined and can blur moral debates, a functional investigation of analogies can be a fruitful part of ethical analysis. We argue that although analogies can be conservative; because they bring old concepts to bear upon new ones, there are at least three ways in which they can be creative. First, understandings of new technologies are quite different from the analogies that established them, and come to be analogies themselves. That is, the concepts may turn out to be quite different from the analogies that established them. Second, analogies transpose similarities from one area into another, where they previously had no bearing. Third, analogies tend to have a figurative function, bringing in something new and different from the content of the analogies. We use research-biobanking as a practical example in our investigations. (shrink)
The objective of this article is to investigate ethical aspects of technology through the moral term “paternalism”. The field of investigation is medicine. The reason for this is twofold. Firstly, “paternalism” has gained moral relevance through modern medicine, where physicians have been accused of behaving paternalistic and threatening patients’ autonomy. Secondly, medicine is a brilliant area to scrutinise the evaluative aspects of technology. It is argued that paternalism is a morally relevant term for the ethics of technology, but that its (...) traditional conception is not adequate to address the challenges of modern technology. A modification towards a “technological paternalism” is necessary. That is, “technological paternalism” is a fruitful term in the ethics of technology. Moreover, it is suited to point out the deficiencies of the traditional concept of paternalism and to reform and vitalise the conception of paternalism in ethics in order to handle the challenges of technology. (shrink)
In his 2000 book Logical Properties Colin McGinn argues that predicates denote properties rather than sets or individuals. I support the thesis, but show that it is vulnerable to a type-incongruity objection, if properties are (modelled as) functions, unless a device for extensionalizing properties is added. Alternatively, properties may be construed as primitive intensional entities, as in George Bealer. However, I object to Bealer’s construal of predication as a primitive operation inputting two primitive entities and outputting a third primitive entity. (...) Instead I recommend we follow Pavel Tichý in construing both predication and extensionalization as instances of the primitive operation of functional application. (shrink)
This article discusses how the results of infant research challenge the assumptions of the classical sciences of social behaviour. According to A.J. Bergesen, the findings of infant research invalidate Durkheim's theory of mental categories, thus requiring a re-theorizing of sociology. This article argues that Bergesen's reading of Emile Durkheim is incorrect, and his review of the infant research in fact invalidates his argument. Reviewing the assumptions of sociology in the light of the findings of infant research, it is argued that (...) the real challenge is to formulate a research strategy that combines the findings of the two sciences. (shrink)
Demographical changes in high income counties will increase the need of health care services but reduce the number of people to provide them. Welfare technology is launched as an important measure to meet this challenge. As with all types of technologies we must explore its ethical challenges. A literature review reveals that welfare technology is a generic term for a heterogeneous group of technologies and there are few studies documenting their efficacy, effectiveness and efficiency. Many kinds of welfare technology break (...) with the traditional organization of health care. It introduces technology in new areas, such as in private homes, and it provides new functions, e.g. offering social stimuli and entertainment. At the same time welfare technology is developed for groups that traditionally have not been extensive technology users. This raises a series of ethical questions with regard to the development and use of welfare technologies, which are presented in this review. The main challenges identified are: (1) Alienation when advanced technology is used at home, (2) conflicting goals, as welfare technologies have many stakeholders with several ends, (3) respecting confidentiality and privacy when third-party actors are involved, (4) guaranteeing equal access and just distribution, and (5) handling conflicts between instrumental rationality and care in terms of respecting dignity and vulnerability. Addressing these issues is important for developing and implementing welfare technologies in a morally acceptable manner. (shrink)
This note sketches how a theory of procedural semantics may offer a solution to the problem of the unity of the proposition. The current revival of the notion of structured meaning has made the problem of propositional unity pressing. The problem, stated in its simplest form, is how an individual a and a property F combine into the proposition P that a is an F; i.e. how two different kinds of objects combine into a third kind of object capable of (...) having properties that neither of its constituents could have. Constraints imposed on P include that P must be capable of being true/false, being known/believed to be true/false, and occurring as argument of propositional connectives, such as entailment. (shrink)