Background The ideal basis of age estimation is considered to be a combination of clinical, skeletal and dental examinations. It is not easy to determine how forensic physicians take account of evidence-based data obtained from medical journals in their medical decision-making. The question of what is an ethically acceptable probability that adolescents are incorrectly considered to be over 18 has not been answered. Methods In a retrospective study over 1 year (2007), 498 files (for 141 female subjects and 357 male (...) subjects) regarding age assessment requested by the public prosecutor's office for purposes of criminal or asylum proceedings were reviewed. Chronological age was estimated from a combination of physical examination, radiographic examination of the left hand and determination of dental status. Results Estimates of chronological age in 498 subjects claiming to be 9–14 years old were incompatible with the alleged age in 356 (71%) when made by the forensic physician but in only 17 (3%) when based on data from published studies on age estimation in adolescents. Conclusions The present study suggests that in most cases the forensic physician ignores the adolescent's word. Medical mission and ethics imply a need to listen to the claims of persons in custody, whatever the risk of false claims. This situation should prompt forensic physicians to keep up with published data on estimating the age of adolescents. (shrink)
Avec un titre comme Luther et la philosophie, depuis le xviiie siècle et dans les milieux « libéraux » du xixe siècle, on aurait pu s’attendre à un exposé, bien sûr complet, de la philosophie du Réformateur. On trouve l’expression, par exemple, dans les tables analytiques de L’Encyclopédie, à l’entrée « luthéranisme ». Bien que Philippe Büttgen se soit donné comme objet, pour d’autres travaux, « la confessionnalisation de la philosophie ..
Upon learning that John C. Harsanyi was awarded the Bank of Sweden Prize in Economic Sciences in Memory of Alfred Nobel, in 1994, for his pioneering work in game theory, few economists probably questioned the appropriateness of that choice. The Budapest-born social scientist had already been recognized as a first-rank contributor to non-cooperative game theory for some time. However, as many readers of this journal will be aware, Harsanyi first contributed to welfare economics, not game theory. More importantly, he was (...) philosophically minded and accordingly has been “acknowledged as the most influential philosopher in economics”. 1 This is of some significance since, before Harsanyi became acquainted with economics around 1950, his main interest was philosophy and, to a lesser extent, sociology and psychology. Rather than an economist with philosophical leanings, Harsanyi was actually a philosopher turned economist. (shrink)
Kaplan claims in Demonstratives that no operator may manipulate the context of evaluation of natural language indexicals. We show that this is not so. In fact, attitude reports always manipulate a context parameter (or, rather, a context variable). This is shown by (i) the existence of De Se readings of attitude reports in English (which Kaplan has no account for), and (ii) the existence of a variety of indexicals across languages whose point of evaluation can be shifted, but only in (...) attitude reports. We develop an alternative account within an extensional framework with overt quantification over times, worlds and contexts. Various typological facts are discussed, esp. the distinction between English, Amharic and Ewe pronouns, and that between English and Russian tenses. (shrink)
: In the 1980s, the analysis of presupposition projection contributed to a ‘dynamic turn’ in semantics: the classical notion of meanings as truth conditions was replaced with a dynamic notion of meanings as Context Change Potentials. We argue that this move was misguided, and we offer an alternative in which presupposition projection follows from the combination of a fully classical semantics and a new pragmatic principle, which we call Be Articulate. This principle requires that a meaning pp’ conceptualized as involving (...) a pre-condition p should be articulated as … … rather than as … pp’ …, unless the full conjunction is ruled out because the first or the second conjunct is semantically idle. In particular, … … is infelicitous - and hence … pp’ … is acceptable - if one can determine as soon as p and is uttered that no matter how the sentence ends these words could be eliminated without affecting its contextual meaning. An equivalence theorem guarantees that this condition suffices to derive Heim’s results in almost all cases. Extensions of the condition lead to several new predictions, in particular concerning some ‘symmetric readings’, as well as presupposition projection in quantified structures, which displays a complex interaction between the nature of the trigger and the monotonicity of the quantifier. (shrink)
This paper argues that besides mechanistic explanations, there is a kind of explanation that relies upon “topological” properties of systems in order to derive the explanandum as a consequence, and which does not consider mechanisms or causal processes. I first investigate topological explanations in the case of ecological research on the stability of ecosystems. Then I contrast them with mechanistic explanations, thereby distinguishing the kind of realization they involve from the realization relations entailed by mechanistic explanations, and explain how both (...) kinds of explanations may be articulated in practice. The second section, expanding on the case of ecological stability, considers the phenomenon of robustness at all levels of the biological hierarchy in order to show that topological explanations are indeed pervasive there. Reasons are suggested for this, in which “neutral network” explanations are singled out as a form of topological explanation that spans across many levels. Finally, I appeal to the distinction of explanatory regimes to cast light on a controversy in philosophy of biology, the issue of contingence in evolution, which is shown to essentially involve issues about realization. (shrink)
Recent semantic research has made increasing use of a principle, Maximize Presupposition, which requires that under certain circumstances the strongest possible presupposition be marked. This principle is generally taken to be irreducible to standard Gricean reasoning because the forms that are in competition have the same assertive content. We suggest, however, that Maximize Presupposition might be reducible to the theory of scalar implicatures. (i)First, we consider a special case: the speaker utters a sentence with a presupposition p which is not (...) initially taken for granted by the addressee, but the latter takes the speaker to be an authority on the matter. Signaling the presupposition provides new information to the addressee; but it also follows from the logic of presupposition qua common belief that the presupposition is thereby satisfied (Stalnaker, Ling Philos 25(5–6):701–721, 2002). (ii) Second, we generalize this solution to other cases. We assume that even when p is common belief, there is a very small chance that the addressee might forget it (‘Fallibility’); in such cases, marking a presupposition will turn out to generate new information by re-establishing part of the original context. We also adopt from Raj Singh (Nat Lang Semantics 19(2):149–168, 2011) the hypothesis that presupposition maximization is computed relative to local contexts—and we assume that these too are subject to Fallibility; this accounts for cases in which the information that justifies the presupposition is linguistically provided. (iii) Finally, we suggest that our assumptions have benefits in the domain of implicatures: they make it possible to reinterpret Magri’s ‘blind’ (i.e. context-insensitive) implicatures as context-sensitive implicatures which just happen to be misleading. (shrink)
According to one productive and influential approach to cognition, categorization, object recognition, and higher level cognitive processes operate on a set of fixed features, which are the output of lower level perceptual processes. In many situations, however, it is the higher level cognitive process being executed that influences the lower level features that are created. Rather than viewing the repertoire of features as being fixed by low-level processes, we present a theory in which people create features to subserve the representation (...) and categorization of objects. Two types of category learning should be distinguished. Fixed space category learning occurs when new categorizations are representable with the available feature set. Flexible space category learning occurs when new categorizations cannot be represented with the features available. Whether fixed or flexible, learning depends on the featural contrasts and similarities between the new category to be represented and the individual's existing concepts. Fixed feature approaches face one of two problems with tasks that call for new features: If the fixed features are fairly high level and directly useful for categorization, then they will not be flexible enough to represent all objects that might be relevant for a new task. If the fixed features are small, subsymbolic fragments (such as pixels), then regularities at the level of the functional features required to accomplish categorizations will not be captured by these primitives. We present evidence of flexible perceptual changes arising from category learning and theoretical arguments for the importance of this flexibility. We describe conditions that promote feature creation and argue against interpreting them in terms of fixed features. Finally, we discuss the implications of functional features for object categorization, conceptual development, chunking, constructive induction, and formal models of dimensionality reduction. Key Words: concept learning; conceptual development; features; perceptual learning; stimulus encoding. (shrink)
The Pareto principle states that if the members of society express the same preference judgment between two options, this judgment is compelling for society. A building block of normative economics and social choice theory, and often borrowed by contemporary political philosophy, the principle has rarely been subjected to philosophical criticism. The paper objects to it on the ground that it indifferently applies to those cases in which the individuals agree on both their expressed preferences and their reasons for entertaining them, (...) and those cases in which they agree on their expressed preferences, while differing on their reasons. The latter are cases of "spurious unanimity", and it is normatively inappropriate, or so the paper argues, to defend unanimity preservation at the social level for them, so the Pareto principle is formulated much too broadly. The objection seems especially powerful when the principle is applied in an ex ante context of uncertainty, in which individuals can disagree on both their probabilities and utilities, and nonetheless agree on their preferences over prospects. (shrink)
Mirror self-experience is re-casted away from the cognitivist interpretation that has dominated discussions on the issue since the establishment of the mirror mark test. Ideas formulated by Merleau-Ponty on mirror self-experience point to the profoundly unsettling encounter with one’s specular double. These ideas, together with developmental evidence are re-visited to provide a new, psychologically and phenomenologically more valid account of mirror self-experience: an experience associated with deep wariness.
When do children become aware of themselves as differentiated and unique entity in the world? When and how do they become self-aware? Based on some recent empirical evidence, 5 levels of self-awareness are presented and discussed as they chronologically unfold from the moment of birth to approximately 4-5 years of age. A natural history of children's developing self-awareness is proposed as well as a model of adult self-awareness that is informed by the dynamic of early development. Adult self-awareness is viewed (...) as the dynamic flux between basic levels of consciousness that develop chronologically early in life. (shrink)
William Alston’s argument against the deontological conception of epistemic justification is a classic—and much debated—piece of contemporary epistemology. At the heart of Alston’s argument, however, lies a very simple mistake which, surprisingly, appears to have gone unnoticed in the vast literature now devoted to the argument. After having shown why some of the standard responses to Alston’s argument don’t work, we elucidate the mistake and offer a hypothesis as to why it has escaped attention.
Potts (2005, 2007) has argued that expressives such as honky must be analyzed using an entirely new dimension of meaning. We explore a more conservative theory in which expressives are presuppositional expressions [Macià 2002] that are indexical and attitudinal (and sometimes shiftable): they predicate something of the mental state of the agent of the context (and this need not always be the agent of the actual context). Following Stalnaker’s recent work on informative presuppositions (2002), we argue that the presuppositions triggered (...) by expressives are automatically satisfied (= ’self-fulfilling’), hence the impression that they are not standard presupposition triggers. (shrink)
Based on the analysis of narrations in Free Indirect Discourse and the Historical Present, we argue that the grammatical notion of context of speech should be ramified into a Context of Thought and a Context of Utterance. Tense and person depend on the Context of Utterance, while all other indexicals are evaluated with respect to the Context of Thought. Free Indirect Discourse and the Historical Present are analyzed as special combinatorial possibilities that arise when the two contexts are distinct, and (...) exactly one of them is presented as identical to the physical point at which the sentence is articulated. (shrink)
According to a theorem recently proved in the theory of logical aggregation, any nonconstant social judgment function that satisfies independence of irrelevant alternatives (IIA) is dictatorial. We show that the strong and not very plausible IIA condition can be replaced with a minimal independence assumption plus a Pareto-like condition. This new version of the impossibility theorem likens it to Arrow’s and arguably enhances its paradoxical value.
We develop a formal semantic analysis of the alarm calls used by Campbell’s monkeys in the Tai forest and on Tiwai island —two sites that differ in the main predators that the monkeys are exposed to. Building on data discussed in Ouattara et al. :e7808, 2009a; PNAS 106: 22026–22031, 2009b and Arnold et al., we argue that on both sites alarm calls include the roots krak and hok, which can optionally be affixed with -oo, a kind of attenuating suffix; in (...) addition, sentences can start with boom boom, which indicates that the context is not one of predation. In line with Arnold et al., we show that the meaning of the roots is not quite the same in Tai and on Tiwai: krak often functions as a leopard alarm call in Tai, but as a general alarm call on Tiwai. We develop models based on a compositional semantics in which concatenation is interpreted as conjunction, roots have lexical meanings, -oo is an attenuating suffix, and an all-purpose alarm parameter is raised with each individual call. The first model accounts for the difference between Tai and Tiwai by way of different lexical entries for krak. The second model gives the same underspecified entry to krak in both locations, but it makes use of a competition mechanism akin to scalar implicatures. In Tai, strengthening yields a meaning equivalent to non-aerial dangerous predator and turns out to single out leopards. On Tiwai, strengthening yields a nearly contradictory meaning due to the absence of ground predators, and only the unstrengthened meaning is used. (shrink)
Institute for Biomedical Ethics, Geneva University Medical School * Corresponding author: Médecins Sans Frontières (OCG), rue de Lausanne 78, CH-1211 Geneva 21, Switzerland. Tel.: +41 (0)22 849 89 29; Fax: +41 (0)22 849 84 88; Email: philippe_calain{at}hotmail.com ' + u + '@' + d + ' '//--> Abstract Outbreaks of filovirus (Ebola and Marburg) hemorrhagic fevers in Africa are typically the theater of rescue activities involving international experts and agencies tasked with reinforcing national authorities in clinical management, biological diagnosis, sanitation, (...) public health surveillance and coordination. These outbreaks can be seen as a paradigm for ethical issues posed by epidemic emergencies, through the convergence of such themes as: isolation and quarantine, privacy and confidentiality and the interpretation of ethical norms across different ethnocultural settings. With an emphasis on the boundaries between public health investigations and research, this article reviews specific challenges, past practices and current normative documents relevant to the application of ethical standards in the course of outbreaks of filovirus hemorrhagic fevers. Aside from commonly identified issues of informed consent and institutional review processes, we argue for more clarity over the specification of which communities are expected to share benefits, and we advocate for the use of collective definitions of duty to care and standard of care. We propose new elaborations around existing normative instruments, and we suggest some pathways toward more comprehensive approaches to the ethics of research in outbreak situations. CiteULike Connotea Del.icio.us What's this? (shrink)
This volume handles in various perspectives the concept of function and the nature of functional explanations, topics much discussed since two major and conflicting accounts have been raised by Larry Wright and Robert Cummins’s papers in the 1970s. Here, both Wright’s ”etiological theory of functions’ and Cummins’s ”systemic’ conception of functions are refined and elaborated in the light of current scientific practice, with papers showing how the ”etiological’ theory faces several objections and may in reply be revisited, while its counterpart (...) became ever more sophisticated, as researchers discovered fresh applications for it. Relying on a firm knowledge of the original positions and debates, this volume presents cutting-edge research evincing the complexities that today pertain in function theory in various sciences. Alongside original papers from authors central to the controversy, work by emerging researchers taking novel perspectives will add to the potential avenues to be followed in the future. Not only does the book adopt no a priori assumptions about the scope of functional explanations, it also incorporates material from several very different scientific domains, e.g. neurosciences, ecology, or technology. In general, functions are implemented in mechanisms; and functional explanations in biology have often an essential relation with natural selection. These two basic claims set the stage for this book’s coverage of investigations concerning both ”functional’ explanations, and the ”metaphysics’ of functions. It casts new light on these claims, by testing them through their confrontation with scientific developments in biology, psychology, and recent developments concerning the metaphysics of realization. Rather than debating a single theory of functions, this book presents the richness of philosophical issues raised by functional discourse throughout the various sciences. Content Level » Research Keywords » Causal role theory of functions - Determination of content - Ecosystem selection - Etiological theory of function - Evolutionary biology - Functional explanations - Historical concepts in biology - Larry Wright - Neurosciences - New mechanism - Selected effects functions - Systemic theory of functions - William Wimsatt Related subjects » Anthropology & Archaeology - Epistemology & Philosophy of Science - Evolutionary & Developmental Biology - Neuroscience - Philosophy TABLE OF CONTENTS Introduction.- Section I. Biological functions and functional explanations: genes, cells, organisms and ecosystems.- Part 1.A. Functions, organization and development in life sciences.- Chapter 1. William C. Wimsatt. Evolution and the Stability of Functional Architectures.- Chapter 2. Denis M. Walsh. Teleological Emergence: The Autonomy of Evo-Devo.- Chapter 3. Jean Gayon. Does oxygen have a function, or: where should the regress of biological functions stop?.- Part 1.B. Functional pluralism for biologists? Chapter 4. Frédéric Bouchard. How ecosystem evolution strengthens the case for functional pluralism.- Chapter 5. Robert N. Brandon. A general case for functional pluralism.- Chapter 6. Philippe Huneman. Weak realism in the etiological theory of functions.- Section 2. Section II. Psychology, philosophy of mind and technology: Functions in a man’s world.- Part 2.A. 2A. Metaphysics, function and philosophy of mind.- Chapter 7. Carl Craver. Functions and Mechanisms in Contemporary Neuroscience.- Chapter 8. Carl Gillett. Understanding the sciences through the fog of ”functionalism.’.- 2.B. Philosophy of technology, design and functions.- Chapter 9. Fran¸ coise Longy. Artifacts and Organisms: A Case for a New Etiological Theory of Functions.- Chapter 10. Pieter Vermaas and Wybo Houkes. Functions as Epistemic Highlighters: An Engineering Account of Technical, Biological and Other Functions.- Epilogue.- Larry Wright. Revising teleological explanations: reflections three decades on. (shrink)
The paper develops an objection to the extensional model of time consciousness—the view that temporally extended events or processes, and their temporal properties, can be directly perceived as such. Importantly, following James, advocates of the extensional model typically insist that whole experiences of temporal relations between non-simultaneous events are distinct from mere successions of their temporal parts. This means, presumably, that there ought to be some feature(s) differentiating the former from the latter. I try to show why the extensional models (...) offers no credible ground for positing such a difference. (shrink)
We argue that some sign language loci (i.e. positions in signing space that realize discourse referents) are both formal variables and simplified representations of what they denote; in other words, they are simultaneously logical symbols and pictorial representations. We develop a 'formal semantics with iconicity' that accounts for their dual life; the key idea ('formal iconicity') is that some geometric properties of signs must be preserved by the interpretation function. We analyze in these terms three kinds of iconic effects in (...) American and French Sign Language (ASL and LSF): (i) structural iconicity, where relations of inclusion and complementation among loci are directly reflected in their denotations; (ii) locus-external iconicity, where the high or low position of a locus in signing space has a direct semantic reflex, akin to the semantic contribution of gender features of pronouns; and (iii) locus-internal iconicity, where different parts of a structured locus are targeted by different directional verbs, as was argued by Liddell and Kegl. The resulting semantics combines insights of two traditions that have been sharply divided by recent debates. In line with the 'formalist camp' (e.g. Lillo-Martin and Klima, Neidle, and Sandler and Lillo-Martin), our theory treats loci as variables, and develops an explicit formal analysis of their behavior. But we also incorporate insights of the 'iconic camp', which emphasized the role of iconic constraints in sign language in general and in pronominals in particular (e.g. Cuxac, Taub, Liddell). However, this synthesis is only possible if formal semantics makes provisions for iconic requirements at the very core of its interpretive procedure. (An Appendix discusses relevant data from Italian Sign Language [LIS].). (shrink)
In the tradition of quantified modal logic, it was assumed that significantly different linguistic systems underlie reference to individuals, to times and to 'possible worlds'. Various results from recent research in formal semantics suggest that this is not so, and that there is in fact a pervasive symmetry between the linguistic means with which we refer to these three domains. Reference to individuals, times and worlds is uniformly effected through generalized quantifiers, definite descriptions, and pronouns, and in each domain grammatical (...) features situate the reference of terms as near, far or 'further' from the actual or from a reported speech act. We outline various directions in which a program of ontological symmetry could be developed, and we offer in the Appendix a symmetric fragment developed in a logic that can be seen as a compromise between an extensional and an intensional system. (shrink)
Heim 1983 suggested that the analysis of presupposition projection requires that the classical notion of meanings as truth conditions be replaced with a dynamic notion of meanings as Context Change Potentials. But as several researchers (including Heim herself) later noted, the dynamic framework is insufficiently predictive: although it allows one to state that, say, the dynamic effect of F and G is to first update a Context Set C with F and then with G (i.e., C[F and G] = C[F][G]), (...) it fails to explain why there couldn’t be a ‘deviant’ conjunction and* which performed these operations in the opposite order (i.e., C[F and* G] = C[G][F]). We provide a formal introduction to a competing framework, the Transparency theory, which addresses this problem. Unlike dynamic semantics, our analysis is fully classical, i.e., bivalent and static. And it derives the projective behavior of connectives from their bivalent meaning and their syntax. We concentrate on the formal properties of a simple version of the theory, and we prove that (i) full equivalence with Heim’s results is guaranteed in the propositional case (Theorem 1), and that (ii) the equivalence can be extended to the quantificational case (for any generalized quantifiers), but only when certain conditions are met (Theorem 2). (shrink)
Public announcement logic is an extension of multiagent epistemic logic with dynamic operators to model the informational consequences of announcements to the entire group of agents. We propose an extension of public announcement logic with a dynamic modal operator that expresses what is true after any announcement: after which , does it hold that Kφ? We give various semantic results and show completeness for a Hilbert-style axiomatization of this logic. There is a natural generalization to a logic for arbitrary events.
Judgment aggregation theory, or rather, as we conceive of it here, logical aggregation theory generalizes social choice theory by having the aggregation rule bear on judgments of all kinds instead of merely preference judgments. It derives from Kornhauser and Sager’s doctrinal paradox and List and Pettit’s discursive dilemma, two problems that we distinguish emphatically here. The current theory has developed from the discursive dilemma, rather than the doctrinal paradox, and the final objective of the paper is to give the latter (...) its own theoretical development along the line of recent work by Dietrich and Mongin. However, the paper also aims at reviewing logical aggregation theory as such, and it covers impossibility theorems by Dietrich, Dietrich and List, Dokow and Holzman, List and Pettit, Mongin, Nehring and Puppe, Pauly and van Hees, providing a uniform logical framework in which they can be compared with each other. The review goes through three historical stages: the initial paradox and dilemma, the scattered early results on the independence axiom, and the so-called canonical theorem, a collective achievement that provided the theory with its specific method of analysis. The paper goes some way towards philosophical logic, first by briefly connecting the aggregative framework of judgment with the modern philosophy of judgment, and second by thoroughly discussing and axiomatizing the ‘general logic’ built in this framework. (shrink)
The paper has a twofold aim. On the one hand, it provides what appears to be the first game-theoretic modeling of Napoleon’s last campaign, which ended dramatically on 18 June 1815 at Waterloo. It is specifically concerned with the decision Napoleon made on 17 June 1815 to detach part of his army against the Prussians he had defeated, though not destroyed, on 16 June at Ligny. Military historians agree that this decision was crucial but disagree about whether it was rational. (...) Hypothesizing a zero-sum game between Napoleon and Blücher, and computing its solution, we show that it could have been a cautious strategy on the former's part to divide his army, a conclusion which runs counter to the charges of misjudgement commonly heard since Clausewitz. On the other hand, the paper addresses methodological issues. We defend its case study against the objections of irrelevance that have been raised elsewhere against “analytic narratives”, and conclude that military campaigns provide an opportunity for successful application of the formal theories of rational choice. Generalizing the argument, we finally investigate the conflict between narrative accounts – the historians' standard mode of expression – and mathematical modeling. (shrink)
Popper's well-known demarcation criterion has often been understood to distinguish statements of empirical science according to their logical form. Implicit in this interpretation of Popper's philosophy is the belief that when the universe of discourse of the empirical scientist is infinite, empirical universal sentences are falsifiable but not verifiable, whereas the converse holds for existential sentences. A remarkable elaboration of this belief is to be found in Watkins's early work on the statements he calls “all-and-some,” such as: “For every metal (...) there is a melting point.” All-and-some statements are both universally and existentially quantified in that order. Watkins argued that AS should be regarded as both nonfalsifiable and nonverifiable, for they partake in the logical fate of both universal and existential statements. This claim is subject to the proviso that the bound variables are “uncircumscribed” ; i.e., that the universe of discourse is infinite. (shrink)
In dynamic theories of presupposition, a trigger pp′ with presupposition p and at-issue component p′ comes with a requirement that p should be entailed by the local context of pp′. We argue that some co-speech gestures should be analyzed within a presuppositional framework, but with a twist: an expression p co-occurring with a co-speech gesture G with content g comes with the requirement that the local context of p should guarantee that p entails g; we call such assertion-dependent presuppositions ‘cosuppositions’. (...) We show that this analysis can be combined with earlier theories of local contexts to account for complex patterns of gesture projection in quantified and in attitudinal contexts, and we compare our account to two potential alternatives: one based on supervaluations, and one, due to Cornelia Ebert, that treats co-speech gestures as supplements. We argue that the latter is correct, but for ‘post-speech’ gestures, rather than for co-speech gestures. (shrink)
This article explores the conceptual and practical gap existing between the developed and developing countries in relation to corporate social responsibility (CSR), or the North-South ' CSR Divide', through the analysis of possible impact on the competitiveness of developing countries' and economies' SMEs and MNEs in globalization. To do so, this article first reviewed the traditional wisdom on the concept of strategic CSR developed in the North and the role that CSR engagement can play in corporate competitiveness, and compare with (...) the impact on the competitive advantage of the South through the supply chains. It points out that among the many factors that could explain the ' CSR Divide', the negative impact of CSR on comparative advantage is the final resort where developing countries are reluctant and defensive toward western-style CSR. It did point out that developing countries are changing their approaches to make CSR work in favor of their competitive position in global trade, such as China who has started to adopt proactive approach by becoming CSR standards-setter. This article concludes with two policy proposals that aim to bridge the CSR gap, the first is to improve CSR standard-setting participation from both sides, and the second to search for solutions in the international investment legal framework which will define corporate obligations in relating to CSR in a more explicit way. (shrink)
We argue that a formal semantics for music can be developed, although it will be based on very different principles from linguistic semantics and will yield less precise inferences. Our framework has the following tenets: Music cognition is continuous with normal auditory cognition. In both cases, the semantic content derived from an auditory percept can be identified with the set of inferences it licenses on its causal sources, analyzed in appropriately abstract ways. What is special about music semantics is that (...) it aggregates inferences based on normal auditory cognition with further inferences drawn on the basis of the behavior of voices in tonal pitch space. This makes it possible to define an inferential semantics but also a truth-conditional semantics for music. In particular, a voice undergoing a musical movement m is true of an object undergoing a series of events e just in case there is a certain structure-preserving map between m and e. Aspects of musical syntax might be derivable on semantic grounds from an event mereology, which also explains some cases in which tree structures are inadequate for music. Intentions and emotions may be attributed at several levels, and we speculate on possible explanations of the special relation between music and emotions. (shrink)
To what extent do early intuitions about ownership depend on cultural and socio-economic circumstances? We investigated the question by testing reasoning about third party ownership conflicts in various groups of three- and five-year-old children (N = 176), growing up in seven highly contrasted social, economic, and cultural circumstances (urban rich, poor, very poor, rural poor, and traditional) spanning three continents. Each child was presented with a series of scripts involving two identical dolls fighting over an object of possession. The child (...) had to decide who of the two dolls should own the object. Each script enacted various potential reasons for attributing ownership: creation, familiarity, first contact, equity, plus a control/neutral condition with no suggested reasons. Results show that across cultures, children are significantly more consistent and decisive in attributing ownership when one of the protagonists created the object. Development between three and five years is more or less pronounced depending on culture. The propensity to split the object in equal halves whenever possible was generally higher at certain locations (i.e., China) and quasi-inexistent in others (i.e., Vanuatu and street children of Recife). Overall, creation reasons appear to be more primordial and stable across cultures than familiarity, relative wealth or first contact. This trend does not correlate with the passing of false belief theory of mind. (shrink)
Conceptualist accounts of the representational content of perceptual experiences have it that a subject _S_ can experience no object, property, relation, etc., unless _S_ "i# possesses and "ii# exercises concepts for such object, property, or relation. Perceptual experiences, on such a view, represent the world in a way that is conceptual.
The paper analyses economic evaluations by distinguishing evaluative statements from actual value judgments. From this basis, it compares four solutions to the value neutrality problem in economics. After rebutting the strong theses about neutrality (normative economics is illegitimate) and non-neutrality (the social sciences are value-impregnated), the paper settles the case between the weak neutrality thesis (common in welfare economics) and a novel, weak non-neutrality thesis that extends the realm of normative economics more widely than the other weak thesis does.
There are two main approaches to the problem of donkey anaphora (e.g. If John owns a donkey , he beats it ). Proponents of dynamic approaches take the pronoun to be a logical variable, but they revise the semantics of quantifiers so as to allow them to bind variables that are not within their syntactic scope. Older dynamic approaches took this measure to apply solely to existential quantifiers; recent dynamic approaches have extended it to all quantifiers. By contrast, proponents of (...) E-type analyses take the pronoun to have the semantics of a definite description (with it ≈ the donkey, or the donkey that John owns ). While competing accounts make very different claims about the patterns of coindexation that are found in the syntax, these are not morphologically realized in spoken languages. But they are in sign language, namely through locus assignment and pointing. We make two main claims on the basis of ASL and LSF data. First, sign language data favor dynamic over E-type theories: in those cases in which the two approaches make conflicting predictions about possible patterns of coindexation, dynamic analyses are at an advantage. Second, among dynamic theories, sign language data favor recent ones because the very same formal mechanism is used irrespective of the indefinite or non-indefinite nature of the antecedent. Going beyond this debate, we argue that dynamic theories should allow pronouns to be bound across negative expressions, as long as the pronoun is presupposed to have a non-empty denotation. Finally, an appendix displays and explains subtle differences between overt sign language pronouns and all other pronouns in examples involving ‘disjunctive antecedents’, and suggests that counterparts of sign language loci might be found in spoken language. (shrink)