The concept of distal similarity that plays a crucial role in Edelman's theory of representation is called into question in this commentary on theoretical as well as empirical grounds. A possible confusion between shape and (knowledge of) its referent, the problem of the subjective world, redundancy, and large individual differences in subjective space encountered in contrived universes are discussed.
Conventional categories of creativity are being deconstructed after the so-called postmodern debate. This article takes this process deeper, to what we will show is the hidden subtext of gender underlying how creativity has been socially constructed. It also proposes a more contextualized approach to creativity that takes into account both its individual and social dimensions and how this relates to what Eisler (1987) has called a partnership rather than dominator model of society.
This article develops a theory for how caringbehavior fits into the makeup of humans andother mammals. Biochemical evidence for threemajor patterns of response to stressful orotherwise complex situations is reviewed. There is the classic fight-or-flight response;the dissociative response, involving emotionalwithdrawal and disengagement; and the bondingresponse, a variant of which Taylor et al. (2000) called tend-and-befriend. All three ofthese responses can be explained as adaptationsthat have been selected for in evolution andare shared between humans and other mammals. Yet each of us (...) contains varying tendenciestoward all of these responses. How doesdevelopment interact with genes to influencethese tendencies? How do individuals,societies, and institutions make choicesbetween these types of responses?We review the evidence, based on behavioral,lesion, single-cell, and brain imaging studies,for cortical-subcortical interactions involvedin all three of these response types, andpropose partial neural network models for someof these interactions. We propose that theorbitomedial prefrontal cortex mediates thischoice process. This area of prefrontal cortexperforms this mediation through its connectionswith areas of sensory and association cortexthat represent social contexts or stimuli, andwith areas of the hypothalamus, limbic system,and autonomic nervous system that representemotional states or classes of response patterns.The article concludes with implications of ourtheory for social interactions andinstitutions. We argue that despite the wideprevalence of fight-or-flight responses, thebonding, caring responses remain available. Weshow with historical and contemporary exampleshow social settings – whether in education,work places, families, politics, and informalsocial customs – can be designed to supportand enhance the natural caring responses of thebrain. (shrink)
There is growing consensus that we need a new paradigm if we are to solve the global problems that are the result of actions and policies stemming from prevailing paradigms or cognitive maps. Theories are cognitive maps. This article summarizes cultural transformation theory, which proposes that to solve our mounting global problems we need a clearer understanding of the self-organizing interaction of two basic movements in cultural evolution. The first consists of technological phase changes, including the most recent shift from (...) industrial to electronic, nuclear, and biochemical technologies. The second consists of shifts in a system's orientation to what, based on three decades of transdisciplinary research, the author identifies as the socio-economic, gender, and cultural configurations characteristic of the dominator and partnership models. The article calls for a reassessment of earlier theories as the basis for effective action to accelerate the shift to a world orienting to the partnership rather than dominator model as a basis for a sustainable, equitable, and peaceful future. (shrink)
The ambiguous material identity of nanotechnology is a minor mystery of the history of contemporary science. This paper argues that nanotechnology functioned primarily in discourses of social, not physical or biological science, the problematic knowledge at stake concerning the economic value of state-supported basic science. The politics of taxonomy in the United States Department of Energy’s Office of Basic Energy Sciences in the 1990s reveals how scientists invoked the term as one of several competing and equally valid candidates for reframing (...) materials sciences in ways believed consonant with the political tenor of the time. The resulting loss of conceptual clarity in the sociology of science traces ultimately to the struggle to bridge the disjunction between the promissory economy of federal basic science and the industrial economy, manifested in attempts to reconcile the precepts of linearity and interdisciplinarity in changing socio-economic conditions over a half century. (shrink)
This volume is a collection of essays presented at the 31st International Wittgenstein Symposium, Kirchberg, in August 2008. It has the character of a high-quality journal issue. There is no introduction, and the papers do not all directly bear on the topic of the original conference, which was "Reduction and Elimination in Philosophy and the Sciences". In what follows, I offer a short description of each paper, and add critical remarks in some cases.
Rudolf Carnap’s Der logische Aufbau der Welt ( The Logical Structure of the World ) is generally conceived of as being the failed manifesto of logical positivism. In this paper we will consider the following question: How much of the Aufbau can actually be saved? We will argue that there is an adaptation of the old system which satisfies many of the demands of the original programme. In order to defend this thesis, we have to show how a new ‘ (...) Aufbau -like’ programme may solve or circumvent the problems that affected the original Aufbau project. In particular, we are going to focus on how a new system may address the well-known difficulties in Carnap’s Aufbau concerning abstraction, dimensionality, and theoretical terms. (shrink)
In discussions about whether the Principle of the Identity of Indiscernibles is compatible with structuralist ontologies of mathematics, it is usually assumed that individual objects are subject to criteria of identity which somehow account for the identity of the individuals. Much of this debate concerns structures that admit of non-trivial automorphisms. We consider cases from graph theory that violate even weak formulations of PII. We argue that (i) the identity or difference of places in a structure is not to be (...) accounted for by anything other than the structure itself and that (ii) mathematical practice provides evidence for this view. We want to thank Leon Horsten, Jeff Ketland, Øystein Linnebo, John Mayberry, Richard Pettigrew, and Philip Welch for valuable comments on drafts of this paper. We are especially grateful to Fraser MacBride for correcting our interpretation of two of his papers and for other helpful comments. CiteULike Connotea Del.icio.us What's this? (shrink)
On the basis of impossibility results on probability, belief revision, and conditionals, it is argued that conditional beliefs differ from beliefs in conditionals qua mental states. Once this is established, it will be pointed out in what sense conditional beliefs are still conditional, even though they may lack conditional contents, and why it is permissible to still regard them as beliefs, although they are not beliefs in conditionals. Along the way, the main logical, dispositional, representational, and normative properties of conditional (...) beliefs are studied, and it is explained how the failure of not distinguishing conditional beliefs from beliefs in conditionals can lead philosophical and empirical theories astray. (shrink)
One of the fundamental problems of epistemology is to say when the evidence in an agent’s possession justifies the beliefs she holds. In this paper and its sequel, we defend the Bayesian solution to this problem by appealing to the following fundamental norm: Accuracy An epistemic agent ought to minimize the inaccuracy of her partial beliefs. In this paper, we make this norm mathematically precise in various ways. We describe three epistemic dilemmas that an agent might face if she attempts (...) to follow Accuracy, and we show that the only inaccuracy measures that do not give rise to such dilemmas are the quadratic inaccuracy measures. In the sequel, we derive the main tenets of Bayesianism from the relevant mathematical versions of Accuracy to which this characterization of the legitimate inaccuracy measures gives rise, but we also show that Jeffrey conditionalization has to be replaced by a different method of update in order for Accuracy to be satisfied. (shrink)
What kinds of sentences with truth predicate may be inserted plausibly and consistently into the T-scheme? We state an answer in terms of dependence: those sentences which depend directly or indirectly on non-semantic states of affairs (only). In order to make this precise we introduce a theory of dependence according to which a sentence is said to depend on a set of sentences iff the truth value of supervenes on the presence or absence of the sentences of in/from the extension (...) of the truth predicate. Both and the members of are allowed to contain the truth predicate. On that basis we are able define notions such as ungroundedness or self-referentiality within a classical semantics, and we can show that there is an adequate definition of truth for the class of sentences which depend on non-semantic states of affairs. (shrink)
We investigate the research programme of dynamic doxastic logic (DDL) and analyze its underlying methodology. The Ramsey test for conditionals is used to characterize the logical and philosophical differences between two paradigmatic systems, AGM and KGM, which we develop and compare axiomatically and semantically. The importance of Gärdenfors’s impossibility result on the Ramsey test is highlighted by a comparison with Arrow’s impossibility result on social choice. We end with an outlook on the prospects and the future of DDL.
If an agent believes that the probability of E being true is 1/2, should she accept a bet on E at even odds or better? Yes, but only given certain conditions. This paper is about what those conditions are. In particular, we think that there is a condition that has been overlooked so far in the literature. We discovered it in response to a paper by Hitchcock (2004) in which he argues for the 1/3 answer to the Sleeping Beauty problem. (...) Hitchcock argues that this credence follows from calculating her fair betting odds, plus the assumption that Sleeping Beauty’s credences should track her fair betting odds. We will show that this last assumption is false. Sleeping Beauty’s credences should not follow her fair betting odds due to a peculiar feature of her epistemic situation. (shrink)
I started out as a student of physics, hard-working, interested, but alas, not ‘in love’ with my subject. Then logic struck, and having become interested in this subject for various reasons – including the fascinating personality of my first teacher –, I switched after my candidate’s program, to take two master’s degrees, in mathematics and in philosophy. The beauty of mathematics was clear to me at once, with the amazing power, surprising twists, and indeed the music, of abstract arguments. As (...) our professor of Analysis wrote at the time in our study guide “Mathematics is about the delight in the purity of trains of thought”, and oldfashioned though this phrasing sounded in the revolutionary 1960s, it did resonate with me. Then I had the privilege of being taught set-theoretic topology by a group of brilliant students around De Groot, our leading expert around the time, who worked with Moore’s method of discovering a subject for oneself. Topology unfolded from a few definitions and examples to real theorems that we had to prove ourselves – and the take-home exam took sleepless nights, as it included proving some results from scratch which came from a recent dissertation (as it turned out later). Only at the very end did De Groot appear, to give one lecture on Tychonoff’s Theorem where an application was made of the Axiom of Choice, a sacral act only to be performed by tenured full professors. (shrink)
Some authors have claimed that ante rem structuralism has problems with structures that have indiscernible places. In response, I argue that there is no requirement that mathematical objects be individuated in a non-trivial way. Metaphysical principles and intuitions to the contrary do not stand up to ordinary mathematical practice, which presupposes an identity relation that, in a sense, cannot be defined. In complex analysis, the two square roots of –1 are indiscernible: anything true of one of them is true of (...) the other. I suggest that i functions like a parameter in natural deduction systems. I gave an early version of this paper at a workshop on structuralism in mathematics and science, held in the Autumn of 2006, at Bristol University. Thanks to the organizers, particularly Hannes Leitgeb, James Ladyman, and Øystein Linnebo, to my commentator Richard Pettigrew, and to the audience there. The paper also benefited considerably from a preliminary session at the Arché Research Centre at the University of St Andrews. I am indebted to my colleagues Craige Roberts, for help with the linguistics literature, and Ben Caplan and Gabriel Uzquiano, for help with the metaphysics. Thanks also to Hannes Leitgeb and Jeffrey Ketland for reading an earlier version of the manuscript and making helpful suggestions. I also benefited from conversations with Richard Heck, John Mayberry, Kevin Scharp, and Jason Stanley. CiteULike Connotea Del.icio.us What's this? (shrink)
One of the fundamental problems of epistemology is to say when the evidence in an agent’s possession justifies the beliefs she holds. In this paper and its prequel, we defend the Bayesian solution to this problem by appealing to the following fundamental norm: Accuracy An epistemic agent ought to minimize the inaccuracy of her partial beliefs. In the prequel, we made this norm mathematically precise; in this paper, we derive its consequences. We show that the two core tenets of Bayesianism (...) follow from the norm, while the characteristic claim of the Objectivist Bayesian follows from the norm along with an extra assumption. Finally, we consider Richard Jeffrey’s proposed generalization of conditionalization. We show not only that his rule cannot be derived from the norm, unless the requirement of Rigidity is imposed from the start, but further that the norm reveals it to be illegitimate. We end by deriving an alternative updating rule for those cases in which Jeffrey’s is usually supposed to apply. (shrink)
The so-called Paradox of Serious Possibility is usually regarded as showing that the standard axioms of belief revision do not apply to belief sets that are introspectively closed. In this article we argue to the contrary: we suggest a way of dissolving the Paradox of Serious Possibility so that introspective statements are taken to express propositions in the standard sense, which may thus be proper members of belief sets, and accordingly the normal axioms of belief revision apply to them. Instead (...) the paradox is avoided by making explicit, for any occurrence of an introspective modality in the object language, the belief state to which this occurrence refers; this will make it impossible for any doxastic modality to refer to two distinct belief sets within one and the same context of doxastic appraisal. By this move the standard derivation of a contradiction from the theory of belief revision in the presence of introspectively closed belief sets does not go through any more, and indeed the premisses of the Paradox of Serious Possibility become jointly consistent once they are reformulated with our amended introspective modalities only. Additionally, we present a probabilistic version of the Paradox of Serious Possibility which can be avoided in a perfectly analogous manner. (shrink)
Abstract: Focusing on early child pretend play from the perspective of developmental psychology, this article puts forward and presents evidence for two claims. First, such play constitutes an area of remarkable individual intentionality of second-order intentionality (or 'theory of mind'): in pretence with others, young children grasp the basic intentional structure of pretending as a non-serious fictional form of action. Second, early social pretend play embodies shared or collective we-intentionality. Pretending with others is one of the ontogenetically primary instances of (...) truly cooperative actions. And it is a, perhaps the, primordial form of cooperative action with rudimentary rule-governed, institutional structure: in joint pretence games, children are aware that objects collectively get assigned fictional status, 'count as' something, and that this creates a normative space of warranted moves in the game. Developmentally, pretend play might even be a cradle for institutional phenomena more generally. (shrink)
We show that a set of prima facie plausible assumptions on the relation of meaning resemblance – one of which is a compositionality postulate – is inconsistent. On this basis we argue that either there is no theoretically useful notion of semantic resemblance at all, or the traditional conception of the compositionality of meaning has to be adapted. In the former case, arguments put forward by Nelson Goodman and Paul Churchland in favor of the concept of meaning resemblance are defeated. (...) In the other case, it must be possible to account for 'degrees of compositionality' or for other refinements of compositionality that are compatible with meaning resemblance. (shrink)
We show that finitely axiomatized first-order theories that involve some criterion of identity for entities of a category C can be reformulated as conjunctions of a non-triviality statement and a criterion of identity for entities of category C again. From this, we draw two conclusions: First, criteria of identity can be very strong deductively. Second, although the criteria of identity that are constructed in the proof of the theorem are not good ones intuitively, it is difficult to say what exactly (...) is wrong with them once the modern metaphysical view of identity criteria is presupposed. (shrink)
We investigate the conditions under which quasianalysis, i.e., Carnap's method of abstraction in his Aufbau, yields adequate results. In particular, we state both necessary and sufficient conditions for the so-called faithfulness and fullness of quasianalysis, and analyze adequacy as the conjunction of faithfulness and fullness. It is shown that there is no method of (re-)constructing properties from similarity that delivers adequate results in all possible cases, if the same set of individuals is presupposed for properties and for similarity, and if (...) similarity is a relation of finite arity. The theory is applied to various examples, including Russell's construction of temporal instants and Carnap's constitution of the phenomenal counterparts to quality spheres. Our results explain why the former is adequate while the latter is bound to fail. (shrink)
This monograph provides a new account of justified inference as a cognitive process. In contrast to the prevailing tradition in epistemology, the focus is on low-level inferences, i.e., those inferences that we are usually not consciously aware of and that we share with the cat nearby which infers that the bird which she sees picking grains from the dirt, is able to fly. Presumably, such inferences are not generated by explicit logical reasoning, but logical methods can be used to describe (...) and analyze such inferences. Part 1 gives a purely system-theoretic explication of belief and inference. Part 2 adds a reliabilist theory of justification for inference, with a qualitative notion of reliability being employed. Part 3 recalls and extends various systems of deductive and nonmonotonic logic and thereby explains the semantics of absolute and high reliability. In Part 4 it is proven that qualitative neural networks are able to draw justified deductive and nonmonotonic inferences on the basis of distributed representations. This is derived from a soundness/completeness theorem with regard to cognitive semantics of nonmonotonic reasoning. The appendix extends the theory both logically and ontologically, and relates it to A. Goldman's reliability account of justified belief. This text will be of interest to epistemologists and logicians, to all computer scientists who work on nonmonotonic reasoning and neural networks, and to cognitive scientists. (shrink)
If is conceived as an operator, i.e., an expression that gives applied to a formula another formula, the expressive power of the language is severely restricted when compared to a language where is conceived as a predicate, i.e., an expression that yields a formula if it is applied to a term. This consideration favours the predicate approach. The predicate view, however, is threatened mainly by two problems: Some obvious predicate systems are inconsistent, and possible-worlds semantics for predicates of sentences has (...) not been developed very far. By introducing possible-worlds semantics for the language of arithmetic plus the unary predicate , we tackle both problems. Given a frame W,R> consisting of a set W of worlds and a binary relation R on W, we investigate whether we can interpret at every world in such a way that A holds at a world wW if and only if A holds at every world vW such that wRv. The arithmetical vocabulary is interpreted by the standard model at every world. Several paradoxes (like Montague's Theorem, Gödel's Second Incompleteness Theorem, McGee's Theorem on the -inconsistency of certain truth theories, etc.) show that many frames, e.g., reflexive frames, do not allow for such an interpretation. We present sufficient and necessary conditions for the existence of a suitable interpretation of at any world. Sound and complete semi-formal systems, corresponding to the modal systems K and K4, for the class of all possible-worlds models for predicates and all transitive possible-worlds models are presented. We apply our account also to nonstandard models of arithmetic and other languages than the language of arithmetic. (shrink)
According to Tarski's Convention T, the adequacy of a truth definition is (implicitly) defined relatively to a translation mapping from the object language to the metalanguage; the translation mapping itself is left unspecified. This paper restates Convention T in a form in which the relativity to translation is made explicit. The notion of an interpreted language is introduced, and a corresponding notion of a translation between interpreted languages is defined. The latter definition is stated both in an algebraic version, and (...) in an equivalent possible worlds version. It is a consequence of our definition that translation is indeterminate in certain cases. Finally, we give an application of our revised version of Convention T and show that interpreted languages exist, which allow for vicious self-reference but which nevertheless contain their own truth predicate. This is possible if only truth is based on a nonstandard translation mapping by which, e.g., the Liar sentence is translated to its own negation. In this part of the paper this existence result is proved only for languages without quantifiers; in Part B the result will be extended to first-order languages. (shrink)
This papers deals with the class of axiomatic theories of truth for semantically closed languages, where the theories do not allow for standard models; i.e., those theories cannot be interpreted as referring to the natural number codes of sentences only (for an overview of axiomatic theories of truth in general, see Halbach). We are going to give new proofs for two well-known results in this area, and we also prove a new theorem on the nonstandardness of a certain theory of (...) truth. The results indicate that the proof strategies for all the theorems on the nonstandardness of such theories are "essentially" of the same kind of structure. (shrink)
Young children interpret some acts performed by adults as normatively governed, that is, as capable of being performed either rightly or wrongly. In previous experiments, children have made this interpretation when adults introduced them to novel acts with normative language (e.g. ‘this is the way it goes’), along with pedagogical cues signaling culturally important information, and with social-pragmatic marking that this action is a token of a familiar type. In the current experiment, we exposed children to novel actions with no (...) normative language, and we systematically varied pedagogical and social-pragmatic cues in an attempt to identify which of them, if either, would lead children to normative interpretations. We found that young 3-year-old children inferred normativity without any normative language and without any pedagogical cues. The only cue they used was adult socialpragmatic marking of the action as familiar, as if it were a token of a well-known type (as opposed to performing it, as if inventing it on the spot). These results suggest that – in the absence of explicit normative language – young children interpret adult actions as normatively governed based mainly on the intentionality (perhaps signaling conventionality) with which they are performed. (shrink)
The difficulties with formalizing the intensional notions necessity, knowability and omniscience, and rational belief are well-known. If these notions are formalized as predicates applying to (codes of) sentences, then from apparently weak and uncontroversial logical principles governing these notions, outright contradictions can be derived. Tense logic is one of the best understood and most extensively developed branches of intensional logic. In tense logic, the temporal notions future and past are formalized as sentential operators rather than as predicates. The question therefore (...) arises whether the notions that are investigated in tense logic can be consistently formalized as predicates. In this paper it is shown that the answer to this question is negative. The logical treatment of the notions of future and past as predicates gives rise to paradoxes due the specific interplay between both notions. For this reason, the tense paradoxes that will be presented are not identical to the paradoxes referred to above. (shrink)
. Interpreted dynamical systems are dynamical systems with an additional interpretation mapping by which propositional formulas are assigned to system states. The dynamics of such systems may be described in terms of qualitative laws for which a satisfaction clause is defined. We show that the systems Cand CL of nonmonotonic logic are adequate with respect to the corresponding description of the classes of interpreted ordered and interpreted hierarchical systems, respectively. Inhibition networks, artificial neural networks, logic programs, and evolutionary systems are (...) instances of such interpreted dynamical systems, and thus our results entail that each of them may be described correctly and, in a sense, even completely by qualitative laws that obey the rules of a nonmonotonic logic system. (shrink)
Werning applies a theorem by Hodges in order to put forward an argument against Quine’s thesis of the indeterminacy of translation (understood as a thesis on meaning, not on reference) and in favour of what Werning calls ‘semantic realism’. We show that the argument rests on two critical premises both of which are false. The reasons for these failures are explained and the actual place of this application of Hodges’ theorem within Quine’s philosophy of language is outlined.
According to Paul Snowdon, one directly perceives an object x iff one is in a position to make a true demonstrative judgement of the form “That is x”. Whenever one perceives an object x indirectly (or dependently , as Snowdon puts it) it is the case that there exists an item y (which is not identical to x) such that one can count as demonstrating x only if one acknowledges that y bears a certain relation to x. In this paper (...) I argue that what we hear directly are sounds, and that material objects (such as violins and goldfinches) are only indirectly heard. However, there are cases of auditory object perception that should count as direct : Some blind persons’ ears are so sensitive to the way sound waves are modified by things in their surroundings that they can detect objects such as other persons, fences or trees. Interestingly, objects localized in this way make themselves felt via a kind of pressure in the perceiver’s face (that is why the phenomenon is commonly called “facial vision”), the perception is phenomenally quite different from hearing. Since, to some degree, most people are able to conclude from the way it sounds that, say, they stand at the foot of a concrete wall (when there is enough traffic noise around), we can imagine situations where two persons perceive the same wall, one indirectly (demonstratively apprehending sounds) and the other directly (demonstratively apprehending nothing but the wall). These cases invite us to discuss the role phenomenology plays in determining whether an object is perceived directly or indirectly. (shrink)
Wenn man in Lehrbüchern und einschlägigen Lexika nach einer Charakterisierung der Erkenntnistheorie sucht, findet man eine große Übereinstimmung im Hinblick auf die Grundfragen dieser Disziplin. Im ersten Band der von Jürgen Mittelstraß herausgegebenen Enzyklopädie Philosophie und Wissenschaftstheorie etwa findet sich folgender Eintrag: Erkenntnistheorie (!), philosophische Grunddisziplin, deren Gegenstand die Beantwortung der Frage nach den Bedingungen begründeten Wissens ist. Im klassischen Sinne schloß dies die Fragen nach der Entstehung, dem Wesen und den Grenzen der Erkenntnis ein ('die Wissenschaft vom Wesen und (...) den Prinzipien der Erkenntnis, vom [logischen] Ursprung, den Quellen, Bedingungen und Voraussetzungen, vom Umfang, von den Grenzen der Erkenntnis', R. Eisler, Wörterbuch der philos. Begriffe 1, Berlin, 41927, 389). (576f.) Und im neuesten Standardwerk, der von Edward Craig herausgegebenen Rout- ledge Encyclopedia of Philosophy, schreibt Peter Klein. (shrink)
In this paper we investigate two purely syntactical notions ofcircularity, which we call ``self-application'''' and ``self-inclusion.'''' Alanguage containing self-application allows linguistic items to beapplied to themselves. In a language allowing for self-inclusion thereare expressions which include themselves as a proper part. We introduceaxiomatic systems of syntax which include identity criteria andexistence axioms for such expressions. The consistency of these axiomsystems will be shown by providing a variety of different models –these models being our circular languages. Finally we will show what (...) apossible semantics for these circular languages could look like. (shrink)
This article suggests that scientific philosophy, especially mathematical philosophy, might be one important way of doing philosophy in the future. Along the way, the article distinguishes between different types of scientific philosophy; it mentions some of the scientific methods that can serve philosophers; it aims to undermine some worries about mathematical philosophy; and it tries to make clear why in certain cases the application of mathematical methods is necessary for philosophical progress.
In this paper, I work through the possible contours of an anti-genocide based on a framework informed by the work of Giorgio Agamben. Such a framework posits the inherent need to circumvent sovereign power within any form of normative activism. To begin, I show how the nascent anti-genocide movement promotes an ideal in which ?Western? states, particularly the USA, accept the global responsibility to protect persecuted life beyond national boundaries. Using Agamben, I argue that this vision also entails an acceptance (...) of a sovereign framework for the valuation of life, thus failing to confront the inherent power of the sovereign to condemn life in the first place. I then highlight the limitations that Agamben's ontology places on us in dealing with this inherent problem within the sovereign-subject relationship. By positing an alternative ontology, I suggest the possibility of establishing communities of solidarity that challenge the sovereign's self-ascribed role as the absolute valuator of life. Counter to Agamben, I argue that the basis for such communities could be a dedication to the universal sacredness of human life, which is maintained independently of, and in challenge to, sovereign power. (shrink)
Rau, D. Die Ethik R. Saadjas.--Neumark, D. Saadya's philosophy.--Vajda, G. Saadia Gaon et l'amour courtois.--Diesendruck, Z. Saadya's formulation of the time-argument for creation.--Altmann, A. Saadya's conception of the law.--Vajda, G. Saʻadyā commentateur du "Livre of la création."--Vajda, G. Études sur Saadia.--Harkavy, A. Fragments of anti-Karaite writings of Saadiah in the Imperial Public Library at St. Petersburg.--Eisler, M. Vorlesungen über die jüdischen Philosophen des Mittelalters.
We will present a new lottery-style paradox on counterfactuals and chance. The upshot will be: combining natural assumptions on (i) the truth values of ordinary counterfactuals, (ii) the conditional chances of possible but non-actual events, (iii) the manner in which (i) and (ii) relate to each other, and (iv) a fragment of the logic of counterfactuals leads to disaster. In contrast with the usual lottery-style paradoxes, logical closure under conjunction—that is, in this case, the rule of Agglomeration of (consequents of) (...) counterfactuals—will not play a role in the derivation and will not be entailed by our premises either. We will sketch four obvious but problematic ways out of the dilemma, and we will end up with a new resolution strategy that is non-obvious but (as we hope) less problematic: contextualism about what counts as a proposition. This proposal will not just save us from the paradox, it will also save each premise in at least some context, and it will be motivated by independent considerations from measure theory and probability theory. (shrink)
Humans have always played a crucial role in the evolutionary dynamics of agricultural biodiversity and thus there is a strong relationship between these resources and human cultures. These agricultural resources have long been treated as a global public good, and constitute the livelihoods of millions of predominantly poor people. At the same time, agricultural biodiversity is under serious threat in many parts of the world despite extensive conservation efforts. Ethical considerations regarding the collecting, research, and use of agricultural biodiversity are (...) currently topics of great concern. For example, easy access to genetic resources for breeding purposes is important, but international agreements and legal frameworks are necessary to ensure adequate recognition of the contributions of local communities and traditional farmers in creating and nurturing these resources. Here, we assess ethical principles in the context of existing codes of conduct that are relevant for agro-biodiversity researchers. We aim to create awareness among scientists and policy makers who are concerned with agro-biodiversity research and its potential impact on local communities. We encourage a serious assessment of the ethical principles presented here and hope to facilitate an integration of these principles into the reader’s personal ethical framework. Key ethical principles considered here include the importance of obtaining prior informed consent, equity, and the inalienability of rights of local communities and farmers. (shrink)
The Prisoner’s Dilemma (PD) is widely used to model social interaction between unrelated individuals in the study of the evolution of cooperative behaviour in humans and other species. Many effective mechanisms and promotive scenarios have been studied which allow for small founding groups of cooperative individuals to prevail even when all social interaction is characterised as a PD. Here, a brief critical discussion of the role of the PD as the most prominent tool in cooperation research is presented, followed by (...) two new objections to such an exclusive focus on PD-based models of social interaction. It is highlighted that only 2 of the 726 combinatorially possible strategically unique ordinal 2x2 games have the detrimental characteristics of a PD and that the frequency of PD-type games in a space of games with random payoffs does not exceed about 3.5%. Although these purely mathematical considerations do not compellingly imply that the relevance of PDs is overestimated, it is proposed that, in the absence of convergent empirical information about the ancestral human social niche, this finding can be interpreted in favour of a so far rather neglected answer to the question of how the founding groups of human cooperation themselves came to cooperate: Behavioural and/or psychological mechanisms which evolved for other, possibly more frequent, social interaction situations might have been applied to PD-type dilemmas only later. Human cooperative behaviour might thus partly have begun as a cooptation. (shrink)
Many studies show that punishment, although able to stabilize cooperation at high levels, destroys gains which makes it less efficient than alternatives with no punishment. Standard public goods games (PGGs) in fact show exactly these patterns. However, both evolutionary theory and real world institutions give reason to expect institutions with punishment to be more efficient, particularly in the long run. Long-term cooperative partnerships with punishment threats for non-cooperation should outperform defection prone non-punishing ones. This article demonstrates that fieldwork data from (...) hunter-gatherers, common pool resource management cases and even PGGs support this hypothesis. Although earnings in PGGs with a punishment option may be lower at the beginning, efficiency increases dramatically over time. Most ten-period PGGs cannot capture this change because their time horizon is too short. (shrink)
I explore Baier, Held, Okin, Code, Noddings, and Eisler on trust and distrust. This reveals a need for reflection on the analysis, ethics, and dynamics of trust and distrust-especially the distinction between trusting and taking for granted, the feasibility of choosing greater trust, and the possibility of moving from situations of warranted distrust to trust. It is impossible to overcome the need for trust through surveillance, recourse to contracts, or legal institutions.
We argue that giving up on the closure of rational belief under conjunction comes with a substantial price. Either rational belief is closed under conjunction, or else the epistemology of belief has a serious diachronic deficit over and above the synchronic failures of conjunctive closure. The argument for this, which can be viewed as a sequel to the preface paradox, is called the ‘review paradox'; it is presented in four distinct, but closely related versions.
The literatures on both authentic leadership and behavioral integrity have argued that leader integrity drives follower performance. Yet, despite overlap in conceptualization and mechanisms, no research has investigated how authentic leadership and behavioral integrity relate to one another in driving follower performance. In this study, we propose and test the notion that authentic leadership behavior is an antecedent to perceptions of leader behavioral integrity, which in turn affects follower affective organizational commitment and follower work role performance. Analysis of a survey (...) of 49 teams in the service industry supports the proposition that authentic leadership is related to follower affective organizational commitment, fully mediated through leader behavioral integrity. Next, we found that authentic leadership and leader behavioral integrity are related to follower work role performance, fully mediated through follower affective organizational commitment. These relationships hold when controlling for ethical organizational culture. (shrink)
The literature on common pool resource (CPR) governance lists numerous factors that influence whether a given CPR system achieves ecological long-term sustainability. Up to now there is no comprehensive model to integrate these factors or to explain success within or across cases and sectors. Difficulties include the absence of large-N-studies (Poteete 2008), the incomparability of single case studies, and the interdependence of factors (Agrawal and Chhatre 2006). We propose (1) a synthesis of 24 success factors based on the current SES (...) framework and a literature review; (2) the application of neural networks on a database of CPR management case studies in an attempt to test the viability of this synthesis. This method allows us to obtain an implicit quantitative and rather precise model of the interdependencies in CPR systems. Given such a model, every success factor in each case can be manipulated separately, yielding different predictions for success. This could become be a fast and inexpensive way to analyze, predict and optimize performance for communities world-wide facing CPR challenges. Existing theoretical frameworks could be improved as well. (shrink)