Abstract. Suppose that several individuals who have separately assessed prior probability distributions over a set of possible states of the world wish to pool their individual distributions into a single group distribution, while taking into account jointly perceived new evidence. They have the option of (i) first updating their individual priors and then pooling the resulting posteriors or (ii) first pooling their priors and then updating the resulting group prior. If the pooling method that they employ is such that they (...) arrive at the same final distribution in both cases, the method is said to be externally Bayesian, a property first studied by Madansky (1964). We show that a pooling method for discrete distributions is externally Bayesian if and only if it commutes with Jeffrey conditioning, parameterized in terms of certain ratios of new to old odds, as in Wagner (2002), rather than in terms of the posterior probabilities of members of the disjoint family of events on which such conditioning originates. (shrink)
Cornelius Castoriadis is one of the very few social and political philosophers - modern and ancient - for whom a concept of imagination is truly central. In his work, however, the role of imagination is so overarching that it becomes difficult to grasp its workings and consequences in detail, in particular in its relation to democracy as the political form in which autonomy is the core imaginary signification. This article will proceed by first suggesting some clarifications about Castoriadis's employment of (...) the concept. This preparatory exploration will allow us in a second step to discuss why the idea of democracy is closely linked to tragedy, and why this linkage in turn is dependent on the centrality of imagination for human action. In a third conceptual step, finally, we suggest that any concept of imagination will need to take into account the plurality and diversity of the outcomes of the power of imagination. Thus, the question of the nature of the novelty that imagination creates needs to be addressed as well as the one of the agon in the face of different imagined innovations in a given democratic political setting. As a consequence of this shift in emphasis, to be elaborated further, one will be able to say more about one question of which Castoriadis was well aware, which he never addressed himself in detail, though: the decline and end of polities and political forms, the question of political mortality. Content Type Journal Article Pages 12-28 Authors Nathalie Karagiannis, University of Barcelona Peter Wagner, University of Barcelona Journal Critical Horizons: A Journal of Philosophy & Social Theory Online ISSN 1568-5160 Print ISSN 1440-9917 Journal Volume Volume 13 Journal Issue Volume 13, Number 1 / 2012. (shrink)
A decision problem in which the values of the decision variables must sum to a fixed positive real number s is called an "allocation problem," and the problem of aggregating the allocations of n experts the "allocation aggregation problem." Under two simple axiomatic restrictions on aggregation, the only acceptable allocation aggregation method is based on weighted arithmetic averaging (Lehrer and Wagner, Rational Consensus in Science and Society, 1981). In this note it is demonstrated that when the values assigned to (...) the variables are restricted to a finite set (as is always the case in practice), the aforementioned axioms allow only dictatorial aggregation. (shrink)
Garber (1983) and Jeffrey (1991, 1995) have both proposed solutions to the old evidence problem. Jeffrey's solution, based on a new probability revision method called reparation, has been generalized to the case of uncertain old evidence and probabilistic new explanation in Wagner 1997, 1999. The present paper reformulates some of the latter work, highlighting the central role of Bayes factors and their associated uniformity principle, and extending the analysis to the case in which an hypothesis bears on a countable (...) family of evidentiary propositions. This extension shows that no Garber-type approach is capable of reproducing the results of generalized reparation. (shrink)
We extend previous work of Lehrer and Wagner, and of McConway, on the consensus of probabilities, showing under axioms similar to theirs that (1) a belief function consensus of belief functions on a set with at least three members and (2) a belief function consensus of Bayesian belief functions on a set with at least four members must take the form of a weighted arithmetic mean. We observe that these results are unchanged when consensual uncertainty measures are allowed to (...) take the form of Choquet capacities of low order monotonicity. (shrink)
Taking his critique of totalitarianizing conceptions of community as a starting point, this text examines Jean-Luc Nancy's work of an "ontology of plural singular being" for its political implications. It argues that while at first this ontology seems to advocate a negative or an anti-politics only, it can also be read as a "theory of communicative praxis" that suggests a certain ethos - in the form of a certain use of symbols (which is expressed only inaptly by the word "style") (...) that would render the ontological plurality of singulars perceptible and practically effective. Finally, some recent texts by Nancy even sidestep the ontology of being-with and face the question of what politics, faced with demands of justice, could be and what a democratic politics could provide. Both of these aspects in Nancy's work, however, still remain to be spelled out more politically. (shrink)
By advocating an enlightened method of theorizing committed to thinking in terms of a system of differences, Luhmann has contributed to the development of sociology in a manner that cannot be praised enough. Nonetheless, he does not succeed in giving an account of his own position that satisfies the very logical preconditions that he himself has formulated for it. Instead, his systems theory paradigm of sociology is based on metaphysical premises characteristic of the identity-logical thought of "Old Europe." In fact, (...) the only way to make Luh mann's approach truly comprehensible is to reconstruct it as a new version of Hegel's dialectic. (shrink)
Part I: Dimensions of time's enigma -- Is time real? -- Eleaticism, temporality, and time -- The makings of a temporal universe -- Pastness and futurity -- Synchronicity and synchronicity -- Temporal pace and measurement -- Presentness or the present -- Aristotle's real account of time -- Parmenidean time and the impossible now -- Cosmic motion and the speed of time -- Time as the motion of the cosmos -- Time as the cosmos itself -- Time as motion and all (...) change -- Temporal cognition and the return of the now -- Real temporality in an Aristotelian world -- Does Aristotle refute eleaticism? -- Bisection argument I -- Bisection argument II -- Bisection argument III -- Plotinus' vitalistic platonism and the real origins of time -- Temporality, eternality, and Plotinus' new metaphysic -- Plotinus' critique of Aristotelian motion -- Indefinite temporality and the measure of motion -- Plotinus' neoplatonic account of time. (shrink)
This paper argues in defense of theanti-reductionist consensus in the philosophy ofbiology. More specifically, it takes issues with AlexRosenberg's recent challenge of this position. Weargue that the results of modern developmentalgenetics rather than eliminating the need forfunctional kinds in explanations of developmentactually reinforce their importance.
Systems involving many interacting variables are at the heart of the natural and social sciences. Causal language is pervasive in the analysis of such systems, especially when insight into their behavior is translated into policy decisions. This is exemplified by economics, but to an increasing extent also by biology, due to the advent of sophisticated tools to identify the genetic basis of many diseases. It is argued here that a regularity notion of causality can only be meaningfully defined for systems (...) with linear interactions among their variables. For the vastly more important class of nonlinear systems, no such notion is likely to exist. This thesis is developed with examples of dynamical systems taken mostly from mathematical biology. It is discussed with particular reference to the problem of causal inference in complex genetic systems, systems for which often only statistical characterizations exist. (shrink)
Individuals are faced with the many opportunities to pirate. The decision to pirate or not may be related to an individual''s attitudes toward other ethical issues. A person''s ethical and moral predispositions and the judgments that they use to make decisions may be consistent across various ethical dilemmas and may indicate their likelihood to pirate software. This paper investigates the relationship between religion and a theoretical ethical decision making process that an individual uses when evaluating ethical or unethical situations. An (...) ethical decision making model was studied for general unethical scenarios and for the unethical behavior of software piracy. The research model was tested via path analysis using structural equation modeling and was found to be appropriate for the sample data. The results suggest that there is a relationship between religion and the stages of an ethical decision making process regarding general ethical situations and software piracy. (shrink)
In discussing the works of 16th-century theorists Francisco de Vitoria and Alberico Gentili, this article examines how two different conceptions of a global legal community affect the legal character of the international order and the obligatory force of international law. For Vitoria the legal bindingness of ius gentium necessarily presupposes an integrated character of the global commonwealth that leads him to as it were ascribe legal personality to the global community as a whole. But then its legal status and its (...) consequences have to be clarified. For Gentili on the other hand, sovereign states in their plurality are the pinnacle of the legal order(s). His model of a globally valid ius gentium then oscillates between being analogous to private law, depending on individual acceptance by states and being natural law, appearing in a certain sense as a form rather of morality than of law. (shrink)
In this paper we argue that an operational organism concept can help to overcome the structural deficiency of mathematical models in biology. In our opinion, the structural deficiency of mathematical models lies mainly in our inability to identify functionally relevant biological characters in biological systems, and not so much in a lack of adequate mathematical representations of biological processes. We argue that the problem of character identification in biological systems is linked to the question of a properly formulated organism concept. (...) Lastly, we demonstrate how a decomposition of an organism into independent characters in the context of a specific biological process--such as adaptation by means of natural selection--depends on the dynamical properties and invariance conditions of the equations that describe this process. (shrink)
Descartes' procedure in "Meditation II" must be brought into line with his claim that "we must never ask about the existence of anything until we first understand its essence." And Descartes' "Meditation III" claim that he is aware of his mind's power to cause ideas must be grounded in a prior discovery of this power. Both demands are met by reading "Meditation II" as a progressive clarification of the nature of mind, with the investigation of the wax providing the discovery (...) of the mind's generative power. This process of discovery also provides the meanings of "thinking" and "existing" -- as "causing ideas" and "exercising causal power", respectively. Thus the discovery of the mind's nature also grounds the cogito. I provide a close reading of the wax investigation which supports this view. (shrink)
It has often been recommended that the differing probability distributions of a group of experts should be reconciled in such a way as to preserve each instance of independence common to all of their distributions. When probability pooling is subject to a universal domain condition, along with state-wise aggregation, there are severe limitations on implementing this recommendation. In particular, when the individuals are epistemic peers whose probability assessments are to be accorded equal weight, universal preservation of independence is, with a (...) few exceptions, impossible. Under more reasonable restrictions on pooling, however, there is a natural method of preserving the independence of any fixed finite family of countable partitions, and hence of any fixed finite family of discrete random variables. (shrink)
Jonathan Weisberg claims that certain probability assessments constructed by Jeffrey conditioning resist subsequent revision by a certain type of after-the-fact defeater of the reasons supporting those assessments, and that such conditioning is thus “inherently anti-holistic.” His analysis founders, however, in applying Jeffrey conditioning to a partition for which an essential rigidity condition clearly fails. Applied to an appropriate partition, Jeffrey conditioning is amenable to revision by the sort of after-the-fact defeaters considered by Weisberg in precisely the way that he demands.
It is shown that the Fisher smoking problem and Newcomb's problem are decisiontheoretically identical, each having at its core an identical case of Simpson's paradox for certain probabilities. From this perspective, incorrect solutions to these problems arise from treating them as cases of decisionmaking under risk, while adopting certain global empirical conditional probabilities as the relevant subjective probabihties. The most natural correct solutions employ the methodology of decisionmaking under uncertainty with lottery acts, with certain local empirical conditional probabilities adopted as (...) the relevant subjective probabilities. (shrink)
Common wisdom holds that communication is impossible when messages are costless and communicators have totally opposed interests. This article demonstrates that such wisdom is false. Non-convergent dynamics can sustain partial information transfer even in a zero-sum signalling game. In particular, I investigate a signalling game in which messages are free, the state-act payoffs resemble rock–paper–scissors, and senders and receivers adjust their strategies according to the replicator dynamic. This system exhibits Hamiltonian chaos and trajectories do not converge to equilibria. This persistent (...) out-of-equilibrium behaviour results in messages that do not perfectly reveal the sender's private information, but do transfer information as quantified by the Kullback–Leibler divergence. This finding shows that adaptive dynamics can enable information transmission even though messages at equilibria are meaningless. This suggests a new explanation for the evolution or spontaneous emergence of meaning: non-convergent adaptive dynamics. (shrink)
We establish a probabilized version of modus tollens, deriving from p(E|H)=a and p()=b the best possible bounds on p(). In particular, we show that p() 1 as a, b 1, and also as a, b 0. Introduction Probabilities of conditionals Conditional probabilities 3.1 Adams' thesis 3.2 Modus ponens for conditional probabilities 3.3 Modus tollens for conditional probabilities.
The out?dated intentionalistic assumptions manifest in Habermas's Theory of Communicative Action undermine a solution to the problem of order in action theory beyond utilitarianism. An analysis of his intersubjectivistic conception, which is based on the theory of the speech?act, shows that the incompleteness of Habermas's linguistic turn is due to his attempt to revive the older Critical Theory's concept of critique. The claims for a scientifically well?founded revival of a universal concept of reason ? which are asserted in this concept (...) ? invalidate the intersubjectivistic paradigm in action theory and therefore obstruct the way to a de?individualized formulation of the theory of social contract that avoids the paradox of utilitarian models. (shrink)
Jeffrey has devised a probability revision method that increases the probability of hypothesis H when it is discovered that H implies previously known evidence E. A natural extension of Jeffrey's method likewise increases the probability of H when E has been established with sufficiently high probability and it is then discovered, quite apart from this, that H confers sufficiently higher probability on E than does its logical negation H̄.
The right interpretation of subjective probability is implicit in the theories of upper and lower odds, and upper and lower previsions, developed, respectively, by Cedric Smith (1961) and Peter Walley (1991). On this interpretation you are free to assign contingent events the probability 1 (and thus to employ conditionalization as a method of probability revision) without becoming vulnerable to a weak Dutch book.
In axiomatic approaches to expert opinion aggregation, so-called independence conditions have been ubiquitous. Such conditions dictate that the group value assigned to each decision variable should depend only on the values assigned by individuals to that variable, taking no account of values that they assign to other variables. This radically anti-holistic stricture on the synthesis of expert opinion severely limits the set of allowable aggregation methods. As we show, the limitations are particularly acute in the case of three or more (...) variables which must be assigned nonnegative real values summing to a fixed positive real number s. For if the subset V of [0,s] comprising the allowable values of the variables satisfies the closure conditions (i)0 is an element of V ; (ii) if x is an element of V, then s-x is an element of V ; and (iii) if x and y are elements of V and x+y is an element of [0,s], then x+y is an element of V, then, if V is finite, which is always the case in practice, subjecting the aggregation of such s-allocations to an independence condition allows only for dictatorial or imposed (i.e., constant) aggregation. (shrink)
The so-called "non-commutativity" of probability kinematics has caused much unjustified concern. When identical learning is properly represented, namely, by identical Bayes factors rather than identical posterior probabilities, then sequential probability-kinematical revisions behave just as they should. Our analysis is based on a variant of Field's reformulation of probability kinematics, divested of its (inessential) physicalist gloss.
This article analyses the link between innovation with high social benefits and corporate social performance (CSP) and the role that family firms play in this. This theme is particularly relevant given the large number of firms that are family-owned. Also the implicit potential of innovation to reconcile corporate sustainability aspects with profitability justifies an extended analysis of this link. Governments often support socially beneficial innovation with various policy instruments, with the intention of increasing international competitiveness and simultaneously supporting sustainable development. (...) In parallel, firms pursue corporate social responsibility (CSR) and environmental management activities partly in the hope that this will foster such innovation in their organisation (alongside their main aim of improving CSP). Hence, the main research question of this article is about the association of CSP with innovation with high social benefits and the determinants of the potential moderation of this association. Based on panel data, the article analyses the link between CSP and innovation, and the effect of being a family firm using panel estimation techniques. The results point to a moderating role of family firms on the link between innovation with high social benefits and CSP. The article concludes by assessing the policy implications of this insight. (shrink)
We define an R-group to be a stable group with the property that a generic element (for any definable transitive group action) can only be algebraic over a generic. We then derive some corollaries for R-groups and fields, and prove a decomposition theorem and a field theorem. As a nonsuperstable example, we prove that small stable groups are R-groups.
Divided into two parts this book examines the train of social theory from the 19th century, through to the `organization of modernity', in relation to ideas of social planning, and as contributors to the `rationalistic revolution' of the `golden age' of capitalism in the 1950s and 60s. Part two examines key concepts in the social sciences. It begins with some of the broadest concepts used by social scientists: choice, decision, action and institution and moves on to examine the `collectivist alternative': (...) the concepts of society, culture and polity, which are often dismissed as untenable by postmodernists today. This is a major contribution to contemporary social theory and provides a host of essential insights into the task of social science today. (shrink)