Dienes & Perner (D&P) argue that nondeclarative knowledge can take multiple forms. We provide empirical support for this from two related lines of research about the development of mathematical reasoning. We then describe how different forms of procedural and declarative knowledge can be effectively modeled in Anderson's ACT-R theory, contrasting this computational approach with D&P's logical approach. The computational approach suggests that the commonly observed developmental progression from more implicit to more explicit knowledge can be viewed as a consequence of (...) accumulating and strengthening mental representations. (shrink)
Gesture does not have a fixed position in the Dienes & Perner framework. Its status depends on the way knowledge is expressed. Knowledge reflected in gesture can be fully implicit (neither factuality nor predication is explicit) if the goal is simply to move a pointing hand to a target. Knowledge reflected in gesture can be explicit (both factuality and predication are explicit) if the goal is to indicate an object. However, gesture is not restricted to these two extreme positions. When (...) gestures are unconscious accompaniments to speech and represent information that is distinct from speech, the knowledge they convey is factuality-implicit but predication-explicit. (shrink)
This study investigated whether activating elements of prior knowledge can influence how problem solvers encode and solve simple mathematical equivalence problems (e.g., 3 + 4 + 5 = 3 + _). Past work has shown that such problems are difficult for elementary school students (McNeil & Alibali, 2000). One possible reason is that children’s experiences in math classes may encourage them to think about equations in ways that are ultimately detrimental. Specifically, children learn a set of patterns that are (...) potentially problematic (McNeil & Alibali, 2005): the perceptual pattern that all equations follow an “operations = answer” format, the conceptual pattern that the equal sign means “calculate the total,” and the procedural pattern that the correct way to solve an equation is to perform all of the given operations on all of the given numbers. Upon viewing an equivalence problem, knowledge of these patterns may be reactivated, leading to incorrect problem solving. We hypothesized that these patterns may negatively affect problem solving by influencing what people encode about a problems. To test this hypothesis in children would require strengthening their misconceptions, and this could be detrimental to their mathematical development. Therefore, we tested this hypothesis in undergraduate participants. Participants completed either control tasks or tasks that activated their knowledge of the three patterns, and were then asked to reconstruct and solve a set of equivalence problems. Participants in the knowledge activation condition encoded the problems less well than control participants. They also made more errors in solving the problems, and their errors resembled the errors children make when solving equivalence problems. Moreover, encoding performance mediated the effect of knowledge activation on equivalence problem solving. Thus, one way in which experience may affect equivalence problem solving is by influencing what students encode about the equations. (shrink)
According to the SIMS model, mimicry and simulation contribute to perceivers' understanding of smiles. We argue that similar mechanisms are involved in comprehending the hand gestures that people produce when speaking. Viewing gestures may elicit overt mimicry, or may evoke corresponding simulations in the minds of addressees. These real or simulated actions contribute to addressees' comprehension of speakers' gestures.
Abstract. Suppose that several individuals who have separately assessed prior probability distributions over a set of possible states of the world wish to pool their individual distributions into a single group distribution, while taking into account jointly perceived new evidence. They have the option of (i) first updating their individual priors and then pooling the resulting posteriors or (ii) first pooling their priors and then updating the resulting group prior. If the pooling method that they employ is such that they (...) arrive at the same final distribution in both cases, the method is said to be externally Bayesian, a property first studied by Madansky (1964). We show that a pooling method for discrete distributions is externally Bayesian if and only if it commutes with Jeffrey conditioning, parameterized in terms of certain ratios of new to old odds, as in Wagner (2002), rather than in terms of the posterior probabilities of members of the disjoint family of events on which such conditioning originates. (shrink)
Cornelius Castoriadis is one of the very few social and political philosophers - modern and ancient - for whom a concept of imagination is truly central. In his work, however, the role of imagination is so overarching that it becomes difficult to grasp its workings and consequences in detail, in particular in its relation to democracy as the political form in which autonomy is the core imaginary signification. This article will proceed by first suggesting some clarifications about Castoriadis's employment of (...) the concept. This preparatory exploration will allow us in a second step to discuss why the idea of democracy is closely linked to tragedy, and why this linkage in turn is dependent on the centrality of imagination for human action. In a third conceptual step, finally, we suggest that any concept of imagination will need to take into account the plurality and diversity of the outcomes of the power of imagination. Thus, the question of the nature of the novelty that imagination creates needs to be addressed as well as the one of the agon in the face of different imagined innovations in a given democratic political setting. As a consequence of this shift in emphasis, to be elaborated further, one will be able to say more about one question of which Castoriadis was well aware, which he never addressed himself in detail, though: the decline and end of polities and political forms, the question of political mortality. Content Type Journal Article Pages 12-28 Authors Nathalie Karagiannis, University of Barcelona Peter Wagner, University of Barcelona Journal Critical Horizons: A Journal of Philosophy & Social Theory Online ISSN 1568-5160 Print ISSN 1440-9917 Journal Volume Volume 13 Journal Issue Volume 13, Number 1 / 2012. (shrink)
A decision problem in which the values of the decision variables must sum to a fixed positive real number s is called an "allocation problem," and the problem of aggregating the allocations of n experts the "allocation aggregation problem." Under two simple axiomatic restrictions on aggregation, the only acceptable allocation aggregation method is based on weighted arithmetic averaging (Lehrer and Wagner, Rational Consensus in Science and Society, 1981). In this note it is demonstrated that when the values assigned to (...) the variables are restricted to a finite set (as is always the case in practice), the aforementioned axioms allow only dictatorial aggregation. (shrink)
Garber (1983) and Jeffrey (1991, 1995) have both proposed solutions to the old evidence problem. Jeffrey's solution, based on a new probability revision method called reparation, has been generalized to the case of uncertain old evidence and probabilistic new explanation in Wagner 1997, 1999. The present paper reformulates some of the latter work, highlighting the central role of Bayes factors and their associated uniformity principle, and extending the analysis to the case in which an hypothesis bears on a countable (...) family of evidentiary propositions. This extension shows that no Garber-type approach is capable of reproducing the results of generalized reparation. (shrink)
We extend previous work of Lehrer and Wagner, and of McConway, on the consensus of probabilities, showing under axioms similar to theirs that (1) a belief function consensus of belief functions on a set with at least three members and (2) a belief function consensus of Bayesian belief functions on a set with at least four members must take the form of a weighted arithmetic mean. We observe that these results are unchanged when consensual uncertainty measures are allowed to (...) take the form of Choquet capacities of low order monotonicity. (shrink)
Taking his critique of totalitarianizing conceptions of community as a starting point, this text examines Jean-Luc Nancy's work of an "ontology of plural singular being" for its political implications. It argues that while at first this ontology seems to advocate a negative or an anti-politics only, it can also be read as a "theory of communicative praxis" that suggests a certain ethos - in the form of a certain use of symbols (which is expressed only inaptly by the word "style") (...) that would render the ontological plurality of singulars perceptible and practically effective. Finally, some recent texts by Nancy even sidestep the ontology of being-with and face the question of what politics, faced with demands of justice, could be and what a democratic politics could provide. Both of these aspects in Nancy's work, however, still remain to be spelled out more politically. (shrink)
By advocating an enlightened method of theorizing committed to thinking in terms of a system of differences, Luhmann has contributed to the development of sociology in a manner that cannot be praised enough. Nonetheless, he does not succeed in giving an account of his own position that satisfies the very logical preconditions that he himself has formulated for it. Instead, his systems theory paradigm of sociology is based on metaphysical premises characteristic of the identity-logical thought of "Old Europe." In fact, (...) the only way to make Luh mann's approach truly comprehensible is to reconstruct it as a new version of Hegel's dialectic. (shrink)
Part I: Dimensions of time's enigma -- Is time real? -- Eleaticism, temporality, and time -- The makings of a temporal universe -- Pastness and futurity -- Synchronicity and synchronicity -- Temporal pace and measurement -- Presentness or the present -- Aristotle's real account of time -- Parmenidean time and the impossible now -- Cosmic motion and the speed of time -- Time as the motion of the cosmos -- Time as the cosmos itself -- Time as motion and all (...) change -- Temporal cognition and the return of the now -- Real temporality in an Aristotelian world -- Does Aristotle refute eleaticism? -- Bisection argument I -- Bisection argument II -- Bisection argument III -- Plotinus' vitalistic platonism and the real origins of time -- Temporality, eternality, and Plotinus' new metaphysic -- Plotinus' critique of Aristotelian motion -- Indefinite temporality and the measure of motion -- Plotinus' neoplatonic account of time. (shrink)
This paper argues in defense of theanti-reductionist consensus in the philosophy ofbiology. More specifically, it takes issues with AlexRosenberg's recent challenge of this position. Weargue that the results of modern developmentalgenetics rather than eliminating the need forfunctional kinds in explanations of developmentactually reinforce their importance.
Systems involving many interacting variables are at the heart of the natural and social sciences. Causal language is pervasive in the analysis of such systems, especially when insight into their behavior is translated into policy decisions. This is exemplified by economics, but to an increasing extent also by biology, due to the advent of sophisticated tools to identify the genetic basis of many diseases. It is argued here that a regularity notion of causality can only be meaningfully defined for systems (...) with linear interactions among their variables. For the vastly more important class of nonlinear systems, no such notion is likely to exist. This thesis is developed with examples of dynamical systems taken mostly from mathematical biology. It is discussed with particular reference to the problem of causal inference in complex genetic systems, systems for which often only statistical characterizations exist. (shrink)
Individuals are faced with the many opportunities to pirate. The decision to pirate or not may be related to an individual''s attitudes toward other ethical issues. A person''s ethical and moral predispositions and the judgments that they use to make decisions may be consistent across various ethical dilemmas and may indicate their likelihood to pirate software. This paper investigates the relationship between religion and a theoretical ethical decision making process that an individual uses when evaluating ethical or unethical situations. An (...) ethical decision making model was studied for general unethical scenarios and for the unethical behavior of software piracy. The research model was tested via path analysis using structural equation modeling and was found to be appropriate for the sample data. The results suggest that there is a relationship between religion and the stages of an ethical decision making process regarding general ethical situations and software piracy. (shrink)
In discussing the works of 16th-century theorists Francisco de Vitoria and Alberico Gentili, this article examines how two different conceptions of a global legal community affect the legal character of the international order and the obligatory force of international law. For Vitoria the legal bindingness of ius gentium necessarily presupposes an integrated character of the global commonwealth that leads him to as it were ascribe legal personality to the global community as a whole. But then its legal status and its (...) consequences have to be clarified. For Gentili on the other hand, sovereign states in their plurality are the pinnacle of the legal order(s). His model of a globally valid ius gentium then oscillates between being analogous to private law, depending on individual acceptance by states and being natural law, appearing in a certain sense as a form rather of morality than of law. (shrink)
In this paper we argue that an operational organism concept can help to overcome the structural deficiency of mathematical models in biology. In our opinion, the structural deficiency of mathematical models lies mainly in our inability to identify functionally relevant biological characters in biological systems, and not so much in a lack of adequate mathematical representations of biological processes. We argue that the problem of character identification in biological systems is linked to the question of a properly formulated organism concept. (...) Lastly, we demonstrate how a decomposition of an organism into independent characters in the context of a specific biological process--such as adaptation by means of natural selection--depends on the dynamical properties and invariance conditions of the equations that describe this process. (shrink)
Descartes' procedure in "Meditation II" must be brought into line with his claim that "we must never ask about the existence of anything until we first understand its essence." And Descartes' "Meditation III" claim that he is aware of his mind's power to cause ideas must be grounded in a prior discovery of this power. Both demands are met by reading "Meditation II" as a progressive clarification of the nature of mind, with the investigation of the wax providing the discovery (...) of the mind's generative power. This process of discovery also provides the meanings of "thinking" and "existing" -- as "causing ideas" and "exercising causal power", respectively. Thus the discovery of the mind's nature also grounds the cogito. I provide a close reading of the wax investigation which supports this view. (shrink)
It has often been recommended that the differing probability distributions of a group of experts should be reconciled in such a way as to preserve each instance of independence common to all of their distributions. When probability pooling is subject to a universal domain condition, along with state-wise aggregation, there are severe limitations on implementing this recommendation. In particular, when the individuals are epistemic peers whose probability assessments are to be accorded equal weight, universal preservation of independence is, with a (...) few exceptions, impossible. Under more reasonable restrictions on pooling, however, there is a natural method of preserving the independence of any fixed finite family of countable partitions, and hence of any fixed finite family of discrete random variables. (shrink)
We establish a probabilized version of modus tollens, deriving from p(E|H)=a and p()=b the best possible bounds on p(). In particular, we show that p() 1 as a, b 1, and also as a, b 0. Introduction Probabilities of conditionals Conditional probabilities 3.1 Adams' thesis 3.2 Modus ponens for conditional probabilities 3.3 Modus tollens for conditional probabilities.
The out?dated intentionalistic assumptions manifest in Habermas's Theory of Communicative Action undermine a solution to the problem of order in action theory beyond utilitarianism. An analysis of his intersubjectivistic conception, which is based on the theory of the speech?act, shows that the incompleteness of Habermas's linguistic turn is due to his attempt to revive the older Critical Theory's concept of critique. The claims for a scientifically well?founded revival of a universal concept of reason ? which are asserted in this concept (...) ? invalidate the intersubjectivistic paradigm in action theory and therefore obstruct the way to a de?individualized formulation of the theory of social contract that avoids the paradox of utilitarian models. (shrink)
Jeffrey has devised a probability revision method that increases the probability of hypothesis H when it is discovered that H implies previously known evidence E. A natural extension of Jeffrey's method likewise increases the probability of H when E has been established with sufficiently high probability and it is then discovered, quite apart from this, that H confers sufficiently higher probability on E than does its logical negation H̄.
The right interpretation of subjective probability is implicit in the theories of upper and lower odds, and upper and lower previsions, developed, respectively, by Cedric Smith (1961) and Peter Walley (1991). On this interpretation you are free to assign contingent events the probability 1 (and thus to employ conditionalization as a method of probability revision) without becoming vulnerable to a weak Dutch book.
The so-called "non-commutativity" of probability kinematics has caused much unjustified concern. When identical learning is properly represented, namely, by identical Bayes factors rather than identical posterior probabilities, then sequential probability-kinematical revisions behave just as they should. Our analysis is based on a variant of Field's reformulation of probability kinematics, divested of its (inessential) physicalist gloss.
This article analyses the link between innovation with high social benefits and corporate social performance (CSP) and the role that family firms play in this. This theme is particularly relevant given the large number of firms that are family-owned. Also the implicit potential of innovation to reconcile corporate sustainability aspects with profitability justifies an extended analysis of this link. Governments often support socially beneficial innovation with various policy instruments, with the intention of increasing international competitiveness and simultaneously supporting sustainable development. (...) In parallel, firms pursue corporate social responsibility (CSR) and environmental management activities partly in the hope that this will foster such innovation in their organisation (alongside their main aim of improving CSP). Hence, the main research question of this article is about the association of CSP with innovation with high social benefits and the determinants of the potential moderation of this association. Based on panel data, the article analyses the link between CSP and innovation, and the effect of being a family firm using panel estimation techniques. The results point to a moderating role of family firms on the link between innovation with high social benefits and CSP. The article concludes by assessing the policy implications of this insight. (shrink)
We define an R-group to be a stable group with the property that a generic element (for any definable transitive group action) can only be algebraic over a generic. We then derive some corollaries for R-groups and fields, and prove a decomposition theorem and a field theorem. As a nonsuperstable example, we prove that small stable groups are R-groups.