Résumé La question posée par Heidegger : « Comment Dieu entre-t-il dans la philosophie? » offre un cadre pour dégager quelques lignes directrices de la pensée de Josef Pieper. On tente ici de reconduire la philosophie à son étonnement initial, où se discernent une exigence de « ne rien négliger » et une approbation du monde.Heidegger’s question : “How does God enter philosophy?”, offers a frame for drawing some of the main lines of Josef Pieper’s thought. We try here to (...) lead philosophy back to its initial wonder, where may be discerned both a requirement to “neglect nothing” and an approval of the world. (shrink)
Ideas from cognitive science are increasingly influential and provide insight into the nature of moral judgement. Three core ideas are discussed: modern schema theory, the frequency of automatic decision-making and implicit processes as the default mode of human information processing. The Defining Issues Test (DIT) measures the beginnings of moral understanding, which are largely non-verbal and intuitive, in contrast to the Moral Judgement Interview (MJI), which measures the highest level of verbal understanding. The positive attributes of the DIT and its (...) conceptualisation of moral judgement schemas are more apparent in a time of increasing respect for implicit knowledge and processing. The DIT offers a means of measuring moral judgement that fits with current views in cognitive science. Although the MJI and interview techniques generally are worthwhile for measuring production competence, the DIT is better able to measure understanding at the level that drives most decisions for most people. (shrink)
I argue against theories that attempt to reduce scientific representation to similarity or isomorphism. These reductive theories aim to radically naturalize the notion of representation, since they treat scientist's purposes and intentions as non-essential to representation. I distinguish between the means and the constituents of representation, and I argue that similarity and isomorphism are common but not universal means of representation. I then present four other arguments to show that similarity and isomorphism are not the constituents of scientific representation. I (...) finish by looking at the prospects for weakened versions of these theories, and I argue that only those that abandon the aim to radically naturalize scientific representation are likely to be successful. (shrink)
This paper defends an inferential conception of scientific representation. It approaches the notion of representation in a deflationary spirit, and minimally characterizes the concept as it appears in science by means of two necessary conditions: its essential directionality and its capacity to allow surrogate reasoning and inference. The conception is defended by showing that it successfully meets the objections that make its competitors, such as isomorphism and similarity, untenable. In addition the inferential conception captures the objectivity of the cognitive representations (...) used by science, it sheds light on their truth and completeness, and it explains the source of the analogy between scientific and artistic modes of representation. (shrink)
With the aid of techniques such as functional magnetic resonance imaging, neuroscience is providing a new perspective on human behaviour. Many areas of psychology have recognised and embraced the new technologies, methodologies and relevant findings. But how do the tools of neuroscience affect the fields of moral development and moral education? This paper reviews neuroscience research germane to moral development using as an organisational framework Rest's Four Component Model of moral functioning, which proposes that moral behaviour requires moral sensitivity, moral (...) judgement, moral motivation/focus and moral action skills. Issues such as the importance of early brain development and attachment are addressed. The authors conclude with a brief description of an integrative theory, Triune Ethics Theory, which provides an example of how moral development and neuroscience can be integrated. (shrink)
Scientific representation is a currently booming topic, both in analytical philosophy and in history and philosophy of science. The analytical inquiry attempts to come to terms with the relation between theory and world; while historians and philosophers of science aim to develop an account of the practice of model building in the sciences. This article provides a review of recent work within both traditions, and ultimately argues for a practice-based account of the means employed by scientists to effectively achieve representation (...) in the modelling sciences. (shrink)
This paper defends the deflationary character of two recent views regarding scientific representation, namely RIG Hughes’ DDI model and the inferential conception. It is first argued that these views’ deflationism is akin to the homonymous position in discussions regarding the nature of truth. There, we are invited to consider the platitudes that the predicate “true” obeys at the level of practice, disregarding any deeper, or more substantive, account of its nature. More generally, for any concept X, a deflationary approach is (...) then defined in opposition to a substantive approach, where a substantive approach to X is an analysis of X in terms of some property P, or relation R, accounting for and explaining the standard use of X. It then becomes possible to characterize a deflationary view of scientific representation in three distinct senses, namely: a “no-theory” view, a “minimalist” view, and a “use-based” view – in line with three standard deflationary responses in the philosophical literature on truth. It is then argued that both the DDI model and the inferential conception may be suitably understood in any of these three different senses. The application of these deflationary ‘hermeneutics’ moreover yields significant improvements on the DDI model, which bring it closer to the inferential conception. It is finally argued that what these approaches have in common – the key to any deflationary account of scientific representation – is the denial that scientific representation may be ultimately reduced to any substantive explanatory property of sources, or targets, or their relations. (shrink)
Personality and social development begins before birth in the communication among mother, child and environment, during sensitive periods when the child’s brain and body are plastic and epigenetically co-constructed. Triune ethics theory postulates three evolved, neurobiologically-based ethics fostered by early life experience. The security ethic is self-protective. The engagement ethic is relationally attuned. The imagination ethic can abstract from the present moment and imagine alternatives. Climates and cultures can foster one or another ethic. Ancestral environments were more conducive to moral (...) development. Individuals can adopt self-authorship of their moral character through the development of ethical expertise. Recommendations are made for research and policies that study and support optimal moral development. (shrink)
In “The Toolbox of Science” (1995) together with Towfic Shomar we advocated a form of instrumentalism about scientific theories. We separately developed this view further in a number of subsequent works. Steven French, James Ladyman, Otavio Bueno and Newton Da Costa (FLBD) have since written at least eight papers and a book criticising our work. Here we defend ourselves. First we explain what we mean in denying that models derive from theory – and why their failure to do so should (...) be lamented. Second we defend our use of the London model of superconductivity as an example. Third we point out both advantages and weaknesses of FLBD’s techniques in comparison to traditional Anglophone versions of the semantic conception. Fourth we show that FLBD’s version of the semantic conception has not been applied to our case study. We conclude by raising doubts about FLBD’s overall project. (shrink)
This paper reviews four attempts throughout the history of quantum mechanics to explicitly employ dispositional notions in order to solve the quantum paradoxes, namely: Margenau’s latencies, Heisenberg’s potentialities, Maxwell’s propensitons, and the recent selective propensities interpretation of quantum mechanics. Difficulties and challenges are raised for all of them, and it is concluded that the selective propensities approach nicely encompasses the virtues of its predecessors. Finally, some strategies are discussed for reading dispositional notions into two other well-known interpretations of quantum mechanics, (...) namely the GRW interpretation and Bohmian mechanics. (shrink)
Every year in this country, some 10,000 college and university courses are taught in applied ethics. And many professional organizations now have their own codes of ethics. Yet social science has had little impact upon applied ethics. This book promises to change that trend by illustrating how social science can make a contribution to applied ethics. The text reports psychological studies relevant to applied ethics for many professionals, including accountants, college students and teachers, counselors, dentists, doctors, journalists, nurses, school teachers, (...) athletes, and veterinarians. Each chapter begins with the research base of the cognitive-developmental approach--especially linked to Kohlberg and Rest's Defining Issues Test. Finally, the book summarizes recent research on the following issues: * moral judgment scores within and between professions, * pre- and post-test evaluations of ethics education programs, * moral judgment and moral behavior, * models of professional ethicseducation, and * models for developing new assessment tools. Researchers in different professional fields investigate different questions, develop different research strategies, and report different findings. Typically researchers of one professional field are not aware of research in other fields. An important aim of the present book is to bring this diverse research together so that cross-fertilization can occur and ideas from one field can transfer to another. (shrink)
Several studies are reviewed that examine differences in moral schema development using techniques intermediate between measuring implicit knowledge (such as with the Defining Issues Test) and explicit knowledge (such as with the Moral Judgment Interview). Findings include significant differences in the comprehension of moral narratives based on age/education and on level of expertise. Also, intended moral themes in stories are not understood by younger children.
This paper argues for a broadly dispositionalist approach to the ontology of Bohmian mechanics . It first distinguishes the ‘minimal’ and the ‘causal’ versions of Bohm’s theory, and then briefly reviews some of the claims advanced on behalf of the ‘causal’ version by its proponents. A number of ontological or interpretive accounts of the wave function in BM are then addressed in detail, including configuration space, multi-field, nomological, and dispositional approaches. The main objection to each account is reviewed, namely the (...) ‘problem of perception’, the ‘problem of communication’, the ‘problem of temporal laws’, and the ‘problem of under-determination’. It is then shown that a version of dispositionalism overcomes the under-determination problem while providing neat solutions to the other three problems. A pragmatic argument is thus furnished for the use of dispositions in the interpretation of the theory more generally. The paper ends in a more speculative note by suggesting ways in which a dispositionalist interpretation of the wave function is in addition able to shed light upon some of the claims of the proponents of the causal version of BM. (shrink)
This paper expands on, and provides a qualified defence of, Arthur Fine's selective interactions solution to the measurement problem. Fine's approach must be understood against the background of the insolubility proof of the quantum measurement. I first defend the proof as an appropriate formal representation of the quantum measurement problem. The nature of selective interactions, and more generally selections, is then clarified, and three arguments in their favour are offered. First, selections provide the only known solution to the measurement problem (...) that does not relinquish any of the explicit premises of the insolubility proofs. Second, unlike some no-collapse interpretations of quantum mechanics, selections suffer no difficulties with non-ideal measurements. Third, unlike most collapse interpretations, selections can be independently motivated by an appeal to quantum propensities. IntroductionThe problem of quantum measurement2.1 The ignorance interpretation of mixtures2.2 The eigenstate–eigenvalue link2.3 The quantum theory of measurementThe insolubility proof of the quantum measurement3.1 Some notation3.2 The transfer of probability condition (TPC)3.3 The occurrence of outcomes condition (OOC)A defence of the insolubility proof4.1 Stein's critique4.2 Ignorance is not required4.3 The problem of quantum measurement is an idealisationSelections5.1 Representing dispositional properties5.2 Selections solve the measurement problem5.3 Selections and ignoranceNon-ideal selections6.1 No-collapse interpretations and non-ideal measurements6.2 Exact and approximate measurements6.3 Selections for non-ideal interactions6.4 Approximate selections6.5 Implications for ignoranceSelective interactions test quantum propensities7.1 Equivalence classes as physical ‘aspects’: a critique7.2 Quantum dispositions7.3 Selections as a propensity modal interpretation7.4 A comparison with Popper's propensity interpretation. (shrink)
Science is popularly understood as being an ideal of impartial algorithmic objectivity that provides us with a realistic description of the world down to the last detail. The essays collected in this book—written by some of the leading experts in the field—challenge this popular image right at its heart, taking as their starting point that science trades not only in truth, but in fiction, too. With case studies that range from physics to economics and to biology, _Fictions in Science_ reveals (...) that fictions are as ubiquitous in scientific narratives and practice as they are in any other human endeavor, including literature and art. Of course scientific activity, most prominently in the formal sciences, employs logically precise algorithmic thinking. However, the key to the predictive and technological success of the empirical sciences might well lie elsewhere—perhaps even in scientists’ extraordinary creative imagination instead. As these essays demonstrate, within the bounds of what is empirically possible, a scientist’s capacity for invention and creative thinking matches that of any writer or artist. (shrink)
: This paper outlines a genuinely pragmatist conception of propensity, and defends it against common objections to the propensity interpretation of probability, prominently Humphreys’ paradox. The paper reviews the paradox and identifies one of its key assumptions, the identity thesis, according to which propensities are probabilities. The identity thesis is also involved in empiricist propensity interpretations deriving from Popper’s influential original proposal, and makes such interpretations untenable. As an alternative, I urge a return to Charles Peirce’s original insights on probabilistic (...) dispositions, and offer a reconstructed version of his pragmatist conception, which rejects the identity thesis. – Correspondence to: email@example.com. (shrink)
Most of human history and prehistory was lived in economic poverty but with social and ecological wealth, both of which are diminishing as commodification takes over most everything. Human moral wealth has also deteriorated. Because humans are biosocially, dynamically, and epigenetically shaped, early experience is key for developing one’s moral capital. When early experience is species-atypical, meaning that it falls outside the evolved developmental niche, which is often the case in modern societies, biopsychosocial moral development is undermined, shifting one’s nature (...) and worldview to self-protectionism. Individuals develop into self-regarding shadows of their potential selves, exhibiting threat-reactive moral mindsets that promote unjust treatment of other humans and nonhumans. Humanity’s moral wealth can be re-cultivated by taking up what indigenous people all over the world know: that a good life, a virtuous life, is a one that is led by a well-cultivated heart, embodied in action that includes partnership with nonhumans. Moral educators can help students to revamp their capacities with self-calming skills, the development of social pleasure and communal ecological imagination. (shrink)
Generalmente los comentadores han abordado la temática de la intuición en la filosofía de Spinoza desde la perspectiva de la problemática de lo que en la Ética aparece como ciencia intuitiva o tercer genero de conocimiento. En el presente artículo, en cambio, intentamos mostrar que hay en los escritos de Spinoza un concepto de intuición más amplio que el que está implícito en la ciencia intuitiva, del cual esta no sería más que una subespecie. Como paso previo para alcanzar dicho (...) fin, intentaremos mostrar que la concepción de la intuición que sostiene Spinoza está estrechamente emparentada con la concepción cartesiana -de la cual ofrecemos un breve esbozo en el primer apartado- y que ambas concepciones tienen como fundamento la naturaleza de la idea. Generally, critics had tackle the issue of intuition in Spinozas philosophy from the point of view of the problem of what in the Ethics is called intuitive science or third kind of knowledge. In the present paper, instead, we try to show that there is in Spinoza`s writings a wider concept of intuition than that which is implied in the intuitive science, being the later no more than a sub-kind of the former. As a previous step to achieve this target, we try to show that Spinoza`s conception of intuition is narrowly tied with the Cartesian conception -of which we offer a brief draft in the first part- and that both conceptions are based on the nature of the idea. (shrink)
I argue against an account of scientific representation suggested by the semantic, or structuralist, conception of scientific theories. Proponents of this conception often employ the term “model” to refer to bare “structures”, which naturally leads them to attempt to characterize the relation between models and reality as a purely structural one. I argue instead that scientific models are typically “representations”, in the pragmatist sense of the term: they are inherently intended for specific phenomena. Therefore in general scientific models are not (...) (merely) structures. I then explore some consequences of this pragmatist account of representation, and argue that it sheds light upon the distinction between theories and models. I finish by briefly addressing some critical comments due to Bas Van Fraassen. (shrink)
This paper is divided in two parts. In part I, I argue against two attempts to naturalise the notion of scientific representation, by reducing it to isomorphism and similarity. I distinguish between the means and the constituents of representation, and I argue that isomorphism and similarity are common means of representation; but that they are not constituents of scientific representation. I look at the prospects for weakened versions of these theories, and I argue that only those that abandon the aim (...) to naturalise scientific representation are likely to be successful. In part II of the paper, I present a deflationary conception of scientific representation, which minimally characterises it by means of two necessary conditions: representation is essentially intentional and it has the capacity to allow surrogate reasoning and inference. I then defend this conception by showing that it successfully meets the objections and difficulties that make its competitors, such as isomorphism and similarity, untenable. In addition the inferential conception explains the success of various means of representation in their appropriate domains, and it sheds light on the truth and accuracy of scientific representations. (shrink)
The distinction between norms and norm-formulations commits legal theorists to treating legal norms as entities. In this article, I first explore the path from meaning to entities built by some analytical philosophers of language. Later, I present a set of problems produced by treating norms as entities. Whatever type of entities we deal with calls for a clear differentiation between the identification and individuation criteria of such entities. In the putative case of abstract entities, the differentiation collapses. By changing the (...) notions of the intension and extension of words by extensional and intensional aspects of what we talk about, I outline a methodological programme for Law and Legal Theory. That programme is based in the identification of normativity. (shrink)
Depersonalization is characterised by a profound disruption of self-awareness mainly characterised by feelings of disembodiment and subjective emotional numbing.It has been proposed that depersonalization is caused by a fronto-limbic suppressive mechanism – presumably mediated via attention – which manifests subjectively as emotional numbing, and disables the process by which perception and cognition normally become emotionally coloured, giving rise to a subjective feeling of ‘unreality’.Our functional neuroimaging and psychophysiological studies support the above model and indicate that, compared with normal and clinical (...) controls, DPD patients show increased prefrontal activation as well as reduced activation in insula/limbic-related areas to aversive, arousing emotional stimuli.Although a putative inhibitory mechanism on emotional processing might account for the emotional numbing and characteristic perceptual detachment, it is likely, as suggested by some studies, that parietal mechanisms underpin feelings of disembodiment and lack of agency feelings. (shrink)
This paper argues that if propensities are displayed in objective physical chances then the appropriate representation of these chances is as indexed probability functions. Two alternative formal models, or accounts, for the relation between propensity properties and their chancy or probabilistic manifestations, in terms of conditionals and conditional probability are first reviewed. It is argued that both confront important objections, which are overcome by the account in terms of indexed probabilities. A number of further advantages of the indexed probability account (...) are discussed, which suggest that it is promising as a general theory of objective physical chance. The paper ends with a discussion of the indexical character of the objective chances that are grounded in propensities. (shrink)
Abstract: It is often assumed without argument that fictionalism in the philosophy of science contradicts scientific realism. This paper is a critical analysis of this assumption. The kind of fictionalism that is at present discussed in philosophy of science is characterised, and distinguished from fictionalism in other areas. A distinction is then drawn between forms of fictional representation, and two competing accounts of fiction in science are discussed. I then outline explicitly what I take to be the argument for the (...) incompatibility of scientific realism with fictionalism. I argue that some of its premises are unwarranted, and are moreover questionable from a fictionalist perspective. The conclusion is that fictionalism is neutral in the realism-antirealism debate, pulling neither in favour nor against scientific realism. (shrink)
There has been an intense discussion, albeit largely an implicit one, concerning the inference of causal hypotheses from statistical correlations in quantum mechanics ever since John Bell’s first statement of his notorious theorem in 1966. As is well known, its focus has mainly been the so-called Einstein-Podolsky-Rosen (“EPR”) thought experiment, and the ensuing observed correlations in real EPR like experiments. But although implicitly the discussion goes as far back as Bell’s work, it is only in the last two decades that (...) it has become recognizably and explicitly a debate about causal inference in the quantum realm. The bulk of this paper is devoted to a review of three influential arguments in the philosophical literature that aim to show that causal models for the EPR correlations are impossible, due to Bas Van Fraassen, Daniel Hausman and Huw Price. I contend that all these arguments are inconclusive since they contain premises or presuppositions that are false, unwarranted, or at least controversial. Five different common cause models are outlined that seem perfectly viable for the EPR correlations. These models are then employed to illustrate various difficulties with the premises and presuppositions underlying Van Fraassen’s, Hausman’s and Price’s arguments. In all these cases it is argued that the difficulties cut deep against these authors’ own theories of causation and causal inference. My conclusions are that causal models for the EPR correlations remain viable, that philosophical work is still required to assess their relative virtues, and that in any case the mere theoretical conceivability and empirical possibility of these models sheds doubts over Van Fraassen’s, Hausman’s and (important elements in) Price’s theories of causation and causal inference. (shrink)
Peter Milne and Neal Grossman have argued against Popper's propensity interpretation of quantum mechanics, by appeal to the two-slit experiment and to the distinction between mixtures and superpositions, respectively. In this paper I show that a different propensity interpretation successfully meets their objections. According to this interpretation, the possession of a quantum propensity by a quantum system is independent of the experimental set-ups designed to test it, even though its manifestations are not.
This article provides a state of the art review of the philosophical literature on scientific representation. It first argues that the topic emerges historically mainly out of what may be called the modelling tradition. It then introduces a number of helpful analytical distinctions, and goes on to divide contemporary approaches to scientific representation into two distinct kinds, substantive and deflationary. Analogies with related discussions of artistic representation in aesthetics, and of the nature of truth in metaphysics are pursued. It is (...) finally urged that the most promising approaches - and the ones most likely to feature prominently in future developments - are deflationary. In particular, a defence is provided of a genuinely inferential conception of representation. (shrink)
It is widely accepted in contemporary philosophy of science that the domain of application of a theory is typically larger than its explanatory covering power: theories can be applied to phenomena that they do not explain. I argue for an analogous thesis regarding the notion of empirical adequacy. A theory's domain of application is typically larger than its domain of empirical adequacy: theories are often applied to phenomena from which they receive no empirical confirmation. \\\ Existe en la filosofía de (...) la ciencia actual un amplio consenso al afirmar que el dominio de aplicación de una teoría científica es en general mucho más extenso que su dominio explicativo: las teorías científicas pueden ser aplicadas a fenómenos que no son capaces de explicar. En este artículo defiendo una tesis análoga con respecto a la noción de adecuación empírica. El dominio de aplicación de las teorías científicas es en general mucho mas amplio que su dominio de adecuacion empírica: las teorías a menudo se aplican a fenómenos que no proporcionan confirmatión empírica a su favor. (shrink)
I analyse critically what I regard as the most accomplished empiricist account of propensities, namely the long run propensity theory developed by Donald Gillies . Empiricist accounts are distinguished by their commitment to the ‘identity thesis’: the identification of propensities and objective probabilities. These theories are intended, in the tradition of Karl Popper’s influential proposal, to provide an interpretation of probability that renders probability statements directly testable by experiment. I argue that the commitment to the identity thesis leaves empiricist theories, (...) including Gillies’ version, vulnerable to a variant of what is known as Humphreys’ paradox. I suggest that the tension may be resolved only by abandoning the identity thesis, and by adopting instead an understanding of propensities as explanatory properties of chancy objects. (shrink)
This paper expands on, and provides a qualified defence of, Arthur Fine’s selective interactions solution to the measurement problem. Fine’s approach must be understood against the background of the insolubility proof of the quantum measurement. I first defend the proof as an appropriate formal representation of the quantum measurement problem. Then I clarify the nature of selective interactions, and more generally selections, and I go on to offer three arguments in their favour. First, selections provide the only known solution to (...) the measurement problem that does not relinquish any of the premises of the insolubility proofs. Second, unlike some no-collapse interpretations of quantum mechanics selections suffer no difficulties with non-ideal measurements. Third, unlike most collapse-interpretations selections can be independently motivated by an appeal to quantum dispositions. (shrink)
I argue that the Causal Markov Condition (CMC) is in principle applicable to the Einstein–Podolsky–Rosen (EPR) correlations. This is in line with my defence in the past of the applicability of the Principle of Common Cause to quantum mechanics. I first review a contrary claim by Dan Hausman and Jim Woodward, who endeavour to preserve the CMC against a possible counterexample by asserting that the conditions for the application of the CMC are not met in the EPR experiment. In their (...) view the CMC is inapplicable to the EPR correlations—i.e. it neither obtains nor fails. The view is grounded upon the non-separability of the quantum state, and the consequent unavailability of interventions. I urge that whether interventions are available in EPR—and why—is a complex and contextual question that does not have a unique or uniform answer. Instead, I argue that different combinations of causal hypotheses under test and interpretations of quantum mechanics yield different answers to the question. (shrink)
This book features new essays by philosophers, psychologists, and a theologian on the important topic of virtue development. The essays engage with work from multiple disciplines and thereby seek to bridge disciplinary divides. The volume is a significant contribution to the emerging interdisciplinary field of virtue development studies.
It is still a matter of controversy whether the Principle of the Common Cause (PCC) can be used as a basis for sound causal inference. It is thus to be expected that its application to quantum mechanics should be a correspondingly controversial issue. Indeed the early 90’s saw a flurry of papers addressing just this issue in connection with the EPR correlations. Yet, that debate does not seem to have caught up with the most recent literature on causal inference generally, (...) which has moved on to consider the virtues of a generalised PCC-inspired condition, the so-called Causal Markov Condition (CMC). In this paper we argue that the CMC is an appropriate benchmark for debating possible causal explanations of the EPR correlations. But we go on to take issue with some pronouncements on EPR by defenders of the CMC. (shrink)
In the early 2000s, some scholars suggested integrative ethical education as an approach to reconcile the gap between cognitive-development education, based on rule ethics, and traditional character-ethics education, inspired by character ethics in Western ethical education. Darcia Narvaez also tried to establish a comprehensive and systematic model. Nonetheless, she has indicated four questions that need further research. This paper aims to respond to Narvaez’s project and its questions from the angle of Xunzi’s ritual education. It argues that Xunzi’s thought may (...) provide some insights for Narvaez’s approach. To present this, it begins its discussion with an introduction of the main ideas of Xunzi’s thought. Later, it tries to show the insights from certain notions and elements, such as Junzi, reasoned judgment, of Xunzi’s ritual education for Narvaez’s project of integrative ethical education. Some relevant questions are also discussed. (shrink)
In the logic of theory change, the standard model is AGM, proposed by Alchourrón et al. (J Symb Log 50:510–530, 1985 ). This paper focuses on the extension of AGM that accounts for contractions of a theory by a set of sentences instead of only by a single sentence. Hansson (Theoria 55:114–132, 1989 ), Fuhrmann and Hansson (J Logic Lang Inf 3:39–74, 1994 ) generalized Partial Meet Contraction to the case of contractions by (possibly non-singleton) sets of sentences. In this (...) paper we present the possible worlds semantics for partial meet multiple contractions. (shrink)
If the target article represents the summary findings of the field, reasoning research is deeply flawed. The vision is too narrow and seems to fall into biological determinism. Humans use reasoning in effective ways apparently not studied by researchers, such as reasoning for action. Moreover, as the brain develops through adulthood and from experience so do reasoning capabilities.
In this paper we claim that the notion of cognitive representation (and scientific representation in particular) is irreducibly plural. By means of an analogy with the minimalist conception of truth, we show thatthis pluralism is compatible with a generally deflationary attitude towards representation. We then explore the extent and nature of representational pluralism by discussing the positive and negative analogies between the inferential conception of representation advocated by one of us and the minimalist conception of truth.
In an influential article published in 1982, Bas Van Fraassen developed an argument against causal realism on the basis of an analysis of the Einstein-Podolsky-Rosen correlations of quantum mechanics. Several philosophers of science and experts in causal inference -including some causal realists like Wesley Salmon- have accepted Van Fraassen’s argument, interpreting it as a proof that the quantum correlations cannot be given any causal model. In this paper I argue that Van Fraassen’s article can also be interpreted as a good (...) guide to the different causal models available for the EPR correlations, and their relative virtues. These models in turn give us insight into some of the unusual features that quantum propensicies might have. (shrink)
Kohlberg's work in moral judgement has been criticised by many philosophers and psychologists. Building on Kohlberg's core assumptions, we propose a model of moral judgement (hereafter the neo-Kohlbergian approach) that addresses these concerns. Using 25 years of data gathered with the Defining Issues Test (DIT), we present an overview of Minnesota's neo-Kohlbergian approach, using Kohlberg's basic starting points, ideas from Cognitive Science (especially schema theory), and developments in moral philosophy.
This article provides a state-of-the-art review of the philosophical literature on scientific representation. It first argues that the topic emerges historically mainly out of what may be called the modelling tradition. It then introduces a number of helpful analytical distinctions and goes on to divide contemporary approaches to scientific representation into two distinct kinds, substantive and deflationary. Analogies with related discussions of artistic representation in aesthetics and the nature of truth in metaphysics are pursued. It is finally urged that the (...) most promising approaches—and the ones most likely to feature prominently in future developments—are deflationary. In particular, a defense is provided of a genuinely inferential conception of representation. (shrink)
Scientific representation is a booming field nowadays within the philosophy of science, with many papers published regularly on the topic every year, and several yearly conferences and workshops held on related topics. Historically, the topic originates in two different strands in 20th-century philosophy of science. One strand begins in the 1950s, with philosophical interest in the nature of scientific theories. As the received or “syntactic” view gave way to a “semantic” or “structural” conception, representation progressively gained the center stage. Yet, (...) there is another, older, strand that links representation to fin de siècle modeling debates, particularly in the emerging Bildtheorie of Boltzmann and Hertz, and to the ensuing discussion among philosophers thereafter. Both strands feed into present-day philosophical work on scientific representation. There are a number of different orthogonal questions that philosophers ask regarding representation. One set of questions concerns the nature of the representational relation between theories or models, on the one hand, and the real-world systems they purportedly represent. Such questions lie at the more metaphysical and abstract end of the spectrum—and they are often addressed with the abstract tools of the analytical metaphysician. They constitute what we may refer to as the “analytical inquiry” into representation. On the other hand there are questions regarding the use that scientists put some representations to in practice—these are questions that are best addressed by means of some of the favorite tools of the philosopher of science, including descriptive analysis, illustration by means of case studies, induction, exemplification, inference from practice, etc., and are best referred to as the “practical inquiry” into representation. The notion of representation invoked in such inquiries may be “deflationary” or “substantive”—depending on whether it construes representation as a primitive notion, or as susceptible to further reduction or analysis in terms of something else. (shrink)