Sometimes, the availability of more evidence for a conclusion provides a reason to believe in its falsity. This counter-intuitive phenomenon is related to the idea of higher-order evidence, which has attracted broad interest in recent epistemological literature. Occasionally, providing more evidence for something weakens the case in its favor, by casting doubt on the probative value of other evidence of the same sort or on the fact-finder’s cognitive performance. We analyze this phenomenon, discuss its rationality, and outline possible application to (...) evidence law and to the law in general. It is suggested, inter alia, that such higher-order evidence may explain how judicial experience-based expertise in fact-finding is possible despite the absence of a feedback mechanism; and that fact-finders’ self-doubt regarding their own competence in making ‘beyond-reasonable doubt’ judgments might be reasonable and should not be ignored. (shrink)
A contribution to the sixth installment of the Common Knowledge symposium “Apology for Quietism,” this article reflects on the challenges that understanding the Holocaust posed for Jews in Palestine and has posed for them in Israel. Ofer concentrates on the images of victims, fighters, and survivors as they were formulated during the last years of World War II and after the establishment of the State of Israel. Behind these images stood historical, concrete human beings who were classified according to (...) concepts supplied by Zionist and historical Jewish culture, in which activism vs. quietism had long presented major issues for debate. A narrative and typology developed of fighting heroes, victims, and survivors; Ofer questions how different these categories were from each other. In pursuit of an answer, she examines institutions in Israel for commemoration and remembrance, especially Yad Vashem, a state-established institution centering on its often-reconfigured and redesigned historical museums. Finally, this article explores the impact of immigrations and wars on the approach of Israelis to activism and quietism, with special reference to attitudes toward the Holocaust. (shrink)
Introducing the sixth and final installment of the Common Knowledge symposium “Apology for Quietism,” Allen looks at the symposium retrospectively and concludes that it has mainly concerned “sage knowledge,” defined as foresight into the development of situations. The sagacious knower sees the disposition of things in an early, incipient form and knows how to intervene with nearly effortless and undetectable (quiet) effectiveness. Whatever the circumstance, the sage handles it with finesse, never doing too much but also never leaving anything undone (...) that must be accomplished. Quiet, when it is knowingly and effectively quiet (not pusillanimous or poor in spirit), is about what not to do, how not to approach a problem, what not to decide, what is not known, what will not work. Allen explains these principles in terms of traditional Chinese thought, Daoist and Confucian: wei we wei, or “doing-not-doing,” means effective inaction. What makes such wisdom possible is not mystical insight, he argues, but discipline in a certain kind of art. The sage has no need of reasons (let alone doctrines), only effectiveness; and he does not need truth or justice, only subtlety. The detachment of a quietist has little, if anything, to do with transcending perspectives. Detachment is good as a means to flexibility, and instead of transcending perspectives, a sage is skilled in the quiet art of never getting stuck in one. To be effectively quiet is not so much to be silent as to be inaudible, invisible; the sage “vanishes into things.”. (shrink)
Sometimes, we are justified in adopting a certain procedure because it leads to a just outcome. A paradigmatic example is dividing a cake using the ‘you-cut-I-choose’ method: the one who cuts the cake is the last to get her share. This procedure is justified because it tends to lead to the just outcome—an equal division of the cake. However, at times the direction of justification is reversed. Think, for example, of a tennis match in which a coin toss is utilized (...) to determine which player will serve first. In this case, the outcome is justified in virtue of its being a product of that (fair) procedure. We can call these two phenomena ‘justificational priority’ of the outcome and of the procedure, respectively. This article suggests that the concept of justificational priority can be applied to the legal classification of norms as ‘substantive’ or ‘procedural’. Such classification is required, for instance, in cases of conflict of laws. It is argued that if a certain substantive outcome has justificational priority over a certain norm, which is conceptually (or philosophically) procedural, then this norm should be legally classified as ‘substantive’. In contrast, if a certain (conceptually) procedural norm has justificational priority over the substantive outcome, then, in general, it should be legally classified as ‘procedural’. (shrink)
We present a four-valued approach for recovering consistent data from inconsistent set of assertions. For a common family of knowledge-bases we also provide an e cient algorithm for doing so automaticly. This method is particularly useful for making model-based diagnoses.
Getting physical: Empiricism’s medical History Content Type Journal Article DOI 10.1007/s11016-010-9474-4 Authors John Gascoigne, School of History and Philosophy, University of New South Wales, Sydney, NSW 2056, Australia Journal Metascience Online ISSN 1467-9981 Print ISSN 0815-0796.
Cet ouvrage collectif, qui résulte en partie des travaux d’un atelier sur l’empirisme incarné dans la science moderne qui s’est tenu à l’université de Sydney en février 2009, rassemble quinze communications regroupées en trois parties : « The Body as Object », « The Body as Instrument », « Embodies Minds ». L’objectif des auteurs est de corriger la conception dominante que se font les historiens des sciences et de la philosophie de l’émergence de la philosophie expérimentale, et de l’empirism..
L’empirisme, comme mode de connaissance mais aussi comme tradition de pensée, a longtemps été négligé, que ce soit en histoire des sciences ou en histoire de la philosophie. Longtemps opposé au rationalisme, l’empirisme fait figure de mode de connaissance rhapsodique et non systématique. Associé au scepticisme, il est considéré comme une forme de renoncement à la connaissance, se contentant de décrire l’apparence des choses quand la véritable .
From a theoretical perspective, most discussions of statistical learning have focused on the possible “statistical” properties that are the object of learning. Much less attention has been given to defining what “learning” is in the context of “statistical learning.” One major difficulty is that SL research has been monitoring participants’ performance in laboratory settings with a strikingly narrow set of tasks, where learning is typically assessed offline, through a set of two-alternative-forced-choice questions, which follow a brief visual or auditory familiarization (...) stream. Is that all there is to characterizing SL abilities? Here we adopt a novel perspective for investigating the processing of regularities in the visual modality. By tracking online performance in a self-paced SL paradigm, we focus on the trajectory of learning. In a set of three experiments we show that this paradigm provides a reliable and valid signature of SL performance, and it offers important insights for understanding how statistical regularities are perceived and assimilated in the visual modality. This demonstrates the promise of integrating different operational measures to our theory of SL. (shrink)
The notion of bilattice was introduced by Ginsberg, and further examined by Fitting, as a general framework for many applications. In the present paper we develop proof systems, which correspond to bilattices in an essential way. For this goal we introduce the notion of logical bilattices. We also show how they can be used for efficient inferences from possibly inconsistent data. For this we incorporate certain ideas of Kifer and Lozinskii, which happen to suit well the context of our work. (...) The outcome are paraconsistent logics with a lot of desirable properties.1. (shrink)
Maximality is a desirable property of paraconsistent logics, motivated by the aspiration to tolerate inconsistencies, but at the same time retain from classical logic as much as possible. In this paper we introduce the strongest possible notion of maximal paraconsistency, and investigate it in the context of logics that are based on deterministic or non-deterministic three-valued matrices. We show that all reasonable paraconsistent logics based on three-valued deterministic matrices are maximal in our strong sense. This applies to practically all three-valued (...) paraconsistent logics that have been considered in the literature, including a large family of logics which were developed by da Costa's school. Then we show that in contrast, paraconsistent logics based on three-valued properly nondeterministic matrices are not maximal, except for a few special cases (which are fully characterized). However, these non-deterministic matrices are useful for representing in a clear and concise way the vast variety of the (deterministic) three-valued maximally paraconsistent matrices. The corresponding weaker notion of maximality, called premaximal paraconsistency, captures the "core" of maximal paraconsistency of all possible paraconsistent determinizations of a non-deterministic matrix, thus representing what is really essential for their maximal paraconsistency. (shrink)
The Optimal Innovation Hypothesis, following from the Graded Salience Hypothesis, is being reviewed and revisited. The attempt is to expand the notion of Optimal Innovation to allow it to apply to both stimuli’s coded meanings as well as their noncoded, constructed interpretations. According to the Optimal Innovation Hypothesis, Optimal Innovations, when devised, will be more pleasing than nonoptimally innovative counterparts. Unlike such competitors, Optimal Innovations deautomatize familiar coded alternatives, which invoke unconditional responses alongside novel but distinct ones, allowing both responses (...) to interact. Conversely, the Revised Optimal Innovation Hypothesis, introduced and tested here, follows from the Defaultness Hypothesis. It posits that both default lexicalized meanings and default constructed interpretations might be qualifiable for Optimal Innovation once they are deautomatized by nondefault, context-dependent counterparts. Such nondefault Optimal Innovations will be pleasing, more pleasing than default and nondefault counterparts not qualifiable for Optimal Innovation. Results of two experiments support the Revised Optimal Innovation Hypothesis, while further corroborating the Defaultness Hypothesis. (shrink)
In the last decades, Systems Biology (including cancer research) has been driven by technology, statistical modelling and bioinformatics. In this paper we try to bring biological and philosophical thinking back. We thus aim at making diferent traditions of thought compatible: (a) causality in epidemiology and in philosophical theorizing—notably, the “sufcient-component-cause framework” and the “mark transmission” approach; (b) new acquisitions about disease pathogenesis, e.g. the “branched model” in cancer, and the role of biomarkers in this process; (c) the burgeoning of omics (...) research, with a large number of “signals” and of associations that need to be interpreted. In the paper we summarize frst the current views on carcinogenesis, and then explore the relevance of current philosophical interpretations of “cancer causes”. We try to ofer a unifying framework to incorporate biomarkers and omic data into causal models, referring to a position called “evidential pluralism”. According to this view, causal reasoning is based on both “evidence of diference-making” (e.g. associations) and on “evidence of underlying biological mechanisms”. We conceptualize the way scientists detect and trace signals in terms of information transmission, which is a generalization of the mark transmission theory developed by philosopher Wesley Salmon. Our approach is capable of helping us conceptualize how heterogeneous factors such as micro and macro-biological and psycho-social—are causally linked. This is important not only to understand cancer etiology, but also to design public health policies that target the right causal factors at the macro-level. (shrink)
Scholars have long been captivated by the parallels between birdsong and human speech and language. In this book, leading scholars draw on the latest research to explore what birdsong can tell us about the biology of human speech and language and the consequences for evolutionary biology. They examine the cognitive and neural similarities between birdsong learning and speech and language acquisition, considering vocal imitation, auditory learning, an early vocalization phase, the structural properties of birdsong and human language, and the striking (...) similarities between the neural organization of learning and vocal production in birdsong and human speech. After outlining the basic issues involved in the study of both language and evolution, the contributors compare birdsong and language in terms of acquisition, recursion, and core structural properties, and then examine the neurobiology of song and speech, genomic factors, and the emergence and evolution of language. Contributors: Hermann Ackermann, Gabriël J.L. Beckers, Robert C. Berwick, Johan J. Bolhuis, Noam Chomsky, Frank Eisner, Martin Everaert, Michale S. Fee, Olga Fehér, Simon E. Fisher, W. Tecumseh Fitch, Jonathan B. Fritz, Sharon M.H. Gobes, Riny Huijbregts, Eric Jarvis, Robert Lachlan, Ann Law, Michael A. Long, Gary F. Marcus, Carolyn McGettigan, Daniel Mietchen, Richard Mooney, Sanne Moorman, Kazuo Okanoya, Christophe Pallier, Irene M. Pepperberg, Jonathan F. Prather, Franck Ramus, Eric Reuland, Constance Scharff, Sophie K. Scott, Neil Smith, Ofer Tchernichovski, Carel ten Cate, Christopher K. Thompson, Frank Wijnen, Moira Yip, Wolfram Ziegler, Willem Zuidema. (shrink)
Research technologies can now produce so much information that there is signifcant potential for incidental fndings . These are fndings generated in research that are beyond the aims of the study. Current law and federal regulations ofer no direct guidance on how to deal with IFs in research, nor is there adequate professional or institutional guidance. We advocate a defned set of researcher duties based on law and ethics and recommend a pathway to be followed in handling IFs in (...) research. This article traces the underlying ethical and legal theories supporting researcher duties to manage IFs, including duties to develop a plan for management in the research protocol, to discuss the possibility of and management plan for IFs in the informed consent process, and to address, evaluate, and ultimately ofer to disclose IFs of potential clinical or reproductive signifcance to research participants when they arise. (shrink)
Long after its alleged demise, phlogiston was still presented, discussed and defended by leading chemists. Even some of the leading proponents of the new chemistry admitted its ‘absolute existence’. We demonstrate that what was defended under the title ‘phlogiston’ was no longer a particular hypothesis about combustion and respiration. Rather, it was a set of ontological and epistemological assumptions and the empirical practices associated with them. Lavoisier's gravimetric reduction, in the eyes of the phlogistians, annihilated the autonomy of chemistry together (...) with its peculiar concepts of chemical substance and quality, chemical process and chemical affinity. The defence of phlogiston was the defence of a distinctly chemical conception of matter and its appearances, a conception which reflected the chemist's acquaintance with details and particularities of substances, properties and processes and his skills of adducing causal relations from the interplay between their complexity and uniformity. (shrink)
In this paper we provide a proof theoretical investigation of logical argumentation, where arguments are represented by sequents, conflicts between arguments are represented by sequent elimination rules, and deductions are made by dynamic proof systems extending standard sequent calculi. The idea is to imitate argumentative movements in which certain claims are introduced or withdrawn in the presence of counter-claims. This is done by a dynamic evaluation of sequences of sequents, in which the latter are considered ‘derived’ or ‘not derived’ according (...) to the content of the sequence. We show that decisive conclusions of such a process correspond to well-accepted consequences of the underlying argumentation framework. The outcome is therefore a general and modular proof-theoretical approach for paraconsistent and non-monotonic reasoning with argumentation systems. (shrink)
We introduce a general approach for representing and reasoning with argumentation-based systems. In our framework arguments are represented by Gentzen-style sequents, attacks between arguments are represented by sequent elimination rules, and deductions are made according to Dung-style skeptical or credulous semantics. This framework accommodates different languages and logics in which arguments may be represented, allows for a flexible and simple way of expressing and identifying arguments, supports a variety of attack relations, and is faithful to standard methods of drawing conclusions (...) by argumentation frameworks. Altogether, we show that argumentation theory may benefit from incorporating proof theoretical techniques and that different non-classical formalisms may be used for backing up intended argumentation semantics. (shrink)
Paradefinite logics are logics that can be used for handling contradictory or partial information. As such, paradefinite logics should be both paraconsistent and paracomplete. In this paper we consider the simplest semantic framework for introducing paradefinite logics. It consists of the four-valued matrices that expand the minimal matrix which is characteristic for first degree entailments: Dunn–Belnap matrix. We survey and study the expressive power and proof theory of the most important logics that can be developed in this framework.
The work of Arnon Avron and Ofer Arieli has shown a deep relationship between the theory of bilattices and the Belnap-Dunn logic \. This correspondence has been interpreted as evidence that \ is “the” logic of bilattices, a consideration reinforced by the work of Yaroslav Shramko and Heinrich Wansing in which \ is shown to be similarly entrenched with respect to the theories of trilattices and, more generally, multilattices. In this paper, we export Melvin Fitting’s “cut-down” connectives—propositional connectives that (...) “cut down” available evidence—to the case of multilattices and show that two related first degree systems—Harry Deutsch’s four-valued \ and Richard Angell’s \—emerge just as elegantly and are as intimately connected to the theories of bilattices and trilattices as the Belnap-Dunn logic. (shrink)
Findings from two experiments argue in favor of the superiority of default, preferred interpretations over non-default less favored counterparts, outshining degree of non-salience, non-literalness, contextual strength, and negation. They show that, outside of a specific context, the default interpretation of specific negative constructions is a non-salient interpretation 1; their non-default interpretation is a salience-based alternative. In contrast, the default interpretation of the affirmative counterparts is a salience-based interpretation ; their non-default interpretation is a non-salient alternative. When in equally strongly supportive (...) contexts, default yet non-salient negative sarcasm is processed faster than non-default, non-salient yet affirmative sarcasm and faster than non-default yet salience-based negativ.. (shrink)
This collection opens a dialogue between process philosophy and contemporary consciousness studies. Approaching consciousness from diverse disciplinary perspectives—philosophy, psychology, neuroscience, neuropathology, psychotherapy, biology, animal ethology, and physics—the contributors offer empirical and philosophical support for a model of consciousness inspired by the process philosophy of Alfred North Whitehead (1861–1947). Whitehead’s model is developed in ways he could not have anticipated to show how it can advance current debates beyond well-known sticking points. This has trenchant consequences for epistemology and suggests fresh and (...) promising new perspectives on such topics as the mind-body problem, the neurobiology of consciousness, animal consciousness, the evolution of consciousness, panpsychism, the unity of consciousness, epiphenomenalism, free will, and causation. Contents: Introduction, Michel Weber & Anderson Weekes I. Setting the Stage 1. Process Thought as a Heuristic for Investigating Consciousness, Michel Weber & Anderson Weekes 2. Whitehead as a Neglected Figure of 20th Century Philosophy, Michel Weber & Anderson Weekes 3. Consciousness as a Topic of Investigation in Western Thought, Anderson Weekes 4. Whitehead’s Unique Approach to the Topic of Consciousness, Anderson Weekes II. Psychology and Philosophy of Mind 5. Consciousness as a Subjective Form, David Ray Griffin 6. The Interpretation and Integration of the Literature on Consciousness from a Process Perspective, Michael W. Katzko 7. Windows on Nonhuman Minds, Donald R. Griffin III. From Metaphysics to (Neuro)Science and Back Again 8. Panexperientialism, Quantum Theory, and Neuroplasticity, George W. Shields 9. The Evolution of Consciousness, Max Velmans 10. The Carrier Theory Of Causation, Gregg H. Rosenberg IV. Clinical Applications: Consciousness as Process 11. The Microgenetic Revolution in Contemporary Neuropsychology and Neurolinguistics, Maria Pachalska and Bruce Duncan MacQueen 12. From Coma to Consciousness, Avraham Schweiger, Michael Frost, Ofer Keren 13. Consciousness and Rationality from a Process Perspective, Michel Weber V. History (and Future?) of Philosophy 14. Consciousness, Memory, and Recollection according to Whitehead, Xavier Verley 15. Consciousness and Causation in Light of Whitehead’s Phenomenology of Becoming, Anderson Weekes. (shrink)
: Bereft of the illusion of an epistemic vantage point external to science, what should be our commitment towards the categories, concepts and terms of that very science? Should we, despaired of the possibility to found these concepts on rock bottom, adopt empiricist skepticism? Or perhaps the inexistence of external foundations implies, rather, immunity for scientific ontology from epistemological criticism? Philosophy's "realism debate" died out without providing a satisfactory answer to the dilemma, which was taken over by the neighboring disciplines. (...) The "symmetry principle" of the "StrongProgramme" for the sociology of science-the requirement that truth and error receive the same kind of causal explanations-offered one bold metaphysical answer, under the guise of a methodological decree. Recently, however, it has been argued that this solution is not bold enough, that the social constructivists replaced the naïve presumption of an independent nature which adjudicates our beliefs with a mirror-image presumption of a sui generis society which furnishes these beliefs autonomously. The proper metaphysics for a foundationless epistemology, argues Bruno Latour, is one which grants nature and society, object and subject, equal roles in the success and failure of science and technology; one in which history of society merges with a history of things-in-themselves. The paper analyzes the philosophical and methodological motivations and ramifications of this extraordinary suggestion. (shrink)