John Bell showed that a big class of local hidden-variable models stands in conflict with quantum mechanics and experiment. Recently, there were suggestions that empirically adequate hidden-variable models might exist which presuppose a weaker notion of local causality. We will show that a Bell-type inequality can be derived also from these weaker assumptions.
Institute of Theoretical Physics, Exact Sciences Sidlerstrasse 5, University of Bern, CH-3012 Bern Switzerland portmann{at}itp.unibe.ch' + u + '@' + d + ''//--> awuethr{at}itp.unibe.ch' + u + '@' + d + ''//--> John Bell showed that a big class of local hidden-variable models stands in conflict with quantum mechanics and experiment. Recently, there were suggestions that empirically adequate hidden-variable models might exist which presuppose a weaker notion of local causality. We will show that a Bell-type inequality can be derived also (...) from these weaker assumptions. Introduction The EPR-Bohm experiment Local causality Bell's inequality from separate common causes 4.1 A weak screening-off principle 4.2 Perfect correlation and ‘determinism’ 4.3 A minimal theory for spins 4.4 No conspiracy Discussion. (shrink)
Falsification no longer is the cornerstone of philosophy of science; but it still looms widely that scientists ought to drop an explanatory hypothesis in view of negative results. We shall argue that, to the contrary, negative empirical results are unable to disqualify causally explanatory hypotheses—not because of the shielding effect of auxiliary assumptions but because of the fact that the causal irrelevance of a factor cannot empirically be established. This perspective is elaborated at a case study taken from the history (...) of plant physiology: the formaldehyde model of photosynthesis, which for about sixty years dominated the field—despite the fact that in these sixty years all the attempts to conclusively demonstrate even the presence of formaldehyde in plants failed. (shrink)
In his Geography, Ptolemy recorded the geographical coordinates of more than 6,300 toponyms of the known oikoumenē. This study presents the type of geographical information that was used by Ptolemy as well as the methods he applied to derive his geographical coordinates. A new methodological approach was developed in order to analyse the characteristic deviations of Ptolemy’s data from their reconstructed reference locations. The clusters of displacement vectors establish that Ptolemy did not obtain his coordinates from astronomical observations at each (...) geographical location. The characteristic displacement vectors reveal how Ptolemy derived the coordinates: he constructed locations on maps using a compass and ruler, for which he employed a small amount of astronomical reference data and geographical distance information; he made schematic drawings of coastlines, based on textual descriptions of coastal formations; and he situated additional locations within the established framework using reports of travel itineraries. (shrink)
In the light of recent discussions we present the main results of our project, the aim of which was to derive a Bell-type inequality from the weakest possible assumptions. A principal outcome of the project is that a Bell-type inequality can be derived from the assumption of separate common causes (Graßhoff, Portmann and Wüthrich 2005), even without the assumption of perfectly anticorrelating event types (Portmann and Wüthrich 2007). We also address the critique that in Graßhoff et al. (2005) we implicitly (...) assume a common common cause (Hofer-Szabó forthcoming). (shrink)
Der zweite Band der Reihe Philosophische Forschung spannt zwei Kerngebiete der Analytischen Philosophie zusammen: die Semantik und die Ontologie. Was sind die Grundbausteine unserer Ontologie? Wie beziehen wir uns sprachlich bzw. geistig auf sie? Diese und weitere Fragen werden von international renommierten Philosophen aus historischer und systematischer Perspektivediskutiert. Die Beiträge sind in Deutsch und English verfasst. Sie stammen von Christian Beyer, Johannes Brandl, Dagfinn Føllesdal, Dorothea Frede, Rolf George, Gerd Graßhoff, Peter Hacker, Andreas Kemmerling, Edgar Morscher, Kevin Mulligan, Rolf (...) Puster, Richard Schantz, Benjamin Schnieder, Oliver Scholz, Severin Schröder, Peter Simons, Thomas Spitzley, Markus Stepanians, Ralf Stoecker und Daniel von Wachter. (shrink)
Der zweite Band der Reihe Philosophische Forschung spannt zwei Kerngebiete der Analytischen Philosophie zusammen: die Semantik und die Ontologie. Was sind die Grundbausteine unserer Ontologie? Wie beziehen wir uns sprachlich bzw. geistig auf sie? Diese und weitere Fragen werden von international renommierten Philosophen aus historischer und systematischer Perspektive diskutiert. Die Beiträge sind in Deutsch und English verfasst. Sie stammen von Christian Beyer, Johannes Brandl, Dagfinn Føllesdal, Dorothea Frede, Rolf George, Gerd Graßhoff, Peter Hacker, Andreas Kemmerling, Edgar Morscher, Kevin Mulligan, (...) Rolf Puster, Richard Schantz, Benjamin Schnieder, Oliver Scholz, Severin Schröder, Peter Simons, Thomas Spitzley, Markus Stepanians, Ralf Stoecker und Daniel von Wachter. (shrink)
For a long time, regularity accounts of causation have virtually vanished from the scene. Problems encountered within other theoretical frameworks have recently induced authors working on causation, laws of nature, or methodologies of causal reasoning – as e.g. May (Kausales Schliessen. Eine Untersuchung über kausale Erklärungen und Theorienbildung. Ph.D. thesis, Universität Hamburg, Hamburg, 1999), Ragin (Fuzzy-set social science. Chicago: University of Chicago Press, 2000), Graßhoff and May (Causal regularities. In W. Spohn, M. Ledwig, & M. Esfeld (Eds.), Current issues in (...) causation (pp. 85–114). Paderborn: Mentis, 2001), Swartz (The concept of physical law (2nd ed.). http://www.sfu.ca/philosophy/physical-law/, 2003), Halpin (Erkenntnis, 58, 137–168, 2003) – to direct their attention back to regularity theoretic analyses. In light of the latest proposals of regularity theories, the paper at hand therefore reassesses the criticism raised against regularity accounts since the INUS theory of causation of Mackie (The cement of the universe. A study of causation. Oxford: Clarendon Press, 1974). It is shown that most of these objections target strikingly over-simplified regularity theoretic sketches. By outlining ways to refute these objections it is argued that the prevalent conviction as to the overall failure of regularity theories has been hasty. (shrink)
The Minimal Theory of Causation, presented in Graßhoff and May, 2001, aspires to be a version of a regularity analysis of causation able to correctly predict our causal intuitions. In my article, I will argue that it is unsuccessful in this respect. The second aim of the paper will be to defend Hitchcock’s proposal concerning divisions of causal relations against criticism made, in Jakob, 2006 on the basis of the Minimal Theory of Causation.
Standard derivations of the Bell inequalities assume a common-commoncause-system that is a common screener-off for all correlations and some additional assumptions concerning locality and no-conspiracy. In a recent paper Graßhoff et al., "The British Journal for the Philosophy of Science", 56, 663–680 ) Bell inequalities have been derived via separate common causes assuming perfect correlations between the events. In the paper it will be shown that the assumptions of this separate-common-cause-type derivation of the Bell inequalities in the case of perfect (...) correlations can be reduced to the assumptions of a common-common-cause-system-type derivation. However, in the case of non-perfect correlations a non-reducible separate-common-cause-type derivation of some Bell-like inequalities can be given. The violation of these Bell-like inequalities proves Szabo's ) conjecture concerning the non-existence of a local, non-conspiratorial, separate-common-cause-model for a δ-neighborhood of perfect EPR correlations. (shrink)
In this paper, multi attribute decision making problem based on grey relational analysis in neutrosophic cubic set environment is investigated. In the decision making situation, the attribute weights are considered as single valued neutrosophic sets. The neutrosophic weights are converted into crisp weights. Both positve and negative GRA coefficients, and weighted GRA coefficients are determined. Hamming distances for weighted GRA coefficients and standard (ideal) GRA coefficients are determined. The relative closeness coefficients are derived in order to rank the alternatives. The (...) relative closeness coefficients are designed in ascending order. Finally, a numerical example is solved to demonstrate the applicability of the proposed approach. (shrink)
The Empire of Chance tells how quantitative ideas of chance transformed the natural and social sciences, as well as daily life over the last three centuries. A continuous narrative connects the earliest application of probability and statistics in gambling and insurance to the most recent forays into law, medicine, polling and baseball. Separate chapters explore the theoretical and methodological impact in biology, physics and psychology. Themes recur - determinism, inference, causality, free will, evidence, the shifting meaning of probability - but (...) in dramatically different disciplinary and historical contexts. In contrast to the literature on the mathematical development of probability and statistics, this book centres on how these technical innovations remade our conceptions of nature, mind and society. Written by an interdisciplinary team of historians and philosophers, this readable, lucid account keeps technical material to an absolute minimum. It is aimed not only at specialists in the history and philosophy of science, but also at the general reader and scholars in other disciplines. (shrink)
This is Michael Picard's translation of Gerd Achenbach’s _Philosophische Praxis_ Beset by life-problems you can neither get rid of nor solve? Stuck? Over- or under-burdened by reality? Not living up to your potential? Philosophical Praxis is the alternative to psychotherapy for people not satisfied to muddle through life or merely exist. Not a method or a teaching, not diagnosis, treatment, or therapy, not a ready-made rule of life for you to conform to. It is no preformed application, but the (...) use of philosophy to aid the individual, organization and society to come to self-understanding, so that people may thrive and live up to being themselves. Philosophical Praxis is not the prepossession of any trained philosopher, but a specialization in the practice of philosophy aimed to liberate guests (not ‘clients’) through sympathetic co-thinking toward their own fresh appraisals of their life and its circumstances. This book is the chief statement of German philosopher, Gerd Achenbach, founder of Philosophical Praxis, who in pioneering this revival of philosophy for life spawned a global movement. (shrink)
In reply to H. Bußhoff's paper I give another outline of Lakatos' approach to normative theories in order to reduce the misunderstandings Bußhoff seems to have fallen victim to. . In particular, I try to show that he is wrong in claiming there is a vicious circle in this approach or my interpretation of it . Finally, I expose for criticism his alternative methodology of political science which propagates a theory of a "third type", suggesting that he takes too little (...) seriously problems he calls academical although their importance has been shown not only by Lakatos but by political philosophers as Rawls and Nozick. (shrink)
The mathematical and epistemological origins of Hermann Graßmann’s innovative work have always attracted the interest of mathematicians and historians. Since Friedrich Engel’s biography, a favourite source for these interpretations has been two curriculum vitae, which were, however, only known from several excerpts. The complete texts are edited here for the first time. They are presented and commented on in their respective contexts, namely the examinations required for a career as Gymnasium teacher and as Protestant pastor. Graßmann’s relations to Schleiermacher’s theology (...) and to his philosophy, as well as Graßmann’s philosophical and mathematical motivations, are discussed in the light of these two documents. (shrink)
Simple Heuristics That Make Us Smart invites readers to embark on a new journey into a land of rationality that differs from the familiar territory of cognitive science and economics. Traditional views of rationality tend to see decision makers as possessing superhuman powers of reason, limitless knowledge, and all of eternity in which to ponder choices. To understand decisions in the real world, we need a different, more psychologically plausible notion of rationality, and this book provides it. It is about (...) fast and frugal heuristics--simple rules for making decisions when time is pressing and deep thought an unaffordable luxury. These heuristics can enable both living organisms and artificial systems to make smart choices, classifications, and predictions by employing bounded rationality. But when and how can such fast and frugal heuristics work? Can judgments based simply on one good reason be as accurate as those based on many reasons? Could less knowledge even lead to systematically better predictions than more knowledge? Simple Heuristics explores these questions, developing computational models of heuristics and testing them through experiments and analyses. It shows how fast and frugal heuristics can produce adaptive decisions in situations as varied as choosing a mate, dividing resources among offspring, predicting high school drop out rates, and playing the stock market. As an interdisciplinary work that is both useful and engaging, this book will appeal to a wide audience. It is ideal for researchers in cognitive psychology, evolutionary psychology, and cognitive science, as well as in economics and artificial intelligence. It will also inspire anyone interested in simply making good decisions. (shrink)