Counterfactuals all the way down? Content Type Journal Article DOI 10.1007/s11016-010-9437-9 Authors Jim Woodward, History and Philosophy of Science, 1017 Cathedral of Learning, University of Pittsburgh, Pittsburgh, PA 15260, USA Barry Loewer, Department of Philosophy, Rutgers University, New Brunswick, NJ 08901, USA John W. Carroll, Department of Philosophy and Religious Studies, North Carolina State University, Raleigh, NC 27695-8103, USA Marc Lange, Department of Philosophy, University of North Carolina at Chapel Hill, CB#3125—Caldwell Hall, Chapel Hill, NC 27599-3125, USA Journal Metascience Online (...) ISSN 1467-9981 Print ISSN 0815-0796 Journal Volume Volume 20 Journal Issue Volume 20, Number 1. (shrink)
This book combines physics, philosophy, and history in a radical new approach to introducing the philosophy of physics. It leads the reader through several central problems in the philosophy of physics by tracing their connections to a single issue: whether a cause must be spatiotemporally local to its effect, or whether action at a distance can occur.
Introduction Friedrich Albert Lange has sometimes been mentioned in relation to the pyrotechnical writings of Nietzsche and, on occasion, has been said to ...
David Hilbert was one of the great mathematicians who expounded the centrality of their subject in human thought. In this collection of essays, Wilfried Sieg frames Hilbert's foundational work, from 1890 to 1939, in a comprehensive way and integrates it with modern proof theoretic investigations.
Lange issues a novel challenge to philosophical accounts of laws of nature. He notes that the laws of nature seem to be themselves governed by laws analogous to the way that the laws govern particular facts. These higher order laws are the meta-laws of nature. He claims that if a philosophical account of laws aims to accurately characterize the laws, it should be able to account for these meta-laws. To generalize this challenge, I introduce the notion of roles played by (...) laws of nature according to a philosophical account, and identify a number of salient roles. I then apply Lange’s challenge to two views: the regularity view and the universals view. I argue that the regularity view may be able to meet the generalized version of Lange’s challenge, and that the universals view is able to meet the challenge. 1 Meta-laws2 Lange’s Challenge3 The Roles Played by Laws and Meta-laws4 Meta-laws and the Regularity View5 Meta-laws and the Universals View6 Summary and Conclusion. (shrink)
In the fall of 1985 Carnegie Mellon University established a Department of Philosophy. The focus of the department is logic broadly conceived, philos ophy of science, in particular of the social sciences, and linguistics. To mark the inauguration of the department, a daylong celebration was held on April 5, 1986. This celebration consisted of two keynote addresses by Patrick Sup pes and Thomas Schwartz, seminars directed by members of the department, and a panel discussion on the computational model of mind (...) moderated by Dana S. Scott. The various contributions, in modified and expanded form, are the core of this collection of essays, and they are, I believe, of more than parochial interest: they turn attention to substantive and reflective interdis ciplinary work. The collection is divided into three parts. The first part gives perspec tives on general features of the interdisciplinary enterprise in philosophy, and on a particular topic that invites such interaction, namely computational models of the mind. The second part con tains reports on concrete research done within that enter prise; the research topics range from decision theory and the philosophy of economics through foundational problems in mathematics to issues in aes thetics and computational linguistics. The third part is a postscriptum by Isaac Levi, analyzing directions of work from his perspective. (shrink)
In this article we give a unifying approach to the theory of fundamental sequences and their related Hardy hierarchies of number-theoretic functions and we show the equivalence of the new approach with the classical one.
Zuerst im 2CV, dann im umgebauten VW-Bus: Das Zürcher Fotografenpaar Margrit und Ernst Baumann, 1929 bzw. 1928 geboren, begann in den 1950er-Jahren rund um den Erdball zu reisen. Ihre Fotografien publizierten sie in Zeitschriften und Zeitungen wie Stern, Neue Zürcher Zeitung oder Das gelbe Heft und brachten so die Welt in die Wohnzimmer. Kosmopolitan und neugierig kamen sie zu Motiven mit Seltenheitswert: Farbporträts von Che Guevara gehören ebenso dazu wie Reportagen über die letzten Kopfjäger im ecuadorianischen Urwald. Ein Schwerpunkt dieser (...) reich bebilderten Doppelmonografie ist ihre Reise entlang der “Panamericana” 1957-1959, die Süd- und Nordamerika verbindet. Sie fotografierten Reportagen, Porträts und Landschaftsbilder und drehten den wohl ersten Farbfilm über den legendären Verkehrsweg. Dieser Film und ein Filporträt über die beiden Fotografen sind dem Buch auf DVD beigefügt. Das Buch präsentiert Funde aus ihrem Fotoarchiv - viele Bilder erstmals im Originalausschnitt - sowie Publikationen in Zeitschriften. Der Historiker und Schriftsteller Wilfried Meichtry erzählt vom ereignisreichen Leben des Paars, das sich zeitlebens als journalistische Handwerker verstanden hat. Der Fotohistoriker Markus Schürpf ordnet ihr Werk in die Foto- und Pressegeschichte ein, Nadine Olonetzky verfasst Begleittexte zu den Bildern. Eine Entdeckung in der Schweizer Fotogeschichte. (shrink)
We establish by elementary proof-theoretic means the conservativeness of two subsystems of analysis over primitive recursive arithmetic. The one subsystem was introduced by Friedman [6], the other is a strengthened version of a theory of Minc [14]; each has been shown to be of considerable interest for both mathematical practice and metamathematical investigations. The foundational significance of such conservation results is clear: they provide a direct finitist justification of the part of mathematical practice formalizable in these subsystems. The results are (...) generalized to relate a hierarchy of subsystems, all contained in the theory of arithmetic properties, to a corresponding hierarchy of fragments of arithmetic. The proof theoretic tools employed there are used to re-establish in a uniform, elementary way relationships between various fragments of arithmetic due to Parsons, Paris and Kirby, and Friedman. (shrink)
Hilbert's finitist program was not created at the beginning of the twenties solely to counteract Brouwer's intuitionism, but rather emerged out of broad philosophical reflections on the foundations of mathematics and out of detailed logical work; that is evident from notes of lecture courses that were given by Hilbert and prepared in collaboration with Bernays during the period from 1917 to 1922. These notes reveal a dialectic progression from a critical logicism through a radical constructivism toward finitism; the progression has (...) to be seen against the background of the stunning presentation of mathematical logic in the lectures given during the winter term 1917/18. In this paper, I sketch the connection of Hilbert's considerations to issues in the foundations of mathematics during the second half of the 19th century, describe the work that laid the basis of modern mathematical logic, and analyze the first steps in the new subject of proof theory. A revision of the standard view of Hilbert's and Bernays's contributions to the foundational discussion in our century has long been overdue. It is almost scandalous that their carefully worked out notes have not been used yet to understand more accurately the evolution of modern logic in general and of Hilbert's Program in particular. One conclusion will be obvious: the dogmatic formalist Hilbert is a figment of historical (de)construction! Indeed, the study and analysis of these lectures reveal a depth of mathematical-logical achievement and of philosophical reflection that is remarkable. In the course of my presentation many questions are raised and many more can be explored; thus, I hope this paper will stimulate interest for new historical and systematic work. (shrink)
The Hygiene Hypothesis has been recognized as an important cornerstone to explain the sudden increase in the prevalence of asthma and allergic diseases in modernized culture. The recent epidemic of allergic diseases is in contrast with the gradual implementation of Homo sapiens sapiens to the present-day forms of civilization. This civilization forms a gradual process with cumulative effects on the human immune system, which co-developed with parasitic and commensal Helminths. The clinical manifestation of this epidemic, however, became only visible in (...) the second half of the twentieth century. In order to explain these clinical effects in terms of the underlying IgE-mediated reactions to innocuous environmental antigens, the low biodiversity of antigens in the domestic environment plays a pivotal role. The skewing of antigen exposure as a cumulative effect of reducing biodiversity in the immediate human environment as well as in changing food habits, provides a sufficient and parsimonious explanation for the rise in allergic diseases in a highly developed and helminth-free modernized culture. Socio-economic tendencies that incline towards a further reduction of environmental biodiversity may provide serious concern for future health. This article explains that the “Hygiene Hypothesis”, the “Old Friends Hypothesis”, and the “Skewed Antigen Exposure Hypothesis” are required to more fully explain the rise of allergy in modern societies. (shrink)
Marc Lange objects to scientific essentialists that they can give no better account of the counterfactual invariance of laws than Humeans. While conceding this point succeeds ad hominem against some essentialists, I show that it does not undermine essentialism in general. Moreover, Lange's alternative account of the relation between laws and counterfactuals is - with minor modification - compatible with essentialism.
The incompleteness theorems constitute the mathematical core of Gödel’s philosophical challenge. They are given in their “most satisfactory form”, as Gödel saw it, when the formality of theories to which they apply is characterized via Turing machines. These machines codify human mechanical procedures that can be carried out without appealing to higher cognitive capacities. The question naturally arises, whether the theorems justify the claim that the human mind has mathematical abilities that are not shared by any machine. Turing admits that (...) non-mechanical steps of intuition are needed to transcend particular formal theories. Thus, there is a substantive point in comparing Turing’s views with Gödel’s that is expressed by the assertion, “The human mind infinitely surpasses any finite machine”. The parallelisms and tensions between their views are taken as an inspiration for beginning to explore, computationally, the capacities of the human mathematical mind. (shrink)
Dedekind’s structuralism is a crucial source for the structuralism of mathematical practice—with its focus on abstract concepts like groups and fields. It plays an equally central role for the structuralism of philosophical analysis—with its focus on particular mathematical objects like natural and real numbers. Tensions between these structuralisms are palpable in Dedekind’s work, but are resolved in his essay Was sind und was sollen die Zahlen? In a radical shift, Dedekind extends his mathematical approach to “the” natural numbers. He creates (...) the abstract concept of a simply infinite system, proves the existence of a “model”, insists on the stepwise derivation of theorems, and defines structure-preserving mappings between different systems that fall under the abstract concept. Crucial parts of these considerations were added, however, only to the penultimate manuscript, for example, the very concept of a simply infinite system. The methodological consequences of this radical shift are elucidated by an analysis of Dedekind’s metamathematics. Our analysis provides a deeper understanding of the essay and, in addition, illuminates its impact on the evolution of the axiomatic method and of “semantics” before Tarski. This understanding allows us to make connections to contemporary issues in the philosophy of mathematics and science. (shrink)
Using the concept of notations for infinitary derivations we give an explanation of Takeuti's reduction steps on finite derivations (used in his consistency proof for Π1 1-CA) in terms of the more perspicious infinitary approach from [BS88].
Hilbert gave lectures on the foundations of mathematics throughout his career. Notes for many of them have been preserved and are treasures of information; they allow us to reconstruct the path from Hilbert's logicist position, deeply influenced by Dedekind and presented in lectures starting around 1890, to the program of finitist proof theory in the early 1920s. The development toward proof theory begins, in some sense, in 1917 when Hilbert gave his talk Axiomatisches Denken in Zürich. This talk is rooted (...) in the past and points to the future. As to the future, Hilbert suggested:[. . .] we must[—]that is my conviction[—]take the concept of the specifically mathematical proof as an object of investigation, just as .. (shrink)
This paper explores the decision-making and coordination mechanism of pricing and collection rate in a closed-loop supply chain with capacity constraint in recycling channels, which consists of one manufacturer and one retailer. On the basis of game theory, the equilibriums of decisions and profits in the centralized and decentralized scenarios are obtained and compared. Through the performance analysis of a different scenario, a higher saving production cost and lower competition intensity trigger the members to engage in remanufacturing. Furthermore, we try (...) to propose a two-part tariff contract through bargaining to coordinate supply chain and achieve a Pareto improvement. The results show that when the capacity constraints in recycling channels exceed a threshold, the decisions and profit will change. Additionally, for closed-loop supply chain, the selling price is more susceptible to the influence of capacity constraint in recycling channel than the members’ profit. (shrink)
Alonzo Church's mathematical work on computability and undecidability is well-known indeed, and we seem to have an excellent understanding of the context in which it arose. The approach Church took to the underlying conceptual issues, by contrast, is less well understood. Why, for example, was "Church's Thesis" put forward publicly only in April 1935, when it had been formulated already in February/March 1934? Why did Church choose to formulate it then in terms of Gödel's general recursiveness, not his own λ (...) -definability as he had done in 1934? A number of letters were exchanged between Church and Paul Bernays during the period from December 1934 to August 1937; they throw light on critical developments in Princeton during that period and reveal novel aspects of Church's distinctive contribution to the analysis of the informal notion of effective calculability. In particular, they allow me to give informed, though still tentative answers to the questions I raised; the character of my answers is reflected by an alternative title for this paper, Why Church needed Gödel's recursiveness for his Thesis. In Section 5, I contrast Church's analysis with that of Alan Turing and explore, in the very last section, an analogy with Dedekind's investigation of continuity. (shrink)