This paper offers a detailed account of arguments in De Caelo I by which Aristotle tried to demonstrate the necessity of the perpetual existence and the perpetual rotation of the cosmos. On our interpretation, Aristotle’s arguments are naturalistic. Instead of being based (as many have thought) on rules of logic and language, they depend, we argue, on natural science theories about abilities (δυνάμεις), e.g., to move and to change, which things have by nature and about the conditions under which these (...) abilities can be exercised. Our interpretation locates the De Caelo arguments in the context of some central doctrines of the Organon, the Metaphysics, the Physics, and other texts. The De Caelo arguments fit a number of views developed in these texts. Aristotle’s treatments of local motion, of natural motion and change, of necessity and possibility, and of abilities and their exercises are examples. But, as we interpret them, the De Caelo arguments raise serious questions about the role of (and the need for) Metaphysics A’s soulful Unmoved Mover in Aristotle’s overall natural-scientific picture. (shrink)
A central supposition of the "communicative ethics controversy" in modern social theory has been either that there exist universal standards against which we can judge the validity of speech and moral argumentation or, conversely, that there are no determinate standards to which moral claims can be held answerable, and hence no methods by which disputes over contested claims can rationally be resolved. In this paper it is argued that the basic terms of this debate are miscast. The "order without rules" (...) thesis maintains both that the search for universal standards of valid moral arguments is likely to end in disappointment, and that nonetheless there are discoverable methods by which arguments are evaluated, facts constituted, and disputes settled, and for which appeals to general standards of validity are neither requested nor forthcoming. Wittgenstein's numerous remarks on rules and rule following are considered in support of this thesis. (shrink)
Wegner vacillates between considering the experience of will as a directly-sensed feeling and as a cognitive construct. Most of his book is devoted to examples of erroneous cognition. The brain basis of will as an immediately-sensed emotion receives minimal attention.
For a writing to be a writing it must continue to "act" and to be readable even when what is called the author of the writing no longer answers for what he has written, for what he seems to have signed, be it because of a temporary absence, because he is dead or, more generally, because he has not employed his absolutely actual and present intention or attention, the plenitude of his desire to say what he means, in order to (...) sustain what seems to be written "in his name" (Derrida 1988, p. 8). Ein Buch ist ein spiegel; wenn ein Affe hineinguckt, so kann freilich kein Apostel heraussehen (A book is a mirror; if an ape peers into it an apostle will never peer out.) (Lichtenberg 1949, p. 297). (shrink)
This paper suggests and discusses an answer to the question what distinguishes causal from non-causal or coincidental co-occurrences based on Elizabeth Anscombe’s idea that causality is a highly abstract concept whose meaning derives from our understanding of specific causally productive activities (e.g., pulling, scraping, burning), and her rejection of the assumption that causality can be informatively understood in terms of general regularities of some sort.
I claim that the Hodgkin‐Huxley (HH) current equations owe a great deal of their importance to their role in bringing results from experiments on squid giant action preparations to bear on the study of the action potential in other neurons in other in vitro and in vivo environments. I consider ideas from Weber and Craver about the role of Coulomb’s and other fundamental equations in explaining the action potential and in HH’s development of their equations. Also, I offer an embellishment (...) to Schaffner’s emergent unifier conception of the HH model. †To contact the author, please write to: Department of HPS, University of Pittsburgh, 1017 Cathedral of Learning, Pittsburgh, PA 15260; e‐mail: firstname.lastname@example.org. (shrink)
This paper provides a restatement and defense of the data/ phenomena distinction introduced by Jim Bogen and me several decades ago (e.g., Bogen and Woodward, The Philosophical Review, 303–352, 1988). Additional motivation for the distinction is introduced, ideas surrounding the distinction are clarified, and an attempt is made to respond to several criticisms.
Depending on different positions in the debate on scientific realism, there are various accounts of the phenomena of physics. For scientific realists like Bogen and Woodward, phenomena are matters of fact in nature, i.e., the effects explained and predicted by physical theories. For empiricists like van Fraassen, the phenomena of physics are the appearances observed or perceived by sensory experience. Constructivists, however, regard the phenomena of physics as artificial structures generated by experimental and mathematical methods. My paper investigates the (...) historical background of these different meanings of phenomenon in the traditions of physics and philosophy. In particular, I discuss Newton’s account of the phenomena and Bohr’s view of quantum phenomena, their relation to the philosophical discussion, and to data and evidence in current particle physics and quantum optics. (shrink)
A stalwart view in the philosophy of science holds that, even when broadly construed so as to include theoretical auxiliaries, theories cannot make direct contact with observations. This view owes much to Bogen and Woodward’s (1988) influential distinction between data and phenomena. According to them, data are typically the kind of things that are observable or measurable like "bubble chamber photographs, patterns of discharge in electronic particle detectors and records of reaction times and error rates in various psychological experiments" (...) (p. 306). Phenomena are physical processes that are typically unobservable. Examples of the latter category include "weak neutral currents, the decay of the proton, and chunking and recency effects in human memory" (ibid.). Theories, in Bogen and Woodward’s view, are utilised to systematically explain, infer and predict phenomena, not data (pp. 305-306). The relationship between theories and data is rather indirect. Data count as evidence for phenomena and the latter in turn count as evidence for theories. This view is becoming increasingly influential (e.g. Prajit K. Basu (2003), Stathis Psillos (2004) and Mauricio Suárez (2005)). In this paper I argue contrary to this view that in various significant and well-known cases theories do make direct contact with the help of suitable auxiliaries. (shrink)
Bogen and Woodward characterized data as embedded in the context in which they are produced (‘local’) and claims about phenomena as retaining their significance beyond that context (‘nonlocal’). This view does not fit sciences such as biology, which successfully disseminate data via packaging processes that include appropriate labels, vehicles, and human interventions. These processes enhance the evidential scope of data and ensure that claims about phenomena are understood in the same way across research communities. I conclude that the degree (...) of locality of both data and claims about phenomena varies depending on the packaging used to make them travel and on the research setting in which they are used. †To contact the author, please write to: ESRC Centre for Genomics in Society, University of Exeter, Byrne House, St. Germans Road, EX4 4PJ Exeter, United Kingdom; e‐mail: email@example.com. (shrink)
Jim Bogen and James Woodward’s ‘Saving the Phenomena’, published only twenty years ago, has become a modern classic. Their centrepiece idea is a distinction between data and phenomena. According to them, data are typically the kind of things that are observable or measurable like “bubble chamber photographs, patterns of discharge in electronic particle detectors and records of reaction times and error rates in various psychological experiments” (p. 306). Phenomena are physical processes that are typically unobservable. Examples of the latter (...) category include “weak neutral currents, the decay of the proton, and chunking and recency effects in human memory” (ibid.). Theories, in Bogen and Woodward’s view, are utilised to systematically explain and predict phenomena, not data (pp. 305-306). The relationship between theories and data is rather indirect. Data count as evidence for phenomena and the latter in turn count as evidence for theories. This view has been further elaborated in subsequent papers (see Bogen and Woodward 1992, 2005 and Woodward 1989) and is becoming increasingly influential (e.g. Prajit K. Basu 2003, Stathis Psillos 2004 and Mauricio Suárez 2005). In this paper I argue that in various significant and well-known cases theories accompanied with suitable auxiliary hypotheses are more proximal to observations than Bogen and Woodward would have us believe. This is especially true of cases involving novel predictions. (shrink)
In this paper I try to shed some light on how one discerns a physical effect or phenomenon from experimental background ‘noise’. To this end I revisit the discovery of Weak Neutral Currents (WNC), which has been right at the centre of discussion of some of the most influential available literature on this issue. Bogen and Woodward (1988) have claimed that the phenomenon of WNC was inferred from the data without higher level physical theory explaining this phenomenon (here: the (...) Weinberg-Salam model of electroweak interactions) being involved in this process. Mayo (1994, 1996), in a similar vein, holds that the discovery of WNC was made on the basis of some piecemeal statistical techniques—again without the Salam-Weinberg model (predicting and explaining WNC) being involved in the process. Both Bogen & Woodward and Mayo have tried to back up their claims by referring to the historical work about the discovery of WNC by Galison (1983, 1987). Galison’s presentation of the historical facts, which can be described as realist, has however been challenged by Pickering (1984, 1988, 1989), who has drawn sociological-relativist conclusions from this historical case. Pickering’s conclusions, in turn, have recently come under attack by Miller and Bullock (1994), who delivered a defence of Galison’s realist account. In this paper I consider all of these historical studies in order to evaluate the philosophical claims that have been made on the basis of them. I conclude that—contrary to Bogen & Woodward (1988) and Mayo (1994)—statistical methods and other experimental inference procedures from the “bottom-up” (i.e. from the data to the phenomena) were insufficient for discerning WNC from their background noise. I also challenge Galison’s notion of the “end of experiments” and shall take the wind out of the sail of Miller and Bullock’s attack on some of Pickering’s claims, whilst rejecting Pickering’s sociological-relativist conclusions. Instead, I claim that an epistemic warrant from the ‘top down’ in the form of a theoretical postulate of the Weinberg-Salam model was necessary for “ending the experiments”, i.e. for the acceptance of WNC as a genuine phenomenon in the scientific community. (shrink)