This paper argues for two related theses. The first is that mathematical abstraction can play an important role in shaping the way we think about and hence understand certain phenomena, an enterprise that extends well beyond simply representing those phenomena for the purpose of calculating/predicting their behaviour. The second is that much of our contemporary understanding and interpretation of natural selection has resulted from the way it has been described in the context of statistics and mathematics. I argue for these (...) claims by tracing attempts to understand the basis of natural selection from its early formulation as a statistical theory to its later development by R.A. Fisher, one of the founders of modern population genetics. Not only did these developments put natural selection of a firm theoretical foundation but its mathematization changed the way it was understood as a biological process. Instead of simply clarifying its status, mathematical techniques were responsible for redefining or reconceptualising selection. As a corollary I show how a highly idealised mathematical law that seemingly fails to describe any concrete system can nevertheless contain a great deal of accurate information that can enhance our understanding far beyond simply predictive capabilities. (shrink)
The paper presents an argument for treating certain types of computer simulation as having the same epistemic status as experimental measurement. While this may seem a rather counterintuitive view it becomes less so when one looks carefully at the role that models play in experimental activity, particularly measurement. I begin by discussing how models function as “measuring instruments” and go on to examine the ways in which simulation can be said to constitute an experimental activity. By focussing on the connections (...) between models and their various functions, simulation and experiment one can begin to see similarities in the practices associated with each type of activity. Establishing the connections between simulation and particular types of modelling strategies and highlighting the ways in which those strategies are essential features of experimentation allows us to clarify the contexts in which we can legitimately call computer simulation a form of experimental measurement. (shrink)
Although the recent emphasis on models in philosophy of science has been an important development, the consequence has been a shift away from more traditional notions of theory. Because the semantic view defines theories as families of models and because much of the literature on “scientific” modeling has emphasized various degrees of independence from theory, little attention has been paid to the role that theory has in articulating scientific knowledge. This paper is the beginning of what I hope will be (...) a redress of the imbalance. I begin with a discussion of some of the difficulties faced by various formulations of the semantic view not only with respect to their account of models but also with their definition of a theory. From there I go on to articulate reasons why a notion of theory is necessary for capturing the structure of scientific knowledge and how one might go about formulating such a notion in terms of different levels of representation and explanation. The context for my discussion is the BCS account of superconductivity, a `theory' that was, and still is, sometimes referred to as a `model'. BCS provides a nice focus for the discussion because it illuminates various features of the theory/model relationship that seem to require a robust notion of theory that is not easily captured by the semantic account. (shrink)
The paper examines philosophical issues that arise in contexts where one has many different models for treating the same system. I show why in some cases this appears relatively unproblematic (models of turbulence) while others represent genuine difficulties when attempting to interpret the information that models provide (nuclear models). What the examples show is that while complementary models needn’t be a hindrance to knowledge acquisition, the kind of inconsistency present in nuclear cases is, since it is indicative of a lack (...) of genuine theoretical understanding. It is important to note that the differences in modeling do not result directly from the status of our knowledge of turbulent flows as opposed to nuclear dynamics—both face fundamental theoretical problems in the construction and application of models. However, as we shall, the ‘problem context(s)’ in which the modeling takes plays a decisive role in evaluating the epistemic merit of the models themselves. Moreover, the theoretical difficulties that give rise to inconsistent as opposed to complementary models (in the cases I discuss) impose epistemic and methodological burdens that cannot be overcome by invoking philosophical strategies like perspectivism, paraconsistency or partial structures. (shrink)
Prandtl's work on the boundary layer theory is an interesting example for illustrating several important issues in philosophy of science such as the relation between theories and models and whether it is possible to distinguish, in a principled way, between pure and applied science. In what follows I discuss several proposals by the symposium participants regarding the interpretation of Prandtl's work and whether it should be characterized as an instance of applied science. My own interpretation of this example (1999) emphasised (...) the degree of autonomy embedded in Prandtl's boundary layer model and the way it became integrated in the larger theoretical context of hydrodynamics. In addition to extending that discussion here I also claim that the characterization of applied science which formed the basis for the symposium does not enable us to successfully distinguish applied science from the general practice of 'applying' basic scientific knowledge in a variety of contexts. (shrink)
Ernst Mayr has criticised the methodology of population genetics for being essentialist: interested only in “types” as opposed to individuals. In fact, he goes so far as to claim that “he who does not understand the uniqueness of individuals is unable to understand the working of natural selection” (1982, 47). This is a strong claim indeed especially since many responsible for the development of population genetics (especially Fisher, Haldane, and Wright) were avid Darwinians. In order to unravel this apparent incompatibility (...) I want to examine the possible sources and implications of essentialism in this context and show why the kind of mathematical analysis found in Fisher's work is better seen as responsible for extending the theory of natural selection to a broader context rather than inhibiting its applicability. (shrink)
I argued that the frameworks and mechanisms that produce unification do not enable us to explain why the unified phenomena behave as they do. That is, we need to look beyond the unifying process for an explanation of these phenomena. Anya Plutynski () has called into question my claim about the relationship between unification and explanation as well as my characterization of it in the context of the early synthesis of Mendelism with Darwinian natural selection. In this paper I argue (...) that her methodological criticisms rest on a misinterpretation of my views on explanation and defend my historical interpretation of the work of Fisher and Wright. A statement of the problem Methodological differences: how to characterize explanation Historical matters: disagreements about details Explanation revisited: the possible versus the ‘merely actual’. (shrink)
In addition to its obvious successes within the kinetic theory the ideal gas law and the modeling assumptions associated with it have been used to treat phenomena in domains as diverse as economics and biology. One reason for this is that it is useful to model these systems using aggregates and statistical relationships. The issue I deal with here is the way R. A. Fisher used the model of an ideal gas as a methodological device for examining the causal role (...) of selection in producing variation in Mendelian populations. The model enabled him to create the kind of population where one could measure the effects of selection in a way that could not be done empirically. Consequently we are able to see how the model of an ideal gas was transformed into a biological model that functioned as an instrument for both investigating nature and developing a new theory of genetics. (shrink)
The debate between the Mendelians and the (largely Darwinian) biometricians has been referred to by R. A. Fisher as ‘one of the most needless controversies in the history of science’ and by David Hull as ‘an explicable embarrassment’. The literature on this topic consists mainly of explaining why the controversy occurred and what factors prevented it from being resolved. Regrettably, little or no mention is made of the issues that figured in its resolution. This paper deals with the latter topic (...) and in doing so reorients the focus of the debate as one between Karl Pearson and R. A. Fisher rather than between the biometricians and the Mendelians. One reason for this reorientation is that Pearson's own work in 1904 and 1909 suggested that Mendelism and biometry could, to some extent, be made compatible, yet he remained steadfast in his rejection of Mendelism. The interesting question then is why Fisher, who was also a proponent of biometric methods, was able to synthesise the two traditions in a way that Pearson either could not or would not. My answer to this question involves an analysis of the ways in which different kinds of assumptions were used in modelling Mendelian populations. I argue that it is these assumptions, which lay behind the statistical techniques of Pearson and Fisher, that can be isolated as the source of Pearson's rejection of Mendelism and Fisher's success in the synthesis. (shrink)
One of Nancy Cartwright's arguments for entity realism focuses on the non-redundancy of causal explanation. In How the Laws of Physics Lie she uses an example from laser theory to illustrate how we can have a variety of theoretical treatments governing the same phenomena while allowing just one causal story. In the following I show that in the particular example Cartwright chooses causal explanation exhibits the same kind of redundancy present in theoretical explanation. In an attempt to salvage Cartwright's example (...) the causal explanation could be reinterpreted as a capacity claim, as outlined in her recent work Nature's Capacities and Their Measurement. However, I argue that capacities cannot be isolated in the way that Cartwright suggests and consequently these capacity claims also fail to provide a unique causal story. We can, however, make sense of capacities by characterizing them in a relational way and I offer some ideas as to how this approach would retain our intuitions about capacities while denying their ontological priority as dormant powers. (shrink)
Some very persuasive arguments have been put forward in recent years in support of the disunity of science. Despite this, one is forced to acknowledge that unification, especially the practice of unifying theories, remains a crucial aspect of scientific practice. I explore specific aspects of this tension by examining the nature of theory unification and how it is achieved in the case of the electroweak theory. I claim that because the process of unifying theories is largely dependent on particular kinds (...) of mathematical structures it is possible to have a theory that displays a degree of unity at the level of theoretical structure without an accompanying ontological unity or reduction. As a result, unity and disunity can coexist not only within science but within the same theory. (shrink)
This paper is intended as an extension to some of the recent discussion in the philosophical literature on the nature of experimental evidence. In particular I examine the role of empirical evidence attained through the use of deductions from phenomena. This approach to theory construction has been widely used throughout the history of science both by Newton and Einstein as well as Clerk Maxwell. I discuss a particular formulation of maxwell's electrodynamics, one he claims was deduced from experimental facts. However, (...) the deduction is problematic in that it is not immediately clear that one of the crucial parameters of the theory, the displacement current, can be given an empirical foundation. In outlining Maxwell's argument and his attempts to arrive on an empirically based account of the electromagnetic field equations I draw attention to the philosophical implications of the constraints on theory that arise in this particular case of deduction from phenomena. (shrink)
In The Foundations of Space-Time Theories Friedman argues for a literal realistic interpretation about theoretical structures that participate in theory unification. His account of the relationship between observational and theoretical structure is characterized as that of model to submodel and involves a reductivist strategy that allows for the conjunction of certain theoretical structures with other structures which, taken together, form a truly unified theory. Friedman criticizes the representational account for its failure to allow for a literal interpretation and conjunction of (...) theoretical structure. I argue that contra Friedman the representationalist account can sanction a literal interpretation and in fact presents a more accurate account of scientific practice than the model-submodel account. The strict reductivism characteristic of the model submodel approach can in some cases be seen to prevent rather than facilitate a literal account of theoretical structure. Because of the dependence Friedman places on reduction for his account of conjunction, and because the former cannot be sustained, it would appear that Friedman's own account fails to achieve what it was designed to do. (shrink)
Morrison offers an illuminating study of two linked traditions that have figured prominently in twentieth-century thought: Buddhism and the philosophy of Nietzsche. Nietzsche admired Buddhism, but saw it as a dangerously nihilistic religion; he forged his own affirmative philosophy in reaction against the nihilism that he feared would overwhelm Europe. Morrison shows that Nietzsche's influential view of Buddhism was mistaken, and that far from being nihilistic, it has notable and perhaps surprising affinities with Nietzsche's own project of the (...) transvaluation of all values. (shrink)
In this book, Morrison discusses the process of aesthetic education, as defined by Johann Joachim Winckelmann on the basis of his status as arbiter of classical taste and as applied to his teaching of two pupils. Morrison identifies the key features of Winckelmann's treatment of classical beauty and elucidates how Winckelmann taught the appreciation of beauty. He argues that Winckelmann's practice of aesthetic education fell short of his aesthetic theory. Morrison concludes by looking at Goethe's aesthetic self-education, (...) which was strongly influenced by Winckelmann. (shrink)
Linda Morrison brings the voices and issues of a little-known, complex social movement to the attention of sociologists, mental health professionals, and the general public. The members of this social movement work to gain voice for their own experience, to raise consciousness of injustice and inequality, to expose the darker side of psychiatry, and to promote alternatives for people in emotional distress. Talking Back to Psychiatry explores the movement's history, its complex membership, its strategies and goals, and the varied (...) response it has received from psychiatry, policy makers, and the public at large. (shrink)
This paper addresses the role of integrity in global leadership. It reviews the philosophy of ethics and suggests that both contractarianism and pluralism are particularly helpful in understanding ethics from a global leadership perspective. It also reviews the challenges to integrity that come through interactions that are both external and internal to the company. Finally, the paper provides helpful suggestions on how global leaders can define appropriate ethical standards for themselves and their organizations.
When we listen to music, what do we listen to and for? How do we listen? How well do we listen and how do we listen well? This paper suggests that ‘modes of engagement’ are the active, operational means by which listeners experience music and that listening experiences more often than not involve multiple interacting modes rather than a fixed mode throughout. Modes of engagement may be voluntarily employed or involuntarily adopted; they may be technical or descriptive; they may involve (...) explicitly musical details and relationships, or they may seem more peripheral to the music. In the end, though, successive, simultaneous, and interacting modes of engagement are said to define unique and meaningful trajectories through music as heard. (shrink)
Before a general cognitive model for recurrent complex visual hallucinations (RCVH) is accepted, there must be more research into the neuropsychological and cognitive characteristics of the various disorders in which they occur. Currently available data are insufficient to distinguish whether the similar phenomenology of RCVH across different disorders is in fact produced by a single or by multiple cognitive mechanisms.
In an attempt to gain some control over ever escalating health care cost, many organizations have moved to a managed care concept of health benefits. Managed care health benefit strategies account for well over 90 percent of all employer sponsored health benefit programs.In essence, managed care coverage usually demands, at a minimum, some form of utilization review in regard to provider services. Thus the privacy of the traditional doctor patient relationship must inevitably be modified when managed care enters the picture.