One of the hallmarks of Kantian philosophy, especially in connection with its characterization of scientific knowledge, is the importance of unity, a theme that is also the driving force behind a good deal of contemporary high energy physics. There are a variety of ways that unity figures in modern science—there is unity of method where the same kinds of mathematical techniques are used in different sciences, like physics and biology; the search for unified theories like the unification of electromagnetism and (...) optics by Maxwell; and, more recently, the project of grand unification or the quest for a theory of everything which involves a reduction of the four fundamental forces under the umbrella of a single theory. In this latter case it is thought that when energies are high enough, the forces, while very different in strength, range and the types of particles on which they act, become one and the same force. The fact that these interactions are known to have many underlying mathematical features in common suggests that they can all be described by a unified field theory. Such a theory describes elementary particles in terms of force fields which further unifies all the interactions by treating particles and interactions in a technically and conceptually similar way. It is this theoretical framework that allows for the prediction that measurements made at a certain energy level will supposedly indicate that there is only one type of force. In other words, not only is there an ontological reduction of the forces themselves but the mathematical framework used to describe the fields associated with these forces facilitates their description in a unified theory. Specific types of symmetries serve an important function in establishing these kinds of unity, not only in the construction of quantum field theories but also in the classification of particles; classifications that can lead to new predictions and new ways of understanding properties like quantum numbers. Hence, in order to address issues about unification and reduction in contemporary physics we must also address the way that symmetries facilitate these processes. (shrink)
This book is about the methods used for unifying different scientific theories under one all-embracing theory. The process has characterized much of the history of science and is prominent in contemporary physics; the search for a 'theory of everything' involves the same attempt at unification. Margaret Morrison argues that, contrary to popular philosophical views, unification and explanation often have little to do with each other. The mechanisms that facilitate unification are not those that enable us to explain how or (...) why phenomena behave as they do. A feature of this book is an account of many case studies of theory unification in nineteenth- and twentieth-century physics and of how evolution by natural selection and Mendelian genetics were unified into what we now term evolutionary genetics. (shrink)
Morrison and Morgan argue for a view of models as 'mediating instruments' whose role in scientific theorising goes beyond applying theory. Models are partially independent of both theories and the world. This autonomy allows for a unified account of their role as instruments that allow for exploration of both theories and the world.
Linda Morrison brings the voices and issues of a little-known, complex social movement to the attention of sociologists, mental health professionals, and the general public. The members of this social movement work to gain voice for their own experience, to raise consciousness of injustice and inequality, to expose the darker side of psychiatry, and to promote alternatives for people in emotional distress. Talking Back to Psychiatry explores the movement's history, its complex membership, its strategies and goals, and the varied (...) response it has received from psychiatry, policy makers, and the public at large. (shrink)
Morrison offers an illuminating study of two linked traditions that have figured prominently in twentieth-century thought: Buddhism and the philosophy of Nietzsche. Nietzsche admired Buddhism, but saw it as a dangerously nihilistic religion; he forged his own affirmative philosophy in reaction against the nihilism that he feared would overwhelm Europe. Morrison shows that Nietzsche's influential view of Buddhism was mistaken, and that far from being nihilistic, it has notable and perhaps surprising affinities with Nietzsche's own project of the (...) transvaluation of all values. (shrink)
We establish that, due to certain quantum indeterminacies, there must be foundational colours that do not reliably cause any particular experience. This report functions as an appendix to Morrison's "Colour in a Physical World.".
At the Editors' request, I have given this paper the final revision which Mr. Morrison has not time to give. This was needed chiefly in II, in the establishment of the stemma, and in the early part of IV. In these parts Mr. Morrison must not be held responsible for the details, though I have endeavoured to give his conclusions. In II the credit is his for the identification of the sororis filius in Quintilian, Inst. Or. xi. 2. (...) 14, as Antiochus, for the view that Antiochus is an Aleuad, and therefore the three Echecratidae also, and for the consequent interpretation of Thuc. i. i n . In IV I found it difficult to revise Mr. Morrison's detailed interpretation of ‘Herodes’ and have omitted much. It will be understood that this procedure does little justice to his views, though I have tried to suppress nothing which bore directly on his main argument. (shrink)
After pointing out the great influence of this philosophical system, Mr. Morrison looser no time in going to the heart of it. He is will qualified forth is study, having spent much time in reading and discussion concerning ths great proponents of the system.
A project based at the Centre for Educational Development, Appraisal and Research at the University of Warwick is currently exploring the formal and informal ways in which children and adults experience food and eating in schools. Conducted by Burgess & Morrison during 1993‐94, the project forms part of the Economic and Social Research Council's Nation's Diet Initiative. Using data from the project, this paper explores food choice and consumption in relation to the institutional dynamics of two English secondary schools (...) and to pupils’ interpretations of internal and external influences upon their understandings about food. Here, the apparent ordinariness of eating is considered alongside multiple perceptions of food as they link to educational experience, and to identities forged from family, gender and media interests. Descriptions and interpretations are explored in terms of coherence and clarity as well as ambivalence and contradiction. Data analysis offers a range of messages for educationists, nutritionists and sociologists. In particular, conclusions point to the need for a continuing reappraisal of the formal and informal mechanisms of food‐focused education. When more is being asked of nutrition education in schools, much remains to be evaluated systematically. (shrink)
Jeffry H. Morrison offers readers the first comprehensive look at the political thought and career of John Witherspoon—a Scottish Presbyterian minister and one of America’s most influential and overlooked founding fathers. Witherspoon was an active member of the Continental Congress and was the only clergyman both to sign the Declaration of Independence and to ratify the federal Constitution. During his tenure as president of the College of New Jersey at Princeton, Witherspoon became a mentor to James Madison and influenced (...) many leaders and thinkers of the founding period. He was uniquely positioned at the crossroads of politics, religion, and education during the crucial first decades of the new republic. Morrison locates Witherspoon in the context of early American political thought and charts the various influences on his thinking. This impressive work of scholarship offers a broad treatment of Witherspoon’s constitutionalism, including his contributions to the mediating institutions of religion and education, and to political institutions from the colonial through the early federal periods. This book will be appreciated by anyone with an interest in American political history and thought and in the relation of religion to American politics. “I have been waiting a long time for such a book on John Witherspoon. This book is not only well-researched, but well-written. The story Morrison tells is quite wonderful.” —_Michael Novak, American Enterprise Institute for Public Policy Research_ _ _ "Dr. John Witherspoon is at once an exceptionally influential figure in Early American history, and a sadly neglected one. Professor Morrison's book fills this gap in American political history brilliantly. It is especially revealing of 18th century views on the interrationships between education, religion, and society. Morrison presents new insights into the Early American understanding of balancing faith, government, and society. It will change our conceptions of this period and provide fresh perspectives on contemporary problems. Everyone interested in the American Founding era is indebted to Morrison for this illuminating book." —_Garrett Ward Sheldon, University of Virginia's College at Wise_ "At last we have a full and learned account, as the title states, of _JOHN WITHERSPOON aND THE FOUNDING OF THE AMERICAN REPUBLIC_. Including discussion of Witherspoon's direct role in the crucial events of 1775-1790 as an advocate of Independence and friend of the Constitution, as a contributor to early American religious and political thought, and most important, as a mentor to James Madison and other Princeton revolutionairies and nation-builders, Morrison reveals Witherspoon's high standing in American religious, educational, and political history. Madison remembered Witherspoon's injunction to his students to 'Lead useful Lives;' he provided an excellent role model." —_Ralph Ketcham, Syracuse University_. (shrink)
In this book, Morrison discusses the process of aesthetic education, as defined by Johann Joachim Winckelmann on the basis of his status as arbiter of classical taste and as applied to his teaching of two pupils. Morrison identifies the key features of Winckelmann's treatment of classical beauty and elucidates how Winckelmann taught the appreciation of beauty. He argues that Winckelmann's practice of aesthetic education fell short of his aesthetic theory. Morrison concludes by looking at Goethe's aesthetic self-education, (...) which was strongly influenced by Winckelmann. (shrink)
At the Editors' request, I have given this paper the final revision which Mr. Morrison has not time to give. This was needed chiefly in II, in the establishment of the stemma, and in the early part of IV. In these parts Mr. Morrison must not be held responsible for the details, though I have endeavoured to give his conclusions. In II the credit is his for the identification of the sororis filius in Quintilian, Inst. Or. xi. 2. (...) 14, as Antiochus, for the view that Antiochus is an Aleuad, and therefore the three Echecratidae also, and for the consequent interpretation of Thuc. i. i n. In IV I found it difficult to revise Mr. Morrison's detailed interpretation of ‘Herodes’ and have omitted much. It will be understood that this procedure does little justice to his views, though I have tried to suppress nothing which bore directly on his main argument. (shrink)
Models as Mediators discusses the ways in which models function in modern science, particularly in the fields of physics and economics. Models play a variety of roles in the sciences: they are used in the development, exploration and application of theories and in measurement methods. They also provide instruments for using scientific concepts and principles to intervene in the world. The editors provide a framework which covers the construction and function of scientific models, and explore the ways in which they (...) enable us to learn about both theories and the world. The contributors to the volume offer their own individual theoretical perspectives to cover a wide range of examples of modelling, from physics, economics and chemistry. These papers provide ideal case-study material for understanding both the concepts and typical elements of modelling, using analytical approaches from the philosophy and history of science. (shrink)
In 2012, a new and promising gene manipulation technique, CRISPR-Cas9, was announced that seems likely to be a foundational technique in health care and agriculture. However, patents have been granted. As with other technological developments, there are concerns of social justice regarding inequalities in access. Given the technologies’ “foundational” nature and societal impact, it is vital for such concerns to be translated into workable recommendations for policymakers and legislators. Colin Farrelly has proposed a moral justification for the use of patents (...) to speed up the arrival of technology by encouraging innovation and investment. While sympathetic to his argument, this article highlights a number of problems. By examining the role of patents in CRISPR and in two previous foundational technologies, we make some recommendations for realistic and workable guidelines for patenting and licensing. (shrink)
The book examines issues related to the way modeling and simulation enable us to reconstruct aspects of the world we are investigating. It also investigates the processes by which we extract concrete knowledge from those reconstructions and how that knowledge is legitimated.
The paper presents an argument for treating certain types of computer simulation as having the same epistemic status as experimental measurement. While this may seem a rather counterintuitive view it becomes less so when one looks carefully at the role that models play in experimental activity, particularly measurement. I begin by discussing how models function as “measuring instruments” and go on to examine the ways in which simulation can be said to constitute an experimental activity. By focussing on the connections (...) between models and their various functions, simulation and experiment one can begin to see similarities in the practices associated with each type of activity. Establishing the connections between simulation and particular types of modelling strategies and highlighting the ways in which those strategies are essential features of experimentation allows us to clarify the contexts in which we can legitimately call computer simulation a form of experimental measurement. (shrink)
The Ontology for Biomedical Investigations (OBI) is an ontology that provides terms with precisely defined meanings to describe all aspects of how investigations in the biological and medical domains are conducted. OBI re-uses ontologies that provide a representation of biomedical knowledge from the Open Biological and Biomedical Ontologies (OBO) project and adds the ability to describe how this knowledge was derived. We here describe the state of OBI and several applications that are using it, such as adding semantic expressivity to (...) existing databases, building data entry forms, and enabling interoperability between knowledge resources. OBI covers all phases of the investigation process, such as planning, execution and reporting. It represents information and material entities that participate in these processes, as well as roles and functions. Prior to OBI, it was not possible to use a single internally consistent resource that could be applied to multiple types of experiments for these applications. OBI has made this possible by creating terms for entities involved in biological and medical investigations and by importing parts of other biomedical ontologies such as GO, Chemical Entities of Biological Interest (ChEBI) and Phenotype Attribute and Trait Ontology (PATO) without altering their meaning. OBI is being used in a wide range of projects covering genomics, multi-omics, immunology, and catalogs of services. OBI has also spawned other ontologies (Information Artifact Ontology) and methods for importing parts of ontologies (Minimum information to reference an external ontology term (MIREOT)). The OBI project is an open cross-disciplinary collaborative effort, encompassing multiple research communities from around the globe. To date, OBI has created 2366 classes and 40 relations along with textual and formal definitions. The OBI Consortium maintains a web resource providing details on the people, policies, and issues being addressed in association with OBI. (shrink)
Enhanced indispensability arguments claim that Scientific Realists are committed to the existence of mathematical entities due to their reliance on Inference to the best explanation. Our central question concerns this purported parity of reasoning: do people who defend the EIA make an appropriate use of the resources of Scientific Realism to achieve platonism? We argue that just because a variety of different inferential strategies can be employed by Scientific Realists does not mean that ontological conclusions concerning which things we should (...) be Scientific Realists about are arrived at by any inferential route which eschews causes, and nor is there any direct pressure for Scientific Realists to change their inferential methods. We suggest that in order to maintain inferential parity with Scientific Realism, proponents of EIA need to give details about how and in what way the presence of mathematical entities directly contribute to explanations. (shrink)
Perceptual Confidence is the view that perceptual experiences assign degrees of confidence. After introducing, clarifying, and motivating Perceptual Confidence, I catalogue some of its more interesting consequences, such as the way it blurs the distinction between veridical and illusory experiences, a distinction that is sometimes said to carry a lot of metaphysical weight. I also explain how Perceptual Confidence fills a hole in our best scientific theories of perception and why it implies that experiences don't have objective accuracy conditions.
Although the recent emphasis on models in philosophy of science has been an important development, the consequence has been a shift away from more traditional notions of theory. Because the semantic view defines theories as families of models and because much of the literature on “scientific” modeling has emphasized various degrees of independence from theory, little attention has been paid to the role that theory has in articulating scientific knowledge. This paper is the beginning of what I hope will be (...) a redress of the imbalance. I begin with a discussion of some of the difficulties faced by various formulations of the semantic view not only with respect to their account of models but also with their definition of a theory. From there I go on to articulate reasons why a notion of theory is necessary for capturing the structure of scientific knowledge and how one might go about formulating such a notion in terms of different levels of representation and explanation. The context for my discussion is the BCS account of superconductivity, a `theory' that was, and still is, sometimes referred to as a `model'. BCS provides a nice focus for the discussion because it illuminates various features of the theory/model relationship that seem to require a robust notion of theory that is not easily captured by the semantic account. (shrink)
Throughout the biological and biomedical sciences there is a growing need for, prescriptive ‘minimum information’ (MI) checklists specifying the key information to include when reporting experimental results are beginning to find favor with experimentalists, analysts, publishers and funders alike. Such checklists aim to ensure that methods, data, analyses and results are described to a level sufficient to support the unambiguous interpretation, sophisticated search, reanalysis and experimental corroboration and reuse of data sets, facilitating the extraction of maximum value from data sets (...) them. However, such ‘minimum information’ MI checklists are usually developed independently by groups working within representatives of particular biologically- or technologically-delineated domains. Consequently, an overview of the full range of checklists can be difficult to establish without intensive searching, and even tracking thetheir individual evolution of single checklists may be a non-trivial exercise. Checklists are also inevitably partially redundant when measured one against another, and where they overlap is far from straightforward. Furthermore, conflicts in scope and arbitrary decisions on wording and sub-structuring make integration difficult. This presents inhibit their use in combination. Overall, these issues present significant difficulties for the users of checklists, especially those in areas such as systems biology, who routinely combine information from multiple biological domains and technology platforms. To address all of the above, we present MIBBI (Minimum Information for Biological and Biomedical Investigations); a web-based communal resource for such checklists, designed to act as a ‘one-stop shop’ for those exploring the range of extant checklist projects, and to foster collaborative, integrative development and ultimately promote gradual integration of checklists. (shrink)
In “Perceptual Confidence,” I argue that our perceptual experiences assign degrees of confidence. In “Precision, not Confidence, Describes the Uncertainty of Perceptual Experience,” Rachel Denison disagrees. In this reply I first clarify what i mean by ‘perceptual experiences’, ‘assign’ and ‘confidence’. I then argue, contra Denison, that perception involves automatic categorization, and that there is an intrinsic difference between a blurry perception of a sharp image and a sharp perception of a blurry image. -/- .
This paper is an examination of evidential holism, a prominent position in epistemology and the philosophy of science which claims that experiments only ever confirm or refute entire theories. The position is historically associated with W.V. Quine, and it is at once both popular and notorious, as well as being largely under-described. But even though there’s no univocal statement of what holism is or what it does, philosophers have nevertheless made substantial assumptions about its content and its truth. Moreover they (...) have drawn controversial and important conclusions from these assumptions. In this paper I distinguish three types of evidential holism and argue that the most oft-cited and controversial thesis is entirely unmotivated. The other two theses are much overlooked, but are well-motivated and free from controversial implications. (shrink)
The paper examines philosophical issues that arise in contexts where one has many different models for treating the same system. I show why in some cases this appears relatively unproblematic (models of turbulence) while others represent genuine difficulties when attempting to interpret the information that models provide (nuclear models). What the examples show is that while complementary models needn’t be a hindrance to knowledge acquisition, the kind of inconsistency present in nuclear cases is, since it is indicative of a lack (...) of genuine theoretical understanding. It is important to note that the differences in modeling do not result directly from the status of our knowledge of turbulent flows as opposed to nuclear dynamics—both face fundamental theoretical problems in the construction and application of models. However, as we shall, the ‘problem context(s)’ in which the modeling takes plays a decisive role in evaluating the epistemic merit of the models themselves. Moreover, the theoretical difficulties that give rise to inconsistent as opposed to complementary models (in the cases I discuss) impose epistemic and methodological burdens that cannot be overcome by invoking philosophical strategies like perspectivism, paraconsistency or partial structures. (shrink)
The indispensability argument is a method for showing that abstract mathematical objects exist. Various versions of this argument have been proposed. Lately, commentators seem to have agreed that a holistic indispensability argument will not work, and that an explanatory indispensability argument is the best candidate. In this paper I argue that the dominant reasons for rejecting the holistic indispensability argument are mistaken. This is largely due to an overestimation of the consequences that follow from evidential holism. Nevertheless, the holistic indispensability (...) argument should be rejected, but for a different reason —in order that an indispensability argument relying on holism can work, it must invoke an unmotivated version of evidential holism. Such an argument will be unsound. Correcting the argument with a proper construal of evidential holism means that it can no longer deliver mathematical Platonism as a conclusion: such an argument for Platonism will be invalid. I then show how the reasons for rejecting the holistic indispensability argument importantly constrain what kind of account of explanation will be permissible in explanatory versions. (shrink)
I suggest a solution to two puzzles in Spinoza's metaphysics. The first puzzle involves the mind and the idea of the mind, in particular how they can be identical, even though the mind thinks about bodies and nothing else, whereas the idea of the mind thinks about ideas and nothing else. The second puzzle involves the mind and the idea of a thing that belongs to an unknown attribute, in particular how they can be identical, even though the mind thinks (...) about bodies and nothing else, whereas the idea thinks about things belonging to the unknown attribute and nothing else. I suggest that Spinoza would respond to both puzzles by rejecting the Indiscernibility of Identicals. (shrink)
The debate between the Mendelians and the (largely Darwinian) biometricians has been referred to by R. A. Fisher as ‘one of the most needless controversies in the history of science’ and by David Hull as ‘an explicable embarrassment’. The literature on this topic consists mainly of explaining why the controversy occurred and what factors prevented it from being resolved. Regrettably, little or no mention is made of the issues that figured in its resolution. This paper deals with the latter topic (...) and in doing so reorients the focus of the debate as one between Karl Pearson and R. A. Fisher rather than between the biometricians and the Mendelians. One reason for this reorientation is that Pearson's own work in 1904 and 1909 suggested that Mendelism and biometry could, to some extent, be made compatible, yet he remained steadfast in his rejection of Mendelism. The interesting question then is why Fisher, who was also a proponent of biometric methods, was able to synthesise the two traditions in a way that Pearson either could not or would not. My answer to this question involves an analysis of the ways in which different kinds of assumptions were used in modelling Mendelian populations. I argue that it is these assumptions, which lay behind the statistical techniques of Pearson and Fisher, that can be isolated as the source of Pearson's rejection of Mendelism and Fisher's success in the synthesis. (shrink)
According to anti-atomism, we represent color properties (e.g., red) in virtue of representing color relations (e.g., redder than). I motivate anti-atomism with a puzzle involving a series of pairwise indistinguishable chips. I then develop two versions of anti-atomism.
Conception and causation are fundamental notions in Spinoza's metaphysics. I argue against the orthodox view that, due to the causal axiom, if one thing is conceived through another thing, then the second thing causes the first thing. My conclusion forces us to rethink Spinoza's entitlement to some of his core commitments, including the principle of sufficient reason, the parallelism doctrine and the conatus doctrine.
Many of the arguments against reductionism and fundamental theory as a method for explaining physical phenomena focus on the role of models as the appropriate vehicle for this task. While models can certainly provide us with a good deal of explanatory detail, problems arise when attempting to derive exact results from approximations. In addition, models typically fail to explain much of the stability and universality associated with critical point phenomena and phase transitions, phenomena sometimes referred to as "emergent." The paper (...) examines the connection between theoretical principles like spontaneous symmetry breaking and emergent phenomena and argues that new ways of thinking about emergence and fundamentalism are required in order to account for the behavior of many phenomena in condensed matter and other areas of physics. (shrink)
I will develop a new problem for almost all realist theories of colour. The problem involves fluctuations in our colour experiences that are due to visual noise rather than changes in the objects we are looking at.
This paper addresses the role of integrity in global leadership. It reviews the philosophy of ethics and suggests that both contractarianism and pluralism are particularly helpful in understanding ethics from a global leadership perspective. It also reviews the challenges to integrity that come through interactions that are both external and internal to the company. Finally, the paper provides helpful suggestions on how global leaders can define appropriate ethical standards for themselves and their organizations.
In this paper I argue for a distinction between subjective and value laden aspects of judgements showing why equating the former with the latter has the potential to confuse matters when the goal is uncovering the influence of political influences on scientific practice. I will focus on three separate but interrelated issues. The first concerns the issue of ‘verification’ in computational modelling. This is a practice that involves a number of formal techniques but as I show, even these allegedly objective (...) methods ultimately rely on subjective estimation and evaluation of different types of parameters. This has implications for my second point which relates to uncertainty quantification—an assessment of the degree of uncertainty present in a particular modelling scenario. I argue that while this practice also involves subjective elements, in no way does that detract from its status as an epistemic exercise. Finally I discuss the relation between accuracy and uncertainty and how each relates to judgements that embody social/ethical/political concerns, in particular those associated with high consequence systems. (shrink)
The Cambridge Companion to Socrates is a collection of essays providing a comprehensive guide to Socrates, the most famous Greek philosopher. Because Socrates himself wrote nothing, our evidence comes from the writings of his friends , his enemies, and later writers. Socrates is thus a literary figure as well as a historical person. Both aspects of Socrates' legacy are covered in this volume. Socrates' character is full of paradox, and so are his philosophical views. These paradoxes have led to deep (...) differences in scholars' interpretations of Socrates and his thought. Mirroring this wide range of thought about Socrates, this volume's contributors are unusually diverse in their background and perspective. The essays in this volume were authored by classical philologists, philosophers and historians from Germany, Francophone Canada, Britain and the United States, and they represent a range of interpretive and philosophical traditions. (shrink)