After the discovery of the structure of DNA in 1953, scientists working in molecular biology embraced reductionism—the theory that all complex systems can be understood in terms of their components. Reductionism, however, has been widely resisted by both nonmolecular biologists and scientists working outside the field of biology. Many of these antireductionists, nevertheless, embrace the notion of physicalism—the idea that all biological processes are physical in nature. How, Alexander Rosenberg asks, can these self-proclaimed physicalists also be antireductionists? With clarity (...) and wit, Darwinian Reductionism navigates this difficult and seemingly intractable dualism with convincing analysis and timely evidence. In the spirit of the few distinguished biologists who accept reductionism—E. O. Wilson, Francis Crick, Jacques Monod, James Watson, and Richard Dawkins—Rosenberg provides a philosophically sophisticated defense of reductionism and applies it to molecular developmental biology and the theory of natural selection, ultimately proving that the physicalist must also be a reductionist. (shrink)
Mechanistic models in molecular systems biology are generally mathematical models of the action of networks of biochemical reactions, involving metabolism, signal transduction, and/or gene expression. They can be either simulated numerically or analyzed analytically. Systems biology integrates quantitative molecular data acquisition with mathematical models to design new experiments, discriminate between alternative mechanisms and explain the molecular basis of cellular properties. At the heart of this approach are mechanistic models of molecular networks. We focus on the articulation (...) and development of mechanistic models, identifying five constraints which guide the articulation of models in molecular systems biology. These constraints are not independent of one another, with the result that modeling becomes an iterative process. We illustrate the use of these constraints in the modeling of the mechanism for bistability in the lac operon. (shrink)
Understanding how scientific activities use naming stories to achieve disciplinary status is important not only for insight into the past, but for evaluating current claims that new disciplines are emerging. In order to gain a historical understanding of how new disciplines develop in relation to these baptismal narratives, we compare two recently formed disciplines, systems biology and genomics, with two earlier related life sciences, genetics and molecular biology. These four disciplines span the twentieth century, a period in which the (...) processes of disciplinary demarcation fundamentally changed from those characteristic of the nineteenth century. We outline how the establishment of each discipline relies upon an interplay of factors that include paradigmatic achievements, technological innovation, and social formations. Our focus, however, is the baptism stories that give the new discipline a founding narrative and articulate core problems, general approaches and constitutive methods. The highly plastic process of achieving disciplinary identity is further marked by the openness of disciplinary definition, tension between technological possibilities and the ways in which scientific issues are conceived and approached, synthesis of reductive and integrative strategies, and complex social interactions. The importance – albeit highly variable – of naming stories in these four cases indicates the scope for future studies that focus on failed disciplines or competing names. Further attention to disciplinary histories could, we suggest, give us richer insight into scientific development. (shrink)
The present paper analyzes the use and understanding of the homology concept across different biological disciplines. It is argued that in its history, the homology concept underwent a sort of adaptive radiation. Once it migrated from comparative anatomy into new biological fields, the homology concept changed in accordance with the theoretical aims and interests of these disciplines. The paper gives a case study of the theoretical role that homology plays in comparative and evolutionary biology, in molecular biology, and in (...) evolutionary developmental biology. It is shown that the concept or variant of homology preferred by a particular biological field is used to bring about items of biological knowledge that are characteristic for this field. A particular branch of biology uses its homology concept to pursue its specific theoretical goals. (shrink)
As opposed to the dismissive attitude toward reductionism that is popular in current philosophy of mind, a “ruthless reductionism” is alive and thriving in “molecular and cellular cognition”—a field of research within cellular and molecular neuroscience, the current mainstream of the discipline. Basic experimental practices and emerging results from this field imply that two common assertions by philosophers and cognitive scientists are false: (1) that we do not know much about how the brain works, and (2) that lower-level (...) neuroscience cannot explain cognition and complex behavior directly. These experimental practices involve intervening directly with molecular components of sub-cellular and gene expression pathways in neurons and then measuring specific behaviors. These behaviors are tracked using tests that are widely accepted by experimental psychologists to study the psychological phenomenon at issue (e.g., memory, attention, and perception). Here I illustrate these practices and their importance for explanation and reduction in current mainstream neuroscience by describing recent work on social recognition memory in mammals. (shrink)
The theory of concepts advanced in the dissertation aims at accounting for a) how a concept makes successful practice possible, and b) how a scientific concept can be subject to rational change in the course of history. Traditional accounts in the philosophy of science have usually studied concepts in terms only of their reference; their concern is to establish a stability of reference in order to address the incommensurability problem. My discussion, in contrast, suggests that each scientific concept consists of (...) three components of content: 1) reference, 2) inferential role, and 3) the epistemic goal pursued with the concept's use. I argue that in the course of history a concept can change in any of these three components, and that change in one component—including change of reference—can be accounted for as being rational relative to other components, in particular a concept's epistemic goal. This semantic framework is applied to two cases from the history of biology: the homology concept as used in 19th and 20th century biology, and the gene concept as used in different parts of the 20th century. The homology case study argues that the advent of Darwinian evolutionary theory, despite introducing a new definition of homology, did not bring about a new homology concept (distinct from the pre-Darwinian concept) in the 19th century. Nowadays, however, distinct homology concepts are used in systematics/evolutionary biology, in evolutionary developmental biology, and in molecular biology. The emergence of these different homology concepts is explained as occurring in a rational fashion. The gene case study argues that conceptual progress occurred with the transition from the classical to the molecular gene concept, despite a change in reference. In the last two decades, change occurred internal to the molecular gene concept, so that nowadays this concept's usage and reference varies from context to context. I argue that this situation emerged rationally and that the current variation in usage and reference is conducive to biological practice. The dissertation uses ideas and methodological tools from the philosophy of mind and language, the philosophy of science, the history of science, and the psychology of concepts. (shrink)
Neurophysiological research suggests our mental life is related to the cellular processes of particular nerves. In the spirit of Occam’s razor, some authors take these connections as reductions of psychological terms and kinds to molecular- biological mechanisms and patterns. Bickle’s ‘intervene cellularly/molecularly and track behaviourally’ reduction is one example of this. Here the mental is being reduced to the physical in two steps. The first is, through genetically altered mammals, to causally alter activity of particular nerve cells, i.e. neurons, (...) at the molecular level and then, under controlled experimental conditions, to use generally-accepted rules of behaviour within psychology to monitor the results of these manipulations. In this article, we argue that Bickle’s case example for molecular reduction, i.e. the reduction of long-term memory to its cellular-molecular mechanisms, cannot support his claims, because it turns out that his chosen molecular pathway is neither a sufficient nor a necessary condition for the memory consolidation switch, and thus, instead of rejecting the multiple realization argument, Bickle’s argument actually speaks in favour of it. Therefore the idea of reductive connections between our mental life and the activity of particular nerves is, at present, still more fiction than reality. (shrink)
Traditional approaches to theory structure and theory change in science do not fare well when confronted with the practice of certain fields of science. We offer an account of contemporary practice in molecular biology designed to address two questions: Is theory change in this area of science gradual or saltatory? What is the relation between molecular biology and the fields of traditional biology? Our main focus is a recent episode in molecular biology, the discovery of enzymatic RNA. (...) We argue that our reconstruction of this episode shows that traditional approaches to theory structure and theory change need considerable refinement if they are to be defended as generally applicable. 1This paper emerged from discussions between us, and we are both equally responsible for its errors. We would like to thank Yvonne Paterson for helpful comments. (shrink)
The comprehension of living organisms in all their complexity poses a major challenge to the biological sciences. Recently, systems biology has been proposed as a new candidate in the development of such a comprehension. The main objective of this paper is to address what systems biology is and how it is practised. To this end, the basic tools of a systems biological approach are explored and illustrated. In addition, it is questioned whether systems biology ‘revolutionizes’ molecular biology and ‘transcends’ (...) its assumed reductionism. The strength of this claim appears to depend on how molecular and systems biology are characterised and on how reductionism is interpreted. Doing credit to molecular biology and to methodological reductionism, it is argued that the distinction between molecular and systems biology is gradual rather than sharp. As such, the classical challenge in biology to manage, interpret and integrate biological data into functional wholes is further intensified by systems biology’s use of modelling and bioinformatics, and by its scale enlargement. (shrink)
This paper considers the relationship between continuum hydrodynamics and discrete molecular dynamics in the context of explaining the behavior of breaking droplets. It is argued that the idealization of a fluid as a continuum is actually essential for a full explanation of the drop breaking phenomenon and that, therefore, the less "fundamental," emergent hydrodynamical theory plays an ineliminable role in our understanding.
A new event is defined as an intervention in the time reversible dynamical trajectories of particles in a system. New events are then assumed to be quantum fluctuations in the spatial and momentum coordinates, and mental action is assumed to work by ordering such fluctuations. It is shown that when the cumulative values of such fluctuations in a mean free path of a molecule are magnified by molecular interaction at the end of that path, the momentum of a molecule (...) can be changed from its original direction to any other direction. In this way mental action can produce effects through the ordering of thermal motions. Examples are given which show that the ordering of 10^4 10^5 molecules is sufficient to (a) produce detectible PK results and (b) open sufficient ion channels in the brain to initiate a physical action. The relationship of the above model to the arrow of time is discussed. (shrink)
Since the 1930s, scientists studying the neurological disease scrapie had assumed that the infectious agent was a virus. By the mid 1960s, however, several unconventional properties had arisen that were difficult to reconcile with the standard viral model. Evidence for nucleic acid within the pathogen was lacking, and some researchers considered the possibility that the infectious agent consisted solely of protein. In 1982, Stanley Prusiner coined the term `prion' to emphasize the agent's proteinaceous nature. This infectious protein hypothesis was denounced (...) by many scientists as `heretical'.This essay asks why the concept of an infectious protein was considered controversial. Some biologists justified their evaluation of this hypothesis on the grounds that an infectious protein contradicted the `central dogma of molecular biology'. Others referred to vague theoretical constraints such as molecular biology's `theoretical structure' or `framework'. Examination of the objections raised by researchers reveals exactly what generalizations were being challenged by a protein model of infection.This two-part survey of scrapie and prion research reaches several conclusions: (1) A theoretical framework is present in molecular biology, exerting its influence in hypothesis formation and evaluation; (2) This framework consists of several related, yet separable, generalizations or `elements', including Francis Crick's Central Dogma and Sequence Hypothesis, plus notions concerning infection, replication, protein synthesis, and protein folding; (3) The term `central dogma' has stretched beyond Crick's original 1958 definition to encompass at least two other `framework elements': replication and protein synthesis; and (4) From the study of scrapie and related diseases, biological information has been delineated into at least two classes: sequential and what I call `conformational'.In Part I of this essay, a brief review of the central dogma, as outlined by both Francis Crick and James Watson, will be given. The developments in scrapie research from 1965 to 1972 will then be traced. This section will summarize many of the puzzling, non-viral-like properties of the scrapie agent. Alternative hypotheses to the viral explanation will also be presented, including early versions of a protein-only hypothesis. Part II of this essay will follow the developments in scrapie and prion research from the mid 1970s through 1991. The growing prominence of a protein-only model of infection will be balanced by continued objections from many researchers to a pathogen devoid of nucleic acid. These objections will help illuminate those generalizations in molecular biology that were indeed challenged by a protein-only model of infection. (shrink)
MacDonald and Kreitman (1991) propose a test of the neutral mutationrandom drift (NM-RD) hypothesis, the central claim of the neutral theory of molecular evolution. The test involves generating predictions from the NM-RD hypothesis about patterns of molecular substitutions. Alternative selection hypotheses predict that the data will deviate from the predictions of the NM-RD hypothesis in specifiable ways. To conduct the test Mac- Donald and Kreitman examine the evolutionary dynamics of the alcohol dehydrogenase (Adh) gene in three species of (...) Drosophila. The test compares the number of DNA sequence changes between species and within species. The number of DNA differences is an indicator of the evolutionary rate of the Adh gene. Based on the test they conclude that there is strong evidence for adaptive protein evolution at particular sites in the gene. Understanding the test requires some basic knowledge about molecular terms and the predictions of neutral theory. The two important terms are fixed differences and polymorphisms. These are determined by comparing DNA sequences made up of thousands of individual nucleotide sites. A site that is unchanged within a species but different from a related species counts as a fixed difference. These are mutations that occur in some common ancestor of the lineage such that all descendants inherit the change. A site that differs within a species counts as a polymorphism. Determining the number of fixed differences and polymorphisms requires placing 1 each individual gene sequence onto a phylogenetic tree. A coalescent tree charts the ancestral relationships for a set of individual gene sequences. Sequences sampled from within a species form a within-species tree. The common ancestors of each within-species tree form a between-species tree. A detected difference counts as a polymorphism or a fixed difference depending on where it occurs in the phylogenetic tree (cf. Table 1). The test uses the numbers of polymorphisms and fixed differences as indicators of evolutionary rates.. (shrink)
One important aspect of biological explanation is detailed causal modeling of particular phenomena in limited experimental background conditions. Recognising this allows a new avenue for intertheoretic reduction to be seen. Reductions in biology are possible, when one fully recognises that a sufficient condition for a reduction in biology is a molecular model of 1) only the demonstrated causal parameters of a biological model and 2) only within a replicable experimental background. These intertheoretic identifications –which are ubiquitous in biology and (...) form the basis of ruthless reductions (Bickle 2003)- are criticised as merely “local” (Sullivan 2009) or “fragmentary” (Schaffner 2006). However, in an instructive case, a biological model is preserved in molecular terms, and a complex biological phenomenon has been successfully reduced. In doing this the molecular model remains valid in a broader range of background conditions and meaningfully unites disparate biological phenomena. (shrink)
Mulliken proposed an Aufbauprinzip for the molecules on the basis of molecular spectroscopy while establishing, point by point, his concept of molecular orbit. It is the concept of electronic state which becomes the lever for his attribution of electronic configurations to a molecule. In 1932, the concept of orbit was transmuted into that of the molecular orbital to integrate the probabilistic approach of Born and to achieve quantitative accuracy. On the basis of the quantum works of Hund, (...) Wigner, Lennard-Jones and group theory, he suggested the fragment method to establish the characteristics of molecular orbital for polyatomic molecules. These developments make it possible to bring elements of thought on the relation between a molecular whole and its parts . An operational realism combined with the second law of thermodynamics can pave the way for interesting tracks in the mereological study of chemical systems. (shrink)
An assessment is offered of the recent debate on information in the philosophy of biology, and an analysis is provided of the notion of information as applied in scientific practice in molecular genetics. In particular, this paper deals with the dependence of basic generalizations of molecular biology, above all the ‘central dogma’, on the so-called ‘informational talk’ (Maynard Smith [2000a]). It is argued that talk of information in the ‘central dogma’ can be reduced to causal claims. In that (...) respect, the primary aim of the paper is to consider a solution to the major difficulty of the causal interpretation of genetic information: how to distinguish the privileged causal role assigned to nucleic acids, DNA in particular, in the processes of replication and protein production. A close reading is proposed of Francis H. C. Crick's On Protein Synthesis (1958) and related works, to which we owe the first explicit definition of information within the scientific practice of molecular biology. Introduction 1.1 The basic questions of the information debate 1.2 The causal interpretation (CI) of biological information and Crick's ‘central dogma’ Crick's definitions of genetic information The main requirement for (CI) Types of causation in molecular biology 4.1 Structural causation in molecular biology 4.2 Nucleic acids as correlative causal factors The ‘central dogma’ without the notion of information Concluding remarks This is a new version of this article as there were errors in the abstract and full text in the previous version. (shrink)
Philosophical discussion of molecular and developmental biology began in the late 1960s with the use of genetics as a test case for models of theory reduction. With this exception, the theory of natural selection remained the main focus of philosophy of biology until the late 1970s. It was controversies in evolutionary theory over punctuated equilibrium and adaptationism that first led philosophers to examine the concept of developmental constraint. Developmental biology also gained in prominence in the 1980s as part of (...) a broader interest in the new sciences of self-organization and complexity. The current literature in the philosophy of molecular and developmental biology has grown out of these earlier discussions under the influence of twenty years of rapid and exciting growth of empirical knowledge. Philosophers have examined the concepts of genetic information and genetic program, competing definitions of the gene itself and competing accounts of the role of the gene as a developmental cause. The debate over the relationship between development and evolution has been enriched by theories and results from the new field of 'evolutionary developmental biology'. Future developments seem likely to include an exchange of ideas with the philosophy of psychology, where debates over the concept of innateness have created an interest in genetics and development. (shrink)
In their review essay (published in this issue), Looren de Jong and Schouten take my 2003 book to task for (among other things) neglecting to keep up with the latest developments in my favorite scientific case study (memory consolidation). They claim that these developments have been guided by psychological theorizing and have replaced neurobiology's traditional 'static' view of consolidation with a 'dynamic' alternative. This shows that my 'essential but entirely heuristic' treatment of higher-level cognitive theorizing is a mistaken view of (...) actual scientific practice. In response I contend that, on the contrary, a closer look at the memory reconsolidation following reactivation experiments and data suggests (1) a less revolutionary judgment about the proposed alternative, and (2) a now-complete reliance on ruthlessly reductive experimental methods from cellular and molecular neuroscience. These conclusions save the heuristic status I propose for higher-level investigations of behavior and brain. I close with a brief comment on their further charge that I 'sell out' philosophy of science to factual developments in science itself. (shrink)
The increasing place of evolutionary scenarios in functional biology is one of the major indicators of the present encounter between evolutionary biology and functional biology (such as physiology, biochemistry and molecular biology), the two branches of biology which remained separated throughout the twentieth century. Evolutionary scenarios were not absent from functional biology, but their places were limited, and they did not generate research programs. I compare two examples of these past scenarios with two present-day ones. At least three characteristics (...) distinguish present and past efforts: An excellent description of the systems under study, a rigorous use of the evolutionary models, and the possibility to experimentally test the evolutionary scenarios. These three criteria allow us to distinguish the domains in which the encounter is likely to be fruitful, and those where the obstacles to be overcome are high and in which the proposed scenarios have to be considered with considerable circumspection. (shrink)
This paper investigates what molecular biology has done for our understanding of the gene. I base a new account of the gene concept of classical genetics on the classical dogma that gene differences cause phenotypic differences. Although contemporary biologists often think of genes in terms of this concept, molecular biology provides a second way to understand genes. I clarify this second way by articulating a molecular gene concept. This concept unifies our understanding of the molecular basis (...) of a wide variety of phenomena, including the phenomena that classical genetics explains in terms of gene differences causing phenotypic differences. (shrink)
This paper concerns the scale related decoupling of the physics of breaking drops and considers the phenomenon from the point of view of both hydrodynamics and molecular dynamics at the nanolevel. It takes the shape of droplets at breakup to be an example of a genuinely emergent phenomenon---one whose explanation depends essentially on the phenomenological (non-fundamental) theory of Navier-Stokes. Certain conclusions about the nature of "fundamental" theory are drawn.
This paper examines Boltzmann’s responses to the Loschmidt reversibility objection to the H-theorem, as presented in his Lectures on Gas Theory. I describe and evaluate two distinct conceptions of the assumption of molecular disorder found in this work, and contrast these notions with the Stosszahlansatz, as well as with the predominant contemporary conception of molecular disorder. Both these conceptions are assessed with respect to the reversibility objection. Finally, I interpret Boltzmann as claiming that a state of molecular (...) disorder serves as a necessary condition for the application of probabilistic arguments. This in turn offers a way to bridge the conceptual gap between the H-theorem and his combinatorial argument. (shrink)
The concept of molecular structure is fundamental to the practice and understanding of chemistry, but the meaning of this term has evolved and is still evolving. The Born–Oppenheimer separation of electronic and nuclear motions lies at the heart of most modern quantum chemical models of molecular structure. While this separation introduces a great computational and practical simplification, it is neither essential to the conceptual formulation of molecular structure nor universally valid. Going beyond the Born–Oppenheimer approximation introduces new (...) paradigms, bringing fresh insight into the chemistry of fluxional molecules, proteins, superconductors and macroscopic dielectrics, thus opening up new avenues for exploration. But it requires that our ideas of molecular structure need to evolve beyond simple ball-and-stick-type models. (shrink)
This paper is devoted to an examination of the discovery, characterization, and analysis of the functions of microRNAs, which also serves as a vehicle for demonstrating the importance of exploratory experimentation in current (post-genomic) molecular biology. The material on microRNAs is important in its own right: it provides important insight into the extreme complexity of regulatory networks involving components made of DNA, RNA, and protein. These networks play a central role in regulating development of multicellular organisms and illustrate the (...) importance of epigenetic as well as genetic systems in evolution and development. The examination of these matters yields principled arguments for the historicity of the functions of key biological molecules and for the indispensability of exploratory experimentation in contemporary molecular biology as well as some insight into the complex interplay between exploratory experimentation and hypothesis-driven science. This latter result is not only of importance for philosophy of science, but also of practical importance for the evaluation of grant proposals, although the elaboration of this latter claim must be left for another occasion. (shrink)
Caenorhabditis elegans (C. elegans) is a tiny worm that has become the focus of a large number of worldwide research projects examining its genetics, development, neuroscience, and behavior. Recently several groups of investigators have begun to tie together the behavior of the organism and the underlying genes, neural circuits, and molecular processes implemented in those circuits. Behavior is quintessentially organismal--it is the organism as a whole that moves and mates--but the explanations are devised at the molecular and neurocircuit (...) levels, and tested in populations using protocols that span many levels of aggregation. Following a brief review of the main relevant features of C. elegans, I describe some of these circuits, and then discuss two contrasting approaches in behavioral genetics and neural network analysis of the worm. Finally, I outline the rudiments of a "field and focus" explanation model using the two contrasting approaches. (shrink)
Quantum systems have a holistic structure, which implies that they cannot be divided into parts. In order tocreate (sub)objects like individual substances, molecules, nuclei, etc., in a universal whole, the Einstein-Podolsky-Rosen correlations between all the subentities, e.g. all the molecules in a substance, must be suppressed by perceptual and mental processes.Here the particular problems ofGestalt (shape)perception are compared with the attempts toattribute a shape to a quantum mechanical system like a molecule. Gestalt perception and quantum mechanics turn out (on an (...) informal level) to show similar features and problems: holistic aspects, creation of objects, dressing procedures, influence of the observer, classical quantities and structures. The attribute classical of a property or structure means thatholistic correlations to any other quantity do not exist or that these correlations are considered as irrelevant and therefore eliminated (either deliberately and by declaration or in a mental process that is not under rational control). An example of animposed classical structure is the nuclear frame of a molecule. Candidates for classical properties that arenot imposed by the observer could be the charge of a particle or the handedness of a molecule. It is argued here that at least part of a molecule's shape can begenerated automatically by the environment. A molecular shape of this sort arises in addition to Lamb shift-type energy corrections. (shrink)
The “DNA is a program” metaphor is still widely used in Molecular Biology and its popularization. There are good historical reasons for the use of such a metaphor or theoretical model. Yet we argue that both the metaphor and the model are essentially inadequate also from the point of view of Physics and Computer Science. Relevant work has already been done, in Biology, criticizing the programming paradigm. We will refer to empirical evidence and theoretical writings in Biology, although our (...) arguments will be mostly based on a comparison with the use of differential methods (in Molecular Biology: a mutation or alike is observed or induced and its phenotypic consequences are observed) as applied in Computer Science and in Physics, where this fundamental tool for empirical investigation originated and acquired a well-justified status. In particular, as we will argue, the programming paradigm is not theoretically sound as a causal(as in Physics) or deductive(as in Programming) framework for relating the genome to the phenotype, in contrast to the physicalist and computational grounds that this paradigm claims to propose. (shrink)
I defend the view that single experiments can provide a sufficient reason for preferring one among a group of hypotheses against the widely held belief that “crucial experiments” are impossible. My argument is based on the examination of a historical case from molecular biology, namely the Meselson-Stahl experiment. “The most beautiful experiment in biology”, as it is known, provided the first experimental evidence for the operation of a semi-conservative mechanism of DNA replication, as predicted by Watson and Crick in (...) 1953. I use a mechanistic account of explanation to show that this case is best construed as an inference to the best explanation (IBE). Furthermore, I show how such an account can deal with Duhem's well-known arguments against crucial experiments as well as Van Fraassen's “bad lot” argument against IBE. (shrink)
The study of mental illness by the methods of molecular genetics is still in its infancy, but the use of genetic markers in psychiatry may potentially lead to a Virchowian revolution in the conception of mental illness. Genetic markers may define novel clusters of patients having diverse clinical presentations but sharing a common genetic and mechanistic basis. Such clusters may differ radically from the conventional classification schemes of psychiatric illness. However, the reduction of even relatively simple Mendelian phenomena to (...)molecular genetics has been shown to be a surprisingly complex and problematic enterprise. Mental illnesses exist at many levels of including social, environmental, and developmental interactions. Reductionistic shifts in the classification of such a disease entity will have to address the interlevel dynamics that take place within the structure of theories of mental illness. The question of how molecular analysis of psychiatric disease will impact on the structure of existing theories and classification systems is the central topic of this paper. Keywords: disease, philosophy of biology, psychiatry, reductionism CiteULike Connotea Del.icio.us What's this? (shrink)
In the 1960s molecular population geneticists used Monte Carlo experiments to evaluate particular diffusion equation models. In this paper I examine the nature of this comparative evaluation and argue for three claims: first, Monte Carlo experiments are genuine experiments: second, Monte Carlo experiments can provide an important meansfor evaluating the adequacy of highly idealized theoretical models; and, third, the evaluation of the computational adequacy of a diffusion model with Monte Carlo experiments is significantlydifferent from the evaluation of the emperical (...) adequacy of the same diffusion model. (shrink)
Taking reduction in the traditional deductive sense, the programmatic claim that most of genetics can be reduced by molecular genetics is defended as feasible and significant. Arguments by Ruse and Hull that either the relationship is replacement or at best a weaker form of reduction are shown to rest on a mixture of historical and logical confusions about the nature of the theories involved.
A crucial part of the knowledge of molecular biologists is procedural knowledge, that is, knowledge of how to do things in laboratories. Procedural knowledge of molecular biologists involves both perceptual-motor skills and cognitive skills. We discuss such skills required in performing the most commonly used molecular biology techniques, namely, Polymerase Chain Reaction and gel electrophoresis. We argue that procedural knowledge involved in performing these techniques is more than just knowing their protocols. Creative exploration and experience are essential (...) for the acquisition of procedural knowledge in molecular biology. With enough experience, molecular biologists make intuitive judgments without recourse to analytical reasoning. We propose that procedural knowledge is intuitive recognition of the patterns of one's environment that are the most relevant for making a decision or acting appropriately. Finally, we argue that knowledge of molecular biologists requires an integration of procedural knowledge and propositional knowledge. (shrink)
It is argued that the conventional descriptions of chemical bonds as covalent, ionic, metallic, and Van der Waals are compromising the usefulness of quantum mechanics in the synthesis and design of new molecules and materials. Parallels are drawn between the state of chemistry now and when the idea that phlogiston was an element impeded the development of chemistry. Overcoming the current obstacles will require new methods to describe molecular structure and bonding, just as new concepts were needed before the (...) phlogiston theory could be set aside. (shrink)
Advances in molecular biology have generally been taken to support the claim that biology is reducible to chemistry. I argue against that claim by looking in detail at a number of central results from molecular biology and showing that none of them supports reduction because (1) their basic predicates have multiple realizations, (2) their chemical realization is context-sensitive and (3) their explanations often presuppose biological facts rather than eliminate them. I then consider the heuristic and confirmational implications of (...) irreducibility and argue that purely biochemical approaches are likely to be unsound and to be unable to confirm an important range of statements. I conclude by sketching criteria for scientific unity that do not entail reducibility and yet leave an important place for identifying underlying mechanisms. Molecular biology, properly understood, provides an excellent paradigm of non-reductive unity between different explanatory levels. (shrink)
Insights into the role of sleep in the molecular mechanisms of memory consolidation may come from studies of activity-dependent synaptic plasticity, such as long-term potentiation (LTP). This commentary posits a specific contribution of sleep to LTP stabilization, in which mRNA transported to dendrites during wakefulness is translated during sleep. Brain-derived neurotrophic factor may drive the translation of newly transported and resident mRNA.
Genes in Development is a collection of 13 stimulating essays on "post genomic" approaches to the concept of the gene. At the risk of caricaturing some complex balances, the contributors tend to be skeptical about genetic determinism, the central dogma of molecular biology, reductionism, genes as programs and the concept of the gene as a DNA sequence. They tend to like emergent properties, complexity theory, the parity thesis for developmental resources, developmental systems theory, and membranes. But within this broad (...) weltanschauung the essays in Genes in Development vary widely in their interests and emphases––from the history of twentieth century genetics to the social and ethical issues raised by contemporary genetics––which makes for an attractive and valuable collection. (shrink)
A recent literature review of commentaries and ‘state of the art’ articles from researchers in psychiatric genetics (PMG) offers a consensus about progress in the science of genetics, disappointments in the discovery of new and effective treatments, and a general optimism about the future of the field. I argue that optimism for the field of psychiatric molecular genetics (PMG) is overwrought, and consider progress in the field in reference to a sample estimate of US National Institute of Mental Health (...) funding for this paradigm for the years 2008 and 2009. I conclude that the amounts of financial investment in PMG is questionable from an ethical perspective, given other research and clinical needs in the USA. (shrink)
The applicability of Nagel's concept of theory reduction, and related concepts of reduction, to the reduction of genetics to molecular biology is examined using the lactose operon in Escherichia coli as an example. Geneticists have produced the complete nucleotide sequence of two of the genes which compose this operon. If any example of reduction in genetics should fit Nagel's analysis, the lactose operon should. Nevertheless, Nagel's formal conditions of theory reduction are inapplicable in this case. Instead, it is argued (...) that genetics has been partially reduced to molecular biology in the sense of token-token reduction. (shrink)
Certain correspondences appear between the classifications and between the classes of various entities at molecular genetic level: types of fundamental correspondences between classifications and between classes of normal entities, on the one hand, and of mutant entities on the other hand; ranks of correspondences between classifications and between classes of entities. The concept of universality of the genetic code was reformulated on the basis of the above correspondences.
“I myself was forced to call myself a molecular biologist because when inquiring clergymen asked me what I did, I got tired of explaining that I was a mixture of crystallographer, biophysicist, biochemist, and geneticist.” Thus explained Francis Crick, who with James Watson discovered in 1953 the double helical structure of DNA, the genetic material..
The convergence of biomedical sciences with nanotechnology as well as ICT has created a new wave of biomedical technologies, resulting in visions of a ‘molecular medicine’. Since novel technologies tend to shift concepts of disease and health, this paper investigates how the emerging field of molecular medicine may shift the meaning of ‘disease’ as well as the boundary between health and disease. It gives a brief overview of the development towards and the often very speculative visions of (...) class='Hi'>molecular medicine. Subsequently three views of disease often used in the philosophy of medicine are briefly discussed: the ontological or neo-ontological, the physiological and the normative/holistic concepts of disease. Against this background two tendencies in the field of molecular medicine are highlighted: (1) the use of a cascade model of disease and (2) the notion of disease as a deviation from an individual pattern of functioning. It becomes clear that molecular medicine pulls conceptualizations of disease and health in several, partly opposed directions. However, the resulting tensions may also offer opportunities to steer the future of medicine in more desirable directions. (shrink)
A recurrent theme in ethnomethodological research is that of instructed actions. Contrary to the classic traditions in the social and cognitive sciences, which attribute logical priority or causal primacy to instructions, rules, and structures of action, ethnomethodologists investigate the situated production of actions which enable such formulations to stand as adequate accounts. Consequently, a recitation of formal structures can not count as an adequate sociological description, when no account is given of the local production ofwhat those structures describe. The natural (...) sciences can be described as a domain of practical action in whichthe use of methods enables the intersubjective reproduction of naturalistic observations and experiments. As numerous sociological studies of laboratory practices have shown, the achievement of intersubjective order cannot be reduced to formal methods; instead, it arises from the work of custom-fitting relevant methods to the local circumstances of the research. In this paper we discuss a possible extension of this idea to cover two intertwined aspects of molecular biology: (1) the work of following instructions on how to perform routine laboratory procedures, and (2) the relationship between cellular orders and the encoded instructions contained in the DNA molecule. We suggest that a classic conception of scientific action is implied by the way formal instructions are treated as a primary basis, both for molecular biologists' actions and the cellular functions they study, and we envision an ethnomethodological alternative to those conceptions of social and biological order. (shrink)
Scientific anomalies are observations and facts that contradict current scientific theories and they are instrumental in scientific theory change. Philosophers of science have approached scientific theory change from different perspectives as Darden (Theory change in science: Strategies from Mendelian genetics, 1991) observes: Lakatos (In: Lakatos, Musgrave (eds) Criticism and the growth of knowledge, 1970) approaches it as a progressive “research programmes” consisting of incremental improvements (“monster barring” in Lakatos, Proofs and refutations: The logic of mathematical discovery, 1976), Kuhn (The structure (...) of scientific revolutions, 1996) observes that changes in “paradigms” are instigated by a crisis from some anomaly, and Hanson (In: Feigl, Maxwell (eds) Current issues in the philosophy of science, 1961) proposes that discovery does not begin with hypothesis but with some “problematic phenomena requiring explanation”. Even though anomalies are important in all of these approaches to scientific theory change, there have been only few investigations into the specific role anomalies play in scientific theory change. Furthermore, much of these approaches focus on the theories themselves and not on how the scientists and their experiments bring about scientific change (Gooding, Experiment and the making of meaning: Human agency in scientific observation and experiment, 1990). To address these issues, this paper approaches scientific anomaly resolution from a meaning construction point of view. Conceptual integration theory (Fauconnier and Turner, Cogn Sci 22:133–187, 1996; The way we think: Conceptual blending and mind’s hidden complexities, 2002) from cognitive linguistics describes how one constructs meaning from various stimuli, such as text and diagrams, through conceptual integration or blending. The conceptual integration networks that describe the conceptual integration process characterize cognition that occurs unconsciously during meaning construction. These same networks are used to describe some of the cognition while resolving an anomaly in molecular genetics called RNA interference (RNAi) in a case study. The RNAi case study is a cognitive-historical reconstruction (Nersessian, In: Giere (ed) Cognitive models of science, 1992) that reconstructs how the RNAi anomaly was resolved. This reconstruction traces four relevant molecular genetics publications in describing the cognition necessary in accounting for how RNAi was resolved through strategies (Darden 1991), abductive reasoning (Peirce, In: Hartshorne, Weiss (eds) Collected papers, 1958), and experimental reasoning (Gooding 1990). The results of the case study show that experiments play a crucial role in formulating an explanation of the RNAi anomaly and the integration networks describe the experiments’ role. Furthermore, these results suggest that RNAi anomaly resolution is embodied. It is embodied in a sense that cognition described in the cognitive-historical reconstruction is experientially based. (shrink)
: Where there are cases of underdetermination in scientific controversies, such as the case of the molecular clock, scientists may direct the course and terms of dispute by playing off the multidimensional framework of theory evaluation. This is because assessment strategies themselves are underdetermined. Within the framework of assessment, there are a variety of trade-offs between different strategies as well as shifting emphases as specific strategies are given more or less weight in assessment situations. When a strategy is underdetermined, (...) scientists can change the dynamics of a controversy by making assessments using different combinations of evaluation strategies and/or weighting whatever strategies are in play in different ways. Following an underdetermination strategy does not end or resolve a scientific dispute. Consequently, manipulating underdetermination is a feature of controversy dynamics and not controversy closure. (shrink)
This article offers three contrasting cases of the use of neutrality and drift in molecular evolution. In the first, neutrality is assumed as a simplest case for modeling. In the second and third, concepts of drift and neutrality are developed within the context of population genetics testing and the development and application of the molecular clock.
Senior molecular geneticists were interviewed about their perceptions of the ethical and social implications of genetic knowledge. Inductive analysis of these interviews identified a number of strategies through which the scientists negotiated their moral responsibilities as they participated in generating knowledge that presents difficult ethical questions. These strategies included: further analysis and application of scientific method; clarification of multiple roles; negotiation with the public through public debate, institutional processes of funding, ethics committees and legislation; and personal responsibility.
Biological psychiatry has been dominated by a psychopharmacologically-driven neurotransmitter dysfunction paradigm. The objective of this paper is to explore a reductionist assumption underlying this paradigm, and to suggest an improvement on it. The methods used are conceptual analysis with a comparative approach, particularly using illustrations from the history of both biological psychiatry and molecular biology. The results are that complete reduction to physicochemical explanations is not fruitful, at least in the initial stages of research in the medical and life (...) sciences, and that an appropriate (non-reducible) integrative principle - addressing a property of the whole system under study - is required for each domain of research. This is illustrated in Pauling's use of a topological integrative principle for the discovery of the functioning of proteins and in Watson and Crick's use of the notion of a genetic code as an integrative principle for the discovery of the structure of genes. The neurotransmitter dysfunction paradigm addresses single molecules and their neural pathways, yet their interactions within the CNS as a whole seem most pertinent to mental disorders such as schizophrenia. The lack within biological psychiatry of an integrative principle addressing a property of the CNS as a whole may be responsible for the empirical failure of orthomolecular psychiatry, as well as for the central role that serendipity has played in the study of mental disorders, which is dominated by the neurotransmitter paradigm. The conclusion is that research in biological psychiatry may benefit from using, at least initially, some integrative principle(s) addressing a property of the CNS as a whole, such as connectionism or a hierarchical notion. (shrink)
Quantum mechanical and molecular dynamics methods were used to analyze the structure and stability of neutral and zwitterionic configurations of the extracted active site sequence from a Burkholderia cepacia lipase, histidyl-seryl-glutamin (His86-Ser87-Gln88) and its mutated form, histidyl-cysteyl-glutamin (His86-Cys87-Gln88) in vacuum and different solvents. The effects of solvent dielectric constant, explicit and implicit water molecules and side chain mutation on the structure and stability of this sequence in both neutral and zwitterionic forms are represented. The quantum mechanics computations represent that (...) the relative stability of zwitterionic and neutral configurations depends on the solvent structure and its dielectric constant. Therefore, in vacuum and the considered non-polar solvents, the neutral form of the interested sequences is more stable than the zwitterionic form, while their zwitterionic form is more stable than the neutral form in the aqueous solution and the investigated polar solvents in most cases. However, on the potential energy surfaces calculated, there is a barrier to proton transfer from the positively charged ammonium group to the negatively charged carboxylat group or from the ammonium group to the adjacent carbonyl oxygen and or from side chain oxygen and sulfur to negatively charged carboxylat group. Molecular dynamics simulations (MD) were also performed by using periodic boundary conditions for the zwitterionic configuration of the hydrated molecules in a box of water molecules. The obtained results demonstrated that the presence of explicit water molecules provides the more compact structures of the studied molecules. These simulations also indicated that side chain mutation and replacement of sulfur with oxygen leads to reduction of molecular flexibility and packing. (shrink)
How to interpret the “molecular gene” concept is discussed in this paper. I argue that the architecture of biological systems is hierarchical and multi-layered, exhibiting striking similarities to that of modern computers. Multiple layers exist between the genotype and system level property, the phenotype. This architectural complexity gives rise to the intrinsic complexity of the genotype-phenotype relationships. The notion of a gene being for a phenotypic trait or traits lacks adequate consideration of this complexity and has limitations in explaining (...) the genotype-phenotype relationships. I explore ways toward an integrative interpretation of the gene in the context of multi-layered biological systems. A gene, I argue, should be interpreted as a functional unit that is responsible for the trans-generation passage of the capacity to dynamically produce a biochemical activity or biochemical activities. At the molecular level, a gene is a genetic unit, a stretch of DNA sequence, which dictates the behavior and the dynamic production of the encoded cellular component(s). Embedded in a gene’s quadruple DNA code are the regulatory signals, such as those for RNA splicing and/or editing, as well as for transcription factor binding. A regulatory signal can be recognized by the gene expression machinery in one state, but not in another. The confusion caused by RNA splicing, editing, and a gene’s selective tissue distribution pattern is addressed. Instead of a context-dependent definition of the gene, I argue for the view that it is the same gene displaying multiple meanings, subject to differential interpretation by the cellular machinery in different states. In other words, the same gene gives rise to different products and expression levels under different conditions. (shrink)
Molecular Weismannism is the claim that: In the development of an individual, DNA causes the production both of DNA (genetic material) and of protein (somatic material). The reverse process never occurs. Protein is never a cause of DNA. This principle underpins both the idea that genes are the objects upon which natural selection operates and the idea that traits can be divided into those that are genetic and those that are not. Recent work in developmental biology and in philosophy (...) of biology argues that an acceptance of Molecular Weismannism requires the tacit assumption that genetic causes are different in kind from other developmental causes. They argue that if this assumption proves to be unwarranted then we should abandon, not just gene selectionism and gene centred functional solutions to the units of selection problem, but also the very notion that there is any such thing as a genetic trait. A group of possible causal distinctions (proximity, ultimacy and specificity) are explored and found wanting. It is argued that an extended version of information theory, while not strong enough to support Molecular Weismannism, will support both the claim that traits can be divided into those that are genetic and those that are not as well as the claim that there is good reason to privilege genetic causes within evolutionary and developmental explanations. The outcome of this for the units of selection debate is explored. (shrink)
According to ‘standard histories’ of nanotechnology, the colorful pictures of atoms produced by scanning probe microscopists since the 1980s essentially inspired visions of molecular nanotechnology. In this paper, I provide an entirely different account that, nonetheless, refers to aesthetic inspiration, First, I argue that the basic idea of molecular nanotechnology, i.e., producing molecular devices, has been the goal of supramolecular chemistry that emerged earlier, without being called nanotechnology. Secondly, I argue that in supramolecular chemistry the production of (...)molecular devices was inspired by an aesthetic phenomenon of gestalt switch, by certain images that referred to both molecules and ordinary objects, and thus symbolically bridged the two worlds. This opened up a new way of perceiving and drawing molecular images and new approaches to chemical synthesis. Employing Umberto Eco’s semiotic theory of aesthetics, I analyze the gestalt switch and the inspiration to build molecular devices and to develop a new sign language for supramolecular chemistry. More generally, I argue that aesthetic phenomena can play an important role in directing scientific research and that aesthetic theories can help understand such dynamics, such that they need to be considered in philosophy of science. (shrink)
In my previous article on the benzene problem, I described how Pauling's valence bond (resonance) theory, sometimes regarded as a modernized version of Kekule's oscillation hypothesis, came to be accepted by chemists by the end of World War II. But the alternative molecular orbital theory, proposed by Mulliken, had already been developed and was regarded as quantitatively superior by many quantum chemists, though it was not as easy to visualize and did not seem to harmonize as well with traditional (...) chemical concepts. During the 1950s and 1960s, thanks to the efforts of Charles Coulson and many other theorists, the molecular orbital approach not only dominated theoretical discussions but also started to be accepted by the chemical community as a whole and became the preferred description for benzene. Possible reasons were: its greater calculational convenience when applied to large molecules; better expository methods directed toward chemists; the spectacular success of the Woodward-Hoffmann rules for pericyclic reactions and Fukui's frontier orbital theory; and the development of a general theory of aromaticity, which predicted properties of similar molecules such as cyclobutadiene (C4H4). The relative importance of these reasons is explored through a mail survey of chemists. (shrink)
Until the 1930s Germany had been the international leader in biochemistry, chemistry, and areas of biology. After WWII, however, molecular biology as a new interdisciplinary scientific enterprise was scarcely represented in Germany for almost 20 years. Three major reasons for the low performance of molecular biology are discussed: first, the forced emigration of Jewish scientists after 1933, which not only led to the expulsion of future distinguished molecular biologists, but also to a strong decline of ''dynamic biochemistry'', (...) a field which contributed greatly to molecular biology. Second, German university structures that strongly impeded interdisciplinary research. Third, the international isolation and self-isolation of German scientists that was a major obstacle to the implementation of new fields of research developed elsewhere. Despite the fact that there was no official boycott against Germany as there had been after WWI and despite the Cold War policy of integrating Germans into the West, as a consequence of National Socialism and WWI for many years only very few German scientists gained access to the international community of molecular biologists. Max Delbruck played an important role in helping the Germans establish modern, mostly molecular, biology because he retained strong connections to Germany. Most importantly, it required a new generation of young scientists who had received part of their training in the US to establish modern molecular biology at German universities and Max Planck Institutes. (shrink)
Biologists and historians often present natural history and molecular biology as distinct, perhaps conflicting, fields in biological research. Such accounts, although supported by abundant evidence, overlook important areas of overlap between these areas. Focusing upon examples drawn particularly from systematics and molecular evolution, I argue that naturalists and molecular biologists often share questions, methods, and forms of explanation. Acknowledging these interdisciplinary efforts provides a more balanced account of the development of biology during the post-World War II era.
When phylogenetic trees constructed from morphological and molecular evidence disagree (i.e. are incongruent) it has been suggested that the differences are spurious or that the molecular results should be preferred a priori. Comparing trees can increase confidence (congruence), or demonstrate that at least one tree is incorrect (incongruence). Statistical analyses of 181 molecular and 49 morphological trees shows that incongruence is greater between than within the morphological and molecular partitions, and this difference is significant for the (...)molecular partition. Because the level of incongruence between a pair of trees gives a minimum bound on how much error is present in the two trees, our results indicate that the level of error may be underestimated by congruence within partitions. Thus comparisons between morphological and molecular trees are particularly useful for detecting this incongruence (spurious or otherwise). Molecular trees have higher average congruence than morphological trees, but the difference is not significant, and both within- and between-partition incongruence is much lower than expected by chance alone. Our results suggest that both molecular and morphological trees are, in general, useful approximations of a common underlying phylogeny and thus, when molecules and morphology clash, molecular phylogenies should not be considered more reliable a priori. (shrink)
Kincaid argues that molecular biology provides little support for the reductionist program, that biochemistry does not reveal common mechanisms, indeed that biochemical theory obstructs discovery. These assertions clash with biologists' stated advocacy of reductionist programs and their claims about the consequent unity of experimental biology. This striking disagreement goes beyond differences in meaning granted to the terms. More significant is Kincaid's misunderstanding of what biochemists do, for a closer look at scientific practice-- and one of Kincaid's examples--reveals substantial progress (...) toward explaining biological function with biochemical models. With the molecular detail emerge unifying generalizations as well as further aspects of the functional processes. (shrink)
A general case about the insights and oversights of molecular genetics is argued for by considering two specific cases: the first concerns the bearing of molecular genetics on Mendelian genetics, and the second concerns the bearing of molecular genetics on the replicability of the genetic material. As in the first case, it is argued that Mendel's law of segregation cannot be explained wholly in terms of molecular genetics--the law demands evolutionary scrutiny as well. In the second (...) case, it is argued that an account of the replicability of the genetic material in terms of molecular genetics is not entirely independent of evolutionary considerations, in the sense that it raises further evolutionary questions. The limitations of the molecular-genetic approach in these cases point to the limitations of that approach in general. (shrink)
Lately there has been a growing interest in evolutionary studies concerning how the regularities and patterns found in the living cell could have emerged spontaneously by way of self-assembly and self-organization. It is reasonable to postulate that the chemical compounds found in the primitive Earth would have mostly been very simple in nature, and would have been immersed in the natural dynamics of the physical world, some of which would have involved self-organization. It seems likely that some molecular processes (...) self-organized spontaneously into a hierarchy of complex behaviours. Our conceptual search herein reaches back to the time when prebiotic phenomena began to take shape. This was before the origin of life, so in this paper we hope to shed new light on some of the theoretical issues that surround the ways in which cellular organization might have evolved without the aid of replicated information. (shrink)
This article examines how a molecular "solution" to an important biological problem-how is antibody diversity generated? was obtained in the 1970s. After the primarily biological clonal selection theory (CST) was accepted by 1967, immunologists developed several different contrasting theories to complete the SCST. To choose among these theories, immunology had to turn to the new molecular biology, first to nucleic acid hybridization and then to recombinant DNA technology. The research programs of Tonegawa and Leder that led to the (...) "solution" are discussed, and some of their strategies and heuristics are broadly characterized: (1) to what extent does the new recombinant DNA technology provide what the scientists claim is "direct evidence," what does that term mean, and what are the implications of that claim for biological "realism," and (2) is this episode one of reduction, partial reduction, or explanatory extension, and what do these terms mean in the context of a successful molecular "solution" to a biological problem. (shrink)
An assessment is offered of the recent debate on information in the philosophy of biology, and an analysis is provided of the notion of information as applied in scientific practice in molecular genetics. In particular, this paper deals with the dependence of basic generalizations of molecular biology, above all the 'central dogma', on the socalled 'informational talk' (Maynard Smith [2000a]). It is argued that talk of information in the 'central dogma' can be reduced to causal claims. In that (...) respect, the primary aim of the paper is to consider a solution to the major difficulty of the causal interpretation of genetic information: how to distinguish the privileged causal role assigned to nucleic acids, DNA in particular, in the processes of replication and protein production. A close reading is proposed of Francis H. C. Crick's On Protein Synthesis () and related works, to which we owe the first explicit definition of information within the scientific practice of molecular biology. (shrink)
In its first part, this paper seeks to make plausible (a) that molecular genetic diagnostics differs in ethically relevant ways from traditional types of medical diagnostics and (b) that the consequences of introducing this technology in broad screening-programs to detect widespread genetic diseases in a population which is not at high risk may change our understanding of health and disease in a problematic way. In its second part, the paper discusses some aspects of public control of scientific and technological (...) innovations in the field of molecular genetic diagnostics. (shrink)
The Medical Research Council Laboratory of Molecular Biology (formerly the Medical Research Council Unit for the Study of Molecular Structure of Biological Systems) in Cambridge (England) played a key role in the postwar history of molecular biology. The paper, focussing on the early history of the institution, aims to show that the creation of the laboratory and the making of molecular biology were part of a new scientific culture set in place after World War II. In (...) five interlinked parts it deals with the institutional creation of the MRC unit dedicated to the crystallographic analysis of biological molecules; the attraction of postwar biophysics, the heading under which the work of the unit initially fell; the people who joined the laboratory and their appropriation of new technologies, in particular the electronic computer for protein crystal structure determination; the cultural appeal of postwar crystallography, as exemplified in the use of crystal structure diagrams for a wide series of consumer goods at the Festival of Britain in 1951 and the display of molecular models at the Brussels World's Fair in 1958, a key site for the presentation of science and its role in the postwar world. (shrink)
This case for discussion highlights some of the ethical difficulties that may arise in the use of molecular typing techniques in the control of infectious diseases. Molecular typing techniques offer evidence (stronger than regular epidemiological exploration of sources and contacts) for claims about infection routes. Such evidence will mean that public health authorities need to think about how to respond ethically to causal responsibility for contagion. In this context, questions are raised about the use of molecular typing (...) methods for source and contact tracing in the control of infectious diseases. (shrink)
The intellectual origins of molecular biology are usually traced back to the 1930s. By contrast, molecular biology acquired a social reality only around 1960. To understand how it came to designate a community of researchers and a professional identity, I examine the creation of the first institutes of molecular biology, which took place around 1960, in four European countries: Germany, the United Kingdom, France, and Switzerland. This paper shows how the creation of these institutes was linked to (...) the results of post-war economic reconstruction. Then, it compares how the promoters of these different institutional projects delimited the goals of their discipline, reflected on its history, and suggested how research should be organised. I show how they carefully positioned their new discipline within the emerging national science policy discourse of the 1950s, and aligned it with the current vision of scientific modernity. In particular, I discuss how they articulated the meaning of molecular biology with respect to five common themes: the role of physics in the atomic age, the relations between fundamental research and medical applications, the 'Americanisation' of scientific research, the value of science in the reconstruction of national identities, and the drive towards interdisciplinary research. This paper thus demonstrates that beyond the local and national accounts there is a European history of molecular biology. (shrink)
Molecular biologists use different kinds of reasoning strategies for different tasks, such as hypothesis formation, experimental design, and anomaly resolution. More specifically, the reasoning strategies discussed in this paper may be characterized as (1) abstraction-instantiation, in which an abstract skeletal model is instantiated to produce an experimental system; (2) the systematic scan, in which alternative hypotheses are systematically generated; and (3) modular anomaly resolution, in which components of a model are stated explicitly and methodically changed to generate alternative changes (...) to resolve an anomaly. This work grew out of close observation over a period of six months of an actively functioning molecular genetics laboratory. (shrink)
The genome contains elements which are most easily understood as the products of selection operating at the level of the genome, without regard to phenotypic effect. The properties of such elements, and more general implications of molecular biological data, are discussed.
This paper combines naturalized metaphysics and a philosophical reflection on a recently evolving interdisciplinary branch of quantum chemistry, ab initio molecular dynamics. Bridging the gaps among chemistry, physics, and computer science, this cutting-edge research field explores the structure and dynamics of complex molecular many-body systems through computer simulations. These simulations are allegedly crafted solely by the laws of fundamental physics, and are explicitly designed to capture nature as closely as possible. The models and algorithms employed, however, involve many (...) approximations and significant degrees of idealization of their target systems. Therefore, for philosophers of science the pivotal question of whether relying only on the fundamental laws of physics supports a reductionist or realist stance arises. One conceivable answer to this question is that the irreducible approximations and idealizations support rather anti-realist positions. After reviewing an influential attitude in the philosophy of computer simulations and the debate concerning scientific realism, I offer a fair interpretation of such ab initio modelling in quantum chemistry within a naturalistic metaphysical framework that gives rise to a specific type of ontic structural realism. (shrink)
One important aspect of biological explanation is detailed causal modeling of particular phenomena in limited experimental background conditions. Recognising this allows one to appreciate that a sufficient condition for a reduction in biology is a molecular model of (1) only the demonstrated causal parameters of a biological model and (2) only within a replicable experimental background. These identities—which are ubiquitous in biology and form the basis of ruthless reductions (Bickle, Philosophy and neuroscience: a ruthlessly reductive account, 2003)—are criticised as (...) merely “local” (Sullivan, Synthese 167:511–539, 2009) or “fragmentary” (Schaffner, Synthese, 151(3):377–402, 2006). However, in an instructive case, a biological model is preserved in molecular terms, demonstrating a complex phenomenon that has been successfully reduced. (shrink)
This response to Rump and Woonink (2012) on ethical questions concerning the use of molecular typing techniques in the control of infectious diseases examines the use of typing in Canada and the legal framework that will govern its increasing use for source and contact tracing in provincial health systems. It examines whether current public health and privacy laws and constitutional protections provide the appropriate balance between public and individual interests in the control of infectious diseases.
RESUMEN: Abir-Am ha criticado la visión estándar de que la Fundación Rockefeller (FR) jugó un papel central en el surgimiento de la biología molecular durante la década de 1960. En su opinión, la FR aceleró la molecularización de las ciencias de la vida, pero no intervino de manera directa en el surgimiento de la biología molecular como disciplina. Aquí sostengo que esta crítica tiene consecuencias mayores a las que sospechó su autora y muestro que la tesis de la (...) centralidad de la FR en el desarrollo de la biología molecular no se puede desmantelar sin alterar también la visión de la biologia molecular como una disciplina orientada a la resolución de problemas predefinidos.ABSTRACT: Abir-Am has critiqued the standard view that the Rockefeller Foundation (RF) played a central role in the development of molecular biology during the 1960s. In her view, the RF accelerated the molecularization of the life sciences, but it did not directly contribute to building molecular biology’s disciplinary identity. Here I argue that Abir-Am’s critique has more consequences than she envisioned, and I show that the thesis of the centrality of the RF cannot be dismantled without also altering the view of molecular biology as a field oriented towards the solution of predefined problems. (shrink)
From the mid-1960s onwards, a set of Spanish molecular biology research groups emerged in Spain. The factors contributing to this included: the return of a group of molecular biologists from their postdoctoral period abroad, the negotiations for the return of Spanish-born Nobel prize winner Severo Ochoa from New York, the negotiations for Spanish membership in the European Conference of Molecular Biology, and national policy towards university reform. As a result, the early molecular biologists' research groups began (...) to be recognised as research schools by Spanish authorities and postgraduate courses and new research centres for molecular biology were set up. Foreign influence in the whole process was crucial. (shrink)
This article presents results from an empirical investigation of the role and importance of ethics in the daily work of Danish oncologyphysicians and Danish molecular biologists. The study is based on 12 semi-structured interviews with three groups of respondents: a group of oncology physicians working in a clinic at a public hospital and two groups of molecular biologists conducting basic research, one group employed at a public university and the other in a private biopharmaceutical company.We found that oncology (...) physicians consider ethical evaluation as part of their daily work. They discuss how to treat patients in groups and they have interdisciplinary seminars. In contrast, molecular biologists employed at the university do not think that basic research causes significant ethical problems, they do not talk about ethics in their daily work and they do not want to prioritise seminars on ethics. Molecular biologists employed in a private biopharmaceutical company do not think that basic research causes significant ethical problems, but the private company prioritises ethical evaluation. If the company behaves unethical, they will be punished by the consumers and by the investors in the last end. In general, oncology physicians working in the clinic experience a closer relationship between their daily work and ethical problems concerning human beings than molecular biologists conducting basic research. (shrink)
This paper discusses the economic, health and social potential of nanotechnology. This involves the development of techniques for the manipulation of matter at the atomic and molecular levels. These new materials can perform multiple functions and be economically produced in large quantities, and hence have the potential for replacing existing technologies and materials in socially and economically disruptive ways. The paper raises some ethical, social and moral concerns arising from the capabilities of such materials, and discusses possible ways to (...) regulate the growth of nanotechnology and allow the society to reap the benefits. (shrink)
This book is about the epistemologically different worlds (hyperverse) in relationship with the "I", the mind-body problem (Frith, Llinas), Bechtel's mechanisms, Clark's extended mind, Bickle's molecular and cellular cognition, Kauffman's life, quantum mechanics, gravity, hyperspace vs. hyperverse -/- .
This book precis describes the motives behind my recent attempt to bring to bear “ruthlessly reductive” results from cellular and molecular neuroscience onto issues in the philosophy of mind. Since readers of this journal will probably be most interested in results addressing features of conscious experience, I highlight these most prominently. My main challenge is that philosophers (even scientifically-inspired ones) are missing the nature and scope of reductionism in contemporary neuroscience by focusing exclusively on higher-level cognitive neuroscience, and ignoring (...) the discipline's cell-physiological and molecular-biological core. (shrink)
This paper, which is based on recent empirical research at the University of Leeds, the University of Edinburgh, and the University of Bristol, presents two difficulties which arise when condensed matter physicists interact with molecular biologists: (1) the former use models which appear to be too coarse-grained, approximate and/or idealized to serve a useful scientific purpose to the latter; and (2) the latter have a rather narrower view of what counts as an experiment, particularly when it comes to computer (...) simulations, than the former. It argues that these findings are related; that computer simulations are considered to be undeserving of experimental status, by molecular biologists, precisely because of the idealizations and approximations that they involve. The complexity of biological systems is a key factor. The paper concludes by critically examining whether the new research programme of ‘systems biology’ offers a genuine alternative to the modelling strategies used by physicists. It argues that it does not. (shrink)
Current accounts of the relationship between classical genetics and molecular biology favor the ‘explanatory extension’ thesis, according to which molecular biology elucidates aspects of inheritance unexplained by classical genetics. I identify however an unresolved tension between the ‘explanatory extension’ account and examples of ‘explanatory interference’ (cases when the accommodation of data from molecular biology results in a more precise genotyping and more adequate classical explanations). This paper provides a new way of analyzing the relationship between classical genetics (...) and molecular biology capable of resolving this tension. The proposed solution makes use of the properties of mechanism schemas and sketches, which can be completed by elucidating some or all of their remaining ‘black boxes’ and instantiated via the filling-in of phenomenon-specific details. This result has implications for the reductionism -antireductionism debate since it shows that molecular elucidations have a positive impact on classical explanations without entailing the reduction of classical genetics to molecular biology. (shrink)
Although molecular biology has meant different things at different times, the term is often associated with a tendency to view cellular causation as conforming to simple linear schemas in which macro-scale effects are specified by micro-scale structures. The early achievements of molecular biologists were important for the formation of such an outlook, one to which the discovery of recombinant DNA techniques, and a number of other findings, gave new life even after the complexity of genotype–phenotype relations had become apparent. (...) Against this background we outline how a range of scientific developments and conceptual considerations can be regarded as enabling and perhaps necessitating contemporary systems approaches. We suggest that philosophical ideas have a valuable part to play in making sense of complex scientific and disciplinary issues. (shrink)
This paper argues in defense of theanti-reductionist consensus in the philosophy ofbiology. More specifically, it takes issues with AlexRosenberg's recent challenge of this position. Weargue that the results of modern developmentalgenetics rather than eliminating the need forfunctional kinds in explanations of developmentactually reinforce their importance.
Recent philosophy of science has seen a number of attempts to understand scientific models by looking to theories of fiction. In previous work, I have offered an account of models that draws on Kendall Walton’s ‘make-believe’ theory of art. According to this account, models function as ‘props’ in games of make-believe, like children’s dolls or toy trucks. In this paper, I assess the make-believe view through an empirical study of molecular models. I suggest that the view gains support when (...) we look at the way that these models are used and the attitude that users take towards them. Users’ interaction with molecular models suggests that they do imagine the models to be molecules, in much the same way that children imagine a doll to be a baby. Furthermore, I argue, users of molecular models imagine themselves viewing and manipulating molecules, just as children playing with a doll might imagine themselves looking at a baby or feeding it. Recognising this ‘participation’ in modelling, I suggest, points towards a new account of how models are used to learn about the world, and helps us to understand the value that scientists sometimes place on three-dimensional, physical models over other forms of representation. (shrink)
Voles are attracting attention because genetic variation at a single locus appears to have a profound impact on a complex social behavior, namely monogamy. After briefly reviewing the state of the most relevant scientific literature, I examine the way that this research gets taken up by the popular media, by scientists, and by the notable philosopher of neuroscience Patricia Churchland and interpreted as having deeply revisionary implications for how we ordinarily understand ourselves as persons. We have all these big questions (...) we would like to resolve about free will, consciousness, our understanding of persons, and the nature of morality and there is a tendency to ask more of neuroscience than it can yet answer. I do not deny that advances in neuroscience may eventually bear on important philosophical issues. However, it is not at all clear that this research has many of the sweeping implications being claimed for it and, in communicating science responsibly to the public, there is reason to be cautious about suggesting that it does. (shrink)
This paper provides an account of the experimental conditions required for establishing whether correlating or causally relevant factors are constitutive components of a mechanism connecting input (start) and output (finish) conditions. I argue that two-variable experiments, where both the initial conditions and a component postulated by the mechanism are simultaneously manipulated on an independent basis, are usually required in order to differentiate between correlating or causally relevant factors and constitutively relevant ones. Based on a typical research project molecular biology, (...) a flowchart model detailing typical stages in the formulation and testing of hypotheses about mechanistic components is also developed. (shrink)
This paper approaches the scientific realism question from a naturalistic perspective. On the basis of a historical case study of the work of James Clerk Maxwell and Ludwig Boltzmann on the kinetic theory of gases, it shows that scientists’ views about the epistemological status of theories and models typically interact with their scientific results. Subsequently, the implications of this result for the current realism debate are analysed. The case study supports Giere’s moderately realist view of scientific models and theories, based (...) on the notion of similarity, and it highlights the crucial role of model users. The paper concludes with a discussion of Boltzmann’s Bildtheorie, the sophisticated form of realism that he developed in response to the scientific problems of kinetic theory. (shrink)