After the discovery of the structure of DNA in 1953, scientists working in molecular biology embraced reductionism—the theory that all complex systems can be understood in terms of their components. Reductionism, however, has been widely resisted by both nonmolecular biologists and scientists working outside the field of biology. Many of these antireductionists, nevertheless, embrace the notion of physicalism—the idea that all biological processes are physical in nature. How, Alexander Rosenberg asks, can these self-proclaimed physicalists also be antireductionists? With clarity (...) and wit, Darwinian Reductionism navigates this difficult and seemingly intractable dualism with convincing analysis and timely evidence. In the spirit of the few distinguished biologists who accept reductionism—E. O. Wilson, Francis Crick, Jacques Monod, James Watson, and Richard Dawkins—Rosenberg provides a philosophically sophisticated defense of reductionism and applies it to molecular developmental biology and the theory of natural selection, ultimately proving that the physicalist must also be a reductionist. (shrink)
Despite the transformation in biological practice and theory brought about by discoveries in molecular biology, until recently philosophy of biology continued to focus on evolutionary biology. When the Human Genome Project got underway in the late 1980s and early 1990s, philosophers of biology -- unlike historians and social scientists -- had little to add to the debate. In this landmark collection of essays, Sahotra Sarkar broadens the scope of current discussions of the philosophy of biology, viewing molecular biology (...) as a unifying perspective on life that complements that of evolutionary biology. His focus is on molecular biology, but the overriding question behind these papers is what molecular biology contributes to all traditional areas of biological research.Molecular biology -- described with some foresight in a 1938 Rockefeller Foundation report as a branch of science in which "delicate modern techniques are being used to investigate ever more minute details" -- and its modeling strategies apparently argue in favor of physical reductionism. Sarkar's first three chapters explore reductionism -- defending it, but cautioning that reduction to molecular interactions is not necessarily a reduction to genetics. The next sections of the book discuss function, exploring how functional explanations pose a problem for reductionism; the informational interpretation of biology and how it interacts with reductionism; and the tension between the unifying framework of molecular biology and the received framework of evolutionary theory. The concluding chapter is an essay in the emerging field of developmental evolution, exploring what molecular biology may contribute to the transformation of evolutionary theory as evolutionary theory takes into account morphogenetic development. (shrink)
Understanding how scientific activities use naming stories to achieve disciplinary status is important not only for insight into the past, but for evaluating current claims that new disciplines are emerging. In order to gain a historical understanding of how new disciplines develop in relation to these baptismal narratives, we compare two recently formed disciplines, systems biology and genomics, with two earlier related life sciences, genetics and molecular biology. These four disciplines span the twentieth century, a period in which the (...) processes of disciplinary demarcation fundamentally changed from those characteristic of the nineteenth century. We outline how the establishment of each discipline relies upon an interplay of factors that include paradigmatic achievements, technological innovation, and social formations. Our focus, however, is the baptism stories that give the new discipline a founding narrative and articulate core problems, general approaches and constitutive methods. The highly plastic process of achieving disciplinary identity is further marked by the openness of disciplinary definition, tension between technological possibilities and the ways in which scientific issues are conceived and approached, synthesis of reductive and integrative strategies, and complex social interactions. The importance – albeit highly variable – of naming stories in these four cases indicates the scope for future studies that focus on failed disciplines or competing names. Further attention to disciplinary histories could, we suggest, give us richer insight into scientific development. (shrink)
Biologists and historians often present natural history and molecular biology as distinct, perhaps conflicting, fields in biological research. Such accounts, although supported by abundant evidence, overlook important areas of overlap between these areas. Focusing upon examples drawn particularly from systematics and molecular evolution, I argue that naturalists and molecular biologists often share questions, methods, and forms of explanation. Acknowledging these interdisciplinary efforts provides a more balanced account of the development of biology during the post-World War II era.
Although it is now generally acknowledged that new biomedical technologies often produce new definitions and sometimes even new concepts of disease, this observation is rarely used in research that anticipates potential ethical issues in emerging technologies. This article argues that it is useful to start with an analysis of implied concepts of disease when anticipating ethical issues of biomedical technologies. It shows, moreover, that it is possible to do so at an early stage, i.e. when a technology is only just (...) emerging. The specific case analysed here is that of ‘molecular medicine’. This group of emerging technologies combines a ‘cascade model’ of disease processes with a ‘personal pattern’ model of bodily functioning. Whereas the ethical implications of the first are partly familiar from earlier—albeit controversial—forms of preventive and predictive medicine, those of the second are quite novel and potentially far-reaching. (shrink)
During the early 1960s, Morris Goodman used a variety of immunological tests to demonstrate the very close genetic relationships among humans, chimpanzees, and gorillas. Molecular anthropologists often point to this early research as a critical step in establishing their new specialty. Based on his molecular results, Goodman challenged the widely accepted taxonomie classification that separated humans from chimpanzees and gorillas in two separate families. His claim that chimpanzees and gorillas should join humans in family Hominidae sparked a well-known (...) conflict with George Gaylord Simpson, Ernst Mayr, and other prominent evolutionary biologists. Less well known, but equally significant, were a series of disagreements between Goodman and other prominent molecular evolutionists concerning both methodological and theoretical issues. These included qualitative versus quantitative data, the role of natural selection, rates of evolution, and the reality of molecular clocks. These controversies continued throughout Goodman's career, even as he moved from immunological techniques to protein and DNA sequence analysis. This episode highlights the diversity of methods used by molecular evolutionists and the conflicting conclusions drawn from the data that these methods generated. (shrink)
The convergence of biomedical sciences with nanotechnology as well as ICT has created a new wave of biomedical technologies, resulting in visions of a ‘molecular medicine’. Since novel technologies tend to shift concepts of disease and health, this paper investigates how the emerging field of molecular medicine may shift the meaning of ‘disease’ as well as the boundary between health and disease. It gives a brief overview of the development towards and the often very speculative visions of (...) class='Hi'>molecular medicine. Subsequently three views of disease often used in the philosophy of medicine are briefly discussed: the ontological or neo-ontological, the physiological and the normative/holistic concepts of disease. Against this background two tendencies in the field of molecular medicine are highlighted: (1) the use of a cascade model of disease and (2) the notion of disease as a deviation from an individual pattern of functioning. It becomes clear that molecular medicine pulls conceptualizations of disease and health in several, partly opposed directions. However, the resulting tensions may also offer opportunities to steer the future of medicine in more desirable directions. (shrink)
A recent literature review of commentaries and ‘state of the art’ articles from researchers in psychiatric genetics (PMG) offers a consensus about progress in the science of genetics, disappointments in the discovery of new and effective treatments, and a general optimism about the future of the field. I argue that optimism for the field of psychiatric molecular genetics (PMG) is overwrought, and consider progress in the field in reference to a sample estimate of US National Institute of Mental Health (...) funding for this paradigm for the years 2008 and 2009. I conclude that the amounts of financial investment in PMG is questionable from an ethical perspective, given other research and clinical needs in the USA. (shrink)
A political discourse of peace marked the distribution and use of radioisotopes in biomedical research and in medical diagnosis and therapy in the post-World War II period. This occurred during the era of expansion and strengthening of the United States' influence on the promotion of sciences and technologies in Europe as a collaborative effort, initially encouraged by the policies and budgetary distribution of the Marshall Plan. This article follows the importation of radioisotopes by two Spanish research groups, one in experimental (...) endocrinology and one in molecular biology. For both groups foreign funds were instrumental in the early establishment of their laboratories. The combination of funding and access to previously scarce radioisotopes helped position these groups at the forefront of research in Spain. (shrink)
In the context of 1960s research on biological membranes, scientists stumbled upon a curiously coloured material substance, which became called the “purple membrane.” Interactions with the material as well as chemical analyses led to the conclusion that the microbial membrane contained a photoactive molecule similar to rhodopsin, the light receptor of animals’ retinae. Until 1975, the find led to the formation of novel objects in science, and subsequently to the development of a field in the molecular life sciences that (...) comprised biophysics, bioenergetics as well as membrane and structural biology. Furthermore, the purple membrane and bacteriorhodopsin, as the photoactive membrane transport protein was baptized, inspired attempts at hybrid bio-optical engineering throughout the 1980s. A central motif of the research field was the identification of a functional biological structure, such as a membrane, with a reactive material substance that could be easily prepared and manipulated. Building on this premise, early purple membrane research will be taken as a case in point to understand the appearance and transformation of objects in science through work with material substances. Here, the role played by a perceptible material and its spontaneous change of colour, or reactivity, casts a different light on objects and experimental practices in the late twentieth century molecular life sciences. With respect to the impact of chemical working and thinking, the purple membrane and rhodopsins represent an influential domain straddling the life and chemical sciences as well as bio- and material technologies, which has received only little historical and philosophical attention. Re-drawing the boundary between the living and the non-enlivened, these researches explain and model organismic activity through the reactivity of macromolecular structures, and thus palpable material substances. (shrink)
By comparing chemistry to art, chemists have recently made claims to the aesthetic value, even beauty, of some of their products. This paper takes these claims seriously and turns them into a systematic investigation of the aesthetics of chemical products. I distinguish three types of chemical products - materials, molecules, and molecular models - and use a wide variety of aesthetic theories suitable for an investigation of the corresponding sorts of objects. These include aesthetics of materials, idealistic aesthetics from (...) Plato to Kant and Schopenhauer, psychological approaches of Ernst Gombrich and Rudolf Arnheim, and semiotic aesthetics of Nelson Goodman and Umberto Eco. Although the investigation does not support recent claims, I point out where aesthetics does and can play an import role in chemistry. Particularly, Eco's approach helps us understand that and how aesthetic experience can be a driving force in chemical research. (shrink)
Vacuum radiation causes a particle to make a random walk about its dynamical trajectory. In this random walk the root mean square change in spatial coordinate is proportional to t 1/2, and the fractional changes in momentum and energy are proportional to t −1/2, where t is time. Thus the exchange of energy and momentum between a particle and the vacuum tends to zero over time. At the end of a mean free path the fractional change in momentum of a (...) particle in a gas is very small. However, at the end of the mean free path each particle undergoes an interaction that magnifies the preceding change, and the net result is that the momentum distribution of the particles in a gas is randomized in a few collision times. In this way the random action of vacuum radiation and its subsequent magnification by molecular interaction produces entropy increase. This process justifies the assumption of molecular chaos used in the Boltzmann transport equation. (shrink)
In the 1960s molecular population geneticists used Monte Carlo experiments to evaluate particular diffusion equation models. In this paper I examine the nature of this comparative evaluation and argue for three claims: first, Monte Carlo experiments are genuine experiments: second, Monte Carlo experiments can provide an important meansfor evaluating the adequacy of highly idealized theoretical models; and, third, the evaluation of the computational adequacy of a diffusion model with Monte Carlo experiments is significantlydifferent from the evaluation of the emperical (...) adequacy of the same diffusion model. (shrink)
The importance of viruses as model organisms is well-established in molecular biology and Max Delbrück's phage group set standards in the DNA phage field. In this paper, I argue that RNA phages, discovered in the 1960s, were also instrumental in the making of molecular biology. As part of experimental systems, RNA phages stood for messenger RNA (mRNA), genes and genome. RNA was thought to mediate information transfers between DNA and proteins. Furthermore, RNA was more manageable at the bench (...) than DNA due to the availability of specific RNases, enzymes used as chemical tools to analyse RNA. Finally, RNA phages provided scientists with a pure source of mRNA to investigate the genetic code, genes and even a genome sequence. This paper focuses on Walter Fiers’ laboratory at Ghent University (Belgium) and their work on the RNA phage MS2. When setting up his Laboratory of Molecular Biology, Fiers planned a comprehensive study of the virus with a strong emphasis on the issue of structure. In his lab, RNA sequencing, now a little-known technique, evolved gradually from a means to solve the genetic code, to a tool for completing the first genome sequence. Thus, I follow the research pathway of Fiers and his ‘RNA phage lab’ with their evolving experimental system from 1960 to the late 1970s. This study illuminates two decisive shifts in post-war biology: the emergence of molecular biology as a discipline in the 1960s in Europe and of genomics in the 1990s. (shrink)
Senior molecular geneticists were interviewed about their perceptions of the ethical and social implications of genetic knowledge. Inductive analysis of these interviews identified a number of strategies through which the scientists negotiated their moral responsibilities as they participated in generating knowledge that presents difficult ethical questions. These strategies included: further analysis and application of scientific method; clarification of multiple roles; negotiation with the public through public debate, institutional processes of funding, ethics committees and legislation; and personal responsibility.
In 1937, a group of researchers in Nazi Germany began investigating tobacco mosaic virus with the hope of using the virus as a model system for understanding gene behavior in higher organisms. They soon developed a creative and interdisciplinary work style and were able to continue their research in the postwar era, when they made significant contributions to the history of molecular biology. This group is significant for two major reasons. First, it provides an example of how researchers were (...) able to produce excellent scientific research in the midst of dictatorship and war. Coupled with the group's ongoing success in postwar Germany, the German TMV investigators provide a dramatic example of how scientific communities deal with adversity as well as rapid political and social change. Second, since the researchers focused heavily on TMV, their story allows us to analyze how an experimental system other than phage contributed to the emergence of molecular biology. (shrink)
The concept of molecular structure is fundamental to the practice and understanding of chemistry, but the meaning of this term has evolved and is still evolving. The Born–Oppenheimer separation of electronic and nuclear motions lies at the heart of most modern quantum chemical models of molecular structure. While this separation introduces a great computational and practical simplification, it is neither essential to the conceptual formulation of molecular structure nor universally valid. Going beyond the Born–Oppenheimer approximation introduces new (...) paradigms, bringing fresh insight into the chemistry of fluxional molecules, proteins, superconductors and macroscopic dielectrics, thus opening up new avenues for exploration. But it requires that our ideas of molecular structure need to evolve beyond simple ball-and-stick-type models. (shrink)
Preparative and analytical methods developed by separation scientists have played an important role in the history of molecular biology. One such early method is gel electrophoresis, a technique that uses various types of gel as its supporting medium to separate charged molecules based on size and other properties. Historians of science, however, have only recently begun to pay closer attention to this material epistemological dimension of biomolecular science. This paper substantiates the historiographical thread that explores the relationship between modern (...) laboratory practice and the production of scientific knowledge. It traces the historical development of gel electrophoresis from the mid-1940s to the mid-1960s, with careful attention to the interplay between technical developments and disciplinary shifts, especially the rise of molecular biology in this time-frame. Claiming that the early 1950s marked a decisive shift in the evolution of electrophoretic methods from moving boundary to zone electrophoresis, I reconstruct various trajectories in which scientists such as Oliver Smithies sought out the most desirable solid supporting medium for electrophoretic instrumentation. Biomolecular knowledge, I argue, emerged in part from this process of seeking the most appropriate supporting medium that allowed for discrete molecular separation and visualization. The early 1950s, therefore, marked not only an important turning point in the history of separation science, but also a transformative moment in the history of the life sciences as the growth of molecular biology depended in part on the epistemological access to the molecular realm available through these evolving technologies. (shrink)
In the advertising discourse of human genetic database projects, of genetic ancestry tracing companies, and in popular books on anthropological genetics, what I refer to as the anthropological gene and genome appear as documents of human history, by far surpassing the written record and oral history in scope and accuracy as archives of our past. How did macromolecules become "documents of human evolutionary history"? Historically, molecular anthropology, a term introduced by Emile Zuckerkandl in 1962 to characterize the study of (...) primate phylogeny and human evolution on the molecular level, asserted its claim to the privilege of interpretation regarding hominoid, hominid, and human phylogeny and evolution vis-à-vis other historical sciences such as evolutionary biology, physical anthropology, and paleoanthropology. This process will be discussed on the basis of three key conferences on primate classification and evolution that brought together exponents of the respective fields and that were held in approximately ten-years intervals between the early 1960s and the 1980s. I show how the anthropological gene and genome gained their status as the most fundamental, clean, and direct records of historical information, and how the prioritizing of these epistemic objects was part of a complex involving the objectivity of numbers, logic, and mathematics, the objectivity of machines and instruments, and the objectivity seen to reside in the epistemic objects themselves. (shrink)
In this article, two issues regarding mechanisms are discussed. The first concerns the relationships between “mechanism description” and “mechanism explanation.” It is proposed that it is rather plausible to think of them as two distinct epistemic acts. The second deals with the different molecular biology explanatory contexts, and it is shown that some of them require physics and its laws.
In this paper, I will reread the history of molecular genetics from a psychoanalytical angle, analysing it as a case history. Building on the developmental theories of Freud and his followers, I will distinguish four stages, namely: (1) oedipal childhood, notably the epoch of model building (1943–1953); (2) the latency period, with a focus on the development of basic skills (1953–1989); (3) adolescence, exemplified by the Human Genome Project, with its fierce conflicts, great expectations and grandiose claims (1989–2003) and (...) (4) adulthood (2003–present) during which revolutionary research areas such as molecular biology and genomics have achieved a certain level of normalcy—have evolved into a normal science. I will indicate how a psychoanalytical assessment conducted in this manner may help us to interpret and address some of the key normative issues that have been raised with regard to molecular genetics over the years, such as ‘relevance’, ‘responsible innovation’ and ‘promise management’. (shrink)
Protistology, and evolutionary protistology in particular, is experiencing a golden research era. It is an extended one that can be dated back to the 1970s, which is when the molecular rebirth of microbial phylogeny began in earnest. John Archibald, a professor of evolutionary microbiology at Dalhousie University, focuses on the beautiful story of endosymbiosis in his book, John Archibald, One Plus One Equals One: Symbiosis and the Origin of Complex Life. However, this historical narrative could be treated as synecdochal (...) of how the molecular revolution has changed evolutionary biology forever, and that is how Archibald has structured his book. I will address the encompassing theme of molecular methods in detail, but also pay careful attention to the endosymbiosis thread in its own right. (shrink)
Biologists employ a suggestive metaphor to describe the complexities of molecular interactions within cells and embryos: cytological components are said to be part of “ecosystems” that integrate them in a complex network of relations with many other entities. The aim of this essay is to scrutinize the molecular ecosystem, a metaphor that, despite its longstanding history, has seldom be articulated in detail. I begin by analyzing some relevant analogies between the cellular environment and the biosphere. Next, I discuss (...) the applicability of the molecular ecosystem concept in actual scientific practice. (shrink)
Scientific anomalies are observations and facts that contradict current scientific theories and they are instrumental in scientific theory change. Philosophers of science have approached scientific theory change from different perspectives as Darden (Theory change in science: Strategies from Mendelian genetics, 1991) observes: Lakatos (In: Lakatos, Musgrave (eds) Criticism and the growth of knowledge, 1970) approaches it as a progressive “research programmes” consisting of incremental improvements (“monster barring” in Lakatos, Proofs and refutations: The logic of mathematical discovery, 1976), Kuhn (The structure (...) of scientific revolutions, 1996) observes that changes in “paradigms” are instigated by a crisis from some anomaly, and Hanson (In: Feigl, Maxwell (eds) Current issues in the philosophy of science, 1961) proposes that discovery does not begin with hypothesis but with some “problematic phenomena requiring explanation”. Even though anomalies are important in all of these approaches to scientific theory change, there have been only few investigations into the specific role anomalies play in scientific theory change. Furthermore, much of these approaches focus on the theories themselves and not on how the scientists and their experiments bring about scientific change (Gooding, Experiment and the making of meaning: Human agency in scientific observation and experiment, 1990). To address these issues, this paper approaches scientific anomaly resolution from a meaning construction point of view. Conceptual integration theory (Fauconnier and Turner, Cogn Sci 22:133–187, 1996; The way we think: Conceptual blending and mind’s hidden complexities, 2002) from cognitive linguistics describes how one constructs meaning from various stimuli, such as text and diagrams, through conceptual integration or blending. The conceptual integration networks that describe the conceptual integration process characterize cognition that occurs unconsciously during meaning construction. These same networks are used to describe some of the cognition while resolving an anomaly in molecular genetics called RNA interference (RNAi) in a case study. The RNAi case study is a cognitive-historical reconstruction (Nersessian, In: Giere (ed) Cognitive models of science, 1992) that reconstructs how the RNAi anomaly was resolved. This reconstruction traces four relevant molecular genetics publications in describing the cognition necessary in accounting for how RNAi was resolved through strategies (Darden 1991), abductive reasoning (Peirce, In: Hartshorne, Weiss (eds) Collected papers, 1958), and experimental reasoning (Gooding 1990). The results of the case study show that experiments play a crucial role in formulating an explanation of the RNAi anomaly and the integration networks describe the experiments’ role. Furthermore, these results suggest that RNAi anomaly resolution is embodied. It is embodied in a sense that cognition described in the cognitive-historical reconstruction is experientially based. (shrink)
In the centennial of Ettore Majorana’s birth (1906–1938?), we re-examine some aspects of his fundamental scientific production in atomic and molecular physics, including a not well known short communication. There, Majorana critically discusses Fermi’s solution of the celebrated Thomas–Fermi equation for electron screening in atoms and positive ions. We argue that some of Majorana’s seminal contributions in molecular physics already prelude to the idea of exchange interactions (or Heisenberg–Majorana forces) in his later works on theoretical nuclear physics. In (...) all his papers, he tended to emphasize the symmetries at the basis of a physical problem, as well as the limitations, rather than the advantages, of the approximations of the method employed. (shrink)
We apply molecular code theory to a rule-based model of the human inner kinetochore and study how complex formation in general can give rise to molecular codes. We analyze 105 reaction networks generated from the rule-based inner kinetochore model in two variants: with and without dissociation of complexes. Interestingly, we found codes only when some but not all complexes are allowed to dissociate. We show that this is due to the fact that in the kinetochore model proteins can (...) only bind at kinetochores by attaching to already attached proteins and cannot form complexes in free solution. Using a generalized linear mixed model we study which centromere protein can take which role in a molecular code . By this, associations between CENPs and code roles are found. We observed that CenpA is a major risk factor while CenpQ is a major protection factor . Finally we show, using an abstract model of copolymer formation, that molecular codes can also be realized solely by the formation of stable complexes, which do not dissociate. For example, with particular dimers as context a molecular code mapping from two different monomers to two particular trimers can be realized just by non-selective complex formation. We conclude that the formation of protein complexes can be utilized by the cell to implement molecular codes. Living cells thus facilitate a subsystem allowing for an enormous flexibility in the realization of mappings, which can be used for specific regulatory processes, e.g. via the context of a mapping. (shrink)
The present paper analyzes the use and understanding of the homology concept across different biological disciplines. It is argued that in its history, the homology concept underwent a sort of adaptive radiation. Once it migrated from comparative anatomy into new biological fields, the homology concept changed in accordance with the theoretical aims and interests of these disciplines. The paper gives a case study of the theoretical role that homology plays in comparative and evolutionary biology, in molecular biology, and in (...) evolutionary developmental biology. It is shown that the concept or variant of homology preferred by a particular biological field is used to bring about items of biological knowledge that are characteristic for this field. A particular branch of biology uses its homology concept to pursue its specific theoretical goals. (shrink)
This paper argues in defense of theanti-reductionist consensus in the philosophy ofbiology. More specifically, it takes issues with AlexRosenberg's recent challenge of this position. Weargue that the results of modern developmentalgenetics rather than eliminating the need forfunctional kinds in explanations of developmentactually reinforce their importance.
We claim that in contemporary studies in molecular biology and biomedicine, the nature of ‘manipulation’ and ‘intervention’ has changed. Traditionally, molecular biology and molecular studies in medicine are considered experimental sciences, whereas experiments take the form of material manipulation and intervention. On the contrary “big science” projects in biology focus on the practice of data mining of biological databases. We argue that the practice of data mining is a form of intervention although it does not require material (...) manipulation. We also suggest that material manipulation, although still present in in the practice of data mining, fulfill a different epistemic role. (shrink)
We discuss foundational issues of quantum information biology —one of the most successful applications of the quantum formalism outside of physics. QIB provides a multi-scale model of information processing in bio-systems: from proteins and cells to cognitive and social systems. This theory has to be sharply distinguished from “traditional quantum biophysics”. The latter is about quantum bio-physical processes, e.g., in cells or brains. QIB models the dynamics of information states of bio-systems. We argue that the information interpretation of quantum mechanics (...) is the most natural interpretation of QIB. Biologically QIB is based on two principles: adaptivity; openness. These principles are mathematically represented in the framework of a novel formalism— quantum adaptive dynamics which, in particular, contains the standard theory of open quantum systems. (shrink)
As opposed to the dismissive attitude toward reductionism that is popular in current philosophy of mind, a “ruthless reductionism” is alive and thriving in “molecular and cellular cognition”—a field of research within cellular and molecular neuroscience, the current mainstream of the discipline. Basic experimental practices and emerging results from this field imply that two common assertions by philosophers and cognitive scientists are false: (1) that we do not know much about how the brain works, and (2) that lower-level (...) neuroscience cannot explain cognition and complex behavior directly. These experimental practices involve intervening directly with molecular components of sub-cellular and gene expression pathways in neurons and then measuring specific behaviors. These behaviors are tracked using tests that are widely accepted by experimental psychologists to study the psychological phenomenon at issue (e.g., memory, attention, and perception). Here I illustrate these practices and their importance for explanation and reduction in current mainstream neuroscience by describing recent work on social recognition memory in mammals. (shrink)
This paper investigates what molecular biology has done for our understanding of the gene. I base a new account of the gene concept of classical genetics on the classical dogma that gene differences cause phenotypic differences. Although contemporary biologists often think of genes in terms of this concept, molecular biology provides a second way to understand genes. I clarify this second way by articulating a molecular gene concept. This concept unifies our understanding of the molecular basis (...) of a wide variety of phenomena, including the phenomena that classical genetics explains in terms of gene differences causing phenotypic differences. (shrink)
Philosophers have proposed various kinds of relations between Mendelian genetics and molecular biology: reduction, replacement, explanatory extension. This paper argues that the two fields are best characterized as investigating different, serially integrated, hereditary mechanisms. The mechanisms operate at different times and contain different working entities. The working entities of the mechanisms of Mendelian heredity are chromosomes, whose movements serve to segregate alleles and independently assort genes in different linkage groups. The working entities of numerous mechanisms of molecular biology (...) are larger and smaller segments of DNA plus related molecules. Discovery of molecular DNA mechanisms filled black boxes that were noted, but unilluminated, by Mendelian genetics. (shrink)
The paper argues against the central dogma and its interpretation by C. Kenneth Waters and Alex Rosenberg. I argue that certain phenomena in the regulation of gene expression provide a break with the central dogma, according to which sequence specificity for a gene product must be template derived. My thesis of 'molecular epigenesis' with its three classes of phenomena, sequence 'activation', 'selection', and 'creation', is exemplified by processes such as transcriptional activation, alternative cis- and trans-splicing, and RNA editing. It (...) argues that other molecular resources share the causal role of genes; the sequence specificity for the linear sequence of any gene product is distributed between the coding sequence, cis-acting sequences, trans-acting factors, environmental signals, and the contingent history of the cell (thesis of distributed causal specificity). I conclude that the central dogma has unnecessarily restricted genetic research to the sequencing of protein-coding genes, unilinear pathway analyses, and the focus on exclusive specificity. (shrink)
Taking reduction in the traditional deductive sense, the programmatic claim that most of genetics can be reduced by molecular genetics is defended as feasible and significant. Arguments by Ruse and Hull that either the relationship is replacement or at best a weaker form of reduction are shown to rest on a mixture of historical and logical confusions about the nature of the theories involved.