There is at present uneasiness about the conceptual basis of genetics. The gene concept has become blurred and there are problems with the distinction between genotype and phenotype. In the present paper I go back to their role in the creation of modern genetics in the early twentieth century. The terms were introduced by the Danish botanist and geneticist Wilhelm Johannsen in his big textbook of 1909. Historical accounts usually concentrate on this book and his 1911 paper “The Genotype Conception (...) of Heredity.” His bean selection experiment of 1900–1903 is generally assumed to be the source of his genotype theory. The present paper examines the scientific context and meaning of this experiment, how it was received, and how the genotype theory became securely established by the early 1910s. I argue in conclusion that the genotype/phenotype distinction, which provides the empirical basis for Johannsen's gene, was scientifically well founded when introduced and still is. Keith Baverstock's criticism does not consider the force of the bean selection experiment at the time and as a paradigm for following investigations of heredity. (shrink)
Recent trends in psychiatry involve a transition from categorical to dimensional frameworks, in which the boundary between health and pathology is understood as a difference in degree rather than as a difference in kind. A major tenet of dimensional approaches is that no qualitative distinction can be made between health and pathology. As a consequence, these approaches tend to characterize such a threshold as pragmatic or conventional in nature. However, dimensional approaches to psychopathology raise several epistemological and ontological issues. First, (...) we review major sources of evidence usually recruited in support of the dimensional trend (focusing on clinical observation and biological data), and we show that these are connected to different conceptualizations of how dimensional traits extend across health and pathology. Second, we criticize two unquestioned assumptions that stand at the core of the dimensional trend: a) that there is continuity from health to pathology at the symptomatic level; b) that such continuity reflects an underlying continuity in the genetic liability for pathological conditions. Third, we argue against the idea of a conventional threshold by showing that such a view implies a linear relationship between the genotype and the phenotype. Fourth, drawing on epigenetics and developmental biology, we offer a characterization of mental disorders as stable and dynamic constellations of multi-level variables that differ qualitatively from ‘healthy states’. We conclude by showing that our account has several theoretical advantages over both categorical and dimensional approaches. Notably, it provides crucial insights into psychological development over time and individual differences, with major implications in terms of intervention and clinical decision-making. (shrink)
Under the assumption that anticipatory models are required for anticipatory behavior, an important question arises about the different manners in which organisms acquire anticipatory models. This article aims to articulate four different non-exhaustive ways that anticipatory models might possibly be acquired over both phylogenetic and ontogenetic timescales and explore the relationships among them. To articulate these different model-acquisition mechanisms, four schematics will be introduced, each of which represents a particular acquisition structure that can be used for the purposes of comparison, (...) analysis, and hypothesis formulation. By bringing to the fore the differences and similarities between each of the four ways that anticipatory models are acquired, a more complete picture of both anticipatory behavior and its pervasive role in biological self-maintenance can be offered. In doing so, this article helps not only to shed light on how anticipatory behavior might arise in the wide range of organisms that it has been observed in but also to throw into relief the subtle and often still overlooked causal interplay between ontogenetic and phylogenetic plasticity. (shrink)
The “Encyclopedia of DNA Elements” (ENCODE) project was launched by the US National Human Genome Research Institute in the aftermath of the Human Genome Project (HGP). It aimed to systematically map the human transcriptome, and held the promise that identifying potential regulatory regions and transcription factor binding sites would help address some of the perplexing results of the HGP. Its initial results published in 2012 produced a flurry of high-impact publications as well as criticisms. Here we put the results of (...) ENCODE and the work on epigenomics that followed in a broad theoretical and historical context, focusing on three strands of research. The first is the history of thinking about the organization of genomes, both physical and regulatory. The second is the history of ideas about gene regulation, primarily in eukaryotes. Finally, and connecting these two issues, we suggest how to think about the role of genetic material in physiology and development. (shrink)
ENG: We all have our own ideas about what it is like to be intelligent. Indeed, even the experts disagree on this topic. This has generated diverse theories on the nature of intelligence and its genetic and environmental bases. Many scientific and philosophical questions thus remain unaddressed: is it possible to characterize intelligence in scientific terms? What do IQ tests measure? How is intelligence influenced by genetics, epigenetics, and the environment? What are the ethical and social implications of the research (...) on this topic? This book aims to provide the readers with the conceptual resources to critically analyze scientific findings and orientate themselves in this multi-faceted debate across biology, psychology, neuroscience, philosophy, and anthropology. -/- ITA: Ognuno di noi ha una propria idea di cosa significhi essere intelligenti, tanto che anche gli scienziati esperti in materia hanno opinioni differenti al riguardo. Ciò ha contribuito a generare teorie contrastanti sulla natura dell’intelligenza e su quali siano le sue basi genetiche e ambientali. Non è un caso, dunque, che diverse domande scientifiche e filosofiche restino ancora aperte: è possibile caratterizzare scientificamente l’intelligenza? Cosa misurano i test del QI? In che modo genetica, epigenetica e ambiente influenzano l’intelligenza? Quali sono le implicazioni etiche e sociali della ricerca sull’argomento? Oltre ad analizzare questi problemi, il testo intende fornire gli strumenti concettuali necessari per leggere criticamente i dati scientifici e orientarsi in uno sfaccettato e affascinante dibattito a cavallo tra biologia, psicologia, neuroscienze, filosofia e antropologia. (shrink)
Causation has multiple distinct meanings in genetics. One reason for this is meaning slippage between two concepts of the gene: Mendelian and molecular. Another reason is that a variety of genetic methods address different kinds of causal relationships. Some genetic studies address causes of traits in individuals, which can only be assessed when single genes follow predictable inheritance patterns that reliably cause a trait. A second sense concerns the causes of trait differences within a population. Whereas some single genes can (...) be said to cause population-level differences, most often these claims concern the effects of many genes. Polygenic traits can be understood using heritability estimates, which estimate the relative influences of genetic and environmental differences to trait differences within a population. Attempts to understand the molecular mechanisms underlying polygenic traits have been developed, although causal inference based on these results remains controversial. Genetic variation has also recently been leveraged as a randomizing factor to identify environmental causes of trait differences. This technique—Mendelian randomization—offers some solutions to traditional epidemiological challenges, although it is limited to the study of environments with known genetic influences. (shrink)
This essay explores the relation between nature and culture and analyses it from the perspective of contemporary evolutionary theory. Both animals and humans are conceived of as attaining both natural and cultural features that interact with each other on a number of levels of varying complexity: nature as cultural, nature as influenced by culture, culture as natural, and culture as influenced by nature. “Nature as cultural” is meant to express a decoupling of behavioral/phenotypic changes of an organism from its genetic (...) determination. “Nature as influenced by culture” is the idea of niche construction, wherein such decoupled changes can causally feedback to genetic reality, thereby influencing the evolutionary features of downstream species. “Culture as natural” portrays how cultural structures of humans and animals persist through the generations, accumulate incurred changes, and evolve in analogous ways to biological natural selection. “Culture as influenced by nature” is the notion that the cultural/linguistic capacities of animals and humans have evolutionarily emerged from precultural history. All this is meant to evaluate the viability of constructing a nature/culture divide. The conclusion is made that the divide seems arbitrary within and between human and animal life when considering how the differences between the natural and cultural dynamics of humans and animals are modelled as differences of degree, not kind. A potential approach in using the concept of consciousness to recontextualize a nature/culture divide in terms of the possession of consciousness is proposed at the end. (shrink)
This paper challenges the common assumption that some phenotypic traits are quantitative while others are qualitative. The distinction between these two kinds of traits is widely influential in biological and biomedical research as well as in scientific education and communication. This is probably due to both historical and epistemological reasons. However, the quantitative/qualitative distinction involves a variety of simplifications on the genetic causes of phenotypic variability and on the development of complex traits. Here, I examine three cases from the life (...) sciences that show inconsistencies in the distinction: Mendelian traits, Mendelian diseases, and polygenic mental disorders. I show that these traits can be framed both quantitatively and qualitatively depending, for instance, on the methods through which they are investigated and on specific epistemic purposes. This suggests that the received view of quantitative and qualitative traits has a limited heuristic power—limited to some local contexts or to the specific methodologies adopted. Throughout the paper, I provide directions for framing phenotypes beyond the quantitative/qualitative distinction. I conclude by pointing at the necessity of developing a principled characterisation of what phenotypic traits, in general, are. (shrink)
Roughly, the Central Dogma of molecular biology states that DNA codes for protein, not the other way around. This principle, which is still heralded as an important element of contemporary biological theory, has received much critical attention since its original formulation by Francis Crick in 1958. Some have argued that the principle should be rejected, on the grounds that it fails to fully capture the ins-and-outs of protein synthesis, while others have argued that the Dogma is predicated on notions of (...) information that are simply implausible. Yet, despite all this criticism, there is much about the Dogma that has not been said. Existing discussions, for example, gloss over the many distinct, logically independent readings of the Central Dogma that have been defended in the philosophical and biological literature, making it difficult to see which dogma is being criticized. Additionally, this oversight makes it unclear what the overall upshot of these discussions should be taken to be. My aim in this paper is to fix this. (shrink)
Noting minimal philosophical attention to the shift of the meanings of “genotype” and “phenotype,” and their distinction, as well as to the variety of meanings that have co-existed over the last hundred years, this note invites readers to join in exploring the implications of shifts that have been left unexamined.
According to the proponents of Developmental Systems Theory and the Causal Parity Thesis, the privileging of the genome as “first among equals” with respect to the development of phenotypic traits is more a reflection of our own heuristic prejudice than of ontology - the underlying causal structures responsible for that specified development no more single out the genome as primary than they do other broadly “environmental” factors. Parting with the methodology of the popular responses to the Thesis, this paper offers (...) a novel criterion for ‘causal primacy’, one that is grounded in the ontology of the unique causal role of dispositional properties. This paper argues that, if the genome is conceptualised as realising dispositional properties that are “directed toward” phenotypic traits, the parity of ‘causal roles’ between genetic and extra-genetic factors is no longer apparent, and further, that the causal primacy of the genome is both plausible and defensible. (shrink)
Mehlman and Li offer a framework for approaching the bioethical issues raised by the military use of genomics that is compellingly grounded in both the contemporary civilian and military ethics of medical research, arguing that military commanders must be bound by the two principles of paternal- ism and proportionality. I agree fully. But I argue here that this is a much higher bar than we may fully realize. Just as the principle of proportionality relies upon a thorough assessment of harms (...) caused and military advantage gained, the use of genomic research, on Mehlman and Li’s view, will require an accurate understanding of the connection between genotypes and phenotypes – accurate enough to ameliorate the risk undertaken by our armed forces in being subject to such research. Recent conceptual work in evolutionary theory and the philosophy of biology, however, renders it doubtful that such knowledge is forthcoming. The complexity of the relationship between genotypic factors and realized traits (the so-called ‘G→P map’) makes the estimation of potential military advantage, as well as potential harm to our troops, incredibly challenging. Such fundamental conceptual challenges call into question our ability to ever satisfactorily satisfy the demands of a sufficiently rigorous ethical standard. (shrink)
This paper uses a 4 × 4 expansion of the Hawk–Dove Game to illustrate how sexual drift in a large genotype space can shift a population from one equilibrium in a smaller phenotype space to another. An equilibrium is only safe from being destabilized in this way when implemented by recessive alleles.
In the contemporary biomedical literature, every disease is considered genetic. This extension of the concept of genetic disease is usually interpreted either in a trivial or genocentrist sense, but it is never taken seriously as the expression of a genetic theory of disease. However, a group of French researchers defend the idea of a genetic theory of infectious diseases. By identifying four common genetic mechanisms (Mendelian predisposition to multiple infections, Mendelian predisposition to one infection, and major gene and polygenic predispositions), (...) they attempt to unify infectious diseases from a genetic point of view. In this article, I analyze this explicit example of a genetic theory, which relies on mechanisms and is applied only to a specific category of diseases, what we call “a regional genetic theory.” I have three aims: to prove that a genetic theory of disease can be devoid of genocentrism, to consider the possibility of a genetic theory applied to every disease, and to introduce two hypotheses about the form that such a genetic theory could take by distinguishing between a genetic theory of diseases and a genetic theory of Disease. Finally, I suggest that network medicine could be an interesting framework for a genetic theory of Disease. (shrink)
Editor's suggested further reading in BioEssays: Evolution in response to climate change: In pursuit of the missing evidence AbstractHow will fish that evolved at constant sub‐zero temperatures cope with global warming? Notothenioids as a case study Abstract.
Soft X‐ray tomography (SXT) is an imaging technique capable of characterizing and quantifying the structural phenotype of cells. In particular, SXT is used to visualize the internal architecture of fully hydrated, intact eukaryotic and prokaryotic cells at high spatial resolution (50 nm or better). Image contrast in SXT is derived from the biochemical composition of the cell, and obtained without the need to use potentially damaging contrast‐enhancing agents, such as heavy metals. The cells are simply cryopreserved prior to imaging, and (...) are therefore imaged in a near‐native state. As a complement to structural imaging by SXT, the same specimen can now be imaged by correlated cryo‐light microscopy. By combining data from these two modalities specific molecules can be localized directly within the framework of a high‐resolution, three‐dimensional reconstruction of the cell. This combination of data types allows sophisticated analyses to be carried out on the impact of environmental and/or genetic factors on cell phenotypes. (shrink)
Genes are thought to have evolved from long-lived and multiply-interactive molecules in the early stages of the origins of life. However, at that stage there were no replicators, and the distinction between interactors and replicators did not yet apply. Nevertheless, the process of evolution that proceeded from initial autocatalytic hypercycles to full organisms was a Darwinian process of selection of favourable variants. We distinguish therefore between Neo-Darwinian evolution and the related Weismannian and Central Dogma divisions, on the one hand, and (...) the more generic category of Darwinian evolution on the other. We argue that Hull’s and Dawkins’ replicator/interactor distinction of entities is a sufficient, but not necessary, condition for Darwinian evolution to take place. We conceive the origin of genes as a separation between different types of molecules in a thermodynamic state space, and employ a notion of reproducers. (shrink)
Recombination is often considered a disruptive force for well‐adapted phenotypes, but recent evidence suggests that this cost of recombination can be small. A key benefit of recombination is that it can help create proteins and regulatory circuits with novel and useful phenotypes more efficiently than point mutation. Its effectiveness stems from the large‐scale reorganization of genotypes that it causes, which can help explore far‐flung regions in genotype space. Recent work on complex phenotypes in model gene regulatory circuits and proteins shows (...) that the disruptive effects of recombination can be very mild compared to the effects of mutation. Recombination thus can have great benefits at a modest cost, but we do not understand the reasons well. A better understanding might shed light on the evolution of recombination and help improve evolutionary strategies in biochemical engineering. (shrink)
In a now classic paper published in 1991, Alberch introduced the concept of genotype–phenotype (G!P) mapping to provide a framework for a more sophisticated discussion of the integration between genetics and developmental biology that was then available. The advent of evo-devo first and of the genomic era later would seem to have superseded talk of transitions in phenotypic space and the like, central to Alberch’s approach. On the contrary, this paper shows that recent empirical and theoretical advances have only sharpened (...) the need for a different conceptual treat- ment of how phenotypes are produced. Old-fashioned metaphors like genetic blueprint and genetic programme are not only woefully inadequate but positively misleading about the nature of G!P, and are being replaced by an algorithmic approach emerging from the study of a variety of actual G!P maps. These include RNA folding, protein function and the study of evolvable soft- ware. Some generalities are emerging from these disparate fields of analysis, and I suggest that the concept of ‘developmental encoding’ (as opposed to the classical one of genetic encoding) provides a promising computational–theoretical underpinning to coherently integrate ideas on evolvability, modularity and robustness and foster a fruitful framing of the G!P mapping problem. (shrink)
In this article we examine the “phenotype” concept in light of recent technological advances in Genome-Wide Association Studies . By observing the technology and its presuppositions, we put forward the thesis that at least in this case genotype and phenotype are effectively coidentifled one by means of the other. We suggest that the coidentiflcation of genotype-phenotype couples in expression-based GWAS also indicates a conceptual dependence, which we call “co-deñnition.” We note that viewing these terms as codeflned runs against possible expectations, (...) viz., that genotypes and phenotypes could ultimately be expressed independently from one another. In addition, the co-definition of genotypes and phenotypes in this context emphasizes the correlative character of both genotypes and phenotypes in GWAS. (shrink)
This paper describes the historical background and early formation of Wilhelm Johannsen's distinction between genotype and phenotype. It is argued that contrary to a widely accepted interpretation his concepts referred primarily to properties of individual organisms and not to statistical averages. Johannsen's concept of genotype was derived from the idea of species in the tradition of biological systematics from Linnaeus to de Vries: An individual belonged to a group - species, subspecies, elementary species - by representing a certain underlying type. (...) Johannsen sharpened this idea theoretically in the light of recent biological discoveries, not least those of cytology. He tested and confirmed it experimentally combining the methods of biometry, as developed by Francis Galton, with the individual selection method and pedigree analysis, as developed for instance by Louis Vilmorin. The term "genotype" was introduced in W. Johannsen's 1909 treatise, but the idea of a stable underlying biological "type" distinct from observable properties was the core idea of his classical bean selection experiment published 6 years earlier. The individual ontological foundation of population analysis was a self-evident presupposition in Johannsen's studies of heredity in populations from their start in the early 1890s till his death in 1927. The claim that there was a "substantial but cautious modification of Johannsen's phenotype-genotype distinction" from a statistical to an individual ontological perspective derives from a misreading of the 1903 and 1909 texts. The immediate purpose of this paper is to correct this reading of the 1903 monograph by showing how its problems and results grow out of Johannsen's earlier work in heredity and plant breeding. Johannsen presented his famous selection experiment as the culmination of a line of criticism of orthodox Darwinism by William Bateson, Hugo de Vries, and others. They had argued that evolution is based on stepwise rather than continuous change in heredity. Johannsen's paradigmatic experiment showed how stepwise variation in heredity could be operationally distinguished from the observable, continuous morphological variation. To test Galton's law of partial regression, Johannsen deliberately chose pure lines of self-fertilizing plants, a pure line being the descendants in successive generations of one single individual. Such a population could be assumed to be highly homogeneous with respect to hereditary type, and Johannsen found that selection produced no change in this type. Galton, he explained, had experimented with populations composed of a number of stable hereditary types. The partial regression which Galton found was simply an effect of selection between types, increasing the proportion of some types at the expense of others. (shrink)
To construct a synthetic cell we need to understand the rules that permit life. A central idea in modern biology is that in addition to the four entities making reality, matter, energy, space and time, a fifth one, information, plays a central role. As a consequence of this central importance of the management of information, the bacterial cell is organised as a Turing machine, where the machine, with its compartments defining an inside and an outside and its metabolism, reads and (...) expresses the genetic program carried by the genome. This highly abstract organisation is implemented using concrete objects and dynamics, and this is at the cost of repeated incompatibilities (frustration), which need to be sorted out by appropriate «patches». After describing the organisation of the genome into the paleome (sustaining and propagating life) and the cenome (permitting life in context), we describe some chemical hurdles that the cell as to cope with, ending with the specific case of the methionine salvage pathway. (shrink)
This essay examines the origin of genotype-environment interaction, or G×E. "Origin" and not "the origin" because the thesis is that there were actually two distinct concepts of G×E at this beginning: a biometric concept, or \[G \times E_B\], and a developmental concept, or \[G \times E_D \]. R. A. Fisher, one of the founders of population genetics and the creator of the statistical analysis of variance, introduced the biometric concept as he attempted to resolve one of the main problems in (...) the biometric tradition of biology - partitioning the relative contributions of nature and nurture responsible for variation in a population. Lancelot Hogben, an experimental embryologist and also a statistician, introduced the developmental concept as he attempted to resolve one of the main problems in the developmental tradition of biology - determining the role that developmental relationships between genotype and environment played in the generation of variation. To argue for this thesis, I outline Fisher and Hogben's separate routes to their respective concepts of G × E; then these separate interpretations of G × E are drawn on to explicate a debate between Fisher and Hogben over the importance of G × E, the first installment of a persistent controversy. Finally, Fisher's \[G \times E_B\] and Hogben's \[G \times E_D \] are traced beyond their own work into mid-2Oth century population and developmental genetics, and then into the infamous IQ Controversy of the 1970s. (shrink)
August Weismann rejected the inheritance of acquired characters on the grounds that changes to the soma cannot produce the kind of changes to the germ-plasm that would result in the altered character being transmitted to subsequent generations. His intended distinction, between germ-plasm and soma, was closer to the modern distinction between genotype and phenotype than to the modern distinction between germ cells and somatic cells. Recently, systems of epigenetic inheritance have been claimed to make possible the inheritance of acquired characters. (...) I argue that the sense in which these claims are true does not challenge fundamental tenets of neo-Darwinism. Epigenetic inheritance expands the range of options available to genes but evolutionary adaptation remains the product of natural selection of ‘random’ variation. (shrink)
The current implementation of the Neo-Darwinian model of evolution typically assumes that the set of possible phenotypes is organized into a highly symmetric and regular space. Most conveniently, a Euclidean vector space is used, representing phenotypic properties by real-valued variables. Computational work on the biophysical genotype-phenotype model of RNA folding, however, suggests a rather different picture. If phenotypes are organized according to genetic accessibility, the resulting space lacks a metric and can be formalized only in terms of a relatively unfamiliar (...) structure. Patterns of phenotypic evolution—such as punctuation, irreversibility, and modularity—result naturally from the properties of the genotype-phenotype map, which, given the genetic accessibility structure, define accessibility in the phenotype space. The classical framework, however, addresses these patterns exclusively in terms of natural selection on suitably constructed fitness landscapes. Recent work has extended the explanatory level for phenotypic evolution from fitness considerations alone to include the topological structure of phenotype space as induced by the genotype-phenotype map. Lewontin’s notion of “quasi-independence” of characters can also be formalized in topological terms: it corresponds to the assumption that a region of the phenotype space is represented by a product space of orthogonal factors. In this picture, each character corresponds to a factor of a region of the phenotype space. We consider any region of the phenotype space that has a given factorization as a “type”, i.e., as a set of phenotypes that share the same set of phenotypic characters. Thus, a theory of character identity can be developed that is based on the correspondence of local factors in different regions of the phenotype space. (shrink)
The method in human genetics of ascribing causal responsibility to genotype by the use of heritability estimates has been heavily criticized over the years. It has been argued that these estimates are rarely valid and do not serve the purpose of tracing genetic causes. Recent contributions strike back at this criticism. I present and discuss two opposing views on these matters represented by Richard Lewontin and Neven Sesardic, and I suggest that some of the disagreement is based on differing concepts (...) of genetic causation. I use the distinction of structuring and triggering causes to help clarifying the basis for the opposing views. (shrink)
One of the most remarkable aspects of John Maynard Smith’s work was the fact that he devoted time both to doing science and to reflecting philosophically upon its methods and concepts. In this paper I offer a philosophical analysis of Maynard Smith’s approach to modelling phenotypic evolution in relation to three main themes. The first concerns the type of scientific understanding that ESS and optimality models give us. The second concerns the causal–historical aspect of stability analyses of adaptation. The third (...) concerns the concept of evolutionary stability itself. Taken together, these three themes comprise what I call the natural philosophy of adaptation. (shrink)
How to interpret the “molecular gene” concept is discussed in this paper. I argue that the architecture of biological systems is hierarchical and multi-layered, exhibiting striking similarities to that of modern computers. Multiple layers exist between the genotype and system level property, the phenotype. This architectural complexity gives rise to the intrinsic complexity of the genotype-phenotype relationships. The notion of a gene being for a phenotypic trait or traits lacks adequate consideration of this complexity and has limitations in explaining the (...) genotype-phenotype relationships. I explore ways toward an integrative interpretation of the gene in the context of multi-layered biological systems. A gene, I argue, should be interpreted as a functional unit that is responsible for the trans-generation passage of the capacity to dynamically produce a biochemical activity or biochemical activities. At the molecular level, a gene is a genetic unit, a stretch of DNA sequence, which dictates the behavior and the dynamic production of the encoded cellular component(s). Embedded in a gene’s quadruple DNA code are the regulatory signals, such as those for RNA splicing and/or editing, as well as for transcription factor binding. A regulatory signal can be recognized by the gene expression machinery in one state, but not in another. The confusion caused by RNA splicing, editing, and a gene’s selective tissue distribution pattern is addressed. Instead of a context-dependent definition of the gene, I argue for the view that it is the same gene displaying multiple meanings, subject to differential interpretation by the cellular machinery in different states. In other words, the same gene gives rise to different products and expression levels under different conditions. (shrink)
In this book review essay, Justus discusses The Birth of the Mind: How a Tiny Number of Genes Creates the Complexities of Human Thought (2004) by Gary Marcus. The review opens by contrasting the common architectural-blueprint metaphor for the genome with an alternative: the if-then statements of a computer program. The former leads to a seeming “gene shortage” problem while the latter are better suited to representing the cascades of genetic expression that give rise to exponential genotype-phenotype relationships. The essay (...) then develops three conceptual issues of interest to cognitive scientists in light of this small, data-compressed genome: (1) distinguishing dissociations in the developmental process from the domain-specificity of the resulting mental representations, (2) the observation that no gene is specific to a mental representation, a cortical region, or even the nervous system, and (3) the complications that a small number of genes present to the adaptationist programme in evolutionary psychology. The review concludes by questioning the utility of the Swiss Army knife as a metaphor for cognitive development and evolution. (shrink)
Phenotype, whether conventional or extended, is defined as a reflectionof an underlying genotype. Adaptation and the natural selection thatfollows from it depends upon a progressively harmonious fit betweenphenotype and environment. There is in Richard Dawkins' notion ofthe extended phenotype a paradox that seems to undercut conventionalviews of adaptation, natural selection and adaptation. In a nutshell, ifthe phenotype includes an organism's environment, how then can theorganism adapt to itself? The paradox is resolvable through aphysiological, as opposed to a genetic, theory of (...) natural selection andadaptation. (shrink)
Embryonic development and ontogeny occupy whatis often depicted as the black box betweengenes – the genotype – and the features(structures, functions, behaviors) of organisms– the phenotype; the phenotype is not merelya one-to-one readout of the genotype. Thegenes home, context, and locus of operation isthe cell. Initially, in ontogeny, that cell isthe single-celled zygote. As developmentensues, multicellular assemblages of like cells(modules) progressively organized as germlayers, embryonic fields, anlage,condensations, or blastemata, enable genes toplay their roles in development and evolution.As modules, condensations are (...) fundamentaldevelopmental and selectable units ofmorphology (morphogenetic units) that mediateinteractions between genotype and phenotype viaevolutionary developmental mechanisms. In ahierarchy of emergent processes, gene networksand gene cascades (genetic modules) link thegenotype with morphogenetic units such ascondensations, while epigenetic processes suchas embryonic inductions, tissue interactionsand functional integration, link morphogeneticunits to the phenotype. To support theseconclusions I distinguish units of heredityfrom units of transmission and discussepigenetic inheritance by tracing the historyof relationship between embryology andevolution, especially the role(s) assigned tocells or to cellular components in generatingtheories of morphological change in evolution.The concept of cells as modular morphogeneticunits is modeled and illustrated using themammalian dentary bone. (shrink)
In this paper I discuss one possible extension of Richard Lewontin’s proposal in The Triple Helix. After reviewing the theoretical commitments common to discussions that assume we will be able to compute an organism from its genes, I turn to Lewontin’s arguments that we will never be able to compute phenotype from genotype because the genotype specifies an organism’s phenotype relative to a range of environments. The focus of the discussion in this paper, however, is on what might follow if (...) we take seriously the claim that genetic structure does not determine phenotypic structure. The question is: What becomes causally efficacious in an explanation of the development of a heritable trait if genes are not sufficient? Any answer to this question, and even the question itself, is central to an understanding of the types of relations and structures into which humans enter and which they create in an environment. (shrink)
Many analogies exist between the process of evolution by natural selection and of learning by reinforcement and punishment. A full extension of the evolutionary analogy to learning to include analogues of the fitness, genotype, development, environmental influences, and phenotype concepts makes possible a single theory of the learning process able to encompass all of the elementary procedures known to yield learning.
Though the target article is not without fertile suggestions, at least two problems limit its overall validity: (1) the extended gene-culture coevolutionary framework is not an alternative to standard evolutionary theory; (2) the proposed model does not explain how much time is necessary for selective pressure to determine the stabilization of a new aspect of the genotype.
A wide range of ecological and evolutionary models predict variety in phenotype or behavior when a population is at equilibrium. This heterogeneity can be realized in different ways. For example, it can be realized through a complex population of individuals exhibiting different simple behaviors, or through a simple population of individuals exhibiting complex, varying behaviors. In some theoretical frameworks these different realizations are treated as equivalent, but natural selection distinguishes between these two alternatives in subtle ways. By investigating an increasingly (...) complex series of models, from a simple fluctuating selection model up to a finite population hawk/dove game, we explore the selective pressures which discriminate between pure strategists, mixed at the population level, and individual mixed strategists. Our analysis reveals some important limitations to the ESS framework often employed to investigate the evolution of complex behavior. (shrink)
In this paper, I examine an experimental technique, gene targeting, used for establishing genotype/phenotype relationships. Through analyzing a case study, I identify many pitfalls that may lead to false conclusions about these relationships. I argue that some of these pitfalls may seriously affect gene targeting's usefulness for associating phenotypes with genes cataloged by the Human Genome Project. This case also shows the use of gene targeted mice as model systems for studying genotype/phenotype relationships in humans. Moreover, I argue that it (...) reveals the weakness of one attempt to draw conclusions about the biological determination of sexual and aggressive behaviors in humans. (shrink)
As a consequence of the problems caused by genetic discrimination, federal and state law makers are being pressured to pass a legislative remedy. A primary question is whether the Americans with Disabilities Act of 1990 applies to individuals with a potentially disabling genetic disorder who are pre-symptomatic or asymptomatic and may never become ill and to healthy individuals who are carriers of genetic conditions. At present, this question has relevance principally for individuals with the genotype for single gene disorders, like (...) Huntington disease and hemochromatosis, and to asymptomatic carriers of single gene disorders such as cystic fibrosis. Although many such single gene conditions exist, the total incidence of these conditions in the U.S. population is less than 0.4 percent. However, the question concerning the applicability of the ADA will become increasingly important because genetic tests will almost certainly be developed in the near future for common multifactorial diseases like diabetes, heart disease, and certain forms of cancer. (shrink)
argue that natural selection does not explain the genotypic arid phenotypic properties of individuals. On this view, natural selection explains the adaptedness of individuals, not by explaining why the individuals that exist have the adaptations they do, but rather by explaining why the individuals that exist are the ones with those adaptations. This paper argues that this ‘Negative’ view of natural selection ignores the fact that natural selection is a cumulative selection process. So understood, it explains how the genetic sequences (...) that individuals inherit and that are responsible for their complex (and co-adapted) adaptations first arose in the gene-pool. (shrink)
In 1941/42 Konrad Lorenz suggested that Kant's transcendental categories ofa priori knowledge could be given an empirical interpretation in Darwinian material evolutionary terms: a priori propositional knowledge was an organ subject to natural selection for adaptation to its specific environments. D. Campbell extended the conception, and termed evolution a process of knowledge. The philosophical problem of what knowledge is became a descriptive one of how knowledge developed, the normative semantic questions have been sidestepped, as if the descriptive insights would automatically (...) resolve them. This came at a time when the traditional concept of knowledge as universally true, justified beliefs had been challenged by subjectivist, intercommunicative coherence frameworks. Much of the literature on evolutionary epistemology claimed that knowledge in general, and science as its epitome in particular, evolved along lines analogous to organic biological evolution. I refer here only to the view of knowledge as an extension of material biological evolution. These theories of evolutionary epistemology, contrary to the relativist notions of naturalized epistemology, adopted strict realist positions.Although there is no contention with the claim that biological evolution provided the raw material and the constraints for human knowledge, cognition is not knowledge and knowledge is not constrained by it beyond some trivial truisms. The view that sees evolution as a knowledge/cognition process is coercing a loosely defined term into the status of a phenotypic trait on which selection could act. This disregards the intricate many-to-many relationship between correlates of knowledge and biological capacities. But even if we grant the correlates of knowledge the status of selectable traits, the heritability of alternative phenotypes would be low and unpredictable due to the high, open-ended environmental malleability of such complex characters in the course of development. Such concepts are therefore biologically inconsequential. (shrink)
The critics of "hereditarianism" often claim that any attempt to explain human behavior by invoking genes is confronted with insurmountable methodological difficulties. They reject the idea that heritability estimates could lead to genetic explanations by pointing out that these estimates are strictly valid only for a given population and that they are exposed to the irremovable confounding effects of genotype-environment interaction and genotype-environment correlation. I argue that these difficulties are greatly exaggerated, and that we would be wrong to regard them (...) as presenting a fundamental obstacle to the search for genetic explanations. I also show that, to the extent they are cogent, these objections may prove to be even more damaging to the "environmentalist" standpoint. (shrink)
Despite considerable interest in viral evolution, at least among virologists, viruses are rarely considered from the same evolutionary vantage point as other organisms. Early work of necessity emphasized phenotype and phenotypic variation (and therefore arguably was more oriented towards the broader biological and ecological perspectives). More recent work (essentially since the development of molecular evolution in the 1960's but beginning earlier) has concentrated on genotypic variation, with less clarity about the significance of such variations. Other aspects of evolutionary theory, especially (...) considerations of natural selection and of evolutionary constraints, have not widely been applied to viruses, and an evolutionary framework for virology has long been lacking. This becomes apparent in considering 'emerging' viruses, which have often been treated on an ad hoc basis. It was often felt that, because previously unrecognized viruses are involved, mechanisms of viral emergence must mirror the unpredictability of mutations in the viral genome. However, most examples of viral emergence are independent of mutation, at least initially, and are often pre-existing viruses in changed circumstances ('viral traffic'). This conclusion also readily follows from ordinary Darwinian premises, which would require that, like other living species, 'new' organisms are descended only from existing species. In this respect, from a Darwinian perspective, viruses would appear to resemble other organisms. (shrink)
Brandon ( 1984, 1990) has argued that Salmon's (1971) concept of screening-off can be used to characterize (i) the idea that natural selection acts directly on an organism's phenotype, only indirectly on its genotype, and (ii) the biological problem of the levels of selection. Brandon also suggests (iii) that screening-off events in a causal chain are better explanations than the events they screen off. This paper critically evaluates Brandon's proposals.