In the six decades since the publication of Julian Huxley's Evolution: The Modern Synthesis, spectacular empirical advances in the biological sciences have been accompanied by equally significant developments within the core theoretical framework of the discipline. As a result, evolutionary theory today includes concepts and even entire new fields that were not part of the foundational structure of the Modern Synthesis. In this volume, sixteen leading evolutionary biologists and philosophers of science survey the conceptual changes that have emerged since Huxley's (...) landmark publication, not only in such traditional domains of evolutionary biology as quantitative genetics and paleontology but also in such new fields of research as genomics and EvoDevo. Most of the contributors to Evolution—The Extended Synthesis accept many of the tenets of the classical framework but want to relax some of its assumptions and introduce significant conceptual augmentations of the basic Modern Synthesis structure—just as the architects of the Modern Synthesis themselves expanded and modulated previous versions of Darwinism. This continuing revision of a theoretical edifice the foundations of which were laid in the middle of the nineteenth century—the reexamination of old ideas, proposals of new ones, and the synthesis of the most suitable—shows us how science works, and how scientists have painstakingly built a solid set of explanations for what Darwin called the "grandeur" of life. (shrink)
What sets the practice of rigorously tested, sound science apart from pseudoscience? In this volume, the contributors seek to answer this question, known to philosophers of science as “the demarcation problem.” This issue has a long history in philosophy, stretching as far back as the early twentieth century and the work of Karl Popper. But by the late 1980s, scholars in the field began to treat the demarcation problem as impossible to solve and futile to ponder. However, the essays that (...)Massimo Pigliucci and Maarten Boudry have assembled in this volume make a rousing case for the unequivocal importance of reflecting on the separation between pseudoscience and sound science. (shrink)
Making Sense of Evolution explores contemporary evolutionary biology, focusing on the elements of theories—selection, adaptation, and species—that are complex and open to multiple possible interpretations, many of which are incompatible with one another and with other accepted practices in the discipline. Particular experimental methods, for example, may demand one understanding of “selection,” while the application of the same concept to another area of evolutionary biology could necessitate a very different definition.
Biological research on race has often been seen as motivated by or lending credence to underlying racist attitudes; in part for this reason, recently philosophers and biologists have gone through great pains to essentially deny the existence of biological human races. We argue that human races, in the biological sense of local populations adapted to particular environments, do in fact exist; such races are best understood through the common ecological concept of ecotypes. However, human ecotypic races do not in general (...) correspond with 'folk' racial categories, largely because many similar ecotypes have multiple independent origins. Consequently, while human natural races exist, they have little or nothing in common with 'folk' races. (shrink)
Introduction: How hard is the "hard core" of a scientific program? / Massimo Piattelli-Palmarini -- pt. 1. The debate: 1. Opening the debate: The psychogenesis of knowledge and its epistemological significance / Jean Piaget -- On cognitive structures and their development: a reply to Piaget / Noam Chomsky -- 2. About the fixed nucleus and its innateness: Introductory remarks / Jean Piaget -- Cognitive strategies in problem solving / Guy Cellerier -- Some clarifications on innatism and constructivism / Guy (...) Cellerier -- 3. Artificial intelligence and general development mechanisms: The role of artificial intelligence in psychology / Seymour Papert -- 4. Initial states and steady states: The linguistic approach / Noam Chomsky -- 5. Cognitive schemes and their possible relations to language acquisition: Language and knowledge in a constructivist framework / Bäé́ Thom -- Appendix C: Localist hypothesis and theory of catastrophes: note on the debate / Jean Petitot. (shrink)
Phenotypic plasticity integrates the insights of ecological genetics, developmental biology, and evolutionary theory. Plasticity research asks foundational questions about how living organisms are capable of variation in their genetic makeup and in their responses to environmental factors. For instance, how do novel adaptive phenotypes originate? How do organisms detect and respond to stressful environments? What is the balance between genetic or natural constraints (such as gravity) and natural selection? The author begins by defining phenotypic plasticity and detailing its history, including (...) important experiments and methods of statistical and graphical analysis. He then provides extended examples of the molecular basis of plasticity, the plasticity of development, the ecology of plastic responses, and the role of costs and constraints in the evolution of plasticity. A brief epilogue looks at how plasticity studies shed light on the nature/nurture debate in the popular media. (shrink)
The Modern Synthesis (MS) is the current paradigm in evolutionary biology. It was actually built by expanding on the conceptual foundations laid out by its predecessors, Darwinism and neo-Darwinism. For sometime now there has been talk of a new Extended Evolutionary Synthesis (EES), and this article begins to outline why we may need such an extension, and how it may come about. As philosopher Karl Popper has noticed, the current evolutionary theory is a theory of genes, and we still lack (...) a theory of forms. The field began, in fact, as a theory of forms in Darwin’s days, and the major goal that an EES will aim for is a unification of our theories of genes and of forms. This may be achieved through an organic grafting of novel concepts onto the foundational structure of the MS, particularly evolvability, phenotypic plasticity, epigenetic inheritance, complexity theory, and the theory of evolution in highly dimensional adaptive landscapes. (shrink)
The so-called “New Atheism” is a relatively well-defined, very recent, still unfold- ing cultural phenomenon with import for public understanding of both science and philosophy. Arguably, the opening salvo of the New Atheists was The End of Faith by Sam Harris, published in 2004, followed in rapid succession by a number of other titles penned by Harris himself, Richard Dawkins, Daniel Dennett, Victor Stenger, and Christopher Hitchens.
Evolutionary theory is undergoing an intense period of discussion and reevaluation. This, contrary to the misleading claims of creationists and other pseudoscientists, is no harbinger of a crisis but rather the opposite: the field is expanding dramatically in terms of both empirical discoveries and new ideas. In this essay I briefly trace the conceptual history of evolutionary theory from Darwinism to neo-Darwinism, and from the Modern Synthesis to what I refer to as the Extended Synthesis, a more inclusive conceptual framework (...) containing among others evo–devo, an expanded theory of heredity, elements of complexity theory, ideas about evolvability, and a reevaluation of levels of selection. I argue that evolutionary biology has never seen a paradigm shift, in the philosophical sense of the term, except when it moved from natural theology to empirical science in the middle of the 19th century. The Extended Synthesis, accordingly, is an expansion of the Modern Synthesis of the 1930s and 1940s, and one that—like its predecessor—will probably take decades to complete. (shrink)
In recent years, biologists have increasingly been asking whether the ability to evolve — the evolvability — of biological systems, itself evolves, and whether this phenomenon is the result of natural selection or a by-product of other evolutionary processes. The concept of evolvability, and the increasing theoretical and empirical literature that refers to it, may constitute one of several pillars on which an extended evolutionary synthesis will take shape during the next few years, although much work remains to be done (...) on how evolvability comes about. (shrink)
The concept of burden of proof is used in a wide range of discourses, from philosophy to law, science, skepticism, and even in everyday reasoning. This paper provides an analysis of the proper deployment of burden of proof, focusing in particular on skeptical discussions of pseudoscience and the paranormal, where burden of proof assignments are most poignant and relatively clear-cut. We argue that burden of proof is often misapplied or used as a mere rhetorical gambit, with little appreciation of the (...) underlying principles. The paper elaborates on an important distinction between evidential and prudential varieties of burdens of proof, which is cashed out in terms of Bayesian probabilities and error management theory. Finally, we explore the relationship between burden of proof and several (alleged) informal logical fallacies. This allows us to get a firmer grip on the concept and its applications in different domains, and also to clear up some confusions with regard to when exactly some fallacies (ad hominem, ad ignorantiam, and petitio principii) may or may not occur. (shrink)
The “demarcation problem,” the issue of how to separate science from pseu- doscience, has been around since fall 1919—at least according to Karl Pop- per’s (1957) recollection of when he first started thinking about it. In Popper’s mind, the demarcation problem was intimately linked with one of the most vexing issues in philosophy of science, David Hume’s problem of induction (Vickers 2010) and, in particular, Hume’s contention that induction cannot be logically justified by appealing to the fact that “it works,” (...) as that in itself is an inductive argument, thereby potentially plunging the philosopher straight into the abyss of a viciously circular argument. (shrink)
We present an account of semantics that is not construed as a mapping of language to the world but rather as a mapping between individual meaning spaces. The meanings of linguistic entities are established via a “meeting of minds.” The concepts in the minds of communicating individuals are modeled as convex regions in conceptual spaces. We outline a mathematical framework, based on fixpoints in continuous mappings between conceptual spaces, that can be used to model such a semantics. If concepts are (...) convex, it will in general be possible for interactors to agree on joint meaning even if they start out from different representational spaces. Language is discrete, while mental representations tend to be continuous—posing a seeming paradox. We show that the convexity assumption allows us to address this problem. Using examples, we further show that our approach helps explain the semantic processes involved in the composition of expressions. (shrink)
In a pair of recent papers, Allen Buchanan has outlined an ambitious account of the ethics of revolution and its implications for military intervention. Buchanan’s account is bold and yet sophisticated. It is bold in that it advances a number of theses that will no doubt strike the reader as highly controversial; it is sophisticated in that it rests on a nuanced account of how revolutions unfold and the constraints that political self-determination places on intervention. He argues that, despite the (...) importance of political self-determination, humanitarian intervention may be permissible without the consent of the rebelling population. Indeed, given certain structural features of revolutions, there are often reasons to disregard the consent of the population oppressed and intervene before the revolution starts. More controversially, he argues that military force may be employed to nullify the democratic constitutional choice of the newly liberated population and impose a particular form of democratic government, if this is necessary to guarantee the conditions for the future exercise of self-determination. In this paper, I further elaborate Buchanan’s account of political self-determination and argue that once correctly understood, it places tighter constraints on intervention than Buchanan allows. Thus, his bold conclusions should be resisted. (shrink)
Discussions about the biological bases (or lack thereof) of the concept of race in the human species seem to be never ending. One of the latest rounds is represented by a paper by Neven Sesardic, which attempts to build a strong scientific case for the existence of human races, based on genetic, morphometric and behavioral characteristics, as well as on a thorough critique of opposing positions. In this paper I show that Sesardic’s critique falls far short of the goal, and (...) that his positive case is exceedingly thin. I do this through a combination of analysis of the actual scientific findings invoked by Sesardic and of some philo- sophical unpacking of his conceptual analysis, drawing on a dual professional background as an evolu- tionary biologist and a philosopher of science. (shrink)
Decision-making assisted by algorithms developed by machine learning is increasingly determining our lives. Unfortunately, full opacity about the process is the norm. Would transparency contribute to restoring accountability for such systems as is often maintained? Several objections to full transparency are examined: the loss of privacy when datasets become public, the perverse effects of disclosure of the very algorithms themselves, the potential loss of companies’ competitive edge, and the limited gains in answerability to be expected since sophisticated algorithms usually are (...) inherently opaque. It is concluded that, at least presently, full transparency for oversight bodies alone is the only feasible option; extending it to the public at large is normally not advisable. Moreover, it is argued that algorithmic decisions preferably should become more understandable; to that effect, the models of machine learning to be employed should either be interpreted ex post or be interpretable by design ex ante. (shrink)
Denying Evolution aims at taking a fresh look at the evolution–creation controversy. It presents a truly “balanced” treatment, not in the sense of treating creationism as a legitimate scientific theory (it demonstrably is not), but in the sense of dividing the blame for the controversy equally between creationists and scientists—the former for subscribing to various forms of anti-intellectualism, the latter for discounting science education and presenting science as scientism to the public and the media. The central part of the book (...) focuses on a series of creationist fallacies (aimed at showing errors of thought, not at deriding) and of mistakes by scientists and science educators. The last part of the book discusses long-term solutions to the problem, from better science teaching at all levels to the necessity of widespread understanding of how the brain works and why people have difficulties with critical thinking. (shrink)
The so-called ‘‘species problem’’ has plagued evolution- ary biology since before Darwin’s publication of the aptly titled Origin of Species. Many biologists think the problem is just a matter of semantics; others complain that it will not be solved until we have more empirical data. Yet, we don’t seem to be able to escape discussing it and teaching seminars about it. In this paper, I briefly examine the main themes of the biological and philosophical liter- atures on the species problem, (...) focusing on identifying common threads as well as relevant differences. I then argue two fundamental points. First, the species problem is not primarily an empirical one, but it is rather fraught with philosophical questions that require—but cannot be settled by—empirical evidence. Second, the (dis-)solution lies in explicitly adopting Wittgenstein’s idea of ‘‘family resemblance’’ or cluster concepts, and to consider spe- cies as an example of such concepts. This solution has several attractive features, including bringing together apparently diverging themes of discussion among bio- logists and philosophers. The current proposal is con- ceptually independent (though not incompatible) with the pluralist approach to the species problem advocated by Mishler, Donoghue, Kitcher and Dupre ́, which implies that distinct aspects of the species question need to be emphasized depending on the goals of the researcher. From the biological literature, the concept of species that most closely matches the philosophical discussion pre- sented here is Templeton’s cohesion idea. (shrink)
Genes are often described by biologists using metaphors derived from computa- tional science: they are thought of as carriers of information, as being the equivalent of ‘‘blueprints’’ for the construction of organisms. Likewise, cells are often characterized as ‘‘factories’’ and organisms themselves become analogous to machines. Accordingly, when the human genome project was initially announced, the promise was that we would soon know how a human being is made, just as we know how to make airplanes and buildings. Impor- tantly, (...) modern proponents of Intelligent Design, the latest version of creationism, have exploited biologists’ use of the language of information and blueprints to make their spurious case, based on pseudoscientific concepts such as ‘‘irreducible complexity’’ and on flawed analogies between living cells and mechanical factories. However, the living organ- ism = machine analogy was criticized already by David Hume in his Dialogues Concerning Natural Religion. In line with Hume’s criticism, over the past several years a more nuanced and accurate understanding of what genes are and how they operate has emerged, ironically in part from the work of computational scientists who take biology, and in particular developmental biology, more seriously than some biologists seem to do. In this article we connect Hume’s original criticism of the living organism = machine analogy with the modern ID movement, and illustrate how the use of misleading and outdated metaphors in science can play into the hands of pseudoscientists. Thus, we argue that dropping the blueprint and similar metaphors will improve both the science of biology and its understanding by the general public. (shrink)
Introduction : science versus pseudoscience and the "demarcation problem" -- Hard science, soft science -- Almost science -- Pseudoscience -- Blame the media? -- Debates on science : the rise of think tanks and the decline of public intellectuals -- Science and politics : the case of global warming -- Science in the courtroom : the case against intelligent design -- From superstition to natural philosophy -- From natural philosophy to modern science -- The science wars I : do we (...) trust science too much? -- The science wars II : do we trust science too little? -- Who's your expert? -- Conclusion : so, what is science after all? (shrink)
Most biologists and some cognitive scientists have independently reached the conclusion that there is no such thing as learning in the traditional “instructive‘ sense. This is, admittedly, a somewhat extreme thesis, but I defend it herein the light of data and theories jointly extracted from biology, especially from evolutionary theory and immunology, and from modern generative grammar. I also point out that the general demise of learning is uncontroversial in the biological sciences, while a similar consensus has not yet been (...) reached in psychology and in linguistics at large. Since many arguments presently offered in defense of learning and in defense of “general intelligence‘ are often based on a distorted picture of human biological evolution, I devote some sections of this paper to a critique of “adaptationism,‘ providing also a sketch of a better evolutionary theory. Moreover, since certain standard arguments presented today as “knock-down‘ in psychology, in linguistics and in artificial intelligence are a perfect replica of those once voiced by biologists in favor of instruction and against selection, I capitalize on these errors of the past to draw some lessons for the present and for the future. (shrink)
Mayr’s proximate–ultimate distinction has received renewed interest in recent years. Here we discuss its role in arguments about the relevance of developmental to evolutionary biology. We show that two recent critiques of the proximate–ultimate distinction fail to explain why developmental processes in particular should be of interest to evolutionary biologists. We trace these failures to a common problem: both critiques take the proximate–ultimate distinction to neglect specific causal interactions in nature. We argue that this is implausible, and that the distinction (...) should instead be understood in the context of explanatory abstractions in complete causal models of evolutionary change. Once the debate is reframed in this way, the proximate–ultimate distinction’s role in arguments against the theoretical significance of evo-devo is seen to rely on a generally implicit premise: that the variation produced by development is abundant, small and undirected. We show that a “lean version” of the proximate–ultimate distinction can be maintained even when this isotropy assumption does not hold. Finally, we connect these considerations to biological practice. We show that the investigation of developmental constraints in evolutionary transitions has long relied on a methodology which foregrounds the explanatory role of developmental processes. It is, however, entirely compatible with the lean version of the proximate–ultimate distinction. (shrink)
Evolutionary biology is a field currently animated by much discussion concerning its conceptual foundations. On the one hand, we have supporters of a classical view of evolutionary theory, whose backbone is provided by population genetics and the so-called Modern Synthesis (MS). On the other hand, a number of researchers are calling for an Extended Synthe- sis (ES) that takes seriously both the limitations of the MS (such as its inability to incorporate developmental biology) and recent empirical and theoretical research on (...) issues such as evolvability, modularity, and self-organization. In this article, I engage in an in-depth commentary of an influential paper by population geneticist Michael Lynch, which I take to be the best defense of the MS-population genetics position published so far. I show why I think that Lynch’s arguments are wanting and propose a modification of evolutionary theory that retains but greatly expands on population genetics. (shrink)
The Article focuses on the concept of social solidarity, as it is used in the Report of the International Bioethics Committee On Social Responsibility and Health. It is argued that solidarity plays a major role in supporting the whole framework of social responsibility, as presented by the IBC. Moreover, solidarity is not limited to members of particular groups, but potentially extended to all human beings on the basis of their inherent dignity; this sense of human solidarity is a necessary presupposition (...) for a genuinely universalistic morality of justice and human rights. (shrink)
In a now classic paper published in 1991, Alberch introduced the concept of genotype–phenotype (G!P) mapping to provide a framework for a more sophisticated discussion of the integration between genetics and developmental biology that was then available. The advent of evo-devo first and of the genomic era later would seem to have superseded talk of transitions in phenotypic space and the like, central to Alberch’s approach. On the contrary, this paper shows that recent empirical and theoretical advances have only sharpened (...) the need for a different conceptual treat- ment of how phenotypes are produced. Old-fashioned metaphors like genetic blueprint and genetic programme are not only woefully inadequate but positively misleading about the nature of G!P, and are being replaced by an algorithmic approach emerging from the study of a variety of actual G!P maps. These include RNA folding, protein function and the study of evolvable soft- ware. Some generalities are emerging from these disparate fields of analysis, and I suggest that the concept of ‘developmental encoding’ (as opposed to the classical one of genetic encoding) provides a promising computational–theoretical underpinning to coherently integrate ideas on evolvability, modularity and robustness and foster a fruitful framing of the G!P mapping problem. (shrink)
Twenty years have passed since Gould and Lewontin published their critique of ‘the adaptationist program’ – the tendency of some evolutionary biologists to assume, rather than demonstrate, the operation of natural selection. After the ‘Spandrels paper’, evolutionists were more careful about producing just-so stories based on selection, and paid more attention to a panoply of other processes. Then came reactions against the excesses of the anti-adaptationist movement, which ranged from a complete dismissal of Gould and Lewontin’s contribution to a positive (...) call to overcome the problems. We now have an excellent opportunity for finally affirming a more balanced and pluralistic approach to the study of evolutionary biology. (shrink)
In addition to considerable debate in the recent evolutionary literature about the limits of the Modern Synthesis of the 1930s and 1940s, there has also been theoretical and empirical interest in a variety of new and not so new concepts such as phenotypic plasticity, genetic assimilation and phenotypic accommodation. Here we consider examples of the arguments and counter- arguments that have shaped this discussion. We suggest that much of the controversy hinges on several misunderstandings, including unwarranted fears of a general (...) attempt at overthrowing the Modern Synthesis paradigm, and some fundamental conceptual confusion about the proper roles of phenotypic plasticity and natural selection within evolutionary theory. (shrink)
Pigliucci, Massimo A recent New York Times article has noted a new trend in secular writings, what the author, James Atlas, termed 'Can't-Help-Yourself books'. This trend includes writings by prominent scientists and secularists that are characterised by two fundamental - and equally misguided - ideas: an over-enthusiastic embrace of science, and the dismissal of much of human experience under the generic label of 'illusion'.
The term “scientism” is used in a variety of ways with both negative and positive connotations. I suggest that some of these uses are inappropriate, as they aim simply at dismissing without argument an approach that a particular author does not like. However, there are legitimate negative uses of the term, which I explore by way of an analogy with the term “pseudoscience.” I discuss these issues by way of a recent specific example provided by a controversy in the field (...) of bioethics concerning the value, or lack thereof, of homeopathy. I then frame the debate about scientism within the broader context of C.P. Snow’s famous essay on the “two cultures.”. (shrink)
The meta-semiotic ideology that underpins most contemporary semiotics seems at odds with the one that underlies the attempt at planning and creating a new language. Semiotics, as well as modern linguistics, has increasingly evolved into a substantially descriptive endeavor, excluding any consistent normative purpose. Faithful to the epistemology of Ferdinand de Saussure, semiotics does not primarily aim at either pointing at some supposed flaws of such or such language or at proposing some new linguistic forms meant to fix them. The (...) article analyses linguistic utopias from the perspective of present-day semiotics. (shrink)
In this paper I outline a theory of legitimacy that grounds the state’s right to rule on a natural duty not to harm others. I argue that by refusing to enter the state, anarchists expose those living next to them to the dangers of the state of nature, thereby posing an unjust threat. Since we have a duty not to pose unjust threats to others, anarchists have a duty to leave the state of nature and enter the state. This duty (...) correlates to a claim-right possessed by those living next to them, who also have a right to act in self-defence to enforce this obligation. This argument, if successful, would be particularly attractive, as it provides an account of state legitimacy without importing any normative premises that libertarians would reject. (shrink)
This paper outlines a critique of the use of the genetic variance–covariance matrix (G), one of the central concepts in the modern study of natural selection and evolution. Specifically, I argue that for both conceptual and empirical reasons, studies of G cannot be used to elucidate so-called constraints on natural selection, nor can they be employed to detect or to measure past selection in natural populations – contrary to what assumed by most practicing biologists. I suggest that the search for (...) a general solution to the difficult problem of identifying causal structures given observed correlation’s has led evolutionary quantitative geneticists to substitute statistical modeling for the more difficult, but much more valuable, job of teasing apart the many possible causes underlying the action of natural selection. Hence, the entire evolutionary quantitative genetics research program may be in need of a fundamental reconsideration of its goals and how they correspond to the array of mathematical and experimental techniques normally employed by its practitioners. (shrink)
Crimes against humanity are supposed to have a collective dimension with respect both to their victims and their perpetrators. According to the orthodox view, these crimes can be committed by individuals against individuals, but only in the context of a widespread or systematic attack against the group to which the victims belong. In this paper I offer a new conception of crimes against humanity and a new justification for their international prosecution. This conception has important implications as to which crimes (...) can be justifiably prosecuted and punished by the international community. I contend that the scope of the area of international criminal justice that deals with basic human rights violations should be wider than is currently acknowledged, in that it should include some individual violations of human rights, rather than only violations that have a collective dimension. (shrink)
Provided that traditional jus ad bellum principles are fulfilled, military humanitarian intervention to stop large scale violations of human rights (such as genocide, crimes against humanity or war crimes) is widely regarded as morally permissible. In cases of “supreme humanitarian emergency”, not only are the victims morally permitted to rebel, but other states are also permitted to militarily intervene. Things are different if the human rights violations in question fall short of supreme humanitarian emergency. Because of the importance of respecting (...) political self-determination, in cases of “ordinary oppression”, we normally think that rebellion might be permissible, but not military humanitarian intervention. Thus, according to the received view, the conditions for the permissibility of intervention coincide with the conditions for the permissibility of revolution in cases of supreme humanitarian emergency, but not in cases of ordinary oppression. In cases of ordinary oppression there is an asymmetry between the conditions for the permissibility of revolution and intervention (call this the Asymmetry View). Should we accept the Asymmetry View? I answer this question by outlining an account of political self-determination and by illustrating the complex role that this notion should play in discussing the morality of revolution and intervention. (shrink)
The idea of phenotypic novelty appears throughout the evolutionary literature. Novelties have been defined so broadly as to make the term meaningless and so narrowly as to apply only to a limited number of spectacular structures. Here I examine some of the available definitions of phenotypic novelty and argue that the modern synthesis is ill equipped at explaining novelties. I then discuss three frameworks that may help biologists get a better insight of how novelties arise during evolution but warn that (...) these frameworks should be considered in addition to, and not as potential substitutes of, the modern synthesis. †To contact the author, please write to: Departments of Ecology and Evolution and Philosophy, Stony Brook University, Stony Brook, NY 11794; e‐mail: pigliucci@platofootnote.org. (shrink)
Science and philosophy have a very long history, dating back at least to the 16th and 17th centuries, when the first scientist-philosophers, such as Bacon, Galilei, and Newton, were beginning the process of turning natural philosophy into science. Contemporary relationships between the two fields are still to some extent marked by the distrust that maintains the divide between the so-called “two cultures.” An increasing number of philosophers, however, are making conceptual contributions to sciences ranging from quantum mechanics to evolutionary biology, (...) and a few scientists are conducting research relevant to classically philosophical fields of inquiry, such as consciousness and moral decision-making. This article will introduce readers to the borderlands between science and philosophy, beginning with a brief description of what philosophy of science is about, and including a discussion of how the two disciplines can fruitfully interact not only at the level of scholarship, but also when it comes to controversies surrounding public understanding of science. (shrink)
Phenotypic Evolution explicitly recognizes organisms as complex genetic-epigenetic systems developing in response to changing internal and external environments. As a key to a better understanding of how phenotypes evolve, the authors have developed a framework that centers on the concept of the Developmental Reaction Norm. This encompasses their views: (1) that organisms are better considered as integrated units than as disconnected parts (allometry and phenotypic integration); (2) that an understanding of ontogeny is vital for evaluating evolution of adult forms (ontogenetic (...) trajectories, epigenetics, and constraints); and (3) that environmental heterogeneity is ubiquitous and must be acknowledged for its pervasive role in phenotypic expression. (shrink)
There are two natural ways of thinking about negation: (i) as a form of complementation and (ii) as an operation of reversal, or inversion (to deny that p is to say that things are “the other way around”). A variety of techniques exist to model conception (i), from Euler and Venn diagrams to Boolean algebras. Conception (ii), by contrast, has not been given comparable attention. In this note we outline a twofold geometric proposal, where the inversion metaphor is understoood as (...) involving a rotation o a reflection, respectively. These two options are equivalent in classical two-valued logic but they differ significantly in many-valued logics. Here we show that they correspond to two basic sorts of negation operators—Post’s and Kleene’s—and we provide a simple group-theoretic argument demonstrating their generative power. (shrink)