The law has traditionally taken a “hands off” approach to the regulation of basic scientific research and knowledge (Goldberg 1994). While the applications of science and technology are often regulated as they reach the marketplace, the inquiry into basic scientific knowledge has generally escaped direct regulation. Scientific research and knowledge has of course been suppressed based on substantive content in other historical and geographical contexts, but one of the fundamental principles of modern democratic societies is the freedom of scientific research and inquiry.

To be sure, scientific research is subject to a growing number of regulations to control direct risks created by the research for the researchers, their research subjects, or surrounding communities. While the cumulative impact of these regulations of science are substantial, approaching what one leading scientist describes as “chaotic strangulation” (Kennedy 1998), they are still for the most part “incidental” regulations that limit where, when and how scientific research is done, but do not impose substantive limits that restrict certain types of research from being done based on concerns regarding the knowledge created by the research. The current regulatory restrictions on science thus address the means rather than ends of scientific inquiry, much like the “place and time” restrictions on free speech permitted under the First Amendment, which permits government to put reasonable controls on when and where free speech is exercised, but which prohibits much more strongly restrictions on the content of free speech.

In recent years, the premise that scientific research and knowledge should be free from substantive regulation and moral impediments has increasingly been called into question (see Part I below). One response to the growing chorus of calls to restrict or prohibit certain areas of scientific inquiry is to argue that such restrictions on scientific research would be unconstitutional restrictions on free speech and inquiry (Andrews 1998; Charo 2006; National Research Council (NRC) 2003; Santosuosso et al. 2007). The general thought behind this idea is that, if the First Amendment protects a marketplace of ideas, it is likely that it would also protect the generation of information to be included in the marketplace. In fact, the Supreme Court, in Branzburg v. Hayes, specifically compared the act of acquiring information by science researchers to that of the press (Branzburg v. Hayes 1972). However, the protection of scientific research by either the First Amendment or the Fourteenth Amendment of the U.S. Constitution is untested and highly uncertain (Irwin 2005; Weinstein 2009).

There are, however, other, non-constitutional problems with using law to restrict or prohibit certain types of scientific research. These limitations include (i) the inherent imprecision of law for regulating complex and rapidly evolving activities such as scientific research; (ii) the difficulties of enforcing legal restrictions on an activity that is international in scope; (iii) the difficulty in predicting the consequences of restricting specific branches of scientific research; (iv) legislative inertia; and (v) the susceptibility of legislators and regulators to inappropriate factors and influence (see Part II below). These concerns are discussed below and suggest that extreme caution should be used in applying the law to restrict scientific inquiry in all but the most extreme cases, and that a combination of non-traditional legal tools such as norms, codes of conduct, and voluntary standards may be more effective in regulating problematic scientific research (see Part III below).

The Growing Chorus for Restricting Science

There is a strong presumption in modern industrialized democracies, endorsed by most scientists, in favor of minimal government interference in the content of basic scientific research (Baltimore 2005). An example of this presumption is National Security Decision Document 189 issued by U.S. President Reagan in 1985, and adhered to by every U.S. Administration since, which states that it is the government’s policy that, “to the maximum extent possible, the products of fundamental research [shall] remain unrestricted” (National Security Decision Directive 1985). As John Marburger, Director of the White House Office of Science and Technology Policy under the George W. Bush administration trenchantly summarized this common view, “[w]here the marketplace of ideas is regulated, the quality of thought diminishes, and science suffers” (Marburger 2003). Under this traditional paradigm, science is regulated by scientists rather than lawmakers: “The central reality…is that the scientific community in America today is a self-governing republic. Scientists, not governments or voters, decide what is good and what is bad in science” (Goldberg 1994).

In recent years, the regulatory immunity of basic scientific research has come under increasing attack. These challenges to the “hands off” approach to basic science stem from several sources. One key determinant is the enormous power of the knowledge and technologies that could be unleashed as a result of scientific research in emerging fields such as genomics, biotechnology, synthetic biology, cognitive science and nanotechnology. While these emerging technologies have the potential for many benefits, they also tend to be dual-use, capable of both good and pernicious applications. Unlike potentially dangerous dual use research of the past such as nuclear research, these new types of research can be undertaken with relatively little equipment and infrastructure, and thus are much less subject to centralized control and monitoring than has been the case in the past (Atlas 2005). As the possible risks of pernicious applications of these new emerging technologies increase with the power, availability and access to the technology, the progress of scientific research may reach a point where the results of that research could have devastating consequences, possibly even risking the continued existence of the species (so called “existential” or “catastrophic” risks) (Bostrom 2002; Posner 2004). This is the argument for foregoing the risks along with the benefits by prohibiting the research altogether (Jonas 1984). As Bill Joy, co-founder of Sun Microsystems, wrote in his influential article published in Wired magazine, “if our own extinction is a likely, or even possible, outcome of our technological development, shouldn’t we proceed with great caution?” (Joy 2000). Such catastrophic risks have only become apparent in recent decades, and create a new and compelling case for restricting some types of scientific research (Atkinson 1978; Rees 2003).

A second and related concern is based on a type of technological determinism, and embodies the fear that if the research community develops a technology with potentially destructive applications, someone somewhere will eventually and inevitably use it in such a mode. In a speech to the National Academy of Sciences in 2003, George Poste, Director of the Biodesign Institute at Arizona State University, warned: “If you actually look at the history of the assimilation of technological advance into the calculus of military affairs, you cannot find a historical precedent in which dramatic new technologies that redress military inferiority are not deployed” (Quoted in Williams 2006). Thus, it may be prudent to never let the genie out of the bottle in the first place. For example, shortly after the 9/11 terrorist attacks on New York City, the New York Times printed a discussion amongst several technology experts about the implications of those attacks for technology. One expert stated that 9/11 demonstrated that humans were capable of unimaginable evil, and thus the species might be better off foregoing new powerful technologies such as nanotechnology that could potentially be used in future, even-more-terrible terrorist acts (Kolata 2001).

Unfortunately, many of the technologies that may be used for malevolent purposes are initially developed for the purpose of peaceful commercial applications. An example is the development and use of bioengineered weapons. As Brian Rappert, an international expert on codes of conduct at the University of Exeter, pointed out (Rappert 2003), knowledge and techniques gained from genetics, genomics, chemistry, and bio-informatics have increased rapidly, and are quickly becoming widely commercialized. With this increase in commercialization comes an increased risk that these technologies will be used for malevolent purposes. Some specific examples include (Fraser and Dando 2001):

  1. 1.

    Survivability of a particular organism in various harsh environmental conditions can be improved through gene splicing, which could make eradication of that organism much more difficult.

  2. 2.

    Pathogenic bacterial agents can be made resistant to current antibiotics. Reports regarding the Soviet weapons program indicate that it is possible to engineer the causal agent of plague to be resistant to several antibiotics (Barnaby 1997).

  3. 3.

    Genes responsible for an organism’s pathogenicity can be transferred into non-pathogenic bacteria, causing the newly engineered bacteria to exhibit pathogenic qualities. Further, the transfer of pathogenic genes into an already pathogenic organism can result in an organism with increased virulence.

  4. 4.

    Pathogenic bacteria can be modified to produce greater quantities of toxins.

  5. 5.

    Pathogenic bacteria can be genetically engineered to hide telltale signatures, used for identification, which can then be followed by an appropriate method of eradication. It has been reported that Russian scientists were able to produce a strain of anthrax that rendered anthrax detection tests ineffective.

All of these potential malevolent uses of molecular biology could result from, or at least be enabled by, well-intentioned health research intended to improve the detection and treatment of disease. These types of examples strengthen the case for foregoing some kinds of promising beneficial research that have the potential to also be intentionally or accidentally used in ways that can be destructive and even catastrophic.

A third reason for increasing calls for restricting some research, closely related to the second reason above, relates to several recent high-profile incidents involving publication of scientific studies that many scientists and non-scientists believe should not have been undertaken or at least not published, undermining confidence in scientific self-regulation and restraint as an effective control on dangerous science. These incidents have created growing interest and pressure for more formal and mandatory limits on scientific inquiry that may have dangerous applications. A prominent example is the publication of an Australian study in which mousepox virus was unintentionally modified to circumvent vaccine immunity (Jackson et al. 2001), a result that could potentially be used to genetically engineer a more dangerous human virus (e.g., smallpox) (Stephenson 2001). Other examples of controversial studies include the publication of the genetic sequences of the poliovirus (Cello et al. 2002; Wimmer 2006) and the 1918 influenza virus (Kurzweil 2005; Tumpey et al. 2005), a published study showing how terrorists could contaminate the milk supply with botulinum toxin (Alberts 2005; Wein and Liu 2005), and studies involving implantation of human embryonic cells into primates (Greene et al. 2005; Ourednik et al. 2001). The failure of scientific self-restraint to forgo such ethically problematic studies, along with the increasing commercialization of basic science, has convinced some critics that scientists are too self-interested and permissive to have sole discretion on what studies are done. As one critic, Sue Mayer of Genewatch in the United Kingdom, recently stated with respect to experiments involving the synthesis of novel life forms, “[s]cientists creating new life forms cannot be allowed to act as judge and jury…Public debate and policing is needed” (Sample 2006).

Fourth, scientific research is increasingly venturing into areas that evoke intense religious, ethical and social concerns. These scientific developments are seen as intruding on fundamental beliefs about “what it means to be human,” and create fears of Aldous Huxley’s Brave New World becoming a reality. For example, biotechnology research has been described as having the potential “to change human nature and therefore the way that we think of ourselves as a species” (Fukuyama 2002). A loose network of individuals and groups from both the left and right of the political spectrum, including pro-life Christians, environmentalists concerned with the sanctity of nature, and people concerned about the potential return of eugenics, have proposed banning technologies such as in vitro fertilization, stem cell research, transgenic crops, and human cloning based on ethical and moral concerns (Fukuyama 2002; Jasanoff 2005). Some jurisdictions have already banned certain types of scientific research as a result of such concerns. For example, Germany and France were among the first nations to ban research that could lead to human reproductive cloning. Germany, taking a strict stance, banned all genetic research on human embryos in 1990, and France adopted legislation which severely restricted the use of frozen embryos in medical research (effectively banning human reproductive cloning) in June of 2001.

Fifth, there is now an organized constituency of public interest organizations and scholars in fields such as science studies who actively advocate restrictions on science. These groups and individuals perceive a history and pattern of science being misused and misdirected, and call for a more intrusive public control of science. Growing out of the Science-for-the-People movement in the 1970s, these activists and scholars challenge the privileged position that science has enjoyed over the past half-century, and call for democratization of scientific priorities and directions that in many cases include prohibiting lines of research that are perceived as inconsistent with a broader public interest.

Finally, public opinion about scientific research has shifted markedly over the past 50 years (Piller 1991). In the scientific exuberance of post-World War II, science was seen as both pure and beneficial, the stepping stone to progress that would make life more enjoyable, convenient and healthy. Today, the view of science is more jaded, with growing emphasis in the popular media and public opinion on the sinister side of science, a potentially dangerous undertaking that can be manipulated to serve the narrow, self-interested perspective of powerful interests that fund research. For example, according to a survey conducted by Virginia Commonwealth University in 2004, while 90% of American respondents agreed that “developments in science have helped make society better,” 61% of respondents agreed that “scientific research these days doesn’t pay enough attention to the moral values of society” (VCU Center for Public Policy 2004). A study by the National Science Foundation found that while 84% of Americans are positive overall about the prospects of science, more than half of respondents nevertheless agreed that “scientific research these days doesn’t pay enough attention to the moral values of society” and that “scientific research has created as many problems for society as it has solutions” (NSF 2006).

The coalescence of these forces and trends has created a groundswell of support for piercing the once sacrosanct, self-governing kingdom of science, and for imposing legal prohibitions on the types of science that are conducted. Prominent scholars such as Francis Fukuyama have called for legislation banning some areas of cognitive science and biotechnology, arguing that “it is time to move from thinking to acting, from recommending to legislating. We need institutions with real enforcement powers” (Fukuyama 2002). The U.S. House of Representatives responded to such calls by passing legislation in 2001 and again in 2003 that would have criminalized the creation of embryonic stem cells even for therapeutic uses, because it involves destruction of an embryo, but both bills died in the Senate without being enacted (H.R. 534 2003). With recent changes in the composition of Congress, federal legislative efforts to criminalize embryonic stem cell research have stalled, and indeed the momentum is in favor of expanding funding for it. Nevertheless, a substantial minority of both chambers of Congress currently supports a criminal ban on embryonic stem cell research, and such efforts could be resurrected in the future if the balance of power again shifts. Meanwhile, most of the efforts to ban such research have passed to the states. For example, a coalition of organizations attempted (unsuccessfully) to put an initiative on the ballot in the state of Missouri for the 2008 election to preemptively ban most human engineering (Elliot Institute 2009).

While virtually everyone can imagine some scientific experiments that they believe should not be pursued, reaching decisions on precisely which research should not be undertaken and implementing prohibitions of such research are more problematic.

Potential Problems with Restricting Science

When people speak of the need to prohibit certain lines of scientific research, they are almost always explicitly or implicitly recommending that law, in the form of legislation or regulation, would be the instrument used to impose the restriction. Law is indeed a powerful and often effective tool for achieving societal change and controlling human behavior. At the same time, law has important limitations in its capabilities, and some of these limitations are particularly salient for the regulation of science.

Unenforceability

One of the classic norms of science is that it is universal, in that the geographical sites at which research is conducted should not matter in terms of the meaning, applicability or contribution of the research results (Merton 1973). Given that the knowledge and risks generated by controversial research will be generated regardless of where the research is conducted, any attempt by one country to ban certain types of research may be undermined if the research proceeds in other jurisdictions. As a recent report by the U.S. National Research Council (NRC) noted, “any serious attempt to reduce the risks associated with biotechnology must ultimately be international in scope, because the technologies that could be misused are available and being developed throughout the globe” (National Research Council (NRC) 2003).

Scientists that reside in a country which prohibits a particular type of research can move to a country with less stringent policies where their research may be not only allowed, but even actively supported and valued. An example is the geographical migration of embryonic stem cell research to hospitable jurisdictions, where already some prominent scientists have changed locations and jobs to conduct their research in a more favorable regulatory and economic environment (Philipkoski 2006). No sovereign nation-state, many argue, can thus regulate or ban any technological innovation, because the research and development will simply move to another jurisdiction (Fukuyama 2002).

The problem of circumventing scientific prohibitions by off-shoring the research to other jurisdictions could in principle be addressed by international regulations prohibiting the same research. This has proven enormously difficult to accomplish in practice, however, due to important differences between nations in political, religious, legal, economic and scientific perspectives and institutions (Fukuyama 2002). A recent example is the unsuccessful attempt to impose an international prohibition on cloning through the United Nations. Although every nation on record was in favor of a United Nations declaration banning reproductive cloning, the effort to ban such cloning faltered due to disagreement on whether to also include a ban on therapeutic cloning, where nations differed in their religious and ethical positions (Todres et al. 2006; Cameron and Henderson 2008).

Even within a jurisdiction that seeks to ban specific lines of research, enforcing the prohibition could be problematic. Renegade scientists could defy the prohibition, and the ability to verify the type of research that a laboratory is conducting, especially privately funded laboratories, is limited, especially for dual-use technologies.

Legislative Imprecision

Congressional legislation always faces a problem of attempting to address future unanticipated situations. This problem is particularly acute for rapidly evolving scientific fields. Attempting to impose a static definition and restriction of certain types of dangerous research, a necessary attribute of legislation, is prone to being quickly outdated by unanticipated developments. As Nobel Prize winning scientist David Baltimore warned, “So if you wanted to cut off an area of fundamental research, how would you be able to devise the controls? I contend that it would be impossible” (Baltimore 2005). Any prohibition on scientific research is likely to be made obsolete by new scientific developments that either circumvent the existing prohibition or present complexity and confusion about the definition and applicability of the research prohibitions.

The challenge of regulating a dynamic field such as science is exacerbated by the lack of scientific expertise by most legislators. Legislators unfamiliar with complex scientific issues are likely to include errors or ambiguities in legislation that may not become apparent until well after enactment. For example, in response to the public outcry about the cloning of Dolly, many state legislators enacted poorly defined and structured legislation that inadvertently had the effect of criminalizing virtually all genomic and biotechnology research (by prohibiting cloning of any person, cell or molecule) or having natural monozygotic twins (by prohibiting creation of genetically identical individuals) (Greely 1998). Legislation to ban human cloning has been introduced in at least twenty-six states with at least 14 different definitions of human cloning being used (Greely 1998; See NCSL 2008). In five states (Indiana, North Carolina, South Carolina, Tennessee, and New York), proposed legislation defined human cloning as the “growing or creation of a human being from a single cell or cells of a genetically identical human being through asexual reproduction” (Greely 1998). The definition used in those bills applies more directly to botanical rather than human cloning, in which a cutting from a parent plant is used to create genetically identical offspring. This definition does not, however, prohibit cloning via somatic cell nuclear transfer, as was intended. Beyond that, the language of this bill appears to criminalize the creation of monozygotic twins, surely not what the legislators intended.

Another example more innocuous in effect but again illustrating the problem of lay politicians regulating complex scientific areas is the recent ordinance (Measure H) passed in March 2004 by Mendocino County in California. This provision prohibited the growing of organisms genetically modified by the transfer of non-species specific DNA which erroneously defined DNA as “a complex protein that is present in every cell of an organism…” (Campaign for a GMO Free Mendocino County 2003). These anecdotal examples create doubts about the capabilities of legislators to craft carefully-limited prohibitions of certain types of scientific research.

Moreover, once the door to legislative restrictions on fundamental research is opened, there will be a temptation for legislators to intrude further and further into legitimate scientific inquiry. As historian of science Loren Graham warned, policymakers should “resist slipping inadvertently into increasing controls over fundamental science, since such controls can easily lead to abuses” (Graham 1979).

Serendipity and Limited Predictability

Another inherent problem with trying to regulate science is that the scientific discovery almost always proceeds in unanticipated directions (Wolpert 2007). It is therefore highly unlikely that anyone (neither scientist nor policy maker) will have accurate foresight to determine which technologies might result from that science research, including which research directions will or will not result in unacceptable risks (Baltimore 2005). By cutting off specific branches of scientific inquiry, society may not only foreclose potentially harmful or unethical advances, but could also be preventing discovery of very beneficial yet unanticipated knowledge (Fraser and Dando 2001).

Many important scientific discoveries have been found as a result of accidental discovery. Insulin, for example, was discovered as a result of two German scientists’ experimentation on the pancreas in 1889 as part of an investigation into the process of digestion (Roberts 1989). Further, in the 1960s, few would have predicted that research on bacterial endonuclease restriction enzymes would lead to powerful new research tools that formed the backbone of all biotechnology research by making it possible to cut and splice DNA at specific known sequences. There are many similar stories of accidental discoveries which have influenced several fields of science and medicine, for example, the discovery of penicillin, the discoveries of nitrous oxide and ether as anesthetics, the discovery of oxygen, the effect of light on infant jaundice, the presence of cholesterol receptors, chirality or ‘handedness’ of molecules, X-rays, radiation, and a multitude of drugs that are now used to treat human ailments (Roberts 1989).

Molecular biologist Robert Weinberg of the Massachusetts Institute of Technology, after noting over half a dozen examples of “serendipitous discoveries” that have significantly advanced understanding of cancer, noted that “[n]o one could have predicted how these discoveries would arise and play themselves out” (Weinberg 1994). Similarly, John Pike of the Federation of American Scientists has stated: “Half of the money [spent on scientific research] is being wasted. You just don’t know which half. Much of that money is not going to have any practical benefit, but there’s no way of knowing in advance what will and what won’t” (Price 1999).

In at least some cases, the very type of research society might seek to ban as too risky may produce breakthroughs that could help address those very risks (Ehrlich 2006). For example, biotechnological research offers the promise of new diagnostics and therapeutics that may not only offer the best protection against bioweapons created by biotechnology, but also naturally occurring contagions such as avian flu or SARS (Niiler 2002). Thus, banning certain areas of scientific research may make the world more, not less, dangerous (Freitas 2006). Similarly, while embryonic stem cell research is morally objectionable to many people because it involves the destruction of human embryos, advances in embryonic research have produced promising breakthroughs that may permit the isolation of embryonic stem cells without the need to destroy embryos, thus alleviating many of the ethical objections to such technologies (Devolder 2006; President’s Council on Bioethics 2005).

The reverse is also true of course. A scientist may pursue a clearly beneficial scientific objective with pure and altruistic intentions, yet unintentionally unleash new knowledge with potentially undesirable or immoral applications. For example, Ian Wilmut’s goal was to replicate and perpetuate animals carrying a valuable genome (for instance, genetically modified sheep that produce medically valuable proteins in their milk), which resulted in the birth of Dolly on July 5, 1996, a key enabling step to the potential future cloning of humans (Wilmut et al. 2000). Indeed, it is possible to imagine some unexpected, horrendous finding or result that could conceivably result from many, perhaps most, scientific studies, making delineation of experiments that should proceed from those that should not exceedingly difficult if not impossible because of the inherent uncertainty of scientific inquiry (Epstein 2001).

Given that serendipity is an inherent trait of scientific exploration, any attempt to block a line of scientific inquiry carries serious risks of depriving society of beneficial and important new knowledge. As the then Director of the National Science Foundation wrote in 1978, “[s]ince the results of basic research cannot be predicted in advance, scientists, to be effective, must be in a position to follow up on research leads as their training, intuition and ingenuity compel them. This is the primary reason why freedom of inquiry is essential to the conduct of truly innovative research” (Atkinson 1978). Any prohibitions of scientific research risk foreclosing unanticipated and highly valuable new lines of discovery. This inherent trade-off again argues for caution in prohibiting certain types of research altogether, or to as narrowly tailor any such restrictions as possible.

Legislative Inertia

Once a law has been promulgated, it is often very difficult to get that law revised, even when it has become obviously obsolete. This problem is particularly acute with respect to regulation of rapidly emerging scientific fields, where laws can be quickly obsolescent. As far back as 1986, the U.S. Office of Technology Assessment (OTA) noted that “[o]nce a relatively slow and ponderous process, technological change is now outpacing the legal structure that governs the system, and is creating pressures on Congress to adjust the law to accommodate these changes (Office of Technology Assessment 1986).” Many attributes of the legal system suggest that it may encounter problems in keeping up with rapidly emerging technologies.

The legislative process is notoriously slow, with Congress and state legislatures only capable of addressing a small subset of the plethora of potential issues before them in any legislative session. Issues are often not addressed on the basis of their importance, but rather as a function of headlines, opportunity, and political expediency. Thus, a given issue may only be addressed by Congress during an infrequent “window” when various factors converge to elevate the issue to the front of the priority line (Kingdon 1995). Once Congress has acted on an issue during the window of opportunity, it may be years or even decades before it revisits the issue, creating the risk of outdated legislation that remains in effect simply as a reflection of legislative inertia. Statutes also fail to adapt because of political gridlock, where legislators agree that an existing statute is out-of-date, but cannot agree on how it should be changed, resulting in the prolonged life of an outdated statute.

Regulatory processes by federal and state agencies have become slower at the same time that science and technology are speeding up. In the United States, regulatory agencies are required, both to meet legislatively imposed requirements and to survive judicial review, to undertake an ever-increasing burden of analytical requirements to support their regulatory decisions. The increasing complexity of the European Union regulatory processes similarly slows regulatory initiatives. As issues involving technology become more complex, more stakeholders become involved in regulatory processes, further slowing the potential for rapid regulatory action. These and other requirements have resulted in what is referred to as the “ossification” of rulemaking (McGarity 1992), whereby promulgation of new regulations becomes increasingly delayed and difficult, resulting in ineffective and out-dated rules.

The system of judicial case-law is deliberately structured to provide a conservative brake on rapid change in order to provide stability and predictability in the legal system. Thus, common law courts adhere to precedent, often following cases decided decades or even centuries earlier (albeit with some flexibility to depart from such historical decisions in light of new facts, laws, and social views), and the Supreme Court adheres to the principle of stare decisis, that is, it does not depart from previous Supreme Court decisions except in exceptional circumstances. The process of litigation is also often lengthy, as a single case can take many years to progress through the process from filing of a complaint to a final appellate decision, further increasing the likelihood that a judicial opinion might be outdated even at the time it is issued.

These dynamics of legislative, regulatory and judicial legal actors all suggest that legal restrictions on science may have problems keeping pace with exponentially changing technologies. A prime example of the legal system responding too slowly to changes in science and technology is the Delaney Clause, a 1958 amendment to the Food, Drugs, and Cosmetic Act sponsored by Congressman James Delaney of New York. The Delaney Clause prohibited any food additive that was “found to induce cancer in man, or, after tests, found to induce cancer in animals.” At the time the clause was enacted, carcinogens were viewed as relatively rare substances that could be completely eliminated from the human diet. Soon thereafter, though, evolving scientific knowledge suggested that at least half of all chemicals could cause some form of cancer in animal tests at very high doses, and that almost every food additive and most “natural” foods contained some level of potential carcinogens at trace levels. Regulatory agencies such as the Food and Drug Administration (FDA) and the Environmental Protection Agency (EPA) attempted to circumvent the harsh, extreme language of the Delaney Clause by suggesting that additives with trivial cancer risks should be exempted, but were repeatedly rejected by the courts that insisted only Congressional action could change the outdated assumptions underlying the Delaney Clause (Merrill 1988). It was not until 1996 that Congress finally stepped in to update the statute, decades after it was known to be scientifically obsolete and untenable in practice.

These dynamics suggest that legal restrictions on areas of scientific research, particularly in the form of legislation imposing criminal prohibitions and penalties, may be too blunt, inflexible and permanent to deal effectively with rapidly developing scientific fields (Caulfield 2001).

Parochial Interests

The strength of the legislative process, and to a somewhat lesser extent regulation, is that it is responsive and accountable to public pressures and influences. Of course, this strength is also a weakness to the extent that vested interests distort the outcome of legislative or regulatory decisions to favor partisan or parochial interests over the broader public interest. Any issue involving science, ethics and policy is likely to become a “political football” in the regulatory venue, which can produce an outcome that is partisan, unpredictable, or overly broad (Wolf 1997). One example is the effort by many members of Congress in 2001 and subsequent years to hold hostage proposed legislation to prohibit reproductive cloning, a ban which is almost universally supported, unless and until a more controversial ban on therapeutic cloning is also enacted simultaneously. Another example is the lengthy delay of the FDA to approve over-the-counter sales of the “morning after” birth-control pill based apparently on ideological and political reasons. Perhaps because of the potential for partisanship, courts have been very skeptical of legislators relying on general ethical and moral concerns as a legitimate rational for restricting the rights of others (Irwin 2005). Moreover, today’s ethical consensus may become tomorrow’s outdated sensibilities, as evidenced by examples such as heart transplants, in vitro fertilization (IVF), and even sperm donation which were all once ethically objectionable to many but are now commonly accepted (Caulfield 2001).

This tendency to let political and ideological interests over-ride the public interest is connected to what has been called the “action bias” of legislators and regulators, which involves a tendency to take short-term action in response to a perceived immediate crisis while over-looking longer-term repercussions (Patt and Zeckhauser 2002; Stern 2002–2003). The driving force seems to be the desire to take actions that will give the decision-maker the greatest and most immediate credit. The result of these forces is laws that may be superficially appealing, but which create many long-term problems and difficulties, a problem which has been referred to by one commentator as the “pathology of symbolic legislation” (Dwyer 1990).

Legislative or regulatory restrictions on science might be particularly prone to such contamination by extrinsic influences and short-term interests. Given that the potential benefits of any particular line of research are likely to be highly uncertain, far into the future, and involve highly complex, technical knowledge, there is unlikely to be much of a constituency among the general public for pursuing any particular line of research, with the possible exception of research with a direct and promising health benefit. On the other side of the balance, it is relatively easy for politicians to demagogue potentially controversial research, as exemplified by the late Senator Proxmire’s “Golden Fleece” award singling out dubious-sounding federal research projects for public ridicule that in many cases actually turned out to be highly beneficial (Leshner 2003).

More recently, some Congressmen publicly attacked funding of a series of peer-reviewed grants involving human sexuality by the National Institutes of Health (NIH), and it was subsequently disclosed that the list of allegedly objectionable studies had been compiled by the Traditional Values Coalition, a conservative advocacy group (Brainard 2003; Kaiser 2004). While the importance of democratic oversight cannot be denied, the highly partisan, symbolically charged, short-sighted and often superficial approach to issues demonstrated by some Congressional members suggest that legislatures may not be the best forum for making delicate and carefully delineated restrictions on science. Indeed, the long history of science and its suppression has demonstrated that mixing political ideology and scientific inquiry is a dangerous and counter-productive temptation that should be resisted (Baltimore 2005). As even Robert Sinsheimer, one of the most outspoken advocates of the need to put some regulatory restrictions on science, concedes, “Our experience with constraint upon science has hardly been encouraging. From the Inquisition to Lysenko such constraint has been the work of bigots and charlatans” (Sinsheimer 1979).

Alternatives to Legal Restrictions

Given the nature of scientific inquiry, attempts to legally prohibit through legislation or regulation specific types of research are prone to being both ineffective and counter-productive for the reasons discussed above. These factors do not compel the result that science should never be restricted, or that society must resign itself to a grim future based on technological determinism. Rather, we simply suggest that attempts to prohibit science through legal instruments should be undertaken only in extreme cases involving the most severe and unavoidable risks, and even then the restrictions should be configured as narrowly as possible.

What then are the alternatives to legal restrictions of controversial science? It is important to first distinguish two primary motivations for attempting to prohibit science. The first motivation applies to “dual-use” research that has both constructive and destructive applications. This research in and of itself is not morally objectionable, but because the research results could potentially be used inadvertently or deliberately to harm society, some would argue that the research should not be conducted at all, even at the cost of sacrificing the potentially beneficial uses of the research. A prototypical example is research involving the smallpox vaccine, which could be used to better understand and protect against a future smallpox outbreak, but also could be used by terrorists to launch a more deadly bioterrorism attack.

The second motivation for prohibiting some science is that the particular line or area of scientific research and all its potential applications are morally objectionable to at least some citizens. Unlike dual use research, the beneficial and objectionable aspects of the research overlap and cannot be separated. As is the case for many opinions based on moral views, there may be significant disagreement within the population on whether such research is moral or not. Examples include embryonic research that involves the destruction of human embryos, research into the relationship between race and intelligence, or research that involves the formation of human–animal chimeras.

The dynamic of the debate over these two main categories of controversial science is very different. For dual-use research, there is generally little disagreement that the beneficial uses of a technology are desirable and the destructive uses are not. The controversy is about the relative likelihood and weighing of the constructive and destructive applications, and how risk adverse society should be to the destructive applications. For morally objectionable research, on the other hand, the debate tends to focus instead on whether the very conduct of the research is moral or not. Of course these distinctions are not absolute, but they do suggest that different approaches may be needed for addressing the two categories of controversial research.

One obvious option for addressing dual-use research is that instead of trying to block the science altogether, society could seek to manage and control the undesirable applications of the knowledge while allowing the beneficial applications. In other words, society could regulate scientific knowledge in the same way other potentially harmful products such as automobiles, firearms and dynamite are regulated. As one prominent scientist recently wrote, “Although some controls on technology may be sensible, the research lab isn’t the place to put them: it’s just too difficult to predict which scientific discovery will later lead to good applications or bad ones…The scientist’s job is to shine light in the darkness, and if we occasionally burn our fingers on the candle, so be it” (Anderson 2006). The NRC report, Biotechnology Research in an Age of Terrorism, concluded that “the key issue” with respect to dual-use research “is whether the risks associated with misuse can be reduced while still enabling critical research to go forward” (National Research Council (NRC) 2003). The report provides a series of recommendations involving education, oversight, and reporting to achieve this objective. Nevertheless, there may be some technologies whose potential misuse would be so likely and pernicious that allowing the technology to be developed unfettered and then trying to control its misuse may not be acceptable. And of course, this approach will not work for morally objectionable research, where it is the research itself rather then the potential applications that is problematic.

Another approach for dual-use research is to restrict the publication of scientific results rather than prohibiting the science from being undertaken. For example, research on sequencing the genome of a pathogenic organism could be conducted and relevant findings reported, without publishing the entire sequence of the genome that could potentially be used by terrorists to create or modify the organism using the sequence data (Kurzweil and Joy 2005). This approach would allow society to still obtain the benefits of dual use research, while minimizing risks of maleficent uses of that same knowledge by restricting dissemination of the information to those with appropriate credentials and/or a need to know. Some research institutions, scientific societies and journals have already put in place policies to restrict publication in certain extreme circumstances (Bhattacharjee 2006; Congressional Research Service 2006) and the National Scientific Advisory Board for Biosecurity (NSABB) has been charged with recommending whether some types of federally-funded research should be subject to publication restrictions (Fox 2005). The NSABB released a draft report in 2007 recommending that dual use research “of concern” be reviewed prior to publication, and in rare cases where the risks of disclosure exceed the benefits, the publication should be restricted by either: (i) adding information that explains the context of the research; (ii) modifying the manuscript to delete sensitive material; (iii) delaying publication; (iv) limiting distribution of publications to qualified experts or those who “need to know,”; or (v) foregoing any communication and publication (NSABB 2007).

In cases where managing the harmful applications or publication of scientific knowledge is not feasible or effective, including research that is morally objectionable in and of itself, other tools than legislative or regulatory bans may be more effective and flexible for discouraging certain lines of research from being conducted. Legal restrictions that allow the research to proceed but within certain limits, such as the protection of human subjects or research animals, are already in place to channel research in more morally acceptable directions, and indeed to prohibit the most reprehensible research. Various types of “informal constraints” including peer pressure exerted through “unspoken rules” shared by a research community, the fear of social sanctions such as [or] like the potential for stirring up controversy by activist groups or journalists, and the researcher’s own moral standards all play a role in constraining controversial scientific research (Kempner et al. 2005). Another important tool is the adoption of consensus codes and guidelines (Jones 2007; Somerville and Atlas 2005). Self-regulation has the benefit of being more flexible and adaptive to changing or unforeseen contexts (Malakoff and Enserink 2003). Scientists are generally very cognizant of their professional reputation and standing among their peers, and responsible scientists are likely to adhere to codes and guidelines promulgated by credible scientific institutions such as the National Academy of Sciences (Epstein 2001; Goldberg 1994). Recent examples of such scientific self-regulatory initiatives include the code of practice recommended for stem cell research by the NRC, which discourages certain types of research such as putting human embryonic stem cells in primate brains (National Research Council (NRC) 2005), and the oversight framework for synthetic biology suggested by a group of leading scientific and industry experts in that field (Bugl et al. 2007). Such non-legalistic codes and guidelines can help create what Gerald Epstein describes as a “culture of responsibility” in the scientific community (Fox 2005).

The most well-known example is the guidelines on recombinant DNA research that flowed out of the Asilomar conference. Scientists are as likely to adhere to these types of self-regulation restrictions as much or more than legal prohibitions than can be avoided by simply moving the research to another jurisdiction. In contrast, a scientist’s professional reputation would be damaged by failing to adhere to scientific norms regardless of where such research was done. Moreover, scientific guidelines and codes would likely be more credible to scientists, and would also be more flexible and adaptive in that they could be more readily amended to adjust to changing scientific information and circumstances than can legislation or regulation.

While scientific self-restraint depends in significant part on the education and ethical decisions of individual scientists, it also requires a community ethic and responsibility that goes beyond individual decision-making. Scientists must also be more alert to and willing to “make a fuss” when other scientists engage in ethically problematic research (Beckwith and Huang 2005). Moreover, while scientists should play a leading role in developing self-regulatory mechanisms, one of the most important lessons learned from the past three decades of scientific and technology regulation is that other interested parties and the public must also be involved (Frankel 2005). As Daniel Sarewitz observed:

We know that scientists negotiate not only with nature to advance knowledge, but with each other, with their funders, with politicians, corporate executives, various publics. We know that the directions and velocities of science reflect decisions made by people, and decisions emerge within a context. We know that context is strongly embodied by the institutions where science is conducted and planned. …[I]t seems unavoidable to me that responsibility must be located in the processes by which decisions about science are made and implemented in the institutions of public science, rather than in the motives and norms of individuals who conduct science (Sarewitz 2006).

One might argue that the community of research scientists has done an acceptable job in policing itself in such areas as human experimentation and the safety of recombinant DNA technology up until now, but that commercial interests are now too prevalent for self-regulation to continue to work well in the future. For example, some critics suggest that biotechnology companies do not and will not have significant incentives to regulate scientific research for ethical purposes, and therefore, the government will certainly have to step in to draw up and enforce rules for them (Fukuyama 2002). Further, it has been noted that researchers vary in interpretation of what constitutes ethical research behavior, which has resulted in a call for development of ethical guidelines by scientific societies to prevent scientific misconduct and the “normalization of deviance” by scientists (AAAS 2000). Other critiques include the assertion of unethical behavior by entrepreneurs, biotechnology companies, and scientists in the face of conflicts of interests (Brockway and Furcht 2006) and the race to profitability (Krimsky 2003).

Notwithstanding such valid criticisms, however, it fortunately remains the case that most scientists in both industrial and academic settings are generally committed to doing “the right thing” and are increasingly aware that their treasured scientific freedom is contingent on acting in a responsible and socially-acceptable manner (Royal Society and Wellcome Trust 2004; Wolpert 2007). Specifically, major scientific organizations and most individual scientists have recognized the importance of public trust and risk perceptions of the public, and have responded by getting involved in the discussion of how to best inform the public, how to make decisions regarding scientific research comply with the desires and concerns of the public (AAAS 2007; Committee on Science, Engineering, and Public Policy 1995), and which types of scientific misconduct should be punishable (Korenman et al. 1998). An evaluation of scientist focus group responses has suggested that scientists value honesty, integrity, service, sharing, openness, mentoring, and meticulous work habits. The data also suggested that scientist participants perceived that the greatest harm from public disclosure of scientist misconduct is a loss of public trust and funding (Wenger et al. 1997). Generally, there is a strong awareness by scientists that their freedom from content-based regulation and restrictions depends in large part on their own actions in ensuring that they and their colleagues proceed in an ethical and socially responsible manner (Atkinson 1978), an awareness which has resulted in growing interest and support of many scientific leaders and societies for scientific codes of ethics in an attempt to foster research integrity (Atlas 2002; Somerville and Atlas 2005). It is on this simple realization that society’s best hopes and expectations may have to rest.