The Gettier problem has stymied epistemologists. But, whether or not this problem is resolvable, we still must face an important question: Why does the Gettier problem arise in the first place? So far, philosophers have seen it as either a problem peculiar to the concept of knowledge, or else an instance of a general problem about conceptual analysis. But I would like to steer a middle course. I argue that the Gettier problem arises because knowledge is a thick concept, and (...) a Gettier-like problem is just what we should expect from attempts at analyzing a thick concept. Section 2 is devoted to establishing the controversial claim that knowledge is thick, and, in Sect. 3, I show that there is a general problem for analyzing thick concepts of which the Gettier problem is a special instance. I do not take a stand on whether the Gettier problem, or its general counterpart, is resolvable. My primary aim is to bring these problems into better focus. (shrink)
Anonymized reflection was employed as an innovative way of teaching ethics in order to enhance students' ability in ethical decision making during a `Care of the Dying Patient and Family' module. Both qualitative and quantitative data were collected from the first two student cohorts who experienced anonymized reflection ( n = 24). The themes identified were the richness and relevance of scenarios, small-group work and a team approach to teaching. Students indicated that they preferred this style of teaching. This finding (...) was verified by a postal questionnaire conducted four months later. The conclusions drawn from this study suggest that using anonymized reflection is an effective method for teaching ethics to nurses and indicates that learning about ethical issues in this way reduces uncertainties. (shrink)
Ethicists are typically willing to grant that thick terms (e.g. ‘courageous’ and ‘murder’) are somehow associated with evaluations. But they tend to disagree about what exactly this relationship is. Does a thick term’s evaluation come by way of its semantic content? Or is the evaluation pragmatically associated with the thick term (e.g. via conversational implicature)? In this paper, I argue that thick terms are semantically associated with evaluations. In particular, I argue that many thick concepts (if not all) conceptually entail (...) evaluative contents. The Semantic View has a number of outspoken critics, but I shall limit discussion to the most recent--Pekka Väyrynen--who believes that objectionable thick concepts present a problem for the Semantic View. After advancing my positive argument in favor of the Semantic View (section II), I argue that Väyrynen’s attack is unsuccessful (section III). One reason ethicists cite for not focusing on thick concepts is that such concepts are supposedly not semantically evaluative whereas traditional thin concepts (e.g. good and wrong) are. But if my view is correct, then this reason must be rejected. (shrink)
The doctrine of penal substitution claims that it was good (or required) for God to punish in response to human sin, and that Christ received this punishment in our stead. I argue that this doctrine’s central factual claim—that Christ was punished by God—is mistaken. In order to punish someone, one must at least believe the recipient is responsible for an offense. But God surely did not believe the innocent Christ was responsible for an offense, let alone the offense of human (...) sin. So, the central factual claim is mistaken. In the final section, I show that this critique of penal substitution does not apply to the closely-related Anselmian satisfaction theory. (shrink)
Disclosure of financial interests in scientific research is the centerpiece of the new conflict of interest regulations issued by the U.S. Public Health Service and the National Science Foundation that became effective October 1, 1995. Several scientific journals have also established financial disclosure requirements for contributors. This paper measures the frequency of selected financial interests held among authors of certain types of scientific publications and assesses disclosure practices of authors. We examined 1105 university authors (first and last cited) from Massachusetts (...) institutions whose 789 articles, published in 1992, appeared in 14 scientific and medical journals. (shrink)
Despite the amount of public investment in nanotechnology ventures in the developed world, research shows that there is little public awareness about nanotechnology, and public knowledge is very limited. This is concerning given that nanotechnology has been heralded as ‘revolutionising’ the way we live. In this paper, we articulate why public engagement in debates about nanotechnology is important, drawing on literature on public engagement and science policy debate and deliberation about public policy development. We also explore the significance of timing (...) in engaging the public, and we make some suggestions concerning how to effectively engage publics. Our conclusions indicate the significance of scientific researchers, policy makers and representative consumer groupings in public reasoning towards a better public policy framework for debate about technological development. (shrink)
Feminist care theorists Virginia Held and Joan Tronto have suggested that care is relevant to political issues concerning distant others and that care can provide the basis for a more comprehensive moral approach. I consider their approaches with regard to the policy issue of military humanitarian intervention, and raise concerns about exceptionalist attitudes toward international law that entail a collection of costs that I refer to as “the problem of global worldlessness.” I suggest that an ethic of care can overcome (...) these concerns, and offer an Arendt-inflected rereading of some of Tronto's work to show how this is possible. (shrink)
Abstract What is the significance of the wicked problems framework for environmental philosophy? In response to wicked problems, environmental scientists are starting to welcome the participation of social scientists, humanists, and the creative arts. We argue that the need for interdisciplinary approaches to wicked problems opens up a number of tasks that environmental philosophers have every right to undertake. The first task is for philosophers to explore new and promising ways of initiating philosophical research through conducting collaborative learning processes on (...) environmental issues. The second task is for philosophers to recognize the value of philosophical skills in their engagements with members of other disciplines and walks of life in addressing wicked problems. The wicked problems framework should be seen as an important guide for facilitating philosophical research that is of relevance to problems like climate change and sustainable agriculture. Content Type Journal Article Category Articles Pages 1-14 DOI 10.1007/s10806-011-9344-0 Authors Paul B. Thompson, Department of Philosophy, Michigan State University, 503 S. Kedzie Hall, East Lansing, MI 48824, USA Kyle Powys Whyte, Department of Philosophy, Michigan State University, 503 S. Kedzie Hall, East Lansing, MI 48824, USA Journal Journal of Agricultural and Environmental Ethics Online ISSN 1573-322X Print ISSN 1187-7863. (shrink)
Abstract As Paul B. Thompson suggests in his recent seminal paper, “‘There’s an App for That’: Technical Standards and Commodification by Technological Means,” technical standards restructure property (and other social) relations. He concludes with the claim that the development of technical standards of commodification can serve purposes with bad effects such as “the rise of the factory system and the deskilling of work” or progressive effects such as how “technical standards for animal welfare… discipline the unwanted consequences of market forces.” (...) In this reply, we want to append several points to his argument and suggest that he rightly points out that standards can promote various goods; however, there are peculiar powers wielded by standardization processes that might profitably be unpacked more systematically than Thompson's article seems to suggest. First, the concealment of the technopolitics around standards is largely due to their peculiar ontological status as recipes for reality. Second, technical standards can and do commit violence against persons, but such violence is often suffered not in the formation of class consciousness, as Marx might have put it, but as a failure to conform to the laws of nature . Content Type Journal Article Category Commentary Pages 1-6 DOI 10.1007/s13347-011-0048-1 Authors Lawrence Busch, Department of Sociology, Michigan State University, 429A Berkey Hall, East Lansing, MI 48824, USA Kyle Powys Whyte, Department of Philosophy, Michigan State University, 503 S. Kedzie Hall, East Lansing, MI 48824, USA Journal Philosophy & Technology Online ISSN 2210-5441 Print ISSN 2210-5433. (shrink)
Prediction markets are low volume speculative markets whose prices oﬀer informative forecasts on particular policy topics. Observers worry that traders may attempt to mislead decision makers by manipulating prices. We adapt a Kyle-style market microstructure model to this case, adding a manipulator with an additional quadratic preference regarding the price. In this model, when other traders are uncertain about the manipulator’s target price, the mean target price has no eﬀect on prices, and increases in the variance of the target (...) price can increase average price accuracy, by increasing the returns to informed trading and thereby incentives for traders to become informed. (shrink)
P. Kyle Stanford (2000) attempts to offer a truth-linked explanation of the success of science which, he thinks, can be welcome to antirealists. He proposes an explanation of the success of a theory T1 in terms of its predictive similarity to the true theory T of the relevant domain. After raising some qualms about the supposed antirealist credentials of Stanford's account, I examine his explanatory story in some detail and show that it fails to offer a satisfactory explanation of (...) the success of science. (shrink)
The thought that there is a way to reconcile empiricism with a realist stance towards scientific theories, avoiding instrumentalism and without fearing that this will lead straight to metaphysics, seems very promising. This paper aims to articulate this thought. It consists of two parts. The first (sections 2 and 3) will articulate how empiricism can go for scientific realism without metaphysical anxiety. It will draw on the work of Moritz Schlick, Hans Reichenbach and Herbert Feigl to develop an indispensability argument (...) for the adoption of the realist framework. This argument, unlike current realist arguments, has a pragmatic ring to it: there is no ultimate argument for the adoption of the realist framework. The guiding thought here is that fundamental ontic questions are not dealt with in the same way in which questions about the reality of ordinary entities (be they stones or electrons) are dealt with—the ontic framework must already be in place before questions about the reality of specific entities are raised. The second part (sections 4 and 5) will articulate reasons for avoiding instrumentalism. Most space is given in offering reasons to refrain from adopting P. Kyle Stanford’s (2006) neo-instrumentalism—a very sophisticated version of instrumentalism that seems to work within the realist framework and promises empiricists a way to avoid scientific realism. Scientific realism is alive and well because of Ti(a)na: there is (almost) no alternative. However, in section 6, it will be argued that there is room for rapprochement between contextualist instrumentalism and scientific realism. The paper is accompanied by an appendix in which Reichenbach’s argument for scientific realism is presented and discussed. (shrink)
In this article, against the background of a notion of ‘assembled’ truth, the evolutionary progressiveness of a theory is suggested as novel and promising explanation for the success of science. A new version of realism in science, referred to as ‘naturalised realism’ is outlined. Naturalised realism is ‘fallibilist’ in the unique sense that it captures and mimics the self-corrective core of scientific knowledge and its progress. It is argued that naturalised realism disarms Kyle Stanford’s anti-realist ‘new induction’ threats by (...) showing that ‘explanationism’ and his ‘epistemic instrumentalism’ are just two positions among many on a constantly evolving continuum of options between instrumentalism and full-blown realism. In particular it is demonstrated that not only can naturalised realism redefine the terms of realist debate in such a way that no talk of miracles need enter the debate, but it also promises interesting defenses against inductive- and under-determination-based anti-realist arguments. (shrink)
Introduction: Laughter as an expression of human nature in the Middle Ages and the early modern period: literary, historical, theological, philosophical, and psychological reflections -- Judith Hagen. Laughter in Procopius's wars -- Livnat Holtzman. "Does God really laugh?": appropriate and inappropriate descriptions of God in Islamic traditionalist theology -- Daniel F. Pigg. Laughter in Beowulf: ambiguity, ambivalence, and group identity formation -- Mark Burde. The parodia sacra problem and medieval comic studies -- Olga V. Trokhimenko. Women's laughter and gender politics (...) in medieval conduct discourse -- Madelon Köhler-Busch. Pushing decorum: uneasy laughter in Heinrich von Dem Türlîn's Diu crône -- Connie L. Scarborough. Laughter and the comic in a religious text -- John Sewell. The son rebelled and so the father made man alone: ridicule and boundary maintenance in The Nizzahon vetus -- Birgit Wiedl. Laughing at the beast: the judensau: anti-Jewish propaganda and humor from the Middle Ages to the early modern period -- Fabian Alfie. Yes . . . but was it funny? Cecco Angiolieri, Rustico Filippi and Giovanni Boccaccio -- Nicolino Applauso. Curses and laughter in medieval Italian comic poetry -- Feargal Béarra. Tromdhámh guaire: a context for laughter and audience in early modern Ireland -- Jean E. Jost. Humorous transgression in the non-conformist fabliaux: a Bakhtinian analysis of three comic tales -- Gretchen Mieszkowski. Chaucerian comedy: Troilus and Criseyde -- Sarah Gordon. Laughing and eating in the fabliaux -- Christine Bousquet-Labouérie. Laughter and medieval stalls -- Scott L. Taylor. Esoteric humor and the incommensurability of laughter -- Jean N. Goodrich. The function of laughter in The second shepherds' play -- Albrecht Classen. Laughing in late-medieval verse and prose narratives -- Rosa Alvarez perez. The workings of desire: Panurge and the dogs -- Elizabeth Chesney Zegura. Laughing out loud in the Heptaméron: a reassessment of Marguerite de Navarre's ambivalent humor -- Lia B. Ross. You had to be there: the elusive humor of the Sottie -- Kyle Diroberto. Sacred parody in Robert Greene's Groatsworth of wit -- Martha Moffitt Peacock. The comedy of the shrew: theorizing humor in early modern Netherlandish art -- Jessica Tvordi. The comic personas of Milton's Prolusion VI: negotiating masculine identity through self-directed humor -- John Alexander. Ridentum dicere verum (using laughter to speak the truth): laughter and the language of the early modern clown "pickelhering" in German literature of the late seventeenth century (1675-1700) -- Thomas Willard. Andreae's ludibrium: Menippean satire in The chymische hochzeit -- Diane Rudall. The comic power of illusion-allusion -- Allison P. Coudert. Laughing at credulity and superstition in the long eighteenth century. (shrink)
A good Christian can be a good liberal, and perhaps should be, because liberalism is the political theory most consistent with the biblical mandate concerning the role of the state and its officers. The argument for this is made in terms that any good Christian should find acceptable, and then two policy implications are briefly discussed.
John Hare has proposed “prescriptive realism” in an attempt to stake out a middle-ground position in the twentieth century Anglo-American debates concerning metaethics between substantive moral realists and antirealist-expressivists. The account is supposed to preserve both the normativity and objectivity of moral judgments. Hare defends a version of divine command theory. The proposal succeeds in establishing the middle-ground position Hare intended. However, I argue that prescriptive realism can be strengthened in an interesting way.
In this paper, we examine Shaun Gallagher’s project of “naturalizing” phenomenology with the cognitive sciences: front-loaded phenomenology (FLP). While we think it is a productive <span class='Hi'>proposal</span>, we argue that Gallagher does not employ genetic phenomenological methods in his execution of FLP. We show that without such methods, FLP’s attempt to locate neurological correlates of conscious experience is not yet adequate. We demonstrate this by analyzing Gallagher’s critique of cognitive neuropsychologist Christopher Frith’s functional explanation of schizophrenic symptoms. In “constraining” Gallagher’s (...) FLP program, we discuss what genetic phenomenological method is and why FLP ought to embrace it. We also indicate what types of structures a genetically modified FLP will consider, and how such an approach would affect the manner in which potential neurological correlates of conscious experience are conceptually understood and experimentally investigated. (shrink)
Skeptical theists argue that no seemingly unjustified evil (SUE) could ever lower the probability of God's existence at all. Why? Because God might have justifying reasons for allowing such evils (JuffREs) that are undetectable. However, skeptical theists are unclear regarding whether or not God's existence is relevant to the existence of JuffREs, and whether or not God's existence is relevant to their detectability. But I will argue that, no matter how the skeptical theist answers these questions, it is undeniable that (...) the skeptical theist is wrong; SUEs lower the probability of God's existence. To establish this, I will consider the four scenarios regarding the relevance of God's existence to the existence and detectability of JuffREs, and show that in each—after we establish our initial probabilities, and then update them given the evidence of a SUE—the probability of God's existence drops. (shrink)
This paper revisits Herbert Kliebard's figure of John Dewey in Kliebard's The Struggle for the American Curriculum . The paper argues that, while there are indeed reasons for the disembodied picture of Dewey that emerges from Struggle , such figuration ultimately has an effect that is overly reproductive: It ignores Dewey's efforts to live within and across institutional boundaries so as to reconstruct the practices and interests of the society in which he lived. Using the work of Bakhtin and Dewey, (...) I argue that it is only by such a Deweyan engagement that our own voices will ultimately be able to "ring" or "sound" in novel and potentially radical ways. (shrink)
We need a better theory of movement. e present theories harbor stipulations and give little traction on understanding why movement has the properties it does. A presently popular theory of movement has the following ingredients.
There is a serious flaw in The Terminator which pretty much ruins the storyline. The problem is about Kyle Reese, who must enter the time-displacement equipment in the future, sometime after the Terminator had already entered it. We call this the “Bad Timing Problem”.
This article reviews some of the technological devices and ideas which have been used over the years to answer the question, how does the brain work? It describes some of the early technology-based analogies and models of nerve fibers, and then discusses other analogies and models of the brain based on mechanical and electrical technologies. There are also short sections on cybernetics, telephone exchanges, and computers. Although all of these ideas are flawed to some extent, this article offers a brief (...) argument on the usefulness of analogy and abstraction in brain science. (shrink)
Executive response functions can be affected by preceding events, even if they are no longer associated with the current task at hand. For example, studies utilizing the stop signal task have reported slower response times to ‘GO’ stimuli when the preceding trial involved the presentation of a ‘STOP’ signal. However, the neural mechanisms that underlie this behavioral after-effect are unclear. To address this, behavioral and electroencephalography (EEG) measures were examined in 18 young adults (18-30yrs) on 'GO' trials following a previously (...) ‘Successful Inhibition’ trial (pSI), a previously ‘Failed Inhibition’ trial (pFI), and a previous ‘GO’ trial (pGO). Like previous research, slower response times were observed during both pSI and pFI trials (i.e., ‘GO’ trials that were preceded by a successful and unsuccessful inhibition trial, respectively) compared to pGO trials (i.e., ‘GO’ trials that were preceded by another ‘GO’ trial). Interestingly, response time slowing was greater during pSI trials compared to pFI trials, suggesting executive control is influenced by both task set switching and persisting motor inhibition processes. Follow-up behavioral analyses indicated that these effects resulted from between-trial control adjustments rather than repetition priming effects. Analyses of inter-electrode coherence (IEC) and inter-trial coherence (ITC) indicated that both pSI and pFI trials showed greater phase synchrony during the inter-trial interval compared to pGO trials. Unlike the IEC findings, differential ITC was present within the beta and alpha frequency bands in line with the observed behavior (pSI > pFI > pGO), suggestive of more consistent phase synchrony involving motor inhibition processes during the ITI at a regional level. These findings suggest that between-trial control adjustments involved with task-set switching and motor inhibition processes influence subsequent performance, providing new insights into the dynamic nature of executive control. (shrink)
In the concluding chapter of Exceeding our Grasp Kyle Stanford outlines a positive response to the central issue raised brilliantly by his book, the problem of unconceived alternatives. This response, called "epistemic instrumentalism", relies on a distinction between instrumental and literal belief. We examine this distinction and with it the viability of Stanford's instrumentalism, which may well be another case of exceeding our grasp.
Does science successfully uncover the deep structure of the natural world? Or are the depths forever beyond our epistemic grasp? Since the decline of logical positivism and logical empiricism, scientific realism has become the consensus view: of course our scientific theories apprehend the deep structure of the world. What else could explain the remarkable success of science? This is the explanationist defense of scientific realism, the “ultimate argument.” Kyle Stanford starts here and, using the history of theorizing about biological (...) inheritance as his case study, constructs a convincing argument against the realist consensus in his thought provoking book, Exceeding Our Grasp.1 Here I will review the core of Stanford’s new argument for instrumentalism (§ 1) and discuss his considered view of theoretical science (§ 2). (shrink)
In our paper, ‘Escaping hell: divine motivation and the problem of hell’, we defended a theory of hell that we called ‘escapism’. We argued that given God’s just and loving character it would be most rational for God to maintain an open door policy to those who are in hell, allowing them an unlimited number of chances to be reconciled with God and enjoy communion with God. In this paper we reply to two recent objections to our original paper. The (...) first is an argument from religious luck offered by Rusty Jones. The second is an argument from Kyle Swan that alleges that our commitments about the nature of reasons for action still leaves escapism vulnerable to an objection we labeled the ‘Job objection’ in our original paper. In this paper we argue that escapism has the resources built into it needed to withstand the objections from Jones and Swan. (shrink)
Kyle Stanford’s arguments against scientific realism are assessed, with a focus on the underdetermination of theory by evidence. I argue that discussions of underdetermination have neglected a possible symmetry which may ameliorate the situation.
Kyle Stanford has recently claimed to offer a new challenge to scientific realism. Taking his inspiration from the familiar Pessimistic Induction (PI), Stanford proposes a New Induction (NI). Contra Anjan Chakravartty’s suggestion that the NI is a ‘red herring’, I argue that it reveals something deep and important about science. The Problem of Unconceived Alternatives, which lies at the heart of the NI, yields a richer anti-realism than the PI. It explains why science falls short when it falls short, (...) and so it might figure in the most coherent account of scientific practice. However, this best account will be antirealist in some respects and about some theories. It will not be a sweeping antirealism about all or most of science. (shrink)
The problem of underdetermination is thought to hold important lessons for philosophy of science. Yet, as Kyle Stanford has recently argued, typical treatments of it offer only restatements of familiar philosophical problems. Following suggestions in Duhem and Sklar, Stanford calls for a New Induction from the history of science. It will provide proof, he thinks, of “the kind of underdetermination that the history of science reveals to be a distinctive and genuine threat to even our best scientific theories” (Stanford (...) 2001, p. S12). This paper examines Stanford’s New Induction and argues that it – like the other forms of underdetermination that he criticizes – merely recapitulates familiar philosophical conundra. (shrink)
Kyle Stanford (2006) argues that the most serious and powerful challenge to scientific realism has been neglected. The problem of unconceived alternatives (PUA), as he calls it, holds that throughout history scientists have failed to conceive alternative theories roughly equally wellconfirmed (by the available evidence) to the theories of the day and, crucially, that such alternatives eventually were conceived and adopted by some section of the scientific community. PUA is a version of the argument from the underdetermination of theories (...) by evidence (UTE) but departs from it in two significant ways: (i) there is a shift from artificially produced rival theories - of the kind typically talked about in the underdetermination debate - to actual rivals and (ii) there is a shift from empirically equivalent rivals to rivals that are equally well-confirmed by the available evidence at a given point in time. In this talk I will argue that by these shifts Stanford successfully manages to find more historical evidence for PUA (than do proponents of UTE), but only at the expense of making his thesis ineffectual. (shrink)
Increasingly, epistemologists are becoming interested in social structures and their effect on epistemic enterprises, but little attention has been paid to the proper distribution of experimental results among scientists. This paper will analyze a model first suggested by two economists, which nicely captures one type of learning situation faced by scientists. The results of a computer simulation study of this model provide two interesting conclusions. First, in some contexts, a community of scientists is, as a whole, more reliable when its (...) members are less aware of their colleagues' experimental results. Second, there is a robust tradeoff between the reliability of a community and the speed with which it reaches a correct conclusion. ‡The author would like to thank Brian Skyrms, Kyle Stanford, Jeffrey Barrett, Bruce Glymour, and the participants in the Social Dynamics Seminar at University of California–Irvine for their helpful comments. Generous financial support was provided by the School of Social Science and Institute for Mathematical Behavioral Sciences at UCI. †To contact the author, please write to: Department of Philosophy, Baker Hall 135, Carnegie Mellon University, Pittsburgh, PA 15213-3890; e-mail: firstname.lastname@example.org. (shrink)
In the concluding chapter of Exceeding our Grasp Kyle Stanford outlines a positive response to the central issue raised brilliantly by his book, the problem of unconceived alternatives. This response, called "epistemicinstrumentalism", relies on a distinction between instrumental and literal belief. We examine this distinction and with it the viability of Stanford's instrumentalism, which may well be another case of exceeding our grasp.
Sherri Roush () and I (, ) have each argued independently that the most significant challenge to scientific realism arises from our inability to consider the full range of serious alternatives to a given hypothesis we seek to test, but we diverge significantly concerning the range of cases in which this problem becomes acute. Here I argue against Roush's further suggestion that the atomic hypothesis represents a case in which scientific ingenuity has enabled us to overcome the problem, showing how (...) her general strategy is undermined by evidence I have already offered in support of what I have called the 'problem of unconceived alternatives'. I then go on to show why her strategy will not generally (if ever) allow us to formulate and test exhaustive spaces of hypotheses in cases of fundamental scientific theorizing. (shrink)
The distinction between negative and positive liberty is familiar to political philosophers. The negative variety is freedom as noninterference. The positive variety is freedom as self-mastery. However, recently there has been an attempt on the part of a growing number of philosophers, historians, and legal scholars to recapture a third concept of political liberty uncovered from within the rich tradition of civic republicanism. Republican political liberty is freedom as nondomination. I argue that features that distinguish it from noninterference and self-mastery (...) highlight the theoretical and practical advantages of liberty as nondomination. It is, among these candidates, best-suited to serve as the guiding principle for the State's basic institutions and rules. The principle says that the State should secure nondomination among its citizens. (shrink)
Trust is a central concept in the philosophy of science. We highlight how trust is important in the wide variety of interactions between science and society. We claim that examining and clarifying the nature and role of trust (and distrust) in relations between science and society is one principal way in which the philosophy of science is socially relevant. We argue that philosophers of science should extend their efforts to develop normative conceptions of trust that can serve to facilitate trust (...) between scientific experts and ordinary citizens. The first project is the development of a rich normative theory of expertise and experience that can explain why the various epistemic insights of diverse actors should be trusted in certain contexts and how credibility deficits can be bridged. The second project is the development of concepts that explain why, in certain cases, ordinary citizens may distrust science, which should inform how philosophers of science conceive of the formulation of science policy when conditions of distrust prevail. The third project is the analysis of cases of successful relations of trust between scientists and non-scientists that leads to understanding better how ‘postnormal’ science interactions are possible using trust. (shrink)