1 Introduction

Among academics and commentators, there is a growing sense that public trust in science and scientists is at a low ebb.Footnote 1 While reports of the death of expertise (Nichols, 2017) seem to be greatly exaggerated,Footnote 2 it is undeniable that, in the United States (as well as in other countries), there are sizeable pockets of distrust in the scientific consensus on issues ranging from the contribution of human activity to climate changeFootnote 3 to the safety of vaccines.Footnote 4 Given the social and environmental ramifications of some of these issues, the question of public trust in science has gained a new urgency. If a sizeable proportion of our fellow citizens do not believe in the reality of anthropogenic climate change or take its possible consequences seriously, how can we expect them to support the sort of social, political, and economic reforms required to prevent the worst-case scenarios? If people do not trust vaccines to be safe, how can we expect them to have their children vaccinated?

To the extent to which distrust in science is both unwarranted and socially harmful (as it seems to be in the two cases just mentioned),Footnote 5 it gives rise to what I call the problem of harmful distrust. In this paper, I discuss three general approaches to public trust in science and their proposed solutions to the problem of harmful distrust. In Sect. 2, I outline a general taxonomy of relationships of trust and use it to distinguish those three general approaches, which I call the individual approach, the semi-social approach, and the social approach. In Sects. 3 and 4, I critically examine, respectively, the individual approach and the semi-social approach. I argue that, despite their differences, both approaches embrace individualistic solutions to the problem of harmful distrust and that, as a result, both approaches face two general problems, which I call the problem of overidealizing science and the problem of overburdening citizens. In Sect. 5, I argue that in order to avoid these problems we need to embrace a (thoroughly) social approach to public trust in science, which emphasizes the social dimensions of the reception, transmission, and uptake of scientific knowledge in society and the ways in which social forces influence both positively and negatively the trustworthiness of science. The social approach encourages social solutions to the problem of harmful distrust—in order to improve public trust in science, we need to improve what I call the socio-epistemic infrastructure of society.

2 Three Approaches to Public Trust in Science

In this section, I sketch a general taxonomy of relationships of trust (or distrust) based on the occupants of the trustor and trustee positions and use it to distinguish three approaches to public trust in science.Footnote 6 In what follows, I refer to the occupants of a trust relationship (i.e., ‘x (dis)trust y’) as, respectively the trustor and the trustee. So, for example, if Jane distrusts Joe, then Jane is the trustor and Joe is the trustee in that trust relationship.Footnote 7 For the sake of simplicity, I first focus only on relationships of trust that have individuals or groups as their relata. This gives rise to four kinds of trust relationships—i.e.:

  1. 1.

    individual-to-individual trust relationships (e.g., Jane might trust Donald Trump but distrust Hillary Clinton),

  2. 2.

    individual-to-group trust relationships (e.g., Jane might distrust politicians as a group (even if she might still trust specific politicians)),

  3. 3.

    group-to-individual trust relationships (e.g., evangelicals might trust Donald Trump but distrust Hillary Clinton), and

  4. 4.

    group-to-group trust relationships (e.g., evangelicals might distrust politicians in general (even if not all evangelicals distrust all politicians)).

In addition to these four kinds of trust relationship, we can add trust relationship in which the trustee is an institution, which give us two additional kinds of trust relationshipFootnote 8:

  1. 5.

    individual-to-institution trust relationships (e.g., Jane might trust the Supreme Court but distrust Congress), and

  2. 6.

    group-to-institution trust relationships (e.g., evangelicals might trust the Supreme Court but might distrust Congress).

In what follows, I use the label ‘individual trust relationships’ for trust relationships in which the trustor is an individual and ‘social trust relationships’ for those in which the trustor is a group. Also, I refer to trust relationships as ‘individual-oriented,’ ‘group-oriented,’ or ‘institution-oriented’ depending on the sort of entity that plays the role of the trustee in the relevant trust relationship. The taxonomy and its associated terminology are summarized in Table 1.

Table 1 A taxonomy of relationships of trust

Once we look at the question of public trust in science through the lens of the above taxonomy, it becomes apparent that it is possible to understand public trust in science in a number of different ways. Here, we can focus on two crucial distinctions. The first (and, for our purposes, the most crucial) distinction is the one between approaches that understand public trust in science as, primarily, an individual trust relationship (i.e., the individual and the semi-social approaches) and approaches that conceive of it as, primarily, a social trust relationship (i.e., the social approach, which I defend in this paper). One of the main differences between these approaches is that, according to the individual and the semi-social approaches, social trust in science depends on (and is explained by) individual trust in science, while the reverse is true of the social approach. As a first approximation, the individual and the semi-social approaches maintain certain societies (and certain social groups within them) (dis)trust science (or scientists) because their members tend to (dis)trust science (or scientists), while the social approach maintains individuals tend to (dis)trust science (or scientists) because they belong to societies (or to specific social groups within them) in which (dis)trust in science is prevalent. One of the most important consequences of these two different understandings of public trust in science is that they suggest different solutions to the problem of harmful distrust. The individual approach and the semi-social approach tend to favor individualistic solutions to the problem of harmful distrust (i.e., solutions at the individual level), while the social approach tends to favor social solutions (i.e., solutions at the social level).

The second crucial distinction is the one between approaches that understand public trust in science as, primarily, individual-oriented (as the individual approach) and those that understand it as, primarily, institution-oriented (as the semi-social and the social approaches).Footnote 9 According to the individual approach, institution-oriented trust depends on (and is explained by) individual-oriented trust, while the reverse is true for the social and the semi-social approaches. As a first approximation, the individual approach maintains people trust science only insofar as they trust individual scientists, while the semi-social and the social approaches maintain that people trust individual scientists only insofar as they trust science as an institution (and they see the individual scientist as acting on behalf of that institution).

Somewhat schematically, we can thus summarize the main differences between the three approaches as follows. The individual approach conceives of public trust in science as, primarily, an individual-to-individual trust relationship—i.e., as a trust between individual citizens and individual scientists. The semi-social approach conceives of public trust in science as, primarily, an individual-to-institution trust relationship—i.e., a trust relationship between individual citizens and the institutions of science. Finally, the social approach understands it as, primarily, a group-to-institution trust relationship—i.e., a trust relationship between society in general (and certain social groups within society in particular) and the institutions of science. On each of these approaches, other forms of trust in science depend on (and can be explained in terms of) what the account takes to be the fundamental trust relationship. In the next three sections, I discuss each of these approaches in turn.

3 The Individual Approach

As we have seen in the previous section, the individual approach understands public trust in science as, primarily, a trust relationship between individual citizens and individual scientists. Perhaps, the simplest version of the individual approach is what we might call the naïve account of public trust in science. According to the naïve account, ordinary citizens should trust scientists because (individual) scientists are disinterested and objective seekers of the truth who follow the scientific method, which is the most reliable way to establish the truth about that domain. While, to my knowledge, nobody nowadays explicitly accepts (let alone defends) the naïve account, something along its lines seems to be still popular among scientists and the general (trustful) public.

One of the theses of this paper is that accounts of public trust in science tend to face two general problems, which I call, respectively, the problem of overidealizing science and the problem of overburdening citizens. The naïve account faces a particularly severe version of the problem of overidealizing science. This is because the naïve account relies on a picture of science which, nowadays, is almost universally rejected by science scholars, and which we might call the naïve view of science. Since the challenges to the naïve view of science are well-known, I only mention the three most relevant for our purposes here. The first challenge is that the task of explaining exactly what the scientific method is has proven exceptionally difficult, so difficult that today most philosophers of science seem to have given up entirely on the project of identifying a set of principles that are both sufficiently informative and sufficiently general to deserve the honorary title of ‘scientific method.’ The second challenge is that scientists who have access to the same body of empirical evidence often seem to disagree with each other about which theory or hypothesis is best supported by the available evidence. These disagreements seem to suggest that, even if there is such a thing as the scientific method, either it is not applied correctly by all scientists, or it does not yield univocal results. The third challenge is that the picture of individual scientists as objective and disinterested truth-seekers is, at best, highly unrealistic. Since the 1960s, a growing body of work in the history and sociology of science has clearly established that individual scientists are susceptible to the influence of non-epistemic values (such as foibles, idiosyncrasies, social prejudices, personal, political, or economic interests, or metaphysical or religious views) and that these non-epistemic values often affect their scientific decisions.Footnote 10 This third challenge seems to be particularly worrisome at a time when the direct and indirect influence of private funding on science is growing and it is increasingly common for scientists to operate under more or less open forms of conflict of interest.

In light of these challenges, supporters of the individualist approach are likely to reject the naïve account (and the naïve view of science on which it relies). However, if they do so, they seem to run into the second of the general problems I mentioned above—i.e., the problem of overburdening citizens. If one accepts the individual approach without assuming that scientists are always and everywhere disinterested and objective truth-seekers that dispassionately apply the scientific method, then the relationship between scientists and ordinary citizens is similar to the relationships between other non-disinterested experts (e.g., mechanics or lawyers) and their clients, which tend to give rise to what Alexander Guerrero (2016) calls strategic expertise contexts. Let me call the resulting version of the individual approach the consultative account of public trust in science. The consultative account faces a number of problems that characterize strategic expertise contexts in general. Since these problems are well-known, I only mention two of them here. The first is the problem of how non-experts can (without acquiring a significant level of expertise themselves) distinguish the genuine experts on a certain topic from pseudo-experts, quacks, and charlatans; the second is the problem of how non-experts can (again, without acquiring a significant level of expertise) decide whom to believe in cases of disagreement among genuine experts.Footnote 11 While it is possible to find a number of attempts to mitigate the problems that arise from strategic expertise contexts, most of these attempts seem to be based on unrealistic expectations about the (epistemic and non-epistemic) resources that ordinary citizens are both willing and able to devote to ascertaining who the genuine experts are or who to believe among disagreeing experts.Footnote 12

The individual approach, thus, seems to be caught between two sets of unrealistic expectations. On the one hand, the naïve account seems to rely too heavily on the objectivity and disinterestedness of individual scientists and on the existence of a set of unambiguous norms that deserve the honorary title of ‘the scientific method’ (which is an instance of what I have called the problem of overidealizing science). On the other hand, the consultative account seems to rely too heavily on the limited resources of ordinary citizens to settle disagreements between (genuine or self-proclaimed) experts (which is an instance of the problem of overburdening citizens).

4 The Semi-social Approach

If the individual approach understands public trust in science as, primarily, based on a trust relationship between individual citizens and individual scientists, the semi-social approach understands it as, primarily, based on a trust relationship between individual citizens and science.

The semi-social approach typically rejects the naïve view of science in favor of what we might call a social view of science. The social view of science meets the challenges that beset the naïve view by adopting something along the line of the following three-step strategy. The first step is to distinguish the goals of science from the goals of individual scientists. Even if individual scientists have their own personal goals (such as professional success), these goals can be harnessed in the pursuit of the goals of science. Scientists know that, insofar as they want to be professionally successful, they need to act so as to promote the goals of science as their own goals and they have to act in accordance with the ethical and epistemic norms of the scientific community.Footnote 13 The second step is to adopt a broadly pragmatic understanding of the goals of science. As a society, we might be content if science can provide us with reliable knowledge that can be used to pursue our personal and collective goals with a reasonable degree of success (i.e., with a greater degree of success than relying on common sense, intuitions, or divination).Footnote 14 The third step is to maintain that, insofar as there is anything that deserves the label ‘scientific method,’ it is best understood as a set of social norms that regulate the activities of the scientific community as a whole (rather than a set of normative principles to which individual scientists adhere). On this view, the scientific method can be understood as a social system of epistemic checks and balances whose function is to safeguard the reliability of scientific results. For example, scientific communities rely on mechanisms such as peer-review or replication to deter unscrupulous scientists from trying to game the system and they apply very severe sanctions to those scientists who are found to have violated those norms.

One of the most popular versions of the social view of science is the one proposed by Helen Longino.Footnote 15 According to Longino, a scientific community is objective to the extent to which it meets certain conditions, which include sharing a set of standards, displaying a level of social and epistemic diversity, granting comparable epistemic authority to all of its members in good standing (irrespectively of their professional status or social identity), providing recognized avenues for criticism of theories, assumptions, and methods, and engaging with those criticisms (as opposed to merely ignoring them or side-lining their proponents) (see, in particular, Longino, 1990, chap. 4).

The semi-social approach to public trust in science builds on the social view of science by claiming that, while one cannot necessarily assume that individual scientists are disinterested and objective truth-seekers, the scientific community as a whole can be trusted to pursue the goals that society expects it to pursue (whether it is truth or some lesser goal, such as empirical adequacy). The most popular variant of the semi-social approach to public trust in science is what we might call the consensus account of public trust in science [which is more or less explicitly embraced by Oreskes (2019) and Anderson (2011)].Footnote 16 According to the consensus account, the consensus of the relevant scientific community over a certain hypothesis is a sufficient (and, possibly, necessary) condition for ordinary citizens to be justified in accepting (or rejecting) that hypothesis. For example, Anderson writes:

Science needs a balance of diverse inquirers to formulate and investigate a wide range of hypotheses, uncover a wide range of relevant evidence, and check one another’s biases […]. When the vast majority of diverse inquirers converge on certain conclusions, as in evolutionary theory, a robust scientific consensus obtains. Before a consensus, the best course for laypersons is to suspend judgment. Once a consensus of trustworthy experts is consolidated, laypersons are well advised to accept the consensus even in the face of a handful of dissenting scientists, or a few instances of error or dishonesty among a few of the participants in the consensus. (Anderson, 2011, 149)

The consensus account explicitly formulates one of the implicit premises of an argument that is often deployed in the context of public debates about anthropogenic climate change [and whose use was spearheaded by Oreskes herself (Oreskes, 2004)]. The argument appeals to the fact that a vast majority of climate scientists believe in anthropogenic climate change, as testified by the fact that, according to an oft-quoted study, 97% of peer-reviewed climate science papers support anthropogenic climate change (Cook et al., 2013). That argument, however, does not provide us with any explicit reason to think that the consensus of the relevant scientific community over the anthropogenicity of climate change gives ordinary citizens reasons to accept that hypothesis and, on the face of it, it is not obvious that we should do so. After all, as Stephen John (2018, 76–77) notes, the fact that astrologers unanimously believe the hypothesis that a Jupiter-Saturn conjunction ushers in the Age of Aquarius does not seem to be a reason to accept that hypothesis. According to the consensus account, however, the missing premise is that, given the system of epistemic checks and balances implemented by the community of climate scientists, the fact that that community has reached a nearly unanimous consensus over a certain hypothesis after subjecting it (and its main alternatives) to critical examination gives ordinary citizens a strong (if defeasible) reason to accept that hypothesis. In contrast, the astrological community does not seem to have any system of epistemic checks and balances in place.Footnote 17

On the face of it, the consensus account seems to avoid the problems that beset the individual approach. On the one hand, it seems to avoid the problem of overidealizing science by adopting a more realistic view of science. On the other hand, it seems to sidestep the problem of overburdening citizens by relying on a seemingly simple criterion (i.e., the consensus of the relevant scientific community) for lay acceptance of scientific hypotheses.

On closer scrutiny, however, the consensus account turns out to be vulnerable to different versions of the same problems. Like the naïve account, it, too, seems to presuppose an excessively idealized picture of the workings of real-world scientific communities (which gives rise to a version of the problem of the overidealizing science) and, like the consultative account, it too seems to rely excessively on the resources of ordinary citizens (which gives rise to a version of the problem of overburdening citizens). The relevant version of the problem of overidealizing science is that it is not clear how closely real-world scientific communities approximate the ideal of an objective scientific community that underlies social views of science such as Longino’s.Footnote 18 Just to pick one facet of this problem, consider cases of what we might call vicious consensus. The history of science shows that scientific communities can achieve a significant level of consensus over false hypotheses, a consensus that seems to be based more on the shared prejudices and biases then on the empirical support for those hypotheses. A classic example of vicious consensus is the broad consensus among 19th-century physical anthropologists over the hypothesis that there are innate differences in intelligence between members of different “races.”Footnote 19 Cases of vicious consensus seem to show that the prevalence of prejudices and biases in a scientific community can severely inhibit the effectiveness of the self-correcting mechanisms presupposed by the social view of science. Given that all sorts of implicit or explicit prejudices and biases (including racist, sexist, and classist prejudices) are still widespread in contemporary societies, it is unlikely that our scientific communities are completely immune from their influence.Footnote 20

Unsurprisingly, Oreskes, who is an accomplished historian of science, anticipates this objection. Her response to it is that ‘in all cases [of vicious consensus], there was significant, important and empirically informed dissent within the scientific community’ (Oreskes, 2019, 128; emphasis in the original). Even if, for the sake of the argument, we grant the truth of Oreskes’ broad claim, this response seems to give rise to a dilemma. Either the consensus account adopts unanimity as the criterion of lay acceptance, or it adopts a less stringent criterion. If it adopts unanimity, then it is likely that this criterion will be rarely met even in cases of virtuous consensus, as scientific communities rarely achieve full unanimity over any interesting scientific hypothesis. Moreover, from the perspective of social views of science, this lack of unanimity is to be encouraged, as scientific disagreement is what fuels the self-correcting mechanisms of science. Finally, it is not obvious that unanimity would actually help to foster trust in science, as outsiders might worry that the apparent consensus is based on the suppression of dissent.Footnote 21 However, if the consensus account adopts a less stringent criterion for lay acceptance, then Oreskes’ response fails to distinguish between vicious and virtuous consensus, as, in both cases, the consensus is likely to be less-than-unanimous. Advocates of the consensus account would therefore have to identify additional criteria to distinguish cases of (less-than-unanimous) virtuous consensus from cases of (less-than-unanimous) vicious consensus and it is unclear what these criteria might be (see, e.g., de Melo-Martín & Intemann, 2018). Moreover, as Oreskes and her co-author Erick Conway have abundantly demonstrated (Oreskes & Conway, 2010), even in cases of virtuous and near-unanimous consensus, it only takes a handful of highly motivated, unscrupulous, and outspoken scientists to mislead the general public into believing that a certain hypothesis is scientifically controversial.Footnote 22 Oreskes might reply that these cases are not cases of empirically informed dissent. However, the task of distinguishing cases of spurious dissent from the sort of genuine dissent that, according to the social view, drives the self-correcting mechanisms of science is far from a simple task (see, e.g., de Melo-Martín & Intemann, 2018). Moreover, even assuming, for the sake of the argument, that supporters of the consensus account could provide the general public with a set of more specific criteria to distinguish between virtuous and vicious consensus and between spurious and genuine dissent, the consensus account would still run into the problem of overburdening citizens. It seems unrealistic to expect ordinary citizens to have the (epistemic and non-epistemic) resources to apply any such hypothetical set of criteria to every single case ranging from the reality of anthropogenic climate change to the safety of vaccines.

It is important to note that these two problems are not specific to Oreskes’ particular version of the consensus account—they seem to be a consequence of two general features of the semi-social approach. The semi-social approach seems to presuppose that the problem of harmful distrust is primarily the problem of persuading individual citizens that science is a trustworthy institution. However, there seem to be two general problems with this. The first is that it is far from obvious that science as it is currently practiced is fully trustworthy (which is, once again, an instance of the general problem of overidealizing science). Just to focus on one facet of this problem for illustrative purposes, it seems that science’s current system of external and internal incentives is not always best suited to promoting the goals that society expects it to achieve. As far as external incentives are concerned, science is increasingly reliant on private sources of funding (whether directly or indirectly).Footnote 23 A field in which the pernicious effects of this reliance on private funding are particularly obvious is biomedical research. As Jim Brown (2017) has argued, private funding influences biomedical research negatively in two ways—it biases it (as it creates a conflict of interest for the researchers, who are funded by the very corporations that stand to gain or lose financially from their research),Footnote 24 and it skews it (as it promotes certain research directions (such as, e.g., development of new patentable drugs) over others (such as, e.g., research on off-label uses of old drugs, non-pharmaceutical interventions, or lifestyle changes) on the basis of considerations of profitability). The influence of private funding on biomedical research has resulted in a number of notorious scandals that have contributed to tarnishing the public’s perception of the trustworthiness of that research in particular and of science in general.Footnote 25

The internal incentives of scientific communities, however, can also contribute to skewing and biasing their research. Consider, for example, the replication crisis that has recently engulfed social psychology. The crisis originated with a large-scale attempt to replicate some of the most influential results in social psychology, which resulted in a failure to replicate the results of most of the original studies.Footnote 26 Of the many factors that are likely to have contributed to the crisis, one factor illustrates particularly clearly both the biasing and skewing effects of the internal incentives (and disincentives) adopted by the community. Scientific communities tend to have a preference for surprising results, which means that such results are more likely to get published and cited, with all that this entails in terms of professional prestige and academic career. This contributes both to biasing research (as researchers have an incentive to actively search the data for surprising results that might, in fact, be just an artefact of poor experimental design or a statistical fluke) and to skewing it (as replication studies are considered much less prestigious than original studies).Footnote 27 The replicability of scientific results is considered a cornerstone of the scientific method. However, replicability without replication is unlikely to serve the self-correcting function it is supposed to serve, and it is unlikely that the scientific community as it is currently organized will achieve the optimal balance between finding surprising results and probing the robustness and replicability of those results.Footnote 28

While the semi-social approach rightly emphasizes how the norms adopted by scientific communities can function as a system of epistemic checks and balances, it seems to largely ignore the fact that scientific communities do not operate fully independently from society and that the norms they currently adopt are not always best suited to the achievement of the goals of science. In particular, the semi-social approach fails to openly acknowledge that the trustworthiness of science as an institution is partly shaped by external and internal social forces and that these forces often limit the ability of science to self-regulate so as to be fully trustworthy. Moreover, by discounting the fact that science as it is currently practiced is not always fully trustworthy, the semi-social account fails to address some of the legitimate concerns that fuel distrust in science in certain sectors of the public. An adequate account of public trust in science should acknowledge that, as it is currently practiced, science is not always fully trustworthy and should develop proposals to improve the trustworthiness of science rather than simply presupposing it. As philosophers working on trust often note (see, e.g., Hardin, 2006), trust is not intrinsically good, as we should not fully trust those who are not fully trustworthy. While, overall, science might be more trustworthy than many other institutions, it does not mean that an account of public trust in science should ignore the ways in which its trustworthiness can be improved.

The second general problem with the semi-social approach stems from its oddly divided epistemology. The semi-social approach seems to presuppose that it is social epistemology for scientists but individualistic epistemology for everyone else. While it correctly emphasizes the social dimensions of the production of knowledge within the scientific community, it primarily relies on an inadequate individualistic picture of the reception, transmission, and uptake of that knowledge by society, a picture that relies almost exclusively on individual citizens (which, again, leads to an instance of the problem of overburdening citizens). As a result, the semi-social approach largely ignores the social dimensions of the reception, transmission, and uptake of that knowledge.Footnote 29

Let me focus on one specific facet of this problem to illustrate the general problem. The problem arises from the communication channels between science and the general public. Most people do not get their scientific information directly (e.g., from scientists or scientific institutions) but indirectly through traditional media and, increasingly, through social media (see, e.g., (Funk et al., 2019)). This gives rise to a number of issues that the semi-social approach is ill-equipped to address. Let me mention one issue here, which concerns the effectiveness of trying to persuade citizens who are distrustful of science of its trustworthiness. The problem is that, to some extent, we all live in an epistemic bubble that largely determines what information (and, even more critically, misinformation) reaches us. This includes not only scientific (mis)information but also the very attempts to persuade distrustful citizens of the trustworthiness of science, which means that it is, at best, doubtful that the attempts to persuade distrustful citizens will even reach their targets. Even worse, many distrustful citizens seem to be part of echo chambers that actively undermine the credibility of outside sources.Footnote 30 In light of these considerations, it is likely that the sort of persuasion attempts advocated by semi-social approach are likely to be ineffective (in the case of epistemic bubbles) or counterproductive (in the case of echo chambers).Footnote 31

In the next section, I argue that, in order to overcome these two general problems with the semi-social approach, we need to embrace a thoroughly social approach to public trust in science, one that understands both the production of scientific knowledge and its reception and uptake by society as thoroughly social and that sees science as completely embedded in society and dependent on it to improve its trustworthiness.

5 The Social Approach

In this section, I argue that, in order to avoid the problems faced by the individual and semi-social approaches, we need to adopt a (thoroughly) social approach to public trust in science. The outline of the social approach that I sketch here is admittedly incomplete and underdeveloped. The details will need to be worked out elsewhere (and, in some cases, they are already being worked out).Footnote 32 However, I hope that, in spite of its limitations, the outline that I sketch here can help lay the groundwork for a unified social approach to public trust in science that weaves together a number of strands of research on the social epistemology of science broadly construed, stimulate more work in this direction, and, hopefully, promote the development of different and more effective solutions to the problem of harmful distrust.

The social approach rejects the two assumptions that underlie the semi-social approach and that, I claimed, were at the root of its problems. First, the social approach rejects the divided epistemology presupposed by the semi-social approach in favor of a unified social epistemology, one that acknowledges the social dimensions of not only the production of scientific knowledge by the scientific community but also its reception, transmission, and uptake by the rest of society. According to the social approach, it is social epistemology for all. Second, the social approach rejects the semi-social approach’s implicit assumption that science is largely insulated from society and that it is already fully trustworthy as it is currently practiced. According to the social approach, science is fully embedded in society and, as a result, society can affect both positively and negatively to the trustworthiness of science.

As we have seen in Sect. 2, the social approach conceives of public trust in science as, primarily, based on a trust relationship between society (and various social groups within it) and the scientific community. Of course, to say that trust in science is primarily based on group trust is not to deny that social groups trust science only insofar as a suitable proportion of their members trusts science. Rather, it is to deny that members of the group come to trust or distrust science independently of one another. One of the most distinctive characteristics of a social approach to public trust in science is that, while individual and semi-social approach conceive of trust in science as, primarily, the result of a rational and voluntary decision made by each individual citizen independently of others, the social approach conceives of individual trust in science as the result of a habit that is neither fully voluntary nor fully rational and that is largely the result of the attitudes towards science held by those in one’s social and epistemic network.

As we have seen, according to the individual and semi-social approaches, individual citizens should trust science because science is a reliable source of knowledge and, if they do not trust science, they need to be persuaded that science is trustworthy. On these accounts, the solution to the problem of harmful distrust requires educating individual citizens about science and persuading them to trust science. Even when semi-social approach acknowledges the influence of social factors, they still see them as, primarily, interfering with the otherwise rational and voluntary decisions of individual citizens.

The social approach, on the other hand, conceives of trust in science as, primarily, social and it maintains that attempts to solve the problem distrust should focus primarily on social and structural changes rather than on interventions at the individual level. Consider, for example, the extreme case of someone (let’s call him Aaron) who grew up in a traditionalist religious community that denounces science as inimical to religion and as a gateway to atheism and who, as a result, is distrustful of science. For Aaron, distrust in science is a profoundly held attitude and it is unlikely that people or institutions from outside Aaron’s community could persuade Aaron to trust science. In fact, as we have seen, any efforts to persuade Aaron are likely to be ineffective if not counterproductive (if they even reach him). Of course, this does not mean that it is impossible for Aaron to bring himself to trust science. Rather, it means that bringing himself to trust science for Aaron would require a concerted and conscious effort that we cannot expect him to be motivated to make insofar as he is a full member of a community that is profoundly distrustful of science.

Admittedly, the case of Aaron might be an extreme case.Footnote 33 However, according to social approach, the difference is a matter of degree—whether individuals trust (or distrust) science is, by and large, a function of the trust in science in their social environment and their epistemic network. This contention is indirectly supported by the evidence. For example, while science is a global institution, distrust in science seems to vary from country to country (see, e.g., Wellcome Global Monitor, 2018) and, within a specific country, from social group to social group (see, e.g., Funk et al., 2019). These variations suggest that public trust in science is largely a social and cultural phenomenon. Moreover, if one is part of a social group (or a society) that (dis)trusts science, a number of social and cognitive mechanisms are likely to contribute to entrench that (dis)trust,Footnote 34 and it is difficult for external forces to change that.

But what does it mean for a society (or a social group) to trusts science? I think that it is important to emphasize that a society that trusts science is not necessarily one whose members have a positive opinion of science or scientists. A society can trust science without most of its members having any explicit views about the trustworthiness of science or scientists—a society that trusts science is, first and foremost, a society that collectively relies on science to inform its actions and decisions (and those of its members). Public trust in science can be understood as a collective (non-doxastic) disposition (Kappel, 2014) or as a collective (unquestioning) attitude (Nguyen, forthcoming) toward science. Whenever the question of whether we should trust science arises, then, usually, trust in science has already been eroded to some extent. On this view, full trust is (typically)Footnote 35 implicit trust. As Annette Baier aptly put it: ‘We inhabit a climate of trust as we inhabit an atmosphere and notice it as we notice air, only when it becomes scarce or polluted.’ (Baier, 1986, 234) Consider an analogy. If a friend invites you for dinner, you implicitly trust that they will not poison you. The question of whether they have poisoned your food should not even cross your mind. If the question were to genuinely cross your mind, then that would be evidence that you don’t fully trust your friend.

On the social account, a society that trusts science is not necessarily one in which the citizens have a positive opinion of science or scientists. But, if trust in science is not necessarily exemplified by the explicit opinions of citizens what does it consist in? According to the social account, a society trusts science insofar as it exemplifies an efficient division of epistemic labor between scientists and ordinary citizens. As social epistemologists argue, a significant degree of division of epistemic labor is essential to the functioning of any society and, particularly, to that of contemporary societies. A society that exemplifies a perfectly efficient division of epistemic labor is one in which people rely on the knowledge of the relevant experts for their individual and collective decisions. They rely on the knowledge of mechanics to determine what’s wrong with their cars and on the knowledge of doctors to diagnose their medical conditions. A society that trusts science is one that collectively relies on the knowledge of climate scientists to settle the question of whether human activity is causing climate change and on the knowledge of biomedical researchers to settle the question whether vaccines are safe.

Obviously, there are both theoretical and practical limits to how closely real-world societies can (or should) approximate a perfectly efficient division of epistemic labor. Let me mention two here. A first limit arises from the state of what we might call the socio-epistemic infrastructure of society. Many experts offer their services through the market, and this might give rise to conflicts of interest between experts and their clients (e.g., a mechanic might recommend a more expensive repair when a cheaper one would be equally effective, a lawyer might recommend litigation even when settling would be in the client’s best interest). The extent to which a society can approximate a perfectly efficient division of cognitive labor is largely determined by the quality of its socio-epistemic infrastructure. The social approach has a very broad understanding of what counts as part of the socio-epistemic infrastructure. For example, the social approach would include a country’s healthcare system in its socio-epistemic infrastructure, as many of the potential conflicts of interest that might arise in the relationship between doctors and their patients in the context of a private or semi-private health care systems do not arise in the context of a fully public healthcare systems. Similarly, a society in which scientific research is primarily funded privately is a society that has a poor socio-epistemic infrastructure that tends to undermine trust in science.

A second limit concerns the role played by non-epistemic values in science. There is a growing consensus among philosophers of science that even scientific decisions that seem to be purely epistemic often require scientists to take non-epistemic values into account.Footnote 36 A standard example of this is inductive risk—i.e., the risk involved in accepting or rejecting a scientific hypothesis (see Rudner, 1953; Douglas, 2009). The idea is that, since scientific hypotheses are never conclusively verified or falsified, whenever scientists accept or reject a hypothesis, they should consider the non-epistemic consequences of a potential error (i.e., the non-epistemic consequences of accepting a false hypothesis or rejecting a true one). For example, if scientists incorrectly accept the hypothesis that the widely used herbicide glyphosate is carcinogenic to humans, then this might result in an unnecessary ban on glyphosate the substance that might affect crop yields and food prices. If, on the other hand, they incorrectly reject the hypothesis, then this might result in a higher rate of incidence of certain cancers, which would result in unnecessary deaths and human suffering. When choosing how strict the epistemic standards for acceptance or rejection of the hypothesis that glyphosate is a human carcinogen, scientist should weigh the potential consequences of error. It seems that, as a society, we should try to ensure that the values employed in the process are representative of those held by society at large (see, e.g., Elliott, 2017; Schroeder, 2021). However, this seems to limit the extent to which we can have a perfectly efficient division of cognitive labor.

In general, according to the social approach, the problem of harmful distrust is usually the result of the breakdown of an efficient division of epistemic labor caused by a degraded socio-epistemic infrastructure. On this view, the issue with the solutions to the problem of harmful distrust put forward by the individualistic and the semi-social account is not only that they have unrealistic expectations about the resources ordinary citizens are willing and able to devote to sorting through the scientific (mis)information they receive or that they rely on an excessively idealized picture of the trustworthiness of science, but that their proposed solutions might actually contribute to the problem. According to the social approach, to expect ordinary citizens to do more than their basic epistemic due diligence on issues such as vaccine safety or climate change (as the individual and semi-social approach suggest) is likely to have the unintended effect to further undermining the efficient division of epistemic labor. The division of epistemic labor exists exactly so that we do not have to “do our own research” on each and every topic that is relevant to our own good, or that of our family, our community, our society, or the world. It is practically impossible for each citizen to do their own research on such a vast range of topics and, even if it wasn’t, it would be unlikely that most would reach warranted conclusions on the basis of such research. In fact, anecdotal evidence seems to suggest that people who “do their own research” are the ones who are more likely to reject the scientific consensus on specific topics. If a doctor prescribes you a drug, you should be able to assume that the drug is relatively effective, that it does not cause side effects, and that the specific batch you are taking has undergone a sufficient level of quality control to ensure that the right amount of the medicinal ingredient is present, etc. If you had to do your own research on each of these issues before taking a prescription drug, then you’ll probably never end up taking that medicine.

On the social approach, a society that fully trusts science is one that trusts it implicitly. It is a society in which science informs public policy and public debate without becoming itself the object of public debate. Is this an unattainable ideal, such as the one of a perfectly efficient division of cognitive labor? I do not necessarily think so. While a lot of the philosophical literature on public trust in science focuses on cautionary tales, we can perhaps learn as much from comparing them to the success stories. A rare but very instructive success story is the one that led to the discontinuation of the production chlorofluorocarbons (CFCs). Once, CFCs were widely used as refrigerants in products such as refrigerators and air-conditioning units as well as propellants in spray cans. However, in the mid-1970s, scientists started to realize that the chlorine released by the decomposition of CFCs in the stratosphere was rapidly depleting the stratospheric ozone responsible for absorbing some of the harmful radiation from the sun. After the scientists sounded the alarm, political leaders of 27 countries, realizing the urgency of the problem, signed the Montreal Protocol in 1987, which heavily regulated the production of CFCs. In the meantime, industry developed a number of substitutes of CFCs, including hydrofluorocarbons (HFCs), which are now commonly used, and production of CFCs ceased entirely in 1995.

In the case of CFCs, the global community instantiated something very close to the ideal of a perfectly efficient division of epistemic labor—the scientific community was quick to raise the alarm about a serious and rapidly-developing environmental problem, the global political community was quick to take collective political action on that problem, and the private sector was quick to find a technological solution to the problem. The difference between the case of CFCs and the case of climate change is striking. Why did the division of epistemic labor work so well in one case and so poorly in the other? On a superficial reading, the crucial difference between the two cases is the result of luck. In the case of CFCs, the affected industries were able to quickly identify a relatively inexpensive technical solution to the problem, while, in the case of global warming, there is no easy technical solution in sight.

However, to see this merely as a matter of luck seems to miss the more important lesson of comparing the success story with the cautionary tale, which is that, in the case of CFCs, private economic interests did not have a strong economic incentive to forge an alliance with broadcasters, commentators, and politicians to undermine public trust in the relevant science, as in the case of climate change.Footnote 37 The differences between success stories like the one of the ban on CFCs and cautionary tales like the one of climate change suggest that one of the main factors that disrupts a proper division of epistemic labor is the interference of private economic interests with the production, reception, and uptake of scientific knowledge. This hypothesis is supported by the fact that, in most of cases of distrust, economic interests played a crucial role in hindering the production, reception, and uptake of the relevant bits of scientific knowledge [as, for example, in the many other cases discussed in Oreskes and Conway (2010)]. If this hypothesis is correct, then it seems that part of the solution to the problem of harmful distrust requires breaking the anti-scientific alliance between corporate interests and broadcasters, commentators, and politicians. On the social account, the solution of the problem of harmful distrust might have more to do with public policies, such as anti-lobbying legislation and campaign finance reform, then with trying to persuade ordinary citizens of the trustworthiness of science.

One might argue that we should not overgeneralize on the basis of one success story and a handful of cautionary tales. After all, they might argue, economic interests do not seem to play a role in all the standard cautionary tales. For example, one might argue that distrust of vaccines cannot be explained in terms of economic interests, as no one stands to benefit from lower vaccination rates. However, even leaving aside the issue of whether the leading figures of the anti-vaxx movement benefit from it,Footnote 38 it seems plausible to claim that economic interests have played a crucial, if less direct, role in distrust of vaccines, too. In particular, it is tempting to believe that public trust in biomedical research has been eroded by its exceedingly cozy relationship with the pharmaceutical industry and by the many scandals to which this relationship has given rise. This erosion of trust has provided fertile ground for distrust in vaccines to propagate.Footnote 39

According to the social approach, however, economic interests are not the only social force that can negatively affect the trustworthiness of science. Scientists and scientific institutions, for example have also contributed to the distrust of science among members of specific communities and social groups (including women, people with disabilities, racial and ethnic minorities, gender non-conforming individuals, etc.) by perpetuating and validating prejudices against members of these communities and groups and by perpetrating injustices against them. Examples of this are the distrust towards the biomedical profession in sectors of the African-American community (see, e.g., Boulware et al., 2003) and the gay community (see, e.g., Hoyt et al., 2012). The distrust in these communities is largely rooted in a history of discrimination and injustice [perhaps best exemplified by the now infamous Tuskegee Study in the case of the African-American community (see, e.g., Brandt, 1978) and by the problematic response to the HIV/AIDS epidemic in the 1980s (see, e.g., France, 2016)]. In these cases, it seems that improving the social diversity of scientific communities and fostering constructive relationship with stakeholders might be a crucial step towards restoring trust in science among members of these groups and communities.

As these all-too-brief remarks suggest, according to the social approach, the problem of harmful distrust calls for primarily social solutions, not individualistic solutions. According to social approach, if we have to resort to persuading individuals to trust science, it is already too late—we already live in a society in which a proper division of epistemic labor has broken down due to a degraded socio-epistemic infrastructure and, unfortunately, it is extraordinarily difficult to unring the bell of scientific misinformation.Footnote 40 Remedying this situation requires implementing large-scale social and political reforms that are aimed at improving the socio-epistemic infrastructure as well as the (actual and perceived) trustworthiness of science rather than trying to persuade individual citizens of the trustworthiness of science.

To be clear, the social approach does not necessarily reject all interventions at the individual level. However, it emphasizes that the effective interventions at the individual level tend to be social in nature as well. For example, while the social approach insists that an adequate account of public trust in science should not rely too heavily on the epistemic and non-epistemic resources of individual citizens, it still supports efforts to improve the critical thinking skills and the scientific literacy of ordinary citizens. However, the social approach emphasizes the fact that even measures that seem to be targeting individuals should be understood as social. The scientific literacy and the critical thinking skills of individuals can only be improved in a society that is fully committed to providing all of its citizens with a high-quality education. The social approach understands the epistemic resources of ordinary citizens to be themselves partially a social product—i.e., they are largely a product of the resources that society is willing to invest in the education of its citizens.Footnote 41 This fact seems to be better understood by those who try to undermine public trust in science than by those who try to foster it. For example, the campaign to tach creationism in schools can be seen as part of a more general effort to sabotage public science education as a means to undermining both the public understanding of science and public trust in science.

According to the social approach, however, interventions aimed at improving the scientific literacy and critical thinking skills of individual citizens can only play a limited role. This is partly because, as the social approach emphasizes, many of the issues to which scientific knowledge is relevant are not issues that can be effectively addressed through the actions of individual citizens—they are issues that can only be effectively addressed through public policy. While we have already seen a successful case of this sort of collective action in the case of the ban on CFCs, the clearest example of this in the contemporary context is the case of climate change. Even if we were to convince the vast majority of citizens that human activities are causing climate change and that, if not addressed on time, climate change is likely to have profound and irreversible consequences on the planet and on humans in particular, the challenge of climate change cannot be met solely (or, even, mainly) through individual decisions (such as the consumption choices of individuals). Climate change requires large-scale social and structural responses that can only be brought about through public policy and international agreements (as in the case of the ban of CFCs). Similar considerations apply even in cases that seem to be based mostly on individual choices, such as the case of vaccines. Since vaccines are not completely effective at the individual level, their effectiveness is partly due to herd immunity. So, even what seems to be a personal choice (whether to have one’s children vaccinated) has social ramifications—vaccinating one’s kids in a society in which most people do not vaccinate them is only going to provide partial protection against the diseases the vaccine is meant to protect against. Finally, even cases that seem to involve only individual choices and that have significantly less dramatic social consequences might, on closer inspection, require social solutions. Consider, for example, the case of flossing discussed by Oreskes (2019, 118–127). Oreskes notes that the best evidence so far of the effectiveness of flossing is from studies in which children received flossing by a dental hygienist in school. While this might be just an artefact of study design,Footnote 42 it might also be that flossing is only effective (or is significantly more effective) when performed by a dental hygienist. If this is the case, then, if we want to follow the evidence where it leads us, then we should conclude that the question is not so much how regularly individuals floss at home, but whether people should have more access to professional flossing. Oreskes implicitly concedes this when she half-jokingly suggests that flossing bars might be part of the solution to the health issues that result from poor dental hygiene. However, the truth is that, if the evidence does indeed suggest that the benefits of flossing are only limited to professional flossing, then what is required is not simply a change in individual flossing habits, but changes to the way we floss as a society. The point is that even cases that seem to involve only individual choices that appear to have no social ramifications (such as that of flossing one’s teeth), the best solution to the problem might be one that requires new social norms, policies, institutions, or services.Footnote 43

According to the social approach, public trust in science is primarily a form of social trust that is exemplified by an efficient division of epistemic labor between scientists and non-scientists and that depends on the state of the socio-epistemic infrastructure of society. While the individual and semi-social approach focus almost exclusively on interventions at the individual level to solve the problem of harmful distrust, the social approach identifies at least four distinct (though sometimes overlapping) levels of possible intervention. The first is the individual level. For example, as we have seen, the social approach agrees with the individual and the semi-social approach that efforts should be made to improve the scientific literacy of ordinary citizens through high-quality public education. The second level is the social level. For example, as we have seen, the social approach advocates a greater inclusion of stakeholders (and especially from stakeholders from marginalized groups) in decisions about the direction of publicly funded scientific research. The third level is at the interface between science and society. For example, the social approach advocates for improving the quality of science communication, possibly by funding public news sources devoted to providing high-quality science journalism [as opposed to the sort of sensationalistic science journalism that is often provided by private media sources or a journalism that mistakenly pursues the journalistic norm of balance at all costs (see, e.g., Gerken, 2020)]. The fourth level is that of scientific communities. For example, society could protect science from the undue influence of economic interests by effectively socializing scientific research while shielding it from the interference of political interests by letting scientists and a diversity of stakeholders be primarily responsible for the allocation of the public funds.

6 Conclusion

In this paper, I distinguished three general approaches to public trust in science. I argued that the first two approaches, the individual approach and the semi-social approach, conceive of public trust in science as primarily based on an individual trust relationship and that, as a result, conceive of the solution to the problem of harmful distrust as trying to attempt persuade individual citizens that science is trustworthy. I argue that both approaches face two general problems, which I called the problem of overidealizing science and that of overburdening citizens. I argued that, to avoid these problems, we should adopt a (thoroughly) social approach to public trust in science instead. On this approach, public trust in science is primarily based on a social trust relationship between society and science. I also argued that a society that fully trust science is one that trusts it implicitly by adopting an efficient division of epistemic labor. The problem of harmful distrust arises from a poor socio-epistemic infrastructure that undermines the efficient division of epistemic labor. To solve the problem of harmful distrust we need to try to improve the state of society’s socio-epistemic infrastructure, rather than try to persuade citizens to trust science. While this includes interventions at the individual level, these interventions need to be supplemented with interventions at the level of social groups, of the interface between science and society, and at the level of improving the (actual and perceived) trustworthiness of science.

The outline of the social approach that I provided in this is admittedly just a sketch and many of the details will need to be worked out elsewhere (or are already being worked out). However, I hope that, in spite of its shortcomings, the outline I provided here contributes to laying the groundwork for a unified social approach to public trust in science, an approach that weaves together a number of strands of research on the social epistemology of science broadly construed, that stimulates more work in that direction, and that, hopefully, promotes the adoption of a more constructive and effective approach to the problem of harmful distrust.