Introduction

During the last years, the world suffered from the global spread of the SARS-CoV-2 novel coronavirus (COVID-19) into almost all populated regions of our planet. It will most likely mark an historical caesura. Former German chancellor Angela Merkel recently called Corona “the greatest challenge since World War II”. Most aspects of social and public life changed gravely—from curfews to international closings of boarders—and the sanitary, economic, political and cultural consequences cannot be foreseen, especially for poorer countries and regions.

COVID-19 is highly contagious and spreads much quicker than other viruses. But even faster went the spread of information about and around COVID-19—correct reports and references as well as fake news—through the (social) media. The WHO (world health organization) recently published an article entitled “How to Fight an Infodemic”, (UN.org 2020) to combat a global epidemic of disinformation—spreading rapidly through social media platforms and other outlets—which also can pose a serious problem for public health.

In a joint work with the WHO, the Panamerican Health Organization (PAHO) describes this infodemic as “an overabundance of information—some accurate and some not—that makes it hard for people to find trustworthy sources and reliable guidance when they need it” (PAHO 2020). Users of social media could be led astray by consuming and following information, posts, tweets or articles, which are “intentionally and verifiably false’’, so called “fake news” (Allcott and Gentzkow 2017: 213). Quite in accordance to that (but in a more general manner), Caroline Jack defines “disinformation” (Jack 2017: 3), a term which one could use synonymously to fake news within the context of this investigation. Disinformation and fake news, however, should be separated conceptually from “misinformation”, which are verifiably wrong as well, but not intentionally deceptive.

It is possible to diagnose a relative increase in cultural attention in times of crisis and uncertainty (So et al. 2019: 665f). Evidence has been gathered that humans are disposed to give more attention to negative information because it serves as a warning sign for potential dangers and carries special significance in terms of its ability to provide crucial information for avoiding negative outcomes, which suggests a tendency towards a negativity bias to be present in all human populations (see Shoemaker 1996; Skowronski 1989; Irwin et al., 1967, cited in Soroka et al. 2019). Fake news tend to produce these negative emotions, which makes them culturally attractive and supports their spread by a higher amount of attentiveness. On the other hand, especially at the beginning of the COVID-19 pandemic in early 2020, fake news on social media tended to play down the danger of the virus, influencing people up to this day, not to follow protective measures (see Wahidie et al. 2021: 622).

Since COVID-19 comes with a certain amount of risk for a person’s individual health, and can even be lethal, fake news of such a kind can be seen as a ‘maladaptive trait’ from a cultural evolutionary perspective (see De Oliveira and Albuquerque 2021). What does that mean? In analogy to Charles Darwin’s theory of natural selection, Cultural Evolutionary Theory (CET) provides the foundation for understanding cultural or social processes while observing the individual level by using the tools of its biological equivalent.Footnote 1 Philosopher of science Grant Ramsey’s definition of ‘culture’ seems helpful in this context. According to him,”Culture is information transmitted between individuals or groups […]. The information must bring about the reproduction of a behavioral trait.” (Ramsey 2012: 463).

Fake news are pieces of cultural information which—potentially just like any other social information—are transmitted from person to person, from one cultural agent to another. Following this line of thinking, they behave much like the virus itself, a phenomenon that makes ‘pandemic’ and ‘infodemic’ comparable. According to Ramseys definition, this information brings about and reproduces certain behavioral traits, like for example non-compliance with protective measures or refusal to be vaccinated. Moreover, since these behavioral traits can negatively influence the chances for survival of the affected agents, the spread and transmission of this information and adopting such behavioral patterns can be seen as maladaptive. Since COVID-19 is contagious, this affects not only the individuals themselves, but also their surroundings, ultimately affecting the population as a whole.

However, following Simon and Camargo (2021) and others, one should be extra careful to unquestionably use the term ‘infodemic’, regardless of its high popularity. Three fundamental shortcomings of the analogy (1) Biological viruses are much easier to identify then beliefs and fake news, i.e. cultural information; (2) Diseases spread ‘randomly’, while fake news always spread intentionally; (3) The means to fight and control the spread of fake news differ greatly from those to fight and control a virus) show this in detail. We think that providing a background theory for the use of the conceptual analogy between ‘infodemic’ and ‘pandemic’ is helpful in addressing the three shortcomings. We identify cultural evolutionary theory (CET) as such a possible candidate for a background theory. CET can provide an appropriate framework to study how fake news successfully spread (as maladaptive traits) in order to deeper analyze these cultural phenomena. Although we recognize that the pandemic/infodemic analogy is in fact far from perfect, we believe that CET could provide a theoretical underpinning in order to give much more depth to the concept of an infodemic. Accordingly, in section “Three major issues of the analogy between ‘infodemic’ and ‘pandemic’” of this paper, we will initially try to cover the ground of criticism, especially referring to Simon and Carmargo (2021) critique of the infodemic metaphor. We then investigate some (micro-) evolutionary processes of CET in section “Cultural evolutionary biases and how they influence the spread of fake news”, in particular concerning cultural selection. We hope to find out in which way specific ‘biases’, which are well known and studied by proponents of CET, may influence individual agents to consume and copy fake news, both within and between cultural groups (diffusion). In section “How the infodemic influences fitness: immunization and inoculation”, we shall inquire potential consequences on the agent’s fitness when transmitting fake news, and being impaired by that. We will also address the question in which way belief in fake news can be backed up by mimicry. Within the context of the pandemic, fake news tend to mimic actual scientific data, which is why we can often subsume it under the umbrella term ‘pseudo-science’. Such beliefs are strengthened by ‘immunization strategies’ of the respective agents. Finally, in Sect. “The Prebunking strategy: inoculating against fake news”, we will indicate to the psychological theory of ‘inoculation’, presenting a possible way to counteract the spread of fake news, which has lately been investigated within the context of the COVID-19 infodemic (Van der Linden et al. 2020).

Three major issues of the analogy between ‘infodemic’ and ‘pandemic’

As Simon and Camargo (2021) point out in their paper, it is important to be careful when using metaphors, particularly the term “infodemic”s.

Although it has enjoyed great popularity in the media and in scientific publications of any discipline since the outbreak of the COVID-19 pandemic, as the two illustrate with the number of 14,301 articles they found (ibid: 3), they show that the mostly unquestioned replication of the term can be misleading. In order to reveal the necessary background, they first address the question of what metaphors are in the first place and what they serve (ibid: 6). According to Mercier (2020: 96), comparisons of information and communication with epidemiology, such as that of Gustav le Bon, who said that ‘ideas [...] and beliefs possess a power of contagion as intense as that of microbes’, have been around since 1897 (cited in Simon and Camargo 2021: 6).

Paraphrasing various authors, Simon and Camargo (ibid), explain the benefits of metaphors: They would be central to human cognition and play a role in partially ‘determining an agent’s judgements or choice behaviour (Slupska 2020: 3), for example of journalists and politicians, especially if they are used frequently, contain a lot of background information and have a stronger impact than simple comparisons. They would also help to understand abstract concepts (Lakoff 1993) and make ‘’chaotic situations feel controllable’’ (Young 2001, cited in Draper 2020: 8). In the analogy of disinformation and viruses, Simon and Camargo (2021: 6ff) see three fundamental shortcomings: the accuracy, the question of the nature of the 'infection' and the dynamics of dissemination in social networks.

First, while epidemiology clearly defines what constitutes a virus, information often comes from unclear sources of varying quality and is interpreted differently. It is true that laboratory studies make it possible to clearly define SARS-CoV-2 using, inter alia, its RNA information, whereas the definition of units of cultural information is quite controversial. Nevertheless, we do not think that this provides a too serious problem to the analogy of ‘infodemic’ and ‘pandemic’ and one does not have to solve the old and general ‘unitzation problem’ of cultural information, as for instance the colourful field of ‘memetics’ tried. Cultural information can be given in a discrete or continuous manner, it can come along in clearly defined packages, or not. In the case of fake news, it is mostly represented in words, sentences and grammatical structures, such that (semantic and syntactic) units of it can be identified easily. This, however, is really not what we are interested in this paper. In this paper we tried to clearly identify our epistemic interest, i.e. maladaptive fake news about COVID-19, as for example De Oliveira and Albuquerque (2021) did in their study we addressed earlier, more specifically using Twitter posts as their origin. Concerning the different interpretation one may draw an analogy between the various ways people interpret certain online content depending on their character, beliefs etc. and the divergent effects on distinct virus carriers depending on their blood type (see Kim et al. 2021).

Second, diseases would usually not be spread intentionally, whereas news tendentially would. Moreover, cognitive mechanisms of information intake are getting disregarded by the claim that people would be infected against their will. Actual studies that show that people actively decide what information they consume, what they believe and with whom they share it, would be ignored (Mercier 2020, Phillips and Milner 2017, cited in Simon and Camargo 2021: 7). However, it should be noted that both biological transmission of viruses and cultural transmission of information have factors that increase the likelihood of absorption. To claim that people are exposed to viral infection indiscriminately without any possibility of influence disregards the fact that there are also people who, in the course of their leisure activities, voluntarily decide to take the risk of infection or of infecting others when they visit an (indoor) concert or a bar, for example. While the sharing of fake news is of course (in most cases) not unintentional, it is important not to disregard the biases discussed in chapter 2, which play a decisive role in whether one internalizes or even shares a piece of information (one may also frequently encounter at random). What is meant by the dynamics of dissemination is that content in social networks is subject to other social rules that determine how quickly and extensively it spreads, see also O'Connor and Weatherall (2019). Just because it spreads quickly and widely does not mean that it is also viral, as this still happens selectively. Before summarizing the dangers of metaphors, they mention as the main reasons for their use that they are intuitive and easy to understand and serve as a “trading zone” for communication between scientists of all fields, journalists, policy makers and the public (Bensaude 2014: 250, cited in Simon and Camargo 2021: 10). Therefore, they (ibid.) transfer the concept of the “infodemic” to the bandwagon effect according to Shannon (1956), causing the accuracy of the work to suffer.

Third, the comparison of fake news with viruses suggests that the infodemic can also be controlled by simple means such as public health measures, which is not possible (Simon and Camargo 2021: 11). Metaphors that portray fake news as an intentional virus to be fought against would further limit the focus, as structural or contextual problems and the failures of other actors involved, such as the government, fade into the background (Southwell et al. 2019, cited in Simon and Camargo 2021: ibid.). In addition, metaphors would create what Jungherr and Schroeder (2021) call “moral panic”, which is used by politicians to restrict fundamental rights such as freedom of speech or expression, as for example Victor Orbán did with the “Anti-Coronavirus Act”, which allowed him to issue decrees or arrest people for spreading fake news without the consent of the parliament (Novak 2020, cited in Simon and Camargo 2021: 12).

In summary, it seems that we face three major points where the analogy between viruses and cultural information (i.e. ‘pandemic’ and ‘infodemic’) is not perfect or might reach its limits. The first issue is well known to cultural evolutionary theorists and depicts the unitization problem of cultural information, i.e. the difficulty to operationalize this information for empirical research. The second disanalogy arrives when we accept that cultural (dis)information is mostly transmitted intentionally by cultural agents, while biological viruses spread without “the consent” of their hosts. This calls for a description of the respective cognitive mechanisms that might be involved in a process of infodemic. Last but not least, the means of fighting an infodemic might be much more difficult and complicated than the (already challenging) enterprise of providing health measures against a biological pandemic. This point is related to the first two described, and calls for possible descriptions of how an infodemic could be fought and in what ways. In what follows, we will try our best to make sense of these three points and provide a cultural evolutionary background for the term infodemic, because we think that this specific theory could add a lot of theoretical material in this regard. Ultimately, although we agree with the justified criticism of the term ‘infodemic’ on the whole, we see much more potential in the analogy than has been described in the literature so far. In the following, we use CET as a framework to analyze this phenomenon from a different perspective and will also point out practical applications later on, in particular when it comes to how to fight an infodemic.

Cultural evolutionary biases and how they influence the spread of fake news

In this section, we inquire on specific cultural ‘biases’, which are well known and studied by proponents of CET, and how they may influence individual agents to consume and copy fake news, both within and between cultural groups (diffusion) in order to shed light on the spread of fake news from a cultural evolutionary perspective.

CET aims at reconstructing the dynamical change of specific cultural phenomena, similar as biological evolutionary theory aims at reconstructing change in various organisms over long timescales. Cultural phenomena are understood as aggregations within a population of individual agents, which come about by their mutual interactions (Acerbi 2016: 2). These individuals each have a particular set of cultural traits (Mesoudi 2011: 55). These traits, containing the cultural information, are acquired via social learning, respectively imitation (in the wide sense). Some information spreads faster and more often within or between groups than other information and this fact influences its particular cultural fitness. By using the idea of such a cultural selection or diffusion (that works more or less analogous to genetic selection) one can make sense of cultural information reproducing via imitation and social learning.Footnote 2 During the transmission process within or between groups, small variations of this information occur, and often this variation is transmitted, too. In this way, societies do both: maintaining their cultural patterns as well as changing them in the course of time.

Several microevolutionary ‘forces’ have been identified by proponents of CET. These can influence cultural selection. According to classical prominent authors in the field of CET, like Cavalli-Sforza and Feldman (1981), Boyd and Richerson (1985), Henrich and McElreath (2003), Mesoudi (2011) and Lewens (2015) cultural microevolutionary forces can be divided into several categories like transmission, (guided) variation, cultural selection, drift or diffusion.Footnote 3 These structure the way cultural information is passed on, where one can differentiate further between pathways, scope and mechanisms of transmission, e.g. one-to-one or one-to many-transmission.

Within the context of digital and social media, it is possible to share cultural information in the original, meaning that the rate of mutations is drawn to a minimum (which does not mean, that cultural agents do not interpret the information differently, which might then lead to guided variation), and cultural transmission is of high fidelity. Because of that, digital copying is a good candidate to study large scale proliferation processes (see Acerbi 2016: 7). Selection and in digital networks can often be observed quite precisely.

Several ‘biases’ influence cultural selection, which change the likelihood of a cultural trait to be transmitted. If, for instance, a very prominent agent like a celebrity spreads cultural information, this information will most likely be copied more often and therefore spread quicker in the population as it would if the agent was not prominent. In such a case, the transmission of this information is subject to a ‘model based bias’, in which a single person whose characteristics appear appealing, is the focus of consideration (Mesoudi 2011: 73). For example, because the person appears particularly prestigious (= prestige bias), resembles you in some respect (= similarity bias) or is characterized by their age (= age bias).

Another bias is content bias. As the name indicates, the content of the information is the primary focus of interest for cultural agents. Backed up by experimental evidence, CET is able to structure content bias into several subcategories. As an illustrative example, Mesoudi mentions the disgust-based content bias, referring to Heath et al. (2001), according to which, stories spread particularly quickly if they trigger strong feelings of disgust, as this would make them easier to remember.

Regarding fake news in general and pseudo-science in particular, it is of great interest to identify specific cultural transmission biases within social media. Acerbi (2019), as well as Stubbersfield (2021), who puts a specific focus on conspiracy theories and modern urban myths, both investigated such biases. In the following, we shall inquire which content biases might help to explain the success of fake news about COVID-19.

Content Biases: negativity bias and threat bias

Two content biases are of particular interest to us. Since negative fake news affecting survival in the case of the COVID-19 pandemic largely relates to perceived threats (see WHO mythbusters, 2022), “negativity bias” and “threat bias” are presented together here. Acerbi (2019) as well as Stubbersfield (2021) relied on data from two experimental US studies by Fessler et al. (2014). They aimed at showing an increased belief in information with negative connotations rather than positive information, even if the content is (ceteris paribus) identical. If people feel positive or negative emotions towards a topic, they often rely on information corresponding to these emotions (see Van Bavel 2020: 461). This is negativity bias. It seems to be particularly strong in cases where participants see the world as “dangerous” (ibid.). According to a recent German poll, 40% of the participants reported their strong anxiety towards a COVID-19 infection. In Vietnam this value increases to even 86% (see Statista 2022). The constant fear of an infection might contribute to the fact that people subjectively rate the likelihood of negative information being true as higher. What makes this specific situation even more difficult is the observation that—according to Kramer et al. (2014)—these negative emotions tend to go viral in social media. The virus itself is contagious, and so is negative information about it. It is of great interest to CET, that Bebbington et al. (2016: 9) also observed such a negativity bias in the transmission of negative information—where the likelihood of transmitting is also higher, see also De Oliveira and Albuquerque (2021). Possibly, this might be explained by Stubbersfield et al. (2015) finding, according to which that information pertaining to our survival is conveyed more truthfully than information that is not. In addition, negative information online might (for several reasons) have better chances of reproduction because lesser direct social connection between individuals is often involved, who, in case of a stronger connection, tend to share less negative information (see Fay et al. 2021: 14).

Studies on transmission chains of different kinds of information (neutral, negative, threat-related) show that, in comparison, participants tend to spread threat-related information more frequently, probably because they are assumed to contribute to the decision-making of other cultural agents, regardless of the likelihood of their actually occurring (Blaine and Boyer 2017: 4). The authors rely on a previous study (see Boyer and Parren 2015), documenting an increased likelihood of threat-related information transmission, compared to negative but no threat-related information transmission. Blaine & Boyer speculated that participants probably behave in this way, because they aim at delivering valuable and competent pieces of knowledge.

Acerbi (2019: 4) showed that threat related information constitutes the second largest part (28%) of all in the paper investigated information, right after social—/information about celebrities. Additionally, negative information was present in a 5:1 relation compared to positive (ibid.: 3).

Prestige Bias

Brennen et al. (2020: 5) diagnose “top–down misinformation” when fake news are shared by public figures (including politicians) and reaches major parts of society, analogous to previously mentioned one-to-many scope of transmission. While accounting for only 20% of total fake news about COVID-19, the greater reach of this information (due to model based bias) caused 69% of online interactions (‘I like’, sharing or commenting on a post). 36% of the false information was shared by politicians (ibid.). They are therefore of central importance. Many political leaders have large followings and thus trust from sections of the population. This can be devastating when following the shared information is harmful to public and personal health.

Additionally, information shared by such public figures might be subject to what CET calls ‘prestige bias’. This kind of model based bias indicates that a person that is well known by many people and copied on a regular base might be copied even more because he has so much prestige/influence. This kind of ‘boost’ in transmission rates is well studied in CET literature (see Mesoudi 2011: 73). Examples include the former US President Donald Trump (see Levy 2016), (https://qz.com/626022/donald-trumps-fans-may-be-influenced-an-evolutionary-strategy-called-prestige-bias) or former President of Brazil Jair Bolsonaro, denying the usefulness of face masks or delivering scientifically unjustified suggestions for the supposed treatment or protection against COVID-19. After Trump announced to the public that he aimed to check the functionality of injected disinfectant for protection against the virus, the Maryland Department of Emergency Management received over 100 calls regarding the use of disinfectant (cf. Rose 2020: 816). In their Twitter analysis, Germani and Biller-Andorno (2021) found that Donald Trump was the driving force behind the spread of misinformation before his Twitter account was blocked.

Another possible example of a model-based bias’ impact, which Stubbersfield (2021: 14f) mentions in the context of conspiracy ideologies, is the level of awareness of the former virologist Judy Mikovits. In an interview, she was portrayed as one of the most successful researchers within her generation and is said to have revolutionized the treatment of HIV/AIDS (cf. Enserink and Cohen 2020). This interview gained notoriety in the form of a trailer for a conspiracy film called “Plandemic: The Hidden Agenda behind Covid-19” as it was viewed 1.8 million times in a few days before it was deleted from Facebook and was shared 150,000 times (cf. Gebel 2020). E.g., the alleged expertise of a supposedly high-ranking virologist could indicate a prestige bias. However, a broad evaluation of COVID-19-related fake news on Twitter (see De Oliveira and Albuquerque 2021) showed that truthful and false information is indeed shared to a comparable degree, regardless of whether it was sent by verified/public figures or regular users.

Minimally counterintuitive bias (MCI) and hyperactive agency detection (HAD)

Minimally Counterintuitive bias occurs when information is easier to remember if largely tailored to human intuitions (the “common sense”) but contains at least one counterintuitive point. As an illustration, Mesoudi (2011: 66) cites ghosts, which possess human characteristics (such as the need for revenge) but also hold counterintuitive characteristics due to their ability to float through walls and are thus more likely to be remembered than information about “a person who cannot walk through walls and is dying of old age” (ibid.).

Related to COVID-19 fake news, this could be illustrated, for example, by the success of the claim that Bill Gates wants to use vaccination to implant microchips in as many humans as possible for surveillance (cf. Ball and Maxmen 2020). It is intuitive to assume that vaccinating as many people as possible is important for pandemic control and health security, however, the alleged reason that Bill Gates would want to monitor nearly all of humanity using injected microchips is arguably quite counterintuitive. Nevertheless, a YouTube video on this ‘insight’ was viewed almost two million times. Furthermore, when a former advisor to ex US president Donald Trump announced this thesis in a radio broadcast with the statement that he would never trust a vaccine that Gates had co-financed, that statement was taken up unquestioningly by the conservative tabloid newspaper “New York Post” in an online article with which a total of one million people eventually interacted on Facebook (cf. ibid.). According to a recent survey, 50% of all “Fox News” viewers in the USA believe this claim (cf. Sanders 2020).

The spread of these or similar ideas could additionally be explained by the so-called hyperactive agency detection bias which, for example in the case of belief in ghosts, might often work hand in hand with Minimally Counterintuitive bias. It describes the tendency to attribute an agent to events or circumstances, although there is none (see Barrett 2004, cited after Stubbersfield 2021: 6f). It may have had an evolutionary advantage for our ancestor’s survival to assume an actor behind events (cf. ibid.). By citing Hofstadter (1966), van Prooijen and Douglas emphasized that blaming “powerful and evil enemy group[s]” for certain crises events reduces their complexity, thus making them easier to grasp psychologically (cf. 2017: 327). After it became known that COVID-19 originated in Wuhan, allegations emerged that Jews were linked to Chinese laboratories where the virus was produced as a “Zionist bioweapon” or “designed” so that only non-Jewish people could become infected (cf. Comerford and Gerster 2021: 16). In a German Telegram channel with over 34,000 subscribers, which presents itself as a Corona information source, a video was shared in which two Jewish men say that the virus is “for non-Jews, […] not for the Jews” (ibid.). Already during the Black Death in the fourteenth century, the accusation spread that Jewish people had intentionally spread the plague by poisoning rivers and wells, as they seemed to have died from it less frequently compared to Christian people (cf. Clamp 2020). Such explanations for natural events have been shown to be intuitively satisfying to human consciousness (cf. Barrett 2007, as cited in Boudry et al. 2015: 1186).

Lin Wood, a Republican lawyer for Donald Trump, also referred countless times to the narrative that COVID-19 is a bioweapon and serves to decimate the world’s population on his Telegram channel with almost 800,000 subscribers (cf. Chhetri 2021).

We listed which different types of fake news serve certain cognitive biases due to diverse stimuli and are consequently more likely to be taken up by the individual. They therefore have an increased ‘diffusion potential’ (cf. Ramsey and De Block 2015: 13). Relying on Richerson & Boyd (2005), Ramsey and De Block highlight the following: cultural traits which are appealing because they are associated with certain models or because of their catchiness and simplicity, are inclined to be transmitted more than the average. Values between 0 and 1 can be used to indicate the transmissibility range, with 0 implying that acquisition is very unlikely and 1 implying almost certainly obtaining a trait.

Transmission isolating mechanisms within social filter bubbles

Above in this section, we mentioned cultural diffusion as a microevolutionary process. It refers to the transfer of cultural information between groups or individuals, parallel to gene flow in biological evolution, in which genes pass from one population to another (Mesoudi 2011: 81). A more fine-grained distinction can be made between demic diffusion and cultural diffusion, where the latter is of particular interest to this paper. The former describes individuals moving from one population to another, and with them their cultural traits. In cultural diffusion, however, they are passed on using different transmission pathways. While demic diffusion is better suited for tracing prehistoric transmission processes, such as migrations, cultural diffusion is more applicable to our postmodern globalized world, which is always connected through mass communication, and occurs digitally, often in the form of one-to-many transmission. A theoretical concept that supports this CET argument is the so-called “small-world-phenomenon”, see (Schnettler 2009). The small-world phenomenon—the principle that all people on earth are linked by short chains of acquaintances—is a fundamental issue in social networks, a basic statement about the plenty of short paths in a social graph. As Kleinberg (2004) shows, a quadratic grid can be enriched with a second dimension of transmission between non-neighboring nodes/cultural agents. This second layer can be given by digital transmission.

While Boyd and Richerson (1985) assumed the better memorability of cultural information as sufficient for wide dissemination, Morin (2013) recognized that additionally, the frequency of transmission and certain motivations behind it mattered (see also Blaine and Boyer 2017: 2). Acerbi, in “A Cultural Evolution Approach to Digital Media” (2016: 8), records that cultural traits in digital media could achieve enormous cultural diffusion—due to fast information transmission and high connectivity between cultural individuals. He illustrates this with the help of the music video “Gangnam Style” on YouTube, which has been viewed over 4.6 billion times. Hamel et al. (2021) found that 31% of U.S. adults used social networks as a source of information about the pandemic, and 78% of respondents were aware of at least one of eight COVID-19-related fake news, either believing it to be true or at least being undecided about it.

However, the world of digital information transmission does not only show small world phenomena and facilitated diffusion (due to its rapid spread without the need to be in touch physically), but also processes of demarcation and grouping, similar to speciation and population splitting in biological evolution. According to Del Vicario et al. (2016: 1f), the so-called “content-selective exposure” is the core factor of diffusion and reinforces the formation of certain filter bubbles. This concept goes back to Eli Pariser (2012: 17) and describes the process of algorithms constantly theorizing about the personality of users, resulting in the crystallization of their own “information universes” that determine how they access information. Since the personalized information universe only produces news that largely correspond to the individual’s worldview, preferences and convictions, there is a danger of forming a distorted image of reality and with it impairing a balanced discourse (Messingschlager et al. 2020: 91). An impaired discourse with Internet users who perceive reality differently and do not believe the disseminated fake news contribute to limiting diffusion. At the same time however, it helps the spread of fake news within the bubbles themselves, e.g., due to positive content, context or model biases of the “in-group” members.

If a cultural agent is, however, not part of a specific filter bubble (which can be seen as the cultural analogion to a biological population of conspecific individuals which exchange genetic information via sexual reproduction), it is (a) harder to access bubble-specific information and (b) easier to resist fake news transmitted in a bubble. In this way, information spreads mostly within the bubbles, not so much between bubbles. It is most illuminating, that it is precisely this criterion of increased in-group causal connectivity that was often used to define biological populations in the life sciences. For example, according to Roberta Millstein’s definition of a population (cf. Millstein 2010, 67): “Populations […] consist of […] conspecific organisms that, over the course of a generation, are actually engaged in survival or reproductive interactions, or both. The boundaries of the population are the largest grouping for which the rates of interaction are much higher within the grouping than outside.” It is this restriction of interactions that is important for the hierarchically structured emergence of biological complexity (see Simon 2002), since they protect a certain gene pool from too much variation. There are lots of parallels between biological and cultural evolution.

Having this in mind, filter bubbles could be conceptualized as installing a transmission isolating mechanism (TRIM). Durham (1991) defined TRIMs as cultural barriers that impede diffusion and concomitant mixing of cultural information between groups, allowing cultural identities to remain relatively stable over time (as cited in Smaldino 2014: 250). According to Thies (2017: 1) the reduction of such exchange gives the impression that a personal opinion is also the “correct” one, about which there would be general consensus.

Similar things can be said for so-called echo chambers, see Acerbi (2016), which further isolate the users from other points of view. One explanatory approach that CET could provide here is the assumption that individuals would preferentially absorb information from people who are similar to them (= ‘self-similarity bias’), since people from the same group are more likely to encounter the same challenges, according to Henrich (2016). In the case of the COVID-19 pandemic, these could thus be, for example, the widespread media disparagement of vaccination opponents or demonstrators against the measures.

In their paper on the diffusion of fake news about COVID-19, De Oliveira and Albuquerque (2021: 8) introduce the egocentric bias in reference to Segovia-Martín et al. (2020), defining it as the “attachment to [one’s] own cultural trait”. The egocentric bias can cause individuals to maintain their own cultural trait but transferred to a group level perspective, it can also reduce cultural exchange and favor the isolation of populations (ibid.). How can that be possible? Interaction with people factually demonstrating the falsity of one’s belief system thus devaluing them, might lead to a loss of prestige of cultural agent. By keeping information within the group itself and thereby ‘immunize’ it against external counterarguments/counterevidence, i.e. creating a blockade for the “social bubble”, as De Oliveira & Albuquerque (2021: 9) call it, individuals can avoid losing their in-group’s self perceived prestige (prestige inside a filter bubble). This dynamic could ultimately lead to negationism (“the rejection of technical and scientific information”) of information from ‘outsiders’.

Media “parallel societies” with different worldviews are then formed (Reckwitz 2017: 264). In groups and channels of messaging service ‘Telegram’, for example, vaccine refusers network in order to establish populations with like-minded people or to find dating partners. Users in their own community are attributed strongly positive characteristics and views that deviate from these are strongly devalued (cf. Gentzkow 2016: 13). In the case of the current pandemic, this valuation can be seen in the rhetoric of so-called “Covid Deniers” or “Querdenker:innen”, as they are called in Germany. They apostrophize persons who trust the news reports of the mainstream media as puppets of the “Merkel dictatorship” or “sleep sheep”, considering their own group in contrast as “awakened” and enlightened.

Additionally, in their study on the belief in fake news among conservatives and liberals in the U.S., Harper and Baguley (2019) used the concept of collective narcissism, which originated in psychology, in order to be able to explain a stronger belief in fake news with the fact that people collectively find the group to which they feel more similarity. As a sad result, members of other groups are devalued. In the various groups, the same articles are often spread, which are shared with other acquaintances, Facebook friends, etc.

Retention—and diffusion potential of cultural information regarding fake news

We think that social networks in which posts or articles are frequently replicated in the original through the “share” function have what Ramsey and De Block call retention potential (2015: 13), see above. By this they rely on mathematical results of Strimling et al. (2009), who found that in cultural evolution, the number of learning opportunities within one cultural domain usually changes during repeated learning. In such dynamics, individuals tend to copy variants with a high diffusion potential first (i.e., variants that are attractive for various reasons) but later replace them with traits with a higher retention potential. Retention potential indicates the probability that a variant will be passed on unchanged, i.e., the variant’s ‘stickiness’ (cf. Ramsey and De Block 2015: 13). In filter bubbles or echo chambers, COVID-19-related fake news and the TRIMS further reinforce like-minded people in their beliefs, independent of the rationality of these beliefs. This is because in filter bubbles, variants tend towards higher and higher retention potential. Since within filter bubbles, the number of learning opportunities decreases (not precisely the number, but the number of types of learning opportunities—agents in filter bubbles actively decrease the types of sources of information that they see as ‘reliable’), variants with a higher retention potential will be positively selected in the long run. This means that “A new variant may be very tempting without actually being good [diffusion potential], whereas an individual addicted to something will have difficulties abandoning it [retention potential], even in the face of an objectively better alternative.” cf. Strimling et al. (2009: 13870). We think that within-group pressure and conformity increases retention potential, thereby making individuals ‘addicted’ to certain in-group variants. The authors’ results also suggest that rational/functional traits are more likely to be favored when the number of learning trials increases, in our case, when people are presented with alternative opinions that are not yet part of their filter bubble. Sadly, there are strategies of immunization against such out-group information, as we will see in the next section.

Additionally, one may create an argument for how an infodemic can even effect the biological fitness of its carriers in this context. In a way, the number of subscribers, likes or shares can provide information on how many people refuse vaccination or resist health measures. Their belief in fake news can endanger their personal health or, from the biological evolutionary point of view, biological fitness of the carriers. Besides conspiracy theories, another quite drastic example is the islamistic ideology causing radical devotees to carry out suicide bombings and thereby ending their biological existence. Also the reproductive fitness of the individuals may thus be affected in a way they will not be able to produce any (biological) offspring. This point shall lead us to the next section.

How the infodemic influences fitness: immunization and inoculation

In evolutionary theory, fitness is regarded as the ability to survive and reproduce. Individuals that are better adapted to their environment than other individuals have better chances of survival. Adaptability is determined by certain genetically produced traits, which thus influence the fitness of an individual (Hallgrímsson and Hall 2005).

Boyd and Richerson (1995: 134) illustrated that culture and social learning can improve agent's adaptability and thus their fitness, provided they reduce the effort of individual learning. For example, through cumulative social learning from quite a few generations, members of our species know which mushrooms are poisonous to us and do not have to figure this out each generation again via individual learning processes. This is ‘the secret of our success’ (Henrich 2016).

However, not all cultural information that we absorb through social learning is beneficial to our survival. Advertisements for sweets or cigarettes are rather harmful to health. Information that induces health risks, such as fake news that influences our behavior, is maladaptive, i.e., a maladaptation (cf. De Oliveira and Albuquerque 2021: 429), because it results in “behavioral traits” (Ramsey 2012) that endanger not only the individuals but their social environment, as well.

Strategies of immunization in pseudo-scientific explanations

What can also harm people is the false belief in so-called ‘pseudo-scientific information’. Information is considered pseudo-scientific if it (a) relates to an area within science in the broad sense, but can (b) only be insufficiently verified, and is at the same time (c) part of a doctrine whose goal is to create the impression that it offers the most reliable source on the associated topic (cf. Hansson 2017: 40). One possible doctrine is science denialism, for example related to climate change or vaccinations (ibid.). The “doctrine” point (c) is of central importance. It allows us to distinguish between pseudo-science and those scientific hypotheses or theories that are merely hard to verify (or falsify), such as quantum gravity, string theory or big bang theory.

Boudry & Braeckman (2012) describe the resilience of pseudo-science by examining it from a cognitive psychological and epistemological perspective. They apply the theory of cognitive dissonance which goes back to Aronson (1992). Cognitive dissonance results from the fact that agents are confronted with information that contradicts their original beliefs (ibid.: 347).The more they are convinced of their view and present it to the outside world, the more likely they will try to “rationalize-away” this information, i.e., providing reasons for their conviction and to cover up deviations in order to create an illusion of objectivity (ibid.). Based on publications of Dawkins (1976, 1993) and Dennett (1996), who show how self-confirming belief structures are made unchallengeable against criticism and adverse evidence, Boudry and Braeckman (2012) deal with the question of epistemic defence mechanisms (ibid.: 350). In analogy to biological evolution and epidemiology, one can speak of "immunization”.

A vivid example is incorporation of objections into the belief itself. Followers of conspiracy theories (just like ultra-religious persons) treat disagreement with them as part of the conspiracy itself (or the religion, for instance as a temptation by the devil). Their criticism therefore serves as further confirmation of their belief system (ibid.: 351). This may lead to a general lack of understanding of why to change it in the first place (cf. Sunstein 2018: 96f.). In order to consolidate their belief system in the face of objections, belief configurations can be necessary that occur in various areas of pseudo-science (cf. Boudry and Braeckman 2012: 352).

Pseudo-scientific fake news within the context of COVID-19

Belief configurations are of central importance. The English conspiracy theorist and anti-vaccination activist Vernon Coleman, who titles himself “Dr. Coleman”, claimed in his widely distributed video “Covid-19 Va@xines Are Weapons of Mass Destruction” that 90–95% of the world's population would be killed by vaccination and that those already vaccinated would die on contact with the virus (cf. Czopek 2021). This thesis of the so-called “Great Reset” was originally an initiative of the World Economic Forum (WEF) to reshape major parts of the global economy in a more egalitarian and sustainable way after the pandemic. It has been reinterpreted by conspiracy ideologies as the goal of human extinction. A former German professor of medical microbiology named Sucharit Bhakdi also supports this thesis in a video with over 268,000 views on “Rumble,” originally published by the conservative magazine “New American” and widely shared in articles on conspiracy ideology websites (see Funke 2021).

Should predictions or claims finally be falsified, a plausible rationalization or reinterpretation must be presented. In his book “The Power of Us”, psychologist Jay Van Bavel explored how believers in a doomsday setting react when their prophecy turns out to be false. He said in an interview with the American news website, Vox that if people really want to believe in something, they will always try to post-rationalize (Scott 2021), also referring to distrust in vaccinations. An obvious line of argument for many corona deniers or lateral thinkers is to claim that the mainstream media is intentionally hiding the COVID-19 vaccine-related deaths and that only “free media” are truthfully documenting them. The same claim can be found, for example, on the website “infowars.com”, managed by right-wing American radio host Alex Jones. In a stream from January 22, 2022 right-wing conspiracy theorist Owen Shroyer postulates there that the U.S. Vaccine Adverse Event Reporting System (VAERS) would record over 21,000 deaths from the corona vaccine and over 200,000 vaccine deaths under false categories of other medical reasons such as a heart attack; the mainstream media would withhold this information from the population (see infowars.com 2022). Another article on Jones’ website states that the year 2021 has shown that the mainstream media is “a propaganda arm of political and corporate elites” (Smith 2022). Thus, by rejecting mainstream media, falsification, or opinions of ‘outsiders’, belief systems can become immune to empirically based refutation (Boudry and Braeckman 2012: 353). The belief in the truth of various fake news defends itself against external falsifications with the help of rationalizations that aim for the impression of objectivity (e.g. by copying science-style communication) and intrinsic self-affirmation.

The prebunking strategy: inoculating against fake news

We have seen how deep the analogy between the spread of a virus and the spread of fake news can become, i.e., how cultural evolution and biological evolution can resemble each other. In this last section, we want to draw our attention to a particular strategy to counteract the spread of, and the belief in fake news. Van der Linden et al. (2020) named it the ‘prebunking strategy’, and it could be described as a kind of ‘mental vaccination’, since the mental immune system (which we described in the last section) of participants is actively exposed to fake news. However, this takes place in a controlled and artificial environment, such that agents are able to develop alternative strategies of immunization (i.e., a specific kind of critical thinking) against more dangerous versions of fake news, to which they might be exposed in the future. The strategy here is ‘psychological inoculation’.

Developing such strategies is of central importance. Since the pandemic began, international organizations such as the WHO or the United Nations (UN) as well as state governments endeavored to offer confidential information on measures to contain the pandemic (cf. Rose 2020: 813). In the fight against fake news, reference is made on social networks to WHO information for almost all content relating to the virus or vaccination. Likewise, content on Facebook (Meta) platforms is checked by so-called third-party fact checks which, should they prove falseness, are marked as such or restricted in their dissemination (Cellan-Jones, 2020). A fact check by the international news agency “Reuters,” for example, refuted factual claims related to VAERS data on the grounds that they would be misinterpreted (see Reuters.com 2021). However, access to the spread of fake news is limited, as it is often impossible to control private channels or closed groups (Brennen et al. 2020: 2). Because of this barrier (that we understand as a cultural TRIM), groups’ cultural identities are less likely to be compromised. According to Vosoughi et al. (2018), fact-checks would spread less quickly than fake news, making it harder for them to be truly effective (Van der Linden et al. 2020).

There are striking differences in how posts declared as false will be handled between different platforms. While on Facebook 24%, and on YouTube 27% of all posts remain without a warning, on Twitter it is even 59% (ibid.: 7). Moreover, a study by Greene and Murphy (2021: 775) suggests that the effectiveness of generic alerts is doubtful. They might even tend to make people trust less the “right” news (cf. Clayton et al. 2020: 21), since they see the generic alerts as parts of the larger conspiracy. In the last subsection, we described this effect of immunization. Van der Linden et al. (2020) draw attention to the problem that repeated exposure to fake news also reinforces belief in it, which is particularly problematic given that 66% of respondents—according to a 2020 study by the UK media regulator office of communication (Ofcom)—reported being exposed to fake news on a daily basis.

It might be possible to counteract the spread and belief in fake news. Van der Linden et al. (2020) propose the application of inoculation theory from psychology, which goes back to McGuire (1964). Analogously to medical inoculation, in which a virus is weakened by antibodies injected into a person, it describes the process by which “challenges” (in this case: fake news) are undermined by defense mechanisms in the sense of critical thinking in such a way that the position of the affected individual cannot be influenced. Individuals are seen as carriers of the “virus” in the form of maladaptive fake news that is able to ‘infect’ their social environment, as well.

Relying on Compton (2013), Van der Linden et al. (2020) name two elements of which psychological inoculation usually consists of: (a) forewarning of a threat or attack of one's views, and (b) preemptively refuting counter arguments (= “prebunking”). Prebunking in advance is thus intended to replace the "debunking" of fact-checking that takes place after the fact (ibid.).

Governments of numerous countries have taken advantage of the Telegram platform, which is considered a hotbed of conspiracy narratives, by providing factual info channels to users (Leong, 2020). For example, the “Corona Info Channel of the Federal Ministry of Health” in Germany was created already in April 2020 to, by its own admission, provide reliable and empirically verified sources alongside proliferating fake news on Telegram. The channel informs thus scarcely 305,000 subscribers about which assertions and narratives are true or false, and for which reasons this is so. But this is a subsequent strategy, and it is sometimes quite ineffective.

Psychological inoculation can therefore help as ‘premature vaccination’ in the form of sensitization towards fake news. Thus, the intention to be vaccinated against COVID-19 improves if subjects are confronted with appropriate critical objection before they see a conspiracy video, rather than after they saw it, as Van der Linden et al. (2020) show. Sensitization is promoted when individuals' attention switches from their own position to the manipulation strategies of fake news creators and it is perceived that, for example, deliberately emotionalizing language would be used. It has been shown by van der Linden et al. (2020) that such a strategy can improve a person’s sensitivity towards fake news, when confronted with it at a later stage or within a different context. Metaphorically speaking, a person develops “cultural antibodies”.

Two browser games, “Bad News” which was awarded the educational media prize in 2020 and translated internationally, and “Go Viral”, developed by the UK government in cooperation with the WHO, proved to be useful in this context, as the players actively learn which concrete manipulation strategies are used in each case (ibid.: 4). Bad news encourages the players to create their own conspiracy theory about COVID-19 and then to understand the negative consequences and the process of spreading it based on the responses of the users on the social networks. A junior version of the “Bad News” game was also created for children aged eight and up.

These techniques would help agents not only to differentiate true from false information. Ultimately, they are even more likely to engage in health-promoting behaviors, such as wearing medical masks or social distancing (ibid.: 5). Figuratively speaking, people could increase their immunity through repeated use (“booster vaccination”), and it was not necessary for everyone to be inoculated, as “societal herd immunity” (ibid.) could also be achieved when enough people develop antibodies against the manipulation techniques, distributing prebunking information or sharing mentioned browser games with, i.e. Facebook friends or acquaintances via cultural diffusion.

Conclusion

In this paper we depicted the development of fake news as the spread of cultural variants of information from the individual agent to the level of society, and finally to the current ‘infodemic’. After unfolding the critical aspects of the metaphor, we embedded these infodemic phenomena into the larger framework of cultural evolutionary theory (CET), since we believe that the latter is able to provide an excellent macro-toolkit to understand them. The essential characteristics of CET were introduced and we briefly explained the most general and basic terminology of its micro-evolutionary processes. It was conveyed that processes of social learning and imitation are central to CET, since they transmit cultural information units in various ways between individuals or groups. Whether or not cultural information is copied depends (to a large extent) on cognitive biases of cultural agents, which thus determine the probability of cultural selection. With reference to Mesoudis (2011) examples of biases and previous studies by Acerbi (2019) and Stubberfields (2021) on fake news and conspiracy ideologies, we explored and demonstrated which specific biases may be active in the cultural selection of fake news. We did so on the basis of empirical findings by these authors.

After that, we dealt with the phenomenon of cultural diffusion, i.e. between-group transmission (as opposed to within-group transmission) in the context of the infodemic. A study by Hamel et al. (2021) illustrated how widespread fake news about COVID-19 actually is. The danger of ‘filter bubbles’ according to Pariser (2012) was pointed out, which arise from the algorithmization of online content and produce echo chambers, which complicate the transmission of cultural information units between different groups. Relying on CET, we identified these filter bubbles as ‘TRIMs’ (transmission isolating mechanisms). TRIMS prevent the spread of valid scientific knowledge in certain groups due to restricted communication and often cause networking within Telegram groups or channels where a variety of fake news regarding possible vaccine harm is shared on a daily basis. This disinformation can be maladaptive for carriers, as they lure people into potentially being less vaccinated or resisting protection measures. This danger was illustrated by the huge number of demonstrators or Telegram group members or calls, which also pose a threat to public health.

Finally, we pursued the question of how to possibly fight an infodemic. This task is of utmost importance and it is made more difficult by certain defense strategies of pseudo-science. These information variants are able to 'immunize' themselves against external criticism. Before measures such as rationalization or belief configuration seem necessary at all, this development could be prevented at a stage where beliefs are not yet so firmly established. Reports on COVID-19 or on vaccinations are accompanied by redirects to WHO information websites and false reports recognized and marked with warnings by fact checkers on a regular basis. However, these notifications have not proven to be particularly effective and do not seem to reach every agent. Alternative options should also be used that help users to recognize fake news in advance, for example by raising awareness of typical characteristics and to form ‘antibodies’ (van der Linden, 2020). Based on the inoculation theory, which can be understood analogous to a medical vaccination, participants can thereby be immunized against the absorption of disinformation as maladaptive “viruses”. Specific browser games (specially developed for this purpose) could, for example, be played in teaching institutions as a prophylactic measure to protect young people or young adults from being “infected” with disinformation within an artificial and controlled environment. Nationwide mental immunization (just like medical immunization) also offers the prospect of securing health and improving the crisis situation in the case of an infodemic. Measures against pandemic and infodemic can go hand in hand and CET can help to explain why there are so many parallels and analogies between pandemic and infodemic, between biological and cultural evolution. This paper can be seen as a small contribution to this complex process of understanding.