“Everyone who is born holds dual citizenship in the kingdom of the well and in the kingdom of the sick. Although we all prefer to use only the good passport, sooner or later each of us is obliged, at least for a spell, to identify ourselves as citizens of that other place.” (Sontag 1979, 3)

Introduction

In early January of 2020, news outlets began reporting on a particularly contagious and severe coronavirus in Wuhan, China; by the next month, the first known case of COVID-19 in the United States – the disease caused by the SARS-CoV-2 virus – was identified in Snohomish, Washington; and then by March, the World Health Organization declared the spread of the virus a global pandemic. While some countries took more impactful public health measures, such as New Zealand, the United States was one of the most afflicted countries in the world, creating a strong dissonance with its status as one of the world’s leaders in medical research. COVID-19 was not the US first pandemic, of course. While public health outcomes have varied among other historical examples like the 1918 flu, polio, HIV and more, one common rhetorical goal threads public health messaging in these pandemics together: As members of the public, we are asked to stay safe so that we might avoid becoming victims of a communicable illness and spreading it to others. While such messaging may seem like a banal and expected trope in public health discourse, it is doing much more than asking us to modify our behavior.

In this project, therefore, I propose to unpack how discourses of identity figure within infectious disease public health rhetoric in the United States. In order to better understand this type of public health rhetoric, I seek to trace prevention-based language that calls us to stop ourselves and others from embodying the identity of victim in cases of new infectious diseases, namely, polio, AIDS and COVID-19.1 Importantly, these case studies all involved scenarios where public health experts knew the illness was viral, so communicating information and prescribing behavior change to the public was much clearer than when the cause was unknown (as in cases like the bubonic plague or cholera). Polio, HIV and COVID-19 all share other characteristics that complicate how risky the public sees their behavior, too; specifically, long incubation periods, the fact that each disease can spread both asymptomatically and pre-symptomatically, and the broad range of severity of each disease (with the exception of HIV’s early years) might encourage people not to see themselves as potential victims. As a point of contrast, diseases like the bubonic plague or the 1918 flu were more broadly conceived of as indiscriminate with their victims, both were highly virulent, symptoms showed quickly, and death rates were extraordinarily high – as such, people were more likely to abide by public health directives. The cases of polio, HIV and COVID-19, though, are different in this respect, at the onset of these pandemics, the public associated each disease with particular populations – respectively, low-income and immigrant families, the gay community, and the elderly. So the question arises, then, how did members of the public who saw themselves as not at risk for polio, HIV and COVID-19 get repositioned as susceptible?

To answer this question, I offer two main arguments. First, I argue that public health experts don’t simply attempt to persuade us to change our behavior in these cases, but rather they attempt to awaken a subject position through the process of interpellation – something Louis Althusser describes as ideological discourse that “recruits” and “transforms the individuals into subjects” (1971, 174). While Maurice Charland notes that, “One must already be an interpellated subject and exist as a discursive position in order to be a part of the audience of a rhetorical situation in which persuasion could occur” (1987, 138), in the case of public health crises and prevention, the interpellation of identity and the persuasion of action appear to occur not sequentially but simultaneously. To not be a victim is both tied to who we want to be but also to our preventative actions – we become a potential victim when we either see ourselves as at-risk or when we violate public health recommendations. The invitation to inhabit the identity of potential victim moves the use of constitutive rhetoric into a negative void where we are both called to inhabit and resist the identity of potential victim, in turn positioning us on the other side of the mirror of who we could be but are not yet (and hopefully won’t become). Secondly, I illustrate how this form of constitutive rhetoric in public health is inextricably tied to ideological presumptions that uphold wealthy, heteronormative whiteness as the accepted articulation of victimhood. When underserved populations are identified as victims of the disease through marginalizing rhetoric (with polio, the poor and immigrants; with HIV, the gay community and drug users; and with COVID-19, the elderly), public discourse dissociates victimhood from the privileged, effectively telling white, wealthy, heteronormative, young communities that “it’s not your problem, this isn’t you.” When we see the rhetoric shift to articulate that these illnesses do not just impact marginalized populations but can, in fact, infect anyone, the articulation of potential victim becomes one inseparably tied to wealth, whiteness, straightness, and youth, reinforcing assumptions about whose bodies are valued and whose are not.

Constitutive rhetoric and public health

As the 2014 special issue on medical publics in the Journal of Medical Humanities demonstrates, the “public” that is addressed in health discourse is not monolithic but multiple. Specifically, Lisa Keränen notes: “We can appreciate biomedical and health discourses and practices as the result of complex sets of interacting rhetorical performances that bridge public, private, institutional, and technical concerns” (2014, 104). We can further unpack the multiplicity of publics drawing out the connections between Charland’s work (1987) on constitutive rhetoric and interpellation – a process that scholars have shown to happen frequently in civic and consumer discourse – and public health rhetoric. Such interpellation happens both within health sciences, as Colleen Derkatch shows us in her analysis of how science journal editors call forth a biomedical community in a struggle over disciplinary boundaries (2012), and in the ways that health practitioners interface with patients, as Judy Segal shows us how constitutive rhetoric “makes patients” (2005, 39) in a case study on sufferers of migraine headaches. Some scholars have further identified how constitutive rhetoric functions as a means of framing identities for susceptible populations specifically. In particular, Eric Rodriguez and Dawn Opel’s work (2020) on the Healthy People 2020 initiative unpacks the ways in which discourses of resiliency and vulnerability functionally quantify and objectify communities at risk for disease; Tasha Dubriwny’s work (2009) shows us how “ideal patients” get constituted in breast cancer discourse; and Bonnie Dow’s project (1994) on AIDS activism literature uses perspective by incongruity to reposition gay denialists of AIDS into identities of both victims and advocates. Paula Treichler further unpacks how the discursive dichotomies of AIDS function to stratify the public between “self and not-self” (1999, 35), underscoring the centrality of identity articulation in public health rhetoric. And, in discussing the COVID-19 pandemic, Michael Bernard-Donals further notes how “vulnerability is constitutive” (2020, 229), but such vulnerability is often compounded by additional situational vulnerabilities in pandemics.

Other scholars have noted specifically how constitutive processes function to shape messages about responsibility in public health. M.M. Brown illustrates how H1N1 public health rhetoric called for an “individualistic model of public health” (2019, 212) that used hand hygiene practices as forms of both constitutive and stigmatizing rhetoric where “the collective identity called into existence is that of the health citizen for whom participation in containing an outbreak is a personal responsibility” (216). Further, Colleen Derkatch and Philippa Spoel show us how public health rhetoric in Canada constructs the “‘good’ health citizen” (2017, 155) as a means of tying together citizenship, health, and local food consumption to create a sense of personal obligation to care for the economy and the environment. The authors extend this discussion, further pointing out the importance of tracing “how forms of health citizenship are called forth and prioritized in public health discourse” (2020, 26) to illuminate moments where responsibility of health becomes one of the individual rather than the state. As a point of contrast, Hui Ding’s study on public responses to SARS shows how China’s “official call for participation with its appeals to patriotism, grassroots participation, and social responsibility helped to facilitate mass mobilization campaigns” (2014, 197) where even college students embraced a patriotic call for social responsibility. Ding’s comparison of public responses to quarantine in China for SARS to responses in the US underscores this difference in cultural responses to public health measures where “SARS was often racialized as a ‘Chinese disease’” (201) in the US, and people of Asian descent were subject to repeated hate speech. Jonathan Metzl and Anna Kirkland’s work on stigma and health connects to this work as well, where they identify how marks of health differences function as sites of exclusion (2010, 5). In these studies, we see how constitutive rhetoric is intertwined with ideological assumptions that either put full responsibility on the individual rather than the state to take care of health issues or one that perpetuate a form of individualism that generates indifference to others, particularly along the lines of race and class, highlighting how our health and bodies function as sites of power where our identities become articulated in ideologically-driven ways.

What does it look like, though, when public health rhetoric focuses on prevention where we are called to avoid becoming an identity altogether? How can one articulate potential victimhood for those who do not see themselves to be at much risk? And what do such moments of interpellation reflect about our cultural values and views of people, health, and disease? In light of the COVID-19 pandemic, looking back at the cases of polio and HIV allows us to better understand how such identities are articulated and the ideological implications of asking people to see themselves as potential victims. To illustrate the role of the potential victim identity in infectious disease rhetoric, I analyze fragments of language used by United States public health experts, victims, and advocates prior to the establishment of the vaccine for polio, treatment for AIDS, and COVID-19 vaccine when the risks to the public were at their height. I first discuss the assumptions of victimhood, filth, and polio in the early 1900s and offer an analysis of how public health language began to articulate middle- and upper-class populations as potential victims. I then discuss Franklin Delano Roosevelt as an embodiment of the polio victim for both adult and wealthy populations. Then, my analysis of HIV rhetoric centers around the case of Ryan White, the young hemophiliac who contracted HIV via blood transfusion, and Mary Fisher’s speech to the Republican National Convention in 1992. Both cases again represent an embodiment of who could be a victim and called on children and wealthy, white Republicans to inhabit the potential victim identity. Lastly, I offer a discussion of how the identity of potential victim has figured differently through more resistance within the COVID-19 pandemic. In each case study, I unpack how constitutive rhetoric of the potential victim is ideologically tied to presumptions about what kinds of victims are considered worthy of attention and care.

Polio and embodied contagion

Prior to the 1900s, a handful of countries reported sporadic cases of what was then referred to as “infantile paralysis,” a disease largely known to target children by attacking the nervous system, which often resulted in full or partial paralysis. By 1908, researchers Karl Landsteiner and Erwin Popper isolated the poliovirus, but it was not for many more decades that scientists would discover the transmission route and that there were three, not one, strains of the virus causing outbreaks. A dramatic change happened when the US began seeing outbreaks of the illness in the early 1900s with 1916 being a particularly devastating year that began the pattern of recurrent outbreaks in the summers in the US thereafter. After decades of research, scientists finally developed two different vaccines for the public – one by Jonas Salk, approved in 1955, and the other by Albert Sabin, approved in 1961 – and by 1979 the illness was eradicated in the US. While the disease still remains endemic in some regions of the world, its near global eradication is one of the crowning achievements of modern medicine.

In this section of the paper, I identify how potential victims of polio had to be articulated to show seemingly less susceptible populations that they could contract the disease. First by looking at how white, middle- and upper-class families were called to see that they too could become “polios,” and then by analyzing Franklin Delano Roosevelt’s role in embodying the disease as an adult, I show how the articulation of potential victim relies on a constitutive rhetoric that exists in a void, as it asks people to not be something, rather than embody a new identity. I then show how such rhetoric articulates the identity of potential victim in ways that are inextricably tied to race, class and power.

Polio and the articulation of middle-class risk

While the March of Dimes campaign of the mid-1900s, one of the most successful fundraising efforts for scientific progress in history, is what most of us recall when we think of polio and public health efforts in the US, this campaign’s primary objective was not to warn the public of the dangers of polio as a means of preventing them or their children from getting it; rather, it was largely designed to raise awareness and funding for victims who already had the disease. Both the prevalence of the disease in the US and the campaign peaked in the 1950s; as such, responses to outbreaks of the illness prior to the 1950s tell us more about how new audiences and potential victims had to be articulated for the public to first see this virus as a threat.

For centuries prior to the rise of polio, disease had largely been associated with filth – fleas on rats that infested cities spread plague; fecal matter spread cholera; and miasma was thought to spread smallpox. One of the most robust responses to disease and filth, The Sanitation Movement of the 1830s and 40s, started in Great Britain as a city clean-up program and was gradually emulated in other countries like the United States (Snowden 2019). But even when the filth theory of disease was supplanted by Louis Pasteur and Robert Koch’s germ theory in the late 1800s (not because filth cannot breed disease, but because germs became a newly discovered cause of disease), remnants of the ideological connection between filth and disease remained in public consciousness. The successful sanitation measures of the early 1900s helped eradicate several diseases, reinforcing the belief “that poor living conditions – filth, poverty, overcrowding, and ignorance – were responsible for breeding epidemic disease” (Oshinksy 2005, 22) – and such beliefs underscored nationalist-based fears as disease was assumed to be “…linked to both dirt and immigrants who came into contact with native-born middle-class Americans” (Rogers 1996, 10). Disease and illness, therefore, became inextricably tied to filth, waste, and the disenfranchised populations who were denied access to cleaner living and working conditions. Such assumptions, not surprisingly, carried explicitly racist overtones not too different from the “imperial contagion” that Jeff Bass (1998) notes in his analysis of contemporary representations of colonized populations and disease. Looking back to British colonial rule, Bass writes that “The stigmatization of cultural ‘Others’ as sources of disease contamination, for example, generally involves an overtly racist projection of images of filth and defilement onto subject populations, usually for the purpose of validating the ‘purity’ of the imperial race” (431), which he argues persists in contemporary media representations of new disease strains like Ebola. While Bass’s work focuses more directly on colonized populations, such narratives are evident in discourse related to disease and immigrant populations as well. As sanitation measures of the late 1800s and early 1900s moved from public health magistrates to the home and became increasingly adopted by middle- and upper-class families, these families began to perceive themselves as more immune to disease than immigrant families living in tenement housing – a familiar sentiment that Eula Biss, in allusion to Michael Wilrich’s Pox: An American History, notes was prevalent in the last nationwide small pox epidemic where “…some people believed that whites were not susceptible to the disease” (2014, 25) – and as such, the disease acquired a variety of overtly racist names and associations and could be used as a discriminatory marker.

While cases of polio in wealthier families did happen, they were initially seen as outliers “…best explained by an unlucky association with infected members of the urban poor” (Rogers 2007, 786). As the illness continued to spread, health officials then began to define “middle- and upper-class families as innocent victims, and poor immigrant families as guilty carriers” (Rogers 1996, 21). It is precisely this association of polio with the “urban poor” which created an exigency for constituting an identity of potential victim for the families who did not see themselves as such due at least in part to the persistence of the filth theory of disease and its ideological implications that associated disease with marginalized populations. Upper- and middle-class families saw their homes and neighborhoods as safe harbor from the disease because of clean-up efforts both in public spaces and campaigns to promote vigilant housecleaning.

It is this crossroads that reveals the exigent nature of communicating to the public that the norms of sanitation that white middle- and upper-class families embraced at home did not actually protect them from the virus; rather, “…it seemed perversely to be the exact reverse of a disease of poverty – as it had a predilection for affluent neighborhoods, suburbs, and rural areas” (Snowden 2019, 390). As early as 1912, researchers began to notice the virus’s success in more affluent areas, noting a prevalence of cases in “families which are in comparatively well-to-do circumstances and in which the children enjoy every comfort and care” (Peabody, Draper and Dochez 1912, 31). And during the 1916 outbreak, when around six thousand people died from polio in the US (one-third of which were in New York City), “the wealthiest areas with the lowest population density had the most paralytic disease per 1,000 population…” (Bollett 2004, 124). The statistics alone communicated the virus’s disregard for cleanliness and class to some degree, naturally; when wealthier families in say, Staten Island, began to see homes quarantined or saw the daily polio tally in the newspaper, they saw that the virus could come for them.

But it is not simply numbers of nearby victims that awakened an identity of potential victims. When white middle- and upper-class people began to see their neighbors contract polio, the new victims physically sent an embodied message. “This could be you,” they seemed to say, not through words but through their pain, their braces, their death and the “high visibility of polio’s crippling aftereffects” (Sills 1957, 127). In some cases, renowned community members were named in the media when their families were affected by polio, further reflecting a new understanding that the disease could strike anywhere. In a piece from the New York Times, a prominent magazine publisher is named as a notable figure impacted by the illness: “E.J. Ridgeway’s Son Stricken. Boy Returns from School with Infantile Paralysis” (Special to the New York Times 1916), effectively telling affluent Americans that their children could be victims, too. And in a 1916 poem entitled, “The Bond of Motherhood,” also published in the New York Times, the author articulates the irrelevance of class in who was a potential victim: “‘O Thou, with mercy mild,/ Keep this dread spectre from our door’/ Is the one prayer of rich and poor” (Barry 1916 cited in Rogers 1996).

As Rogers notes, “by the late 1920s, polio was reconceived as ‘everyone’s disease’” (2007, 786). Snowden further shows us how popular discussions in outlets like Ladies Home Journal began articulating this realization in the 1930s as well: “‘Poliomyelitis is not the penalty of poverty…Once the terror stalks, mere wealth cannot buy immunity. The well-fed babies of the boulevards are no safer than the gamins from the gutter from mysterious universality of the crippling midget, once it’s on the rampage…There’s hardly a human plague…that is as mysterious as this subvisible torturer and maimer of our children’” (de Kruif 1935 cited in Snowden 2019). While the association between polio and immigrant families persisted beyond many of these remarks, gradually more of the public saw that viruses like polio could be devastatingly equalizing in their indiscriminate nature – no longer could the middle- and upper-class families pretend they were immune or that polio was a problem of the tenements; rather it was in their homes and they had become its victims.

Roosevelt and adults as potential victims of polio

In 1921, this kind of embodied rhetoric took the national stage. Franklin Delano Roosevelt, a well-off, thirty-nine-year-old, contracted the illness in the prime of his physicality and at an age that much of the public did not associate with polio. Since statistics of polio showed that it largely impacted children, adults were often left to feel somewhat immune to the virus. From 1912–1916, 70% of cases were in children aged four or younger, and only 10% of cases were in children ages ten and older. But even by 1916, researchers began to notice that “wealthier areas also had the highest frequency of older children and adults becoming affected by the disease” (Bollett 2004, 124). Largely due to the decline in maternal immunity, by the late 1940s, “approximately 55 percent were ten and older,” which led to many more severe cases of the illness (Wilson 2005, 14). Nonetheless, polio retained its association with both its early name, “infantile paralysis,” and young children, conferring a perceived immunity on older populations.

Polio victim narratives retrospectively often reveal an initial lack of identification with the potential victim identity, too. For example, in Daniel Wilson’s book on polio survivors, he highlights how John Tindall, an adult in Nebraska, “dismissed it, believing only children got it” (2005, 19) prior to his diagnosis. Larry Alexander, a polio survivor, is quoted in the Smithsonian’s online archive of their exhibit, showing how he gradually grew to understand that the disease could impact anyone: “The fear of polio was a fear of something you had no defense against, something that hit without logic or reason. Yesterday, it was the man down the block. Today it could be you or your children” (Alexander 1954 cited in Smithsonian 2005).

As such, Roosevelt’s diagnosis was striking. The 1921 headlines announcing his illness must have evoked, as Oshinsky writes, questions like: “How, then, could [polio] possibly victimize a man like Franklin D. Roosevelt: thirty-nine years old, robust and athletic, with a long pedigree and a cherished family name?” (2005, 24). As a prominent political figure, Roosevelt took great measures to hide his paralysis, instructing photographers not to show him in a wheelchair or being heavily assisted in activities like getting out of a car. Family members were complicit in this narrative as well, with one remarking, “It is too silly for you to have an ‘infantile’ disease’” (quoted in Oshinksy 2005, 32), underscoring popular perceptions about how invulnerable adults were to the virus.

Roosevelt returned to his political career in 1922, eventually serving as governor of New York from 1929 until his election to the presidency in 1932. While Roosevelt was largely successful in convincing the public of his health and recovery – Amos Kiewe notes his active campaign life allowed him to use “his body’s presence as the very ‘proof’ of his recovery” (1999, 159) – he nonetheless still embodied the “proof” that other affluent, white adults could be afflicted with the virus as well, even if they rarely saw visible evidence of it (Fig. 1). On occasion, he would acknowledge his condition, too, sometimes using a collective voice in discussing the need for rehabilitation of other victims: “People well know that restoring one of us cripples – because as some of you know, I walk around with a cane and with the aid of somebody’s arm myself …” (Roosevelt 1931), but such language would typically direct attention to his recovery. Given that Roosevelt’s embodiment as an adult victim still reflected a minority of those who contracted polio in terms of age, it nonetheless functioned as a national symbol of the disease’s indiscriminate nature. Moving from the tenements to suburban homes to political elites, the virus called to everyone now that it could come for them, too.

Fig. 1
figure 1

A rare image of Franklin D. Roosevelt shows him on crutches (far right) with three other Polio patients in Warm Springs, Georgia. Photo ID: 7755(565). Franklin D. Roosevelt Presidential Library & Museum. 1925. http://www.fdrlibrary.marist.edu/archives/collections/franklin/?p=digitallibrary/digitalcontent&id=2975

The whitening and generalizing of polio victims

While the embodiment of polio in white middle- and upper-class children told other families that “this could be your child, too,” Roosevelt did the same thing for adults, even in spite of how he concealed his paralysis. In both cases, the articulation of potential victim calls others to see themselves as possibly at risk and to avoid becoming a victim. Instead of enlivening a dormant identity, as constitutive rhetoric typically does, it calls us to remain something we already are – a non-victim – and asks us to remain so by seeing that we could fall prey to an illness. In both cases, too, we see how the embodiment of victims begins to articulate victimhood as one connected to whiteness and affluence. No longer was polio tied in racist and classist ways to immigrant families and filth; instead, it became an indiscriminate disease, one that attacked everyone from wealthy children in clean neighborhoods to the President of the United States.

Gradually, the disease became more and more of a “white disease,” (Rogers 2007) which was formalized when Roosevelt established his whites-only polio rehabilitation center in 1927 as part of the National Foundation for Infantile Paralysis (later known as the March of Dimes) in Warm Springs, Georgia. The whitening of this disease permeated public consciousness so much that scientists themselves relied on a susceptibility argument to rationalize how only white patients would need these services: “Blacks, medical experts insisted, were not susceptible to this disease, and therefore research and treatment efforts that focused on Black patients were neither medically necessary nor fiscally justified” (Rogers 2007, 784). While numbers of Black polio victims were lower, what’s likely is that “Black polio cases were missed as the result of medical racism and neglect: families had limited access to doctors and hospitals, and inadequately trained Black health professionals were unable to diagnose polio’s ambiguous early symptoms” (Rogers 2007, 785). With such poor data collection among Black Americans paired with the disenfranchisement of immigrant communities, polio became a “white disease” that did not in reality attack white families with any significant disproportion, but the interpellation of white families and individuals became the means by which those in power saw themselves as potential victims and therefore took actions against the disease, as we see in Fig. 2 where a white child in a wheelchair says “I could be your child” while the billboard calls for fundraising efforts. The identity of victim – once a shameful indicator of filth – became a sanitized symbol for the need to fundraise, do research, and develop a vaccine, and ultimately functioned as a warning to the public to not become like the victims they now pitied instead of scorned.

Fig. 2
figure 2

Billboard sponsored by the National Foundation for Infantile Paralysis (later called the March of Dimes), California, 1942. Courtesy of Franklin D. Roosevelt Library. http://www.fdrlibrary.marist.edu/archives/collections/franklin/?p=digitallibrary/digitalcontent&id=4281

By 1939, the National Foundation for Infantile Paralysis decided to fund the development of a polio center at the Tuskegee Institute to treat Black patients and train Black providers, making “visible the neglect of disabled patients of color” (Rogers 2007, 790). But as Rogers notes, this was a rather insufficient facility, so by 1945, the foundation began to integrate its programs and facilities, finally acknowledging that there was “no evidence of any racial susceptibility to the disease” (Bynum 1947 cited in Rogers 2007, 791). Gradually, though inadequately, the organization began to rearticulate the identity of victim for Black Americans by featuring Black children on the March of Dimes posters and by supporting their inclusion in the 1954 Salk vaccine trial – important steps, but ones that glossed over the “Jim Crow medicine” (Rogers 2007, 793) that persisted in segregated hospitals and schools. To this day, as John Lynch and Mary Stuckey argue, the memorializing of FDR at his Little White House reinscribes polio with “whiteness” in ways that deflect attention away from the racial inequities during FDR’s presidency (2017, 391).

Overall, then, we see the constituting of potential victims as a shape-shifting form of public health rhetoric that reflects ideological values tied to race and class. As potential victims of polio in tenement housing, immigrant families were reviled for this identity; when white, middle- and upper-class families were called to be potential victims, polio became a “white disease,” which then effectively denied African Americans the identity of potential victims and by extension their needed care and treatment. And when FDR embodied victimhood as a political elite, the risk became even more real for the well-off and we finally saw power and money directed toward this public health threat through FDR’s fundraising balls and the March of Dimes campaigns. The constituting of potential victims, while an important component of public health rhetoric, also reveals to us the cultural values of self-determination, wealth, and whiteness that permeate our treatment of bodies and illness.

AIDS, the good victim and Republican embodiment

In 1981, two years after the eradication of polio in the United States, doctors and the Center for Disease Control began reporting various aggressive afflictions – sometimes presenting as pneumonia and other times as cancer – in gay men in California and New York (HIV.gov n.d.). Scientists believe transmission of HIV from animals to humans likely happened much earlier, some say as early as 1959, but it was this moment when HIV entered the public stage. Beginning in May, scientists began connecting this mysterious condition to gay communities in New York City and San Francisco as seen in the Morbitity and Mortality Weekly Report in Fig. 3.

Fig. 3
figure 3

July 3, 1981 Morbidity and Mortality Weekly Report: “Kaposi’s Sarcoma and Pneumocystis Pneumonia Among Homosexual Men – New York City and California,” Sally Hughes AIDS research collection, MSS 2001–04. Archives and Special Collections, University of California, San Francisco. Used with permission

As more cases were reported over the summer in gay men, scientists and outlets frequently began to use language that tied the disease to the identities of its victims. On July 2, 1981, The Bay Area Reporter first mentioned “Gay Men’s Pneumonia;” on July 3, the CDC and New York Times used the term “gay cancer” (HIV.gov n.d.); and the disease was commonly referred to as Gay-Related-Immune Deficiency (GRID) and “gay plague” in the early 1980s. By August of 1981, nearly 100% of the cases were men, and of those, 94% were gay or bisexual, and 40% of the patients had already died (HIV.gov n.d.). President Ronald Reagan took office just six months prior to this initial outbreak in 1981, and his silence on the spread of HIV until 1987 was a deafening, quiet nod of inaction for the Christian Right. While his administration eventually contributed funding to research, Reagan let public perception of HIV as a “gay problem” and “drug user problem” grow. Fanning the flames of Moral Majority rhetoric, HIV became defined “not as a disease but a sin” (Snowden 2019, 436). And even though researchers and most of the media stopped referring to AIDS as “gay cancer” or “GRID” after 1982, the stigma of such labeling remained and told the public that the gay community was at risk, and almost everyone else was not. As Randy Shilts notes: “By its very name, GRID was a homosexual disease, not a disease of babies or their mothers,” (1987, 124) something that Paula Treichler (1999) further unpacks in her analysis of how public health research and directives largely ignored women as a vulnerable population of HIV. J. Blake Scott (2003) offers further insight into how public health rhetoric around HIV testing invokes Foucaldian forms of disciplinary power when it constitutes both “normal” and “risky” subjects in ways that potentially perpetuate risk. When heterosexual, white, monogamous, and non-drug using subjects are categorized by discourse as “normal,” not only do those who fall outside of these conditions become further marginalized and controlled, but those who are “normal” also end up at more risk given the lack of preventative direction to these populations.

Research on media coverage of HIV in the 1980s illustrates how much the media shaped the public agenda around HIV/AIDS as well. Everett Rogers, James Dearing and Soonbum Chang’s study of AIDS and agenda-setting in six prominent news outlets illustrates how news coverage prior to 1985 was ten times less frequent than after, largely because at first “the media perceived the disease mainly as a gay story and a scientific story” (1991, 13). Based on their findings, they argue that “Unless the American media’s core constituency of middle-class individuals is perceived to be at risk, a rampant disease like AIDS does not constitute a news story with high news value” (1991, 13). As they note, such coverage changed after 1985 during what they call the “human era” of HIV news coverage.

Given these conditions, a few noteworthy ruptures in this narrative illuminate for us the ways in which the potential victim of HIV became articulated in new ways in public discourse. While other case studies could be included here – for example, Paula Treichler alludes to this constitutive process when discussing Rock Hudson’s embodiment of the “regular guy” as victim or Life magazine’s 1985 cover story “Now No One is Safe from AIDS” (1999) – I look specifically at the ways that two famous cases of HIV – in a child and in a woman – reframed who had the potential to become a victim. Specifically, I analyze here the ways that Ryan White embodied the innocent or good victim of HIV as a child and hemophiliac and how Mary Fisher embodied the white, wealthy, Republican female that few imagined could become a victim. In both cases, this embodiment of victimhood shook the ground of perception of HIV as a “gay disease” in ways that called for everyone to reconsider their own identities as potential victims.

Ryan White, Ronald Reagan and the good victim

On September 18, 1985, President Ronald Reagan discussed AIDS for the first time in public remarks when he was asked at a press conference whether or not he would send his own children to school with a child who had AIDS. Reagan replied that he was “glad I’m not faced with that problem today” and that he could “understand both sides of it” (Boffey 1985), comments that effectively affirmed minimal evidence that the disease could be spread through surface contamination. Reagan’s comments came less than a year after Ryan White, a thirteen-year-old boy from Indiana, was diagnosed with the disease in December 1984. While cases of children with HIV had been reported by scientists as early as 1981, they were largely ignored in public discourse until this moment. Via a blood transfusion, White, who was a hemophiliac, was infected with the virus and took the national stage to show us what victims could look like in a way that reconstituted the subject position of potential victim of AIDS. His mother’s own initial reactions to his diagnosis reflect public perceptions of the illness at the time. As she recalled in an interview, “I felt like, ‘How could he have AIDS?’ He was a hemophiliac since birth, and I just felt like ‘How could he be one of the first ones?’ I felt like somehow, in some way, it was going to be something else. I really never really believed he had AIDS for quite a while” (White Ginder 2016). As national attention grew around White, though, both his family and the nation began to see that victims could look different than what they had assumed. He was “the Indiana teen-ager who put the face of a child on AIDS,” (Johnson 1990) making real for families around the nation that this virus could come for anyone – and to underscore this familiarity his depictions in the media often showed him doing the everyday things of childhood (Fig. 4).

Fig. 4
figure 4

Ryan White pictured in his bedroom reading and surrounded by toys. Photo permissions granted by the Health Resources & Services Administration’s Ryan White HIV/AIDS program. Who Was Ryan White? | HIV/AIDS Bureau (hrsa.gov)

White’s case was a particularly interesting example of constitutive rhetoric within infectious disease discourse. While initial associations with HIV resulted in the public dismissing it as one that only impacted the gay community, when White’s case made national headlines, the response was an overzealous frenzy of paranoia that resulted in him being denied entry into his school. Parents and teachers worried, could White or others like him infect his classmates? What if they shared a bathroom? What if he spit near them? Or bit them? After a hard-won court battle to allow him to attend public school, life did not get better. As White shared with the President’s Commission on AIDS in 1988, the lack of public education on AIDS made him the victim of jokes, lies, vandalism, homophobic name-calling, and people’s refusal to be near him or shake his hand (White 1988). White’s testimony showed that the public response shifted from one of apathy to one of hysteria where anyone was a potential victim if they were in the vicinity of an AIDS victim. Reagan’s remarks in 1985 bolstered such suspicions, leaving the public to rely more on fear and less on science. The way White contracted the disease – through blood transfusion – showed that the illness could spread through means other than the “sins” of drugs and sex that Reagan and others had come to rely on in their rationalization of ignoring the epidemic. But “When the nation was still grappling with homophobia, unsubstantiated fears of how the virus was transmitted, and a great deal of prejudice towards a growing number of terribly sick individuals, Ryan White’s case became a national antidote” (Markel 2016). Part of that had to do with the fact that White lived longer than doctors had predicted – he died in 1990 and during his time was able to change such problematic perceptions of him and others HIV victims as dangerous vectors of the disease. Given what Rogers, Daring and Chang found regarding the increase of coverage of HIV after White’s case was made public in 1985, it’s perhaps no surprise that funding soon followed as well, which included 190 million dollars from Congress in 1985, 17.2 million dollars from the Robert Wood Johnson Foundation in 1986, the U.S. Health Resources and Services Administration’s AIDS Service Demonstration Grants program in 1986, and, by 1990, Congress enacted the Ryan White Comprehensive AIDS Resources Emergency (CARE) Act of 1990 which granted 220.5 million dollars to HIV care and treatment (“A Timeline”).

But for most of White’s life with AIDS, Reagan continued to remain silent on the matter until his 1987 Remarks to the American Foundation for AIDS where he ironically called for “urgency” on the issue after around 21,000 Americans had already died. Much of the speech focused on Reagan’s commitment to funding research and developing treatments and a vaccine, but he likewise attempted to articulate for the public that this issue was not just one for the gay community and drug users: “…I don’t want Americans to think AIDS simply affects only certain groups. AIDS affects all of us.” In universalizing the illness this way, Reagan then shifted our attention and testing needs to “those who seek marriage licenses” (Reagan 1987). In doing so, Reagan effectively cut out the gay community from this call for preventative measures given that gay marriage was decades away from becoming legalized in the US. This is further underscored by the emphasis on “good” decision-making, when he claimed that AIDS education will “not be value-neutral,” implying an ideological commitment to responsibility and autonomy on the part of individuals rather than speaking to the larger systemic issues that drive people to drug use or that limit access to safe sex education. For Reagan, getting AIDS was a moral decision – and making the right choices in life would result in our protection. The CDC’s America Responds to AIDS campaign that ran from 1987–1996 further pushed the universalizing “everyone is at risk” messaging but in a way that diverted funds away from the more at-need populations, “leaving gay and minority communities to compete with one another for the little money that remained” (Geiling 2013). By shifting the potential victim population from the gay community to a straight, marriage-seeking population, Reagan’s potential victim is the one that affirms the primacy of heteronormative culture. Capitalizing on the existence of good victims like Ryan White, Reagan effectively told the world that while everyone should now see themselves at risk, some potential victims were more valued than others.

Whiteness, wealth and the unexpected victim

While the public was shaken with the example of Ryan White, cases caused by blood transfusion were generally less publicized and therefore HIV was still stigmatized as a “gay disease.” During the Republican National Convention for George H. W. Bush’s reelection in 1992, the story changed, however. Taking stage that night, Mary Fisher, a young, white, Republican mother from a well-off family, told the world she was one of the 150,000 Americans who had AIDS (Fig. 5). At this moment, the call for public health became one that was not focused on alleged immorality of the gay community or drug users; rather, Fisher embodied the virus in a way that allowed her fellow, largely white, Republicans to see themselves as both like her and like other victims. They could all be victims in that room, as Fisher’s words so poignantly articulated. Fisher’s embodiment of HIV could be seen as part of a period of news coverage about HIV and AIDS that focused on politics and policy – Rogers et al. (1991) identify this period as taking place from 1987–1988 when there was a political focus on testing, but Fisher’s speech in 1992 could be seen as an extension of this policy-centered discourse about HIV.2

Fig. 5
figure 5

Mary Fisher at the 1992 Republican National Convention in Houston, TX. Photo permissions granted by the Mary Fisher/Abataka Foundation, Inc

Beyond Fisher’s visual embodiment of the unexpected victim – straight, affluent, female – her message to her fellow Republicans served as a warning of what they could become. We see her at different moments interpellate the audience and awaken a new subject position in them. Speaking of the virus, she says, “… the AIDS virus is not a political creature. It does not care whether you are Democrat or Republican; it does not ask whether you are black or white, male or female, gay or straight, young or old” and underscores the indiscriminate nature of the virus: “HIV asks only one thing of those it attacks. Are you human?” (Fisher 1992). Fisher then calls attention to her own identity as one that might not be expected to inhabit the HIV-victim subject position: “Because I was not hemophiliac, I was not at risk. Because I was not gay, I was not at risk. Because I did not inject drugs, I was not at risk” (1992). And yet, there she stood, a victim of the disease. She then pleads with the audience to see how they too are at risk, every single one of them: “If you believe you are safe, you are at risk. If you do not see this killer stalking your children, look again. There is no family or community, no race or religion, no place left in America that is safe. Until we genuinely embrace this message, we are a nation at risk” (1992). In one of the most celebrated speeches of the twentieth century, Fisher embodies the unexpected but takes her audience to that place of embodiment too. While she is already a victim, the largely white, Republican audience in the room is really no different from her. Her words and presence ask them to see in themselves what they see in her – they are all potential victims of this virus, and therefore, should do something about it.

While mortality rates have improved dramatically since the 1980s and 1990s with AIDS, the virus still continues to wreak havoc on the public. Around 1.2 million people in the US are currently living with HIV and 14% of them are unaware they have it (HIV.gov 2021). Given the virulence of the illness, the need for more of the public to realize their identity as potential victims is gravely important. And, at the same time, we need to recognize how ideologically-driven the constitutive identity of good victims like Ryan White or heteronormative victims like Mary Fisher reflects cultural presumptions about who gets care and what illnesses demand our attention.

Constituting vulnerability in the COVID-19 pandemic

The COVID-19 pandemic offered a set of circumstances where we saw potential victims charged with both new and old meanings. As scientists and doctors quickly learned in 2020, the disease was particularly ruthless with the elderly, as was evidenced by the first major outbreak and death in the US at a Kirkland, Washington nursing home where one hundred sixty-seven residents, staff and visitors were infected between February and March. As more outbreaks occurred at nursing homes across the world – which were quickly seen to be “tinderboxes” for the virus – news coverage began showing dramatic footage of families separated, unable to see elderly parents in their homes, and the elderly living in fear and isolation (Fig. 6). On March 11, 2020, when the World Health Organization declared COVID-19 a global pandemic, the Secretary General of the United Nations, Antonio Guterres, demanded that: “we must show solidarity with the most vulnerable – the elderly, the sick, those without reliable healthcare, and those on the edge of poverty” (Guterres 2020). These early images and messages reflected the reality that the elderly were among the most susceptible to serious disease from the virus; but they also contributed to a narrative of invulnerability for those who didn’t look like them.

Fig. 6
figure 6

Scott Morrow visits his mother, Claudette Stasik, at the Bria of Geneva nursing home in the western suburbs of Chicago. Since mid-April, seventy-five of the nursing home’s ninety-one residents and thirty-seven of its one hundred twenty workers have tested positive for the coronavirus. (Anjali Pinto for ProPublica) – used with permission. A Quarter of the Residents at This Nursing Home Died From COVID-19. Families Want Answers. — ProPublica

To add to this narrative of invulnerability, news coverage in March and April 2020 also showed crowded spring breaks where young people decried fear over the virus – to them, it would seem, the virus wouldn’t harm them – so why should they worry about it? One viral video of college students being interviewed by Reuters demonstrated this lack of regard – perhaps most famously, Brady Sluder from Ohio remarked, “If I get corona, I get corona…At the end of the day, I'm not going to let it stop me from partying” (Bella 2020). Sluder has since apologized for his remark, but the sentiment has remained in public discourse that those who are not at heightened risk for COVID should be able to go about their lives without restrictions or fear; the idea that the virus can’t stop Sluder – or other young spring breakers – from partying shows this sense of invulnerability, not to the virus exactly, but to debilitating and life-threatening disease from it. From the start of the pandemic, then, we see the idea of potential victim take a different form – rather than invoking realization and fear about what could become, the potential victims who saw themselves as low-risk sometimes simply did not care if they got it.

In addition to disregarding the elderly, apathy toward others was highly racialized as well. President Donald Trump – among others – repeatedly referred to the virus as the “Wuhan virus” or the “Chinese virus,” something that Sheng Zou notes is a strategy to confine the virus to the Other rather than acknowledging the susceptibility of everyone: “The deliberate use of such names reflects an attempt to spatialize and racialize the disease not only to deflect blame but also to symbolically confine it to an externalized Other” (2021, 524). While blame and confinement functioned to construct a binary of invulnerable/vulnerable that mapped onto colonialist narratives of West/East, this same binary of invulnerable/vulnerable reflected the systemic issues of race, class and healthcare in the US. Broadly, those who have been disenfranchised through systemic restrictions to access to healthcare, jobs, housing, and more resulted in rates of hospitalization and death among most non-white demographics by a factor of two or more as compared to white people (CDC 2021). When we see SARS-COV-2 as “a pathogen feared for not discriminating while it exposes those systems that do” (Kennedy 2020, 286), we likewise see how the shirking of social responsibility by Sluder, Trump and others is a direct rejection of people’s worth and dignity.

To underscore this lack of regard for others, it is important to note that researchers also knew very quickly what kind of virus we were dealing with – unlike polio or HIV, which took much longer to understand. This meant we had a reasonable set of precautions to take as they learned whether or not the virus was transmitted through surfaces or air, how it attacked the lungs and more. Wearing masks, staying home, and keeping physical distance all became calls to become and signifiers of the socially responsible citizen who was asked to act not just because the virus might attack them and that they might be potential victims, but because their actions might protect others. As Jeffrey Bennett notes, implicit in public health directives are moral mandates about good/bad citizens: “The binary foretells the existence of good subjects, and the failure to meet those expectations implies poor citizenship” (2021, 351), something that is seen by some as a threat to their identities and liberties. Social responsibility of course figured heavily in the AIDS epidemic as well once transmission was better understood with calls for safe sex and clean needles; with polio, too, social responsibility was present, but more in the form of contributing the March of Dimes to help those who had already been afflicted. But with COVID-19, the call to see oneself as a socially responsible citizen came so quickly that it may have pulled too abruptly away from concerns about one’s self. In effect, the call for social responsibility with COVID-19 embodied “…the paradigmatic clash between communitas and immunitas” (Esposito 2012, 49) – or the tension between care for the common good and our autonomy in a way where potential victimhood was not yet constituted for all.

Even as public figures were infected, little seemed to move the skeptical public. And when President Donald Trump was infected in fall 2020, one might think his embodiment of the disease, like with Roosevelt and polio, would reframe how the public saw potential victimhood. But his remarks and actions – even as someone in his seventies and therefore more susceptible to the disease – minimized the virus and framed public health measures as an overreaction. Rather than advocating for a more robust public health response, Trump exploited what David Garland describes as: “ …the problem of trust, and the public’s relation to ‘the authorities’” (2003, 58) among those who see a clashing of interests between their government and themselves. Latkin et al.’s recent study (2020) affirms this decline in trust – in a longitudinal study of trust and public health information sources, the authors found that trust regarding COVID-19 information from outlets like the media, public health organization, the White House and other sources all declined during a four-month period early in the pandemic. And to compound this issue of trust and risk communication, COVID-19 is the first pandemic of the information age, meaning that those who were already skeptical are now equipped with more pundits, Youtube videos, anti-vaccine memes on social media, and more, most of which serve to promote fear – not of getting sick but of limits to our autonomy. As Ian Hacking notes, “…what we really fear after brute physical injury is the invasion of our privacy” (2003, 38), which aligned precisely with the calls for personal liberties and rejections of public health mandates for masks and vaccines. Undoubtedly, “risk codes danger as a threat to liberty” (Ericson and Doyle 2003, 5), and through maximum amplification of such alleged threats on social media, the awakening of the potential victim identity in some subjects will never happen.

Conclusion

There are two primary conclusions to draw from these case studies. The first has to do with how we conceptualize the role of constitutive rhetoric in public health discourse. With Charland’s definition of constitutive rhetoric, we can easily imagine language that attempts to awaken an identity, like the kind we see in pharmaceutical advertisements that articulate a subject that the viewer might identify with (the grandma with arthritis enjoying her grandkids, the young adult suffering from situational anxiety, etc.). “This is you,” the ads call. But with public health rhetoric focused on prevention of communicable illness spread, we see constitutive rhetoric take a different shape. In cases where diseases appear to target particular populations, other groups still need to be invited to see themselves as potential victims given the fact that they can still carry and spread the illness, let alone suffer health consequences themselves. What we see in the above cases, then, are constitutive appeals that effectively ask us to do what we can to remain how we are rather than engage in behavior that makes us become victims, showing us that the constitutive function of this rhetoric names us as potential victims and asks us to remain in the identity of non-victim at the same time. The tensions within these cases of health prevention rhetoric augment our understanding of both public health rhetoric and constitutive rhetoric. It also, by extension, helps us better understand how calls for behavior-change repel off of science denialists and conspiracy theorists – for many, such calls are not just about actions, but identities as well. Knowing how we are articulated and emerge as subjects within health communication shows us not just how identities tied to our health are awakened in us, but also how we are called to not become something.

Secondly, the constituting of potential victim as an identity is ideologically-fraught. In the cases of polio, AIDS, and COVID-19, we see marginalized populations – poor and immigrant communities, African Americans, the gay community, and the elderly – all stigmatized, left without care, and rejected by an affluent, white, youth-centric culture because of either their associations with an illness or their fictionalized immunity to it. But then when we see constitutive rhetoric function to articulate potential victimhood for affluent, white, heteronormative or young people, we see how calls for action, research, and policy are responses to the concerns of those at the center, not the margins, of society. To create action, it seems, we have had to repeatedly show the privileged that they can be victims as well – and the requisite to do so has had devastating consequences. To conclude, Anthony Fauci’s words to Congress on the apathy among young adults contracting COVID-19 underscores our need to see not just ourselves but others as potential victims too: “You should care not only for yourself, but for the impact you might have on the dynamics of the outbreak” (Blake 2020). Perhaps we can one day hope for narratives that no longer have to remind the public of this moral obligation.