Right after completing my doctorate, I took a job as a political reporter. The pay was lousy, the position had little to do with the health sciences, and the newspaper, though respected, wasn’t big enough to compete with the (then) two local dailies and major television stations. I worked odd hours and wore many hats, from writer and copy editor to production artist and all-around assistant. It was, perhaps, my “year of the intern.” And I loved every second of it.

I don’t know if I could say the same had I worked at a major publication. For one thing, no major publication would have hired me. Despite having been tied to a newspaper since the age of fifteen (thanks to English teacher Bonnie Kelly and all the open-minded editors who came after), the “real” world doesn’t often know what to do with an academic. Many think we don’t know an honest day’s work, we’re all theory and no praxis, and in general we’re just out of touch. To be sure there are some who fit this bill, but the percentage likely isn’t that much greater than in any other profession. That said, I will be the first to admit that I am probably not “major daily” material. I am not a fast writer (the old adage about “opening veins” better describes my ever-tortured state), and I struggle with brevity (as anyone who reads these JBI editorials already knows!).

On the other hand, I probably wouldn’t have chosen to work for them. At The Colorado Statesman (and papers such as the Intermountain Jewish News and the Tucson Weekly and others), I could be the queen of the 1,500-plus word article, and the weekly format and special supplements of these publications allowed for more research and reflection. My editor at the IJN in Denver, when asked by writers how many column-inches an article should be, would always respond, “As long as it takes to tell the story.” I also have been fortunate to work for editors with advanced degrees in law, religion, political science, and literature, although these exist at the dailies, too. But what I appreciated most was a commitment to journalistic independence—perhaps easier to come by at smaller papers, since the lawmakers and the rainmakers hardly ever vie for your attention and few of the other print and TV reporters see you as real competition. Whether true independence is possible is not likely, but a journalist who is contemplative and transparent about his or her biases, uninterested in power and favours, and sceptical as a matter of course has at least a shot. (And for those who think journalists must kowtow to advertisers, another editor often reminded me that though these help fund our salaries, our articles provide them a visible and attractive conveyance. Otherwise, you’d see more “pennysavers.”)

Like all the publications in my life, I was proud to be a part of The Colorado Statesmen, in particular because it is purposely non-partisan, focusing mainly on local and state politics and never endorsing candidates. While I tend to lean liberal (not a big surprise for an anthropologist interested in ethics), I have never been wont to fully buy into any institution or ideal, choosing instead to ever orbit from an outer ring. And when it comes to modern-day U.S. politics, it’s not that hard to be non-partisan: I am not sure I can stand either side.

Walking to work most days, I spent my “intern” year in an office half a block from the state capitol and tried to navigate its corridors, and the issues of the day, as often as possible. While it was exciting to chase after elected leaders and get to know them face-to-face, my sense of importance came from doing a job intended to serve the public, being the (perhaps mythical) “watchdog” of the government.

Unfortunately, my love for politics and policy, and even journalism, has waned. Where once the people in my life tired of my never-ending wonk conversation—my being and mind in rhythm with the latest bills and laws and chamber goings-on being pumped out of the heart of the body politic—those days are long gone. Physically and occupationally distanced from any capitol steps, I have tried to keep at least an eye on the political pond and even dip a toe back in every now and then. But I find myself veering in the direction of those around me now, who have given up watching political “news” and even the Comedy Central cable program The Daily Show. As I write this, last night the first U.S. 2016 Republican primary debates occurred on Fox News—even though there’s still fifteen months before the actual presidential election—and much-loved Daily Show host Jon Stewart signed off after sixteen and a half years in the position, one in which he has ironically become one of the “the most trusted [journalists] in America” (Kakutani 2008, article headline; see also Pew Research Center 2007). Watching his last few months of shows, however, it is clear he, too, can’t quite stomach any of it any more. (He’s even giddier than I was rotating off of the university faculty senate.)

It’s not hard to see why. And none of this should come as a surprise. Neil Postman (1985) warned us all three decades ago, but we have been, to use Postman’s phrase, “amusing ourselves to death” ever since. And we’re only getting better at it. This year marks the thirtieth anniversary of the publication of Postman’s book, which was decidedly ahead of its time. Perhaps too much so. The full weight of Postman’s premise probably is only becoming clear now. But even in 1985, Postman’s warning was also too late. The “medium-metaphor” of modern culture had already changed.Footnote 1

Postman’s thesis, like so many groundbreaking ideas, is so simple and obvious—though identified and argued with Postman’s erudition and signature panache—it was right there in front of our faces. Like the frog in warming water, we are swimming in example after example, yet have failed to understand and pay attention to the environment we now live in. (Loved ones in my life now, like those during my Statesman year, physically grimace every time I utter the first syllable of Postman’s name. If he were a wrestler, his stage name would be “The Hammer,” prevailing in a society where there is no dearth of nails.)

Amusing Ourselves to Death began, Postman concedes, with a “gimmick.” The year 1984 came and went, he explains, with none of the dystopian, totalitarian horrors (at least in certain parts of the more democratic world) that George Orwell depicted thirty-five years earlier. “We were keeping our eye on 1984,” Postman wrote in 1985. “When the year came and the prophecy didn’t, thoughtful Americans sang softly in praise of themselves. The roots of liberal democracy had held. Wherever else the terror had happened, we, at least, had not been visited by Orwellian nightmares” (Postman 2006, xix). But it was not merely Orwell’s 1984 that we should have been watching (and, it can be readily argued, we have allowed such “dark vision[s]” to become waking realities in many regions without so much as the bat of an eyelid). “[T]here was another—slightly older, slightly less well known, equally chilling” forecast for the future: Aldous Huxley’s Brave New World (Postman 2006, xix). Postman elucidates:

Contrary to common belief even among the educated, Huxley and Orwell did not prophesy the same thing. Orwell warns that we will be overcome by an externally imposed oppression. But in Huxley’s vision, no Big Brother is required to deprive people of their autonomy, maturity and history. As he saw it, people will come to love their oppression, to adore the technologies that undo their capacities to think. …

What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one. … Huxley feared those who would give us so much [information] that we would be reduced to passivity and egoism. … Huxley feared the truth would be drowned in a sea of irrelevance … [and that] we would become a trivial culture, preoccupied with some equivalent of the feelies, the orgy porgy, and the centrifugal bumblepuppy. As Huxley remarked in Brave New World Revisited, the civil libertarians and rationalists who are ever on the alert to oppose tyranny “failed to take into account man’s almost infinite appetite for distractions.” … In Brave New World, [people] are controlled by inflicting pleasure. In short, Orwell feared that what we hate will ruin us. Huxley feared that what we love will ruin us (Postman 2006, xix–xx).

And, sadly, so it seems. Who knows if Postman, presciently looking ahead, anticipated the web-networked world, the suffocating swath of “selfies” and Facebook updates and tweets, the greedily self-imposed panopticon that has been solidly built on consumer culture? When he was writing, the seeds of these technologies and practices were certainly present, and now in full bloom, they have just upped the ante on Postman’s wager. In 1985, Postman didn’t worry about “junk” TV—saying “[t]he best things on television are its junk, and no one and nothing is seriously threatened by it” (Postman 2006, 16, emphasis original)—and in 2015, he might not be so concerned with individuals’ obsessive navel-gazing, either.Footnote 2

There also are certainly positives to enabling more voices to be heard and to increasing the ease of interaction. It was, and is, however, the use of these media as the prime vehicle for “important cultural conversations” that frightened Postman (2006, 16), and in this he has been chillingly correct. We should all be shaking in our boots, demanding from our political, media, and academic leaders something other than de-contextualized titbits of shallow, fleeting “facts” for knowledge. We have, after all, multitudes of channels, increasing means of access, and twenty-four-hour broadcasting. Yet, thanks to the rhetoric achievable via television (et al.) technology, we are often more focused on in-the-moment trivia, stripped of milieu and isolated from history like useless quanta, packaged to fit between commercial breaks or within 144 characters. In this medium-metaphor, we rarely connect the “now” with the past, and there isn’t enough time (as we are oft reminded) to contemplate what this means for the future and anticipate potential consequences. Neither looking back nor forward, we are engaged in the facsimile of conversation and even unconsciously mistake mediated encounters for real social interaction (Reeves and Nass 1996). Whether social media may leave us isolated, depressed, distracted, and always out of time (research is still out on many of the health risks and benefits), we are encouraged to focus on superficial flubs and snubs—but only until the next one surfaces—and be content with sound bites, flimsy quick fixes, and a never-ending line of consumables. We certainly can’t recognize that we are shaking in our boots when we are constantly being told there is new footwear for sale. (Or, with the move towards living room settings and even glass desks, what legs fit into this fancy footwear on the newsroom dais [see, e.g., Dimiero and Hananoki 2014].)

But, Postman argues, this medium-metaphor cannot be any other way, that we dig a deeper hole by trying to use it for education and serious conversation. If correct, the ramifications are severe, particularly as our modern medium-metaphor expands and we gladly integrate it to greater extents in our political, journalistic, and university institutions. As Postman cautions,

we do not measure a culture by its output of undisguised trivialities but by what it claims as significant. Therein is our problem, for television [or now social media] is at its most trivial and, therefore, most dangerous when its aspirations are high. … The irony here is that this is what intellectuals and critics are constantly urging television [and such technology] to do (Postman 2006, 16).

Thus the recent political debate on Fox News (or any countless others to come in the United States or elsewhere) is anything but a debate. Every modern candidate knows that, whenever queried, the best strategy is to deny the premise of the question and respond instead with talking points on whatever subject is preferable—using a political prestidigitation so that the sleight-of-hand goes unnoticed—and to do so concisely. It is not necessarily the fault of the politicians and their “handlers,” however; the medium-metaphor, designed to attend to ratings, advertisers, and a populace that now rarely reads, prevents anything else. With seventeen Republican candidates in the running (at present count), Fox split the debate in two: seven debated to an empty arena for one hour (with four commercial breaks) at five o’clock, and ten took the stage for two hours before a live audience at nine o’clock. The rules were “simple,” as Bret Baier, Fox News chief political anchor, explained. “One minute for answers, 30 seconds for follow-ups. And if a candidate runs over, you’ll hear this [bell chimes]. Pleasant, no?” (Washington Post Staff 2015, ¶40–¶41). In Amusing Ourselves to Death, Postman juxtaposes today’s ​type of set-up with the 1858 Lincoln–Douglas debates, as Abraham Lincoln and Stephen A. Douglas vied to represent Illinois in the U.S. Senate. This was a series of seven debates throughout the state, each of which was planned to last three hours (one hour for the opening candidate, an hour and a half for reply by the other, and a half hour for rebuttal), although this arrangement

was considerably shorter than those to which the two men were accustomed. … For example, [in an earlier encounter] on October 16, 1854, in Peoria, Illinois, Douglas delivered a three-hour address to which Lincoln, by agreement, was to respond. When Lincoln’s turn came, he reminded the audience that it was already 5 p.m., that he would probably require as much time as Douglas and that Douglas was still scheduled for a rebuttal. He proposed, therefore, that the audience go home, have dinner, and return refreshed for four more hours of talk. The audience amiably agreed, and matters proceeded as Lincoln had outlined (Postman 2006, 44).

In comparison, nothing can be said, let alone debated and discussed, in one minute’s time, and all that the public can hope to hear from seventeen candidates during a not quite three-hour window amounts to little more than a string of tweets. Unlike others, including Jon Stewart, I am not raising an eyebrow here or poking fun at the crowdedness of the primary field; I think we should always have many options, in the primaries and in general elections, beyond two or three candidates. What is problematic is the format we have chosen, influenced by our medium-metaphor, within which to discuss the most pressing issues of the day.

And Fox is not alone. The debates that will be hosted by others won’t be much, if at all, different, and what we have come to think of as “post-game” coverage has been one-dimensional and thin. For example, in an article published the following day, CNN “asked a range of contributors to give their take,” which seems to have been a request to write a few paragraphs on who was the “winner” and who was the “loser” (CNN 2015, ¶1). The article also included studio-style photographs of each of the thirteen commentators (as if they, too, are part of the “story”), as well as multiple typos. Granted, these were most that perhaps few but other editors would notice, but they suggest, along with the content, that publishing this piece as soon as possible was more important than ensuring depth, consistency, and accuracy. Again, this is what our medium-metaphor both demands and feeds on. (And, thus, we might ask, “What else could an article like this have been?” Perhaps a refusal to participate by all invited until there is something real to write about? To be fair, two of the respondents named “women” and “American democracy” as the real losers.) The academic literature also isn’t immune. Researchers are being pressured to submit papers quickly, peer reviewers to turn around assessments in less than a month’s time, editors to truncate final decision-making and editing processes, and publishers to disseminate rapidly and widely.

And CNN is not alone. Much of the post-debate talk has centred on a negligible tête-à-tête, or at least the wrong aspects of it, between blustery businessman Donald Trump and Fox News anchor and debate moderator Megyn Kelly, with the former calling the latter “very hostile and unprofessional” for asking pointed questions (Wattles and Stelter 2015, ¶5). The exchange grew from there and took over the airways (in the United States, but even overseas), with headlines like “Donald Trump late-night angry-tweets Megyn Kelly, and it is epic” (Sinderbrand 2015), “Megyn Kelly Gets Last Word in Donald Trump Comment Kerfuffle” (de Moraes 2015), “Donald Trump supporters reportedly flooded Megyn Kelly with death threats following debate” (Tesfaye 2015), “Trump: Megyn Kelly’s ‘Surprising’ Vacation Due to Debate Flap” (Hoffmann 2015), and “Megyn Kelly vs. Donald Trump will go another round” (Groden 2015).

These are but silly (and simultaneously not-so-silly) distractions. Our leaders and candidates should be welcoming tougher questions, not whining about them, and we should be demanding the reporting of in-depth dialogue—while eagerly being willing to attend to it. Looking back at 1854, Postman asks, “What kind of audience was this? Who were these people who could so cheerfully accommodate themselves to seven hours of oratory?” (2006, 44). Today, even guidelines for information about academic programs on a university website prioritize brevity and promotion over substance (with recommendations like “Make Your Copy Scannable”; use “50 to 150 words for a home page, 200 to 300 words for a landing page”; use “one idea per paragraph and keep it short”; use “bulleted lists”; “[u]se colloquial wording”; and “be warm and friendly … [i]nstead of sounding authoritative” [Content Manager Resource Guide, pers. comm., under “Writing for the Web”]).Footnote 3

But, to continue this one example in a much polluted sea, Trump isn’t the only one in the wrong. First, Fox News asked the candidates to make a (rather undemocratic) pledge on stage that, if not later chosen as the Republican nominee, they would “not run an independent campaign against that person … [because] experts say an independent run would almost certainly had the race over to the Democrats and likely another Clinton” (Washington Post Staff 2015, ¶49–¶57). Second, reporters like Kelly frequently fail to probe deeper with even obvious follow-ups. (And not just for reasons of time: doing so might make more visible the biases and illogicalities of media producers and their viewers that could destabilize entrenched stances on complex issues. This is one reason why no news outlet should be allowed to unabashedly take a slant or endorse political candidates.) For instance, in questioning Wisconsin Governor Scott Walker about his views on abortion, Kelly asked:

You’ve consistently said that you want to make abortion illegal even in cases of rape, incest, or to save the life of the mother. You recently signed an abortion law in Wisconsin that does have an exception for the mother’s life, but you’re on the record as having objected to it. Would you really let a mother die rather than have an abortion, and … are you too out of the mainstream on this issue to win the general election? (Washington Post Staff 2015, ¶142).

Walker responded by emphasizing his belief that “that is an unborn child that’s in need of protection out there” (Washington Post Staff 2015, ¶144). The moderators, which included Baier and Fox anchor Chris Wallace, then asked former Arkansas Governor Mike Huckabee to comment on “social issues” such as “same sex marriage” and being in “favor [of] a constitutional amendment banning abortions” (Washington Post Staff 2015, ¶149). Similar to Walker, Huckabee asserted (based on DNA evidence) “that we clearly know that that baby inside the mother’s womb is a person at the moment of conception” but “that we just continue to ignore the personhood of the individual … [and] that unborn child’s Fifth and 14th Amendment rights for due process and equal protection under the law. … It’s time that we recognize the Supreme Court is not the supreme being” (Washington Post Staff 2015, ¶151–¶153). Rather than posing any follow-up questions to either candidate regarding the complex subject of personhood, problematic contradictions in these statements (e.g., that we should safeguard constitutional rights but that the body charged with doing so has no bearing on this matter), or whether American society should ensure real access to healthcare, paid family leave, quality education, genuine safety nets, etc., as part of this “protection out there,” the moderators simply moved on, turning abruptly to questions about “ISIS.”

What we are left with, then, is entertainment masquerading as significance. Donald Trump’s candidacy, as unnecessarily hyperbolic as it is, at least throws some light on that. And if anyone should wonder if Postman didn’t at minimum get this right, at the beginning of the CNN “who won?” article, there is a seventeen-photo slideshow of the “[t]op quotes from the Republican debate.” The first, always visible unless clicked, is of Trump avowing, “What I say is what I say” (CNN 2015)—an ineloquent begging the question that feigns transparency wrapped in nonsense … and not worthy of quoting. (Instead, perhaps it should be neurosurgeon Ben Carson’s quip, thrown at Hillary Clinton but applicable to them all, that they “count[] on the fact that people are uninformed.” Postman says it better, of course: that our medium-metaphor and all those helping to bolster it “deprive people of their autonomy, maturity and history” [Postman 2006, xix].)

This circus-camouflaged-as-seriousness does not only occur in infrequent political debates but everywhere, every day, around the globe. We have lost all sense of content and context, and this affects our lives and health and that of the body politic, and thus our communities, as well. One need only watch a few shows with medical correspondents or hosted by medical professionals, who within the current medium-metaphor can only offer discrete solutions to distinct problems—whether an overactive thyroid, an underactive libido, diabetes, depression, allergies, acne, anaemia, etc.—unconnected to a broader picture or even the previous day’s counsel. It is no wonder, then, for even those fortunate to have access to certain capital and other forms of power, that most of us feel overwhelmed and unable to improve the status of our lives.

Postman is probably right—in both his thesis about what has occurred and his ultimately inconclusive conclusions about what can be done. “We must … not,” he says, “delude ourselves with preposterous notions such as … [a] straight Luddite position,” for to suggest “shut[ting] down any part of [our] technological apparatus … is to make no suggestion at all” (Postman 2006, 158). But after thirty years, not only has the current medium-metaphor refused to budge, it has grown in size and ferocity. Even when the personnel behind programs like The Daily Show (and others) comb media archives and pose bold questions in order to develop stories that clearly connect what leaders, institutions, journalists, and scientists have said and done yesterday with that of today, and thus bring inconsistencies and inanities into the light, the bulb of truth is quickly drained. It is amusing, if unsettling, then dismissed as ephemera, and it appears that no “real” news teams or politicians fear this attention very long.

But Postman probably would be proud of Jon Stewart’s final words to “camera three”:

Bullshit is everywhere. There is very little that you will encounter in life that has not been, in some ways, infused with bullshit. Not all of it bad. Your general, day-to-day, organic, free-range bullshit is often necessary. Or at the least very innocuous. “Oh, what a beautiful baby. I’m sure he’ll grow into that head.” That kind of bullshit in many ways provides important social-contract fertilizer and keeps people from making each other cry all day. But then there’s the more pernicious bullshit. Your premeditated, institutional bullshit, designed to obscure and distract (Stewart 2015, 00:54–01:37).

Stewart warns that this “comes in three basic flavors. One, making bad things sound like good things,” like the Patriot Act or the Freedom of Marriage Act; two, “[h]iding the bad things under mountains of bullshit,” like the multitude of Tolstoy-esque user agreements to which we consent with a simple click; and “finally, finally … the bullshit of infinite possibility. These bullshitters cover their unwillingness to act under the guise of unending inquiry. We can’t do anything because we don’t yet know everything” (Stewart 2015, 01:40–03:50).

Stewart leaves his post perhaps a bit more optimistic than Postman. For Stewart, “the good news is this. Bullshitters have gotten pretty lazy. And their work is easily detected” (Stewart 2015, 04:02–04:10), but Stewart’s own show and Postman’s Amusing Ourselves to Death demonstrate that the modern medium-metaphor can withstand “easy detection.” Postman suggested, at wit’s end, that we might “[r]equire all political commercials to be preceded by a short statement to the effect that common sense has determined that watching political commercials is hazardous to the intellectual health of the community” (2006, 159). Sadly, the 2002 Bipartisan Campaign Reform Act (also known as the McCain–Feingold Act) that requires candidates, interest groups, and political parties to add an “I approve this message” to every advertisement has made only memes and not a dent of difference in our discourse.

All hope is perhaps not lost, but, like those from Stewart, we should not keep failing to heed Postman’s final words:

The problem, in any case, does not reside in what people watch. The problem is in that we watch. The solution must be found in how we watch. … What I suggest here as a solution is what Aldous Huxley suggested, as well. And I can do no better than he. He believed with H. G. Wells that we are in a race between education and disaster, and he wrote continuously about the necessity of our understanding the politics and epistemology of media. For in the end, he was trying to tell us that what afflicted the people in Brave New World was not that they were laughing instead of thinking, but that they did not know what they were laughing about and why they had stopped thinking (Postman 2006, 160–163, emphasis original).