Abstract
The invention of agriculture 12,000 years ago has been called the worst mistake in human history. Alongside the social, political, and technological innovations that stemmed from it, there came a litany of drawbacks ranging from social inequality, a decline in human health, to the concentration of power in the hands of a few. Millennia after the invention of agriculture, another revolution—the digital revolution—is having a similar impact on humanity, albeit at a scale and speed measured in decades. Despite the tremendous advances brought about by this revolution, there is today a rapidly expanding gulf within and between societies along technological lines; alarming effects on sociality and cognition due to a persistent online presence; and a concentration of power in the form of wealth and data within a handful of tech companies, the likes of which has never been seen before. While the effects of agriculture can now be discerned with thousands of years of hindsight, those of the digital revolution can be witnessed in real time. Is the digital revolution paving the way for a more equitable and stable world, or is it leading humanity down a road that will prove to be more detrimental the more ensconced in technology we become?
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
In 1987, the geographer and physiologist Jared Diamond wrote an article titled “The Worst Mistake in the History of the Human Race”, outlining the consequences of our shift from a hunter-gatherer existence to subsistence agriculture. A shift he described as “a catastrophe from which we have never recovered” (1987). A move away from a nomadic, hunter-gatherer lifestyle roughly 12,000 years ago to a settled, farming one brought a stable source of calories, increase in population, and laid the foundation for trade and innovation, all of which would have made the world in which we live in today.
And yet, it also brought untold changes that proved irreversible: inequality through divisions of labor and social inequality (Shin et al. 2012); deforestation and soil erosion (Wright and White 1996); an array of diseases and malnutrition that locked societies in a bondage of reliance on a far more limited diet than their hunter-gatherer ancestors enjoyed (Martin and Goodman 2002); as well as the concentration of political and ritual authority (Laneri et al. 2015).
In a later work, Diamond noted, “If you could choose between being a middle-class American, a Bushman hunter, and peasant farmer in Ethiopia, the first choice would undoubtedly be the healthiest, but the third choice might be the least healthy” (2002). What he was pointing to is the conundrum of innovation: there is no such thing as a free lunch in evolution, be it biological, social, or technological. With the accrued benefits from one invention, so rise a litany of drawbacks that come with it. The so-called ‘progress trap’ (Wright 2004; O’Leary 2007) where material innovation leads to the rise of problems and uncertainty which societies are incapable of solving, resulting in stagnation and possible collapse.
This essay examines the developments stemming from another paradigm shift in technological innovation—the digital revolution—or what is popularly called the third industrial revolution. A revolution that is perhaps as profound as its ancient agricultural counterpart with equally far-reaching implications. Diamond identified three areas where agriculture has made a significant and negative impact: class division and inequality; adverse effects on health; and the concentration of power in the hands of the few.
The digital revolution is producing a similar impact along associated lines but at a far more rapid rate given the advances in artificial intelligence (AI). Specifically, this includes the digital divide along sociodemographic lines as well as between the developed and developing world; the social and physical effects of a near constant online existence; and the centralization and control over data (personal and otherwise) by a handful of technology corporations, and the perhaps inevitable transfer of that control over to the state.
This is not the first essay to raise such issues. Similar concerns about the geometric growth of technology and our ability to control it have been voiced over the years by people such as the mathematician Irving John Good (1966) and philosopher Martin Heidegger (1977). There are also issues that simply cannot be covered within a single essay. Two, specifically, include the looming threat of mass unemployment due to automation, and the potential (some argue likelihood) for the rise of sentient machines, or what is often what is often called the technological singularity. What this essay aims to do, however, is examine the immediate and potentially long-term effects of the digital revolution against those of agriculture which, with thousands of years of hindsight, we know forever changed humanity.
Despite the concerns over technology, there is also no denying that the digital revolution has brought about remarkable changes over the past half a century in areas, such as the democratization of higher education, personalized medicine, to advances in defense, logistics, and transportation (Scott 2006; Hui-Chen et al. 2021; Hamet and Tremblay 2017; Lau and Haugh 2018). But along with these developments have come an array of drawbacks from online ‘doxing’, Internet addiction, to the threats to individual liberties posed by the concentration of power in the hands of a few tech firms.
The first nomadic tribes experimenting with planting seeds could not foresee the physical and social detriments that arose from them, and it has only been through archaeological excavations and forensic analyses over the past several decades that we can see the true impact of agriculture. The changes unleashed by the digital revolution, by contrast, are occurring in real time. Societal destiny is not always in the hands of individuals, though, and the dramatic environmental shifts that contributed to the demise of agriculture-based civilizations of the past such as the Anasazi in southwestern US (Benson and Berry 2009), Angkor in Cambodia (Buckley et al. 2010), and possibly the Maya in Central America (Kennett et al. 2012) may await us still. But outside of such calamities and our own reckoning with a changing climate, it is highly unlikely that the digital revolution will ever reverse itself with societies choosing to adopt a pre-digital lifestyle, just as most societies never willingly gave up agriculture. If Diamond is correct in his description of the peasant farmer as worse off compared to hunter-gatherers and those in economically advanced societies, are the bulk of those whose lives revolve around digital technology his equivalent?
2 Class divisions and a perpetual developing world
The adoption of agriculture varied widely around the world and involved diverse crops, ranging from the cultivation of cereals in the Middle East more than 12,000 years before present (B.P.); the cultivation of rice in China around 9000 B.P.; domesticated maize in Mexico at 8700 B.P.; bananas and taro roughly 7000 B.P. in New Guinea; and up to 4500 B.P. with the cultivation of wild seed crops in the Eastern Woodlands of North America (Fuller et al. 2014). The picture that emerges is one of incremental change as societies moved from a mobile, hunter-gatherer existence to a more settled one that focused on food production. And while there are historical examples of societies abandoning farming due to the effects of war, disease, famine, or natural disasters, for most, once they became locked into a dependence on agriculture, there was no going back.
As populations grew, so, too, arose an expansion in inequality. Accumulated resources seen in an increase in grave goods and the size of dwellings in the Near East or control over land tenureship by individuals and powerful families in Polynesia, came with attendant dominance over economic and political matters (Price and Bar-Yosef 2010; Hayden and Villeneuve 2010). In the larger city-states of the Basin of Mexico, elites could hold sway over vast territories, acting as the bond between centers of power and outlying areas through intermarriage, trade, or other alliances (Smith 1986, 2021).
Status differentiation in and of itself is not unknown in non-agricultural societies (Borgerhoff Mulder et al. 2009) and has been witnessed in non-human primates for decades (Di Bitetti 1997; Sparks 2006 [1967]). However, it was the monopolization of predictable resources and intergenerational wealth that played a role in persistent inequality (Mattison et al. 2016). A commanding individual could rise to prominence through displays of power and prosperity, acquiring followers along the way through patron–client relationships, and eventually establishing a hereditary rule through passing that ledger of reciprocal relationships on to one’s heirs. Over time, social inequalities became ‘baked in’ and part of the social fabric from which individuals could rarely move out of.
Unlike agriculture, the digital revolution was not a set of experiments separated by thousands of years in varied locations around the world. Rather, it was spearheaded by those countries already at the forefront of technology within the span of a few decades following the end of WWII. However, whereas those monumental shifts initiated by agriculture were slow to accumulate, the inequality brought about by the digital revolution has been ushered in quickly given the global transition to digitization. The post-war interconnectedness of nations in terms of trade, defense, transportation, and diplomacy, to name a few, ushered in this transition. But so, too, did Cold War-era fears and the potential of being left behind in the steady march toward modernization.
This new form of technological inequality has centered on the so-called ‘digital divide’, an issue that has been around for decades and originally described as the advantage conferred to those with access to computers compared to those without, as well as socioeconomic variables of residence (urban versus rural), education, income, English language capabilities, and access to pre-existing technology (Rao et al. 1999; Kleinman 2001). Critics, however, have argued that issues of differential use along with deterministic assumptions that the presence of technology facilitates learning are more salient features of the divide, while the more basic needs of literacy may be the real determinant as to whether access to the Internet has any real meaning (Warschauer 2003; Chun-Yao and Hau-Ning 2010).
As technological innovations such as smartphones became more ubiquitous, particularly among developing nations that saw a greater increase across technologies (Dewan et al. 2010), the access divide decreased. Yet, globally, there continues to be a gap between sexes, age, education, and economic stability among lower paying jobs that do not require the use of the Internet and those that do (Calderón-Gómez et al. 2020); the COVID-19 pandemic making this latter distinction all the more obvious. Similarly, are continuing issues regarding inequality and class division with regard to Internet access and social networks with significant social capital (Winseck 2017; Zhao and Elesh 2007).
The divide also falls along generational lines. The degradation of historical knowledge may be inevitable the more distant in time one gets from historical events and those who lived through them, but the rapid development of the Internet and its outgrowths such as social media have accelerated this process to an unprecedented extent. The result has been a shift in the transfer of cultural knowledge away from generational sources (parents, grandparents, teachers, and so on) and toward those found online. Those born in the mid-90s and onward (Millennials and Generation Z) are less likely to be raised in a religious household compared to their parents, but also less likely to be familiar with the more seminal events of the twentieth century, and more likely to form their identity and opinions via influence from social media (Sultan 2017).
Looking at the early twentieth century by way of comparison, the generation of Americans that fought in WWI was 50 years removed from the US Civil War, but the cause of the latter continued to be vociferously debated (Ramsdell 1937). Today, more than 50 years after the collapse of South Vietnam, younger generations in the US and Vietnam express little interest or knowledge of the Vietnam War (or the American War as it is called in Vietnam), despite the momentous ramifications it had on both countries. As with their American counterparts, younger Vietnamese instead turn to social media to form their opinions around contemporary issues (Rosen 2015).
However, it is between nations that the divide may be of greater concern for countries on the lower end of the socioeconomic ladder. According to the International Telecommunications Union (ITU), urban populations and younger generations have greater access to the Internet, while developed countries have nearly double the number of users as developing countries, and four and a half times the amount of least developed countries (LDCs) (ITU 2020). Social media, likewise, has become a ubiquitous and even mundane part of our online experience with 3 billion of the 3.8 billion Internet users engaged with social media at any point in time (Kemp 2017).
The digital age has, thus, led to inequality on two levels: individually and nationally. Individuals with greater skills/know-how regarding the use and access of information (generally middle-class and college educated) will have a distinct advantage in terms of employment, social networks, and economic advancement. Nationally, the situation is potential even more dire. While access within developing countries is still an issue, the real challenge in the future and one that will impact developmental trajectory is the divide between countries with technological prowess and those without.
A country such as Niger with 6.62 children per woman will have far different needs in terms of supportive care and employment than countries such as Singapore and Taiwan that hover around one child per woman. However, as poorer nations do develop and generally become more prosperous compared to their former state, that also leads to greater health outcomes such as infant survivability and longer lives. That is, greater numbers of people striving for a better life.
Thus, it is countries such as Nigeria whose population has increased tenfold since 1950 that will likely be the wellsprings from which future migrants hail: countries that have made it out of the bottom rungs of development but not so prosperous that their populations would have no reason to leave. Throw into the mix ongoing conflicts and environmental change, and those rising populations will likely be less inclined to stay at home and more likely—and fiscally capable—of making the journey to more prosperous regions of the world (King 2017). But given the backlash over globalization throughout the Western world and severe restrictions on immigration in many countries in Asia, that journey has become all the more difficult.
A far more desperate future could just as well be a technical and economic chasm the likes of which have not been seen since the Age of Discovery. Within richer nations, the social divide may lie between those whose jobs can be conducted online, and a mass of humanity that has been made redundant by automation; the so-called ‘non-essential’ workers of the COVID-19 pandemic. Poorer countries, meanwhile, could become forever chained to the bottom rung of development, dependent on the developed world acting as paternalistic overseers that dole out technology according to the latter’s estimation of which country is capable of handling what and when, or in exchange for whatever resources that they can extract from the ground. Indeed, an even more ominous scenario could see poorer nations cease to functionally develop, becoming ever more dependent on powerful patrons who guard their technological secrets as ancient China once guarded the manufacture of silk.
3 Social and cognitive change
In his 1987-piece, Diamond wrote “From the progressivist perspective on which I was brought up, to ask ‘Why did almost all our hunter-gatherer ancestors adopt agriculture?’ is silly. Of course they adopted it because agriculture is an efficient way to get more food for less work…While the case for the progressivist view seems overwhelming, it's hard to prove.” Indeed, the opposite is the case. The amount of work hunter-gatherers spend acquiring food is less than that required for agricultural production (Sahlins 1972; Dyble et al. 2019). The fruits of their labor, so to speak, are also more diversified, leading to a far healthier diet and lower food-related pathologies compared to their agriculturally dependent counterparts.
Research from around the world has demonstrated a general decline in dental health as well as an overall increase in morbidity with the shift to agriculture and the sedentism that followed (Ubelaker 1992; Gualandi 1992; Lukacs 1996). The narrowing of diets and reliance on domesticated plants compared to animal protein led to a retardation in bone length, thickness, and robusticity (Larsen 1995), not to mention the dangers of relying on a limited number of crops should a bad harvest or disease wipe them out. A carbohydrate-based diet satisfied the need to feed a growing number of hungry mouths, but the lack of nutritional value weakened the bodies attached to them.
And so, too, for our dopamine-driven clicks on the Internet. Chamath Palihapitiya, a former vice president for user growth at Facebook, has publicly stated that the short-term, dopamine effects of social media are having long-term consequences on civil society through the manipulation and overproduction of shared information. At a talk at the Stanford Graduate School of Business in 2017, Palihapitiya said that “Bad actors can now manipulate large swaths of people to do anything you want. And we compound the problem. We curate our lives around this perceived sense of perfection, because we get rewarded in these short-term signals—hearts, likes, thumbs up—and we conflate that with value and we conflate it with truth” (Wang 2017).
While there is evidence that moderate use of social media may not pose a risk (Przybylski and Weinstein 2017), there is also evidence that heavy use of social media (5 + hours a day) among adolescents leads to higher rates of unhappiness and suicidal risk factors (Twenge and Campbell 2019). The risk for teenage girls in particular is much higher than for boys, with rates of depression, suicide-related outcomes, and suicide rates spiking after 2012 when use of social media among teens became common (Twenge et al. 2018; Haidt and Allen 2020). The explosion in growth of teenage girls coming out as transgender—70 times the expected prevalence rate—and increased use of social media has also been cited by parents of girls who belong to peer groups where one or even all friends within the group become gender dysphoric within the same timeframe (Littman 2018).
The primacy of place of social media for many has led to the rise of digital narcissistic behavior (Faucher 2018) through the amplification of one’s profile to millions online where the latest Tweet, Instagram photo, or YouTube video elicits an almost immediate response and cycles of engagement that occupy, for many, most of their waking hours. As the use of digital media has more than doubled from close to 3 h per day in 2009 to more than 6 h per day in 2018, it is perhaps no surprise that living arrangements have begun to reflect this trend, with upscale “physical social network” (Hansen-Bundy 2018) residential apartment blocks offering all the services one needs. Obviating the necessity of venturing out into other parts of a city to complete daily tasks becomes, interacting with others outside of one’s immediate sphere of influence likewise becomes obsolete.
But while these physical spaces are designed to bring individuals together and provide an antidote for the impersonality of social media, a remedy is only needed when there is a problem. Specifically, the dwindling importance many people, especially young people, place on face-to-face interaction. Instead, there is the increasing importance of immersing oneself in a sea of images, videos, and text that can be digested and curated quickly to make room for the next in an endless stream of social consciousness from all points of the globe.
An increasingly isolated populace also leads to broader repercussions for the future demographic health of a society. W. Bradford Wilcox (2018) of the National Marriage Project at the University of Virginia cites online porn and video games as contributing to the decline in birthrates in the US, but also a decline in an interest in sex among younger people. Although the US is nowhere near Japan where one-third of Japanese adults between the ages of 18–34 have never had sex, movements such as the #MeToo movement have the capacity to drive down sexual activity even further as men fear the aftermath of sexual encounters in an age where definitions of what constitutes consent remain ambiguous (ibid), and online ‘doxing’ campaigns can quickly upend lives.
The necessity of the Internet and handheld devices has also given rise to new ailments and an expanding catalog of acronyms to account for them. Internet Addiction Disorder (IAD) and problematic smartphone use (PSU) involve primarily visual stimuli, but those stimuli share similar characteristics with substance-use disorders such as rapid impulsive processes with little reflection by the user (Roh et al. 2018). PSU has been shown to be positively related to anxiety, time spent on one’s phone, and, interestingly, the number of selfies taken, but negatively related to a connection with nature. The latter, by contrast, has been positively related to photos of nature (Richardson et al. 2018). In other words, the more time spent on smartphones, the less one is likely to take a photo of, say, the Grand Canyon for the sake of its own natural splendor, but a photo of oneself at the Grand Canyon as if it were simply the background on which one’s persona is layered and validated through likes, retweets, and shares.
Our reliance on AI for tasks that were once part of the normal repertoire of human activities has saturated modern societies. These include the well-known ‘Google effect’, or relying on the Internet in place of our own memory, and a reduction in the desire to engage in demanding mental tasks and encode new information in our brains (Sparrow et al. 2011; Bohannon 2011; Storm et al., 2017). Where once our friends and family members were part of a network of transactive memory partners with whom we shared information, the Internet increasingly fills that role and has become for many the sole arbiter of knowledge (Wegner and Ward 2013). Surveys of university students have also found a similar reliance on the search engine over library services, even when students are physically present in university libraries (O’Connor and Lundstrom 2011).
These trends stand alongside more serious issues such as the increase addictive disorders and associated complications. Massively multiplayer online role-playing games (MMORPGs) can lead to Internet gaming disorder (IGD) which has been linked to negative perceptions of past experiences (So-Kum Tang et al. 2017; Lukavská 2018), as well as lower densities of gray matter in the cortex and corresponding behaviors such as impulsivity, distorted decision-making capabilities, depression, and anxiety disorders (Lee et al. 2018). And what occurs at the individual level possibly translates to the population level, meaning that greater amounts of information consumed at greater rates are indeed resulting in lower attention spans for the public at large (Firth et al. 2020).
And it is these latter developments that are perhaps the most alarming of all. As Diamond noted of average human height within skeletons found in Greece and Turkey pre- and post-agriculture, whereas hunter-gatherers reached 5′ 9″ and 5′ 5″ for men and women, respectively, their agricultural successors saw those heights plummet to 5′ 3″ and 5′ (1987). And this drop in height was in addition to associated increases in malnutrition, infectious diseases, and degenerative conditions of the spine (ibid). From 1990 to today, use of the Internet went from close to 0 to nearly 50% of the world’s population, with increases in IAD and IGD seeing similar exponential growth. As with agriculture’s impact on height, the digital revolution is having an equally profound effect on those with IAD and IGD who exhibit a decrease in frontal lobe functions and gray-matter volume of (Jun et al. 2013; Pan et al. 2018). And when considering that more than 50% of US children aged 8–18 have smartphones and spend between roughly 5–7 h a day on them (5 h a day for those 8–12; 7 h a day for those 13–18) (Rideout and Robb 2019), the long-term consequences for the cognitive abilities of younger generations are an open question.
However, acronyms and clinical analyses have a way of blinding us to the generality of such conditions, and the true number of those living in modern societies who may accurately be described as having IAD or IGD is unknown. A simple observational experiment can bear this out. Look around the coffee shop, train, airport terminal, or other public space you happen to be in now—what do you see? More than likely, it is individuals immersed in the soft glow of a screen next to others who are similarly wrapped in a constellation of pixels. A community of people who have walled their minds off from a shared reality which they dip in and out of between clicks and swipes. Our switch to agriculture and what we consumed with our mouths did not make our bodies bigger and stronger but smaller and weaker. The transition to the digital age and our sensory immersion into what for many is a perpetual online existence, is well on its way to having an equally profound effect on our neurological health.
4 Toward a universal social credit system?
The conglomeration of power through the invention of agriculture in many ways can be said to be self-fulfilling. A reliable and increased source of calories means a greater number of individuals that can be fed and survive. An increase in the number of individuals making it into adulthood means a greater number of offspring which will likely be born to those people in the following years. As agriculture requires competent organization if a community has any reasonable chance of reaping its benefits year after year, those who have the skills and charisma necessary to lead will usually rise to the top of the social hierarchy. But what historically solidified their position was an overarching religious system and cadre of specialists capable of reading the signs of nature and the heavens, thus, legitimizing the power of the sovereign, and ultimately the political and economic system in which commoner and elite all were entwined. For the average peasant, the workings of those who could divine the whims of the gods were clouded in secrecy, hidden by sacred incantations and magical objects. But above all else, they were not to be questioned.
The digital sorcery of today has its own veil that shrouds its intrigues. Social media companies, at least within the United States, fall under a protected class of companies that are not held as responsible for the content that can be accessed on them. Section 230, a provision in the Communications Decency Act passed by the US Congress in 1996, states that as platforms and not disseminators for information, they are protected from lawsuits over users’ posts while allowing them to moderate posts that express violent or misleading content. With the growth of the Internet and social media since then, this has also meant that companies, such as Google, Facebook, and Twitter, have become the defacto mediums which everyone from ordinary citizens to world leaders use to express their views. The result has been the unprecedented acquisition of power and influence by private companies over what individuals read, see, and share online (Lee 2018).
Although bemoaned in the West for its totalitarian underpinnings, the Chinese Communist Party’s (CCP) rollout of its Social Credit System may be a hallmark of things to come. The system quantifies all online actions from opinions posted online, shopping habits, to the amount of time spent watching videos or playing video games. Planning documents for the new system cite that it would “allow the trustworthy to roam everywhere under heaven while making it hard for the discredited to take a single step” (Chin 2016). As Denyer writes, “your score becomes the ultimate truth of who you are—determining whether you can borrow money, get your children into the best schools or travel abroad; whether you get a room in a fancy hotel, a seat in a top restaurant—or even just get a date” (2016).
A high score means greater ease when making online transactions; a low score means expensive purchases via Alipay (a Chinese online payment system also connected to the government) could be blocked. Those with low scores may also be prohibited from boarding planes and high-speed trains (Wade 2017). Perhaps, the most pernicious aspect of the system is that an individual’s social credit score can be lowered through the actions of his friends. You may be a firm believer in the CCP, but should your friend say something that could be construed as less than positive, your score would be in jeopardy for being associated with him.
But how different is this from the ‘court of Twitter opinion’ or other methods used by groups to silence and shame those with whom they disagree, or pressure social media to adopt their definitions of what constitutes hate (Dibble 2020; Perman 2021; Capatides 2020)? The banning of individuals from Twitter, Facebook, and YouTube has become routine, while ‘sandboxing’ (limiting access to videos that are deemed controversial or hateful) is used against ordinary people and world leaders alike (YouTube 2017; Alba et al. 2021). Such bans have been criticized across the political spectrum by politicians around the world (Brennan 2021), raising concerns of the specter of a “digital oligarchy” that in many ways has greater power than the state to silence individuals with whom it disagrees (Jennen and Nussbaum 2021).
There are those who argue that far from being a limit on information, online platforms have democratized information more than at any other time in history. To that end, they are partially correct. While it is true that foreseeably anyone is capable of putting anything online, should search engines or social media sites choose not to index it (or in the case of Amazon, refuse to let companies use its servers), that material will likely never be seen, or at least be seen be a consequential number of people. It is akin to the US Library of Congress filing a book within its vast depository but never assigning it a reference number. Sure, the information is there—but who will ever find it?
The information that does get the approval of corporate gatekeepers is recycled at a speed and degree never before seen, meaning that a post, comment, or picture can be rediscovered and repackaged, gaining a life of its own and demonstrating information’s pathogenic quality. Information comes to light and becomes ‘viral’, spreading throughout a population only to die down before being rediscovered once again, sometimes years after the fact, and resurfacing with perhaps even greater virulence than before. Appropriate behavior and acceptable conduct are policed by a vigilant and unforgiving online community who judge a person’s past deeds against the shifting sands of what is and is not considered appropriate. Authoritarianism, thus, becomes the purview of everyone online, or what Jared Lanier, one of the original creators of virtual reality, calls “digital Maoism” (in Appleyard 2011).
These threats to individual liberty are eased along by the migration by government and businesses in developed economies to an almost wholly online existence, meaning that some form of online activity is a necessity for most people. The individual is thus forced to collaborate in his slow transformation into a ‘datavidual’: An aggregation of online information accessed by codes, the combination of which is unlikely expressed in anyone else (van der Meulen and Bruinsma 2018). One could envision these trends only continuing, such as the calls for ‘vaccine passports’ and tracking of infected individuals via their smartphones during the COVID-19 pandemic, inevitably becoming part of our everyday lives. An ID card or social security number could be replaced by an individualized Internet protocol, or IP address, without which one would have no way of conducting basic transactions from buying milk at the corner store to registering to vote. A continuous online presence—which many people essentially already have via their smartphones—may soon not be an option but a necessary part of everyday life.
Primates are extremely adaptable species with humans being the most adaptable among them. Countless individuals walking among us would not be here if not for advances in everything from eyeglasses to antibiotics. However, our reliance on technology to fill the gaps of our own physical limitations also allows for a continued redefinition of what it means to be human. If we accept the premise that a continuous online presence will be more rather than less likely to occur in the future, then the probability that invasive procedures to augment those limitations may similarly become routine. As Maguire and McGee write, “As enhancements become more widespread, enhancement becomes the norm, and there is increasing social pressure to avail oneself of the ‘benefit.’ Thus, even those who initially shrink from the surgery may find it a necessity” (1999).
Technology has already moved from smartphones to smart glasses, with smart contact lenses just over the horizon. How farfetched is the idea of implanted chips becoming a necessity of daily life, or worse, even required via state decree, much as it is required of pet owners to implant a chip in the family dog?
5 Conclusion
And, so, we are left with the question posed in the title of this essay: Is the digital revolution our second worst mistake (assuming that Diamond was correct about the first)? The answer, I believe, is that both agriculture and the digital revolution are not exceptions to a rule but necessary outcomes of a rule itself. That rule being that human beings have an inexhaustible capacity to tinker and will do so inexhaustibly. Our extinct cousin, Homo erectus, lived from around two million to 100 thousand years ago; more than six times as long as our species has been in existence. However, during his time on Earth, his toolkit never advanced beyond rudimentary stone tools. By contrast, within the past 40 thousand years, our species, Homo sapiens, went from cave paintings to putting a man on the moon, with the most advanced innovations occurring after the adoption of agriculture.
The transformations brought about by the digital revolution are and will continue to have far-reaching ramifications in terms of our social and political structures, but also on education, the way we consume information, and the digitization of economic exchange. However, as with agriculture, it is also leading to changes in our physiology, reproduction, and a new range of ailments stemming from our novel online existence. And these are in addition to and likely compounded by longer working hours as the divide between home and work becomes increasingly blurred. It is the young, though, which will bear the brunt of any long-term negative effects of innovation. The impact on brains alone is enough to reconsider the steady entrenchment of younger generations in a persistent online presence, and what this may mean for societies writ large in the years to come.
The digital revolution has so thoroughly reorganized modern life, that is also similar to agriculture in that it is extremely difficult to conceive of a time in the future when an online presence would be obviated, much less optional. However, as the COVID-19 pandemic’s impact on childhood literacy has shown (World Bank 2022), the pitfalls that come with an interconnected world can chip away at the very fundamentals on which that world rests, regardless of how technologically sophisticated a country may be. And the poorer the nation, the more impoverished the learning, meaning that succeeding generations and the societies they live in may be left even further behind the digital divide.
And while democratic nations in the West lament the restrictions on freedoms implemented by China with its Social Credit System, those societies have also come to rely on technology platforms to surveil and collect data on their own citizens as never before. Where once a slip of the tongue or casual observation would be as quickly forgotten, today a post on social media can be dissected and reinterpreted—weaponized—even years after the fact. Freedom of speech and association are the most at risk, ironically through the use of the very online platforms created for sharing ideas and associating with likeminded individuals.
Thousands of years ago, human beings simply could not foresee all that would unfold in the distant future with the planting of the first seeds. Millennia after its creation, we can look back on the invention of agriculture and clearly see the costs and benefits of that turn in our collective development. Agriculture led to a settled way of life, greater populations, and the centralization of power in political and religious institutions; the two latter of which, until the modern age, were quite often of the oppressive variety. Our digital revolution is no less radical in the changes that it is bringing to our species and the societies we live in, only at a speed and scale that dwarfs agriculture.
But even if agriculture was the worst mistake in history, that does not mean that everything flowing downstream from it—including the digital revolution—is necessarily a mistake as well, any more than is the offspring of a doomed marriage. What we can say with certainty is that inequality, human health, and individual liberties have been impacted by the digital revolution both positively and negatively. How these three spheres will develop in 50 or 100 years is anyone’s guess, but there are three possible outcomes: they will improve, go unchanged, or worsen. If the drawbacks stemming from the digital revolution continue on their current trajectory as digital technology becomes even more embedded within our lives, then perhaps we can be equally certain of the latter.
Data availability
Data sharing is not applicable to this article as no datasets were generated or analyzed during the current study.
References
Alba D, Koeze E, Silver J (2021) What happened when Trump was banned on social media. The New York Times 7 June 2021. https://www.nytimes.com/interactive/2021/06/07/technology/trump-social-media-ban.html. Accessed 28 June 2021
Appleyard B (2011) The digital generation. RSA J 157(5548):18–23
Benson LV, Berry MS (2009) Climate change and cultural response in the prehistoric American southwest. Kiva 75(1):87–117
Bohannon J (2011) Searching for the Google effect on people’s memory. Science 333:277
Borgerhoff Mulder M, Bowles S, Hertz T et al (2009) Intergenerational wealth transmission and the dynamics of inequality in small-scale societies. Science 326:682–688
Brennan D (2021) Donald Trump's Twitter ban concerns world leaders, officials. Newsweek 1 Dec. 2021. https://www.newsweek.com/donald-trump-twitter-ban-concerns-world-leaders-officials-1560771. Accessed 1 May 2022
Buckley BM, Anchukaitis KJ, Penny D, Fletcher R, Cook ER, Sano M, Le NC, Wichienkeeo A, Ton MT, Hong HM, Marcus J (2010) Climate as a contributing factor in the demise of Angkor, Cambodia. Proc Natl Acad Sci USA 107(15):6748–6752
Calderón-Gómez D, Casas-Mas B, Urraco-Solanilla M, Carlos Revilla J (2020) The labour digital divide: digital dimensions of labour market segmentation. Work Organ Labour Glob 14(2):7–30
Capatides C (2020) White silence on social media: why not saying anything is actually saying a lot. CBS News, 03 June 2020. https://www.cbsnews.com/news/white-silence-on-social-media-why-not-saying-anything-is-actually-saying-a-lot/. Accessed 10 Nov 2021
Chin J (2016) China’s new tool for social control: a credit rating for everything. The Wall Street Journal, 28 November 2016. https://www.wsj.com/articles/chinas-new-tool-for-social-control-a-credit-rating-for-everything-1480351590. Accessed 22 May 2020
Chun-Yao H, Hau-Ning C (2010) Global digital divide: a dynamic analysis based on the Bass model. J Public Policy Mark 29(2):248–264
Denyer S (2016) China’s plan to organize its society relies on ‘big data’ to rate everyone. The Washington Post 22 October 2016. https://www.washingtonpost.com/world/asia_pacific/chinas-plan-to-organize-its-whole-society-around-big-data-a-rating-for-everyone/2016/10/20/1cd0dd9c-9516-11e6-ae9d-0030ac1899cd_story.html?noredirect=on&utm_term=.9191ee7391dd. Accessed 22 May 2021
Dewan S, Ganley D, Kraemer KL (2010) Complementarities in the diffusion of personal computers and the Internet: implications for the global digital divide. Inf Syst Res 21(4):925–940
Di Bitetti MS (1997) Evidence for an important social role of allogrooming in a platyrrhine primate. Anim Behav 54:199–211
Diamond J (2002) The rise and fall of the third chimpanzee: how our animal heritage affects the way we live. Vintage Books, London
Diamond J (1987) The worst mistake in the history of the human race. Discover Mag, May, 64–66
Dibble M (2020) Dean of Massachusetts nursing school fired after saying ‘everyone’s life matters’. Washington Examiner 2 July 2020. https://www.washingtonexaminer.com/news/dean-of-massachusetts-nursing-school-fired-after-saying-everyones-life-matters. Accessed 15 June 2021.
Dyble M, Thorley J, Page AE, Smith D, Bamberg Migliano A (2019) Engagement in agricultural work is associated with reduced leisure time among Agta hunter-gatherers. Nat Hum Behav 3:792–796
Faucher KX (2018) Alienation 2.0—symptoms of narcissism and aggression. In: Fuchs C (ed) Social capital online: alienation and accumulation. University of Westminster Press, London, pp 87–108
Firth JA, Torous J, Firth J (2020) Exploring the impact of Internet use on memory and attention processes. Int J Environ Res Public Health 17(9481):1–12
Fuller DQ, Denham T, Arroyo-Kalin M, Lucas L, Stevens CJ, Qin L, Allaby RG, Purugganan MD (2014) Convergent evolution and parallelism in plant domestication revealed by an expanding archaeological record. PNAS 111(17):6147–6152
Good IJ (1966) Speculations concerning the first ultraintelligent machine. Adv Comput 6:31–88
Gualandi PB (1992) Food habits and dental disease in an iron-age population. Anthropol Anz 50(1/2):67–82
Haidt J, Allen N (2020) Scrutinizing the effects of digital technology on mental health. Nature 578:226–227
Hamet P, Tremblay J (2017) Artificial intelligence in medicine. Metab Clin Exp 69:S36–S40
Hansen-Bundy B (2018) A week inside WeLive, the utopian apartment complex that wants to disrupt city living. GQ 27 February 2018. https://www.gq.com/story/inside-welive. Accessed 17 May 2020
Hayden B, Villeneuve S (2010) Who benefits from complexity? A view from Futuna. In: Feinman GM, Price TD (eds) Pathways to power: new perspectives on the emergence of social inequality. Springer, New York, pp 95–145
Heidegger M (1977) The question concerning technology. The question concerning technology and other essays. Harper & Rowe Inc, New York, pp 3–35
Hui-Chen L, Yun-Fang T, Gwo-Jen H, Hsin H (2021) From precision education to precision medicine. Educ Technol Soc 24(1):123–137
International Telecommunication Union (ITU) (2020) Measuring digital development, facts and figures, 2020. International Telecommunication Union Development Sector. https://www.itu.int/en/ITU-D/Statistics/Pages/ff2020interactive.aspx. Accessed 29 May 2021
Jennen B, Nussbaum A (2021) Germany and France oppose Trump’s Twitter exile. Bloomberg 11 January 2021. https://www.bloomberg.com/news/articles/2021-01-11/merkel-sees-closing-trump-s-social-media-accounts-problematic. Accessed 28 June 2021
Jun L, Esmail F, Lingjiang L, Zhifeng K et al (2013) Decreased frontal lobe function in people with internet addiction disorder. Neural Regen Res 5(34):3225–3232
Kemp S (2017) Three billion people now use social media. Wearesocial.com. https://wearesocial.com/blog/2017/08/three-billion-people-now-use-social-media. Accessed 29 May 2021
Kennett DJ, Breitenbach SFM, Aquino VV, Asmerom Y, Awe J, Baldini JUL, Bartlein P, Culleton BJ, Ebert C, Jazwa C, Macri MJ, Marwan N, Polyak V, Prüfer KM, Ridley HE, Sodemann H, Winterhaider B, Haug GH (2012) Development and disintegration of Maya political systems in response to climate change. Science 338(6108):788–791
King SD (2017) Grave new world: the end of globalization, the return of history. Yale University Press, New Haven
Kleinman SS (2001) Understanding the digital divide: implications for college teaching. Transform J Incl Sch Pedagog 12(2):51–67
Laneri N, Schwartz M, Ur J et al (2015) Ritual and identity in rural Mesopotamia: Hirbemerdon Tepe and the upper Tigris River Valley in the middle Bronze Age. Am J Archaeol 119(4):533–564
Larsen CS (1995) Biological changes in human populations with agriculture. Annu Rev Anthropol 24:185–213
Lau C G, Haugh B A (2018) Megatrend issues in artificial intelligence and autonomous systems. Institute for Defense Analyses. http://www.jstor.org/stable/resrep22645. Accessed 20 June 2022
Lee D, Park J, Namkoong K, Kim IY, Jung Y (2018) Gray matter differences in the anterior cingulate and orbitofrontal cortex of young adults with Internet gaming disorder: surface-based morphometry. J Behav Addict 7(1):21–30
Lee D (2018) Facebook’s data-sharing deals exposed. BBC News 19 December 2018. https://www.bbc.com/news/technology-46618582. Accessed 20 Dec 2021
Littman L (2018) Parent reports of adolescents and young adults perceived to show signs of a rapid onset of gender dysphoria. PLoS ONE 13(8):1–44
Lukacs JR (1996) Sex differences in dental caries rates with the origin of agriculture in South Asia. Curr Anthropol 37(1):147–153
Lukavská K (2018) The immediate and long-term effects of time perspective on Internet gaming disorder. J Behav Addict 7(1):44–51
Maguire GQ Jr, McGee EM (1999) Implantable brain chips? Time for debate. Hastings Cent Rep 29(1):7–13
Martin DL, Goodman AH (2002) Health conditions before Columbus: paleopathology of native North Americans. West J Med 176(1):65–68
Mattison SM, Smith EA, Shenk MK, Cochrane EE (2016) The evolution of inequality. Evol Anthropol 25:184–199
O’Leary D B (2007) Escaping the progress trap. Geozone Communications, Montreal
O’Connor L, Lundstrom K (2011) The impact of social marketing strategies on the information seeking behaviors of college students. Ref User Serv Q 50(4):351–365
Pan N, Yongxin Y, Xin D et al (2018) Brain structures associated with Internet addiction tendency in adolescent online game players. Front Psychiatry 9(67):1–8. https://doi.org/10.3389/fpsyt.2018.00067.Accessed19June2022
Perman S (2021) After NBC calls on HFPA to oust ex-president, citing ‘racist rhetoric,’ Phil Berk is out. Los Angeles Times 20 April 2021. https://www.latimes.com/entertainment-arts/business/story/2021-04-20/nbc-dick-clark-productions-condemn-former-hfpa-presidents-actions-ask-for-his-ouster. Accessed 1 May 2021
Price DT, Bar-Yosef O (2010) Traces of inequality at the origins of agriculture in the ancient Near East. In: Feinman GM, Price TD (eds) Pathways to power: new perspectives on the emergence of social inequality. Springer, New York, pp 147–168
Przybylski AK, Weinstein N (2017) A large-scale test of the Goldilocks hypothesis: quantifying the relations between digital-screen use and the mental well-being of adolescents. Psychol Sci 28(2):204–215
Ramsdell CW (1937) The changing interpretation of the civil war. J South Hist 3(1):3–27
Rao M, Sanjib RB, Iqbal SM et al (1999) Struggling with the digital divide: Internet infrastructure, policies and regulations. Econ Pol Wkly 34(46/47):3317–3320
Richardson M, Hussain Z, Griffith MD (2018) Problematic smartphone use, nature connectedness, and anxiety. J Behav Addict 7(1):109–116
Rideout V, Robb M B (2019) The common sense census: media use by tweens and teens, 2019. Common Sense Media. https://www.commonsensemedia.org/sites/default/files/research/report/2019-census-8-to-18-full-report-updated.pdf. Accessed 15 June 2022
Roh D, Bhang SY, Choi JS et al (2018) The validation of implicit association test measures for smartphone and Internet addiction in at-risk children and adolescents. J Behav Addict 7(1):79–87
Rosen E (2015) How young Vietnamese view the Vietnam War. The Atlantic 30 April 2015. https://www.theatlantic.com/international/archive/2015/04/youth-vietnam-war-fall-saigon/391769/. Accessed 3 June 2021
Sahlins M (1972) The original affluent society. In: Stone age economics. Aldine-Atherton, Inc, Chicago, pp 1–39
Scott JC (2006) The mission of the university: medieval to postmodern transformations. J High Educ 77(1):1–39
Shin SC, Rhee SN, Aikens CM (2012) Chulmun Neolithic intensification, complexity, and emerging agriculture in Korea. Asian Perspect 51(1):68–109
Smith ME (1986) The role of social stratification in the Aztec empire: a view from the provinces. Am Anthropol 88:70–91
Smith ME (2021) Durable inequality in Aztec society. J Anthropol Res 77(2):162–186
So-Kum Tang C, Yee WK, Gan YQ (2017) Addiction to Internet use, online gaming, and online social networking among young adults in China, Singapore, and the United States. Asia Pac J Public Health 29(8):673–682
Sparks J (2006) [1967]) Allogrooming in primates: a review. In: Morris D (ed) Primate ethology. Aldine Transaction, Piscataway, pp 148–175
Sparrow B, Liu J, Wegner DM (2011) Google effects on memory: cognitive consequences of having information at our fingertips. Science 333(6043):776–778
Storm BC, Stone SM, Benjamin AS (2017) Using the Internet to access information inflates future use of the Internet to access other information. Memory 25(6):717–723
Sultan O (2017) Combatting the rise of ISIS 2.0 and terrorism 3.0. Cyber Defense Rev 2(3):41–50
Twenge JM, Campbell KW (2019) Media use is linked to lower psychological well-being: evidence from three datasets. Psychiatr Q 90:311–331
Twenge JM, Joiner TE, Rogers ML, Martin GN (2018) Increases in depressive symptoms, suicide-related outcomes, and suicide rates among US adolescents after 2010 and links to increased new media screen time. Clinical Psychological Science 6:3–17
Ubelaker DH (1992) Temporal trends of dental disease in ancient Ecuador. Anthropologie 30(1):99–102
Van der Meulen S, Bruinsma M (2018) Man as ‘aggregate of data.’ AI Soc 34:343–354
Wade S (2017) China’s Social Credit System: black mirror or red herring? China Digital Times 16 Feb. 2017. https://chinadigitaltimes.net/2017/02/qa-shazeda-ahmed-on-chinas-social-credit-system/. Accessed 23 May 2021
Wang A B (2017) Former Facebook VP says social media is destroying society with ‘dopamine-driven feedback loops’. The Washington Post 12 December 2017. https://www.washingtonpost.com/news/the-switch/wp/2017/12/12/former-facebook-vp-says-social-media-is-destroying-society-with-dopamine-driven-feedback-loops/?noredirect=on&utm_term=.eed38ca274ac. Accessed 29 Aug 2021
Warschauer M (2003) Demystifying the digital divide. Sci Am 289(2):42–47
Wegner DM, Ward AF (2013) How Google is changing your brain. Sci Am 309(6):58–61
Wilcox W B, Sturgeon S (2018) Too much Netflix, not enough chill: why young Americans are having less sex. Politico Magazine 08 Feb. 2018. https://www.politico.com/magazine/story/2018/02/08/why-young-americans-having-less-sex-216953. Accessed 24 May 2021
Winseck D (2017) The geopolitical economy of the global Internet infrastructure. J Inf Policy 7:228–267
World Bank (2022) 70% of 10-year-olds now in learning poverty, unable to read and understand a simple text. The World Bank. https://www.worldbank.org/en/news/press-release/2022/06/23/70-of-10-year-olds-now-in-learning-poverty-unable-to-read-and-understand-a-simple-text. Accessed 16 July 2022
Wright R (2004) A short history of progress. Carroll & Graf Publishers, New York
Wright LE, White CD (1996) Human biology in the classic Maya collapse: evidence from paleopathology and paleodiet. J World Prehist 10(2):147–198
YouTube (2017) An update on our commitment to fight terror content. YouTube.com. https://youtube.googleblog.com/2017/08/an-update-on-our-commitment-to-fight.html. Accessed 13 Sept 2021
Zhao S, Elesh D (2007) The second digital divide: unequal access to social capital in the online world. Int Rev Mod Sociol 33(2):171–192
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The author has no financial or non-financial interests directly or indirectly related to this work.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
O’Lemmon, M. The worst mistake 2.0? The digital revolution and the consequences of innovation. AI & Soc (2022). https://doi.org/10.1007/s00146-022-01599-5
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s00146-022-01599-5