People increasingly form beliefs based on information gained from automatically filtered Internet sources such as search engines. However, the workings of such sources are often opaque, preventing subjects from knowing whether the information provided is biased or incomplete. Users’ reliance on Internet technologies whose modes of operation are concealed from them raises serious concerns about the justificatory status of the beliefs they end up forming. Yet it is unclear how to address these concerns within standard theories of knowledge and justification. (...) To shed light on the problem, we introduce a novel conceptual framework that clarifies the relations between justified belief, epistemic responsibility, action, and the technological resources available to a subject. We argue that justified belief is subject to certain epistemic responsibilities that accompany the subject’s particular decision-taking circumstances, and that one typical responsibility is to ascertain, so far as one can, whether the information upon which the judgment will rest is biased or incomplete. What this responsibility comprises is partly determined by the inquiry-enabling technologies available to the subject. We argue that a subject’s beliefs that are formed based on Internet-filtered information are less justified than they would be if she either knew how filtering worked or relied on additional sources, and that the subject may have the epistemic responsibility to take measures to enhance the justificatory status of such beliefs.. (shrink)
We present a novel model of individual people, online posts, and media platforms to explain the online spread of epistemically toxic content such as fake news and suggest possible responses. We argue that a combination of technical features, such as the algorithmically curated feed structure, and social features, such as the absence of stable social-epistemic norms of posting and sharing in social media, is largely responsible for the unchecked spread of epistemically toxic content online. Sharing constitutes a distinctive communicative act, (...) governed by a dedicated norm and motivated to a large extent by social identity maintenance. But confusion about this norm and its lack of inherent epistemic checks lead readers to misunderstand posts, attribute excess or insufficient credibility to posts, and allow posters to evade epistemic accountability—all contributing to the spread of epistemically toxic content online. This spread can be effectively addressed if people and platforms add significantly more context to shared posts and platforms nudge people to develop and follow recognized epistemic norms of posting and sharing. (shrink)
Information providing and gathering increasingly involve technologies like search engines, which actively shape their epistemic surroundings. Yet, a satisfying account of the epistemic responsibilities associated with them does not exist. We analyze automatically generated search suggestions from the perspective of social epistemology to illustrate how epistemic responsibilities associated with a technology can be derived and assigned. Drawing on our previously developed theoretical framework that connects responsible epistemic behavior to practicability, we address two questions: first, given the different technological possibilities available (...) to searchers, the search technology, and search providers, who should bear which responsibilities? Second, given the technology’s epistemically relevant features and potential harms, how should search terms be autocompleted? Our analysis reveals that epistemic responsibility lies mostly with search providers, which should eliminate three categories of autosuggestions: those that result from organized attacks, those that perpetuate damaging stereotypes, and those that associate negative characteristics with specific individuals.. (shrink)
My aim in this paper is to give a philosophical analysis of the relationship between contingently available technology and the knowledge that it makes possible. My concern is with what specific subjects can know in practice, given their particular conditions, especially available technology, rather than what can be known “in principle” by a hypothetical entity like Laplace’s Demon. The argument has two parts. In the first, I’ll construct a novel account of epistemic possibility that incorporates two pragmatic conditions: responsibility and (...) practicability. For example, whether subjects can gain knowledge depends in some circumstances on whether they have the capability of gathering relevant evidence. In turn, the possibility of undertaking such investigative activities depends in part on factors like ethical constraints, economical realities, and available technology. In the second part of the paper, I’ll introduce “technological possibility” to analyze the set of actions made possible by available technology. To help motivate the problem and later test my proposal, I’ll focus on a specific historical case, one of the earliest uses of digital electronic computers in a scientific investigation. I conclude that the epistemic possibility of gaining access to scientific knowledge about certain subjects depends (in some cases) on the technological possibility for making responsible investigations. (shrink)
David Chalmers thinks his iPhone exemplifies the extended mind thesis by meeting the criteria that he and Andy Clark established in their well-known 1998 paper. Andy Clark agrees. We take this proposal seriously, evaluating the case of the GPS-enabled smartphone as a potential mind extender. We argue that the “trust and glue” criteria enumerated by Clark and Chalmers are incompatible with both the epistemic responsibilities that accompany everyday activities and the practices of trust that enable users to discharge them. Prospects (...) for revision of the original criteria are dim. We therefore call for a rejection of the trust criterion and a reevaluation of the extended mind thesis.. (shrink)
Leading prescriptions for addressing the spread of fake news, misinformation, and other forms of epistemically toxic content online target either the platform or platform users as a single site for intervention. Neither approach attends to the intense feedback between people, posts, and platforms. Leading prescriptions boil down to the suggestion that we make social media more like traditional media, whether by making platforms take active roles as gatekeepers, or by exhorting individuals to behave more like media professionals. Both approaches are (...) impracticable and wrong. (shrink)
To one side of the wide third-floor hallway of Victoria College, just outside the offices of the Institute for the History and Philosophy of Science and Technology, lies the massive carcass of a 1960s-era electron microscope. Its burnished steel carapace has lost its gleam, but the instrument is still impressive for its bulk and spare design: binocular viewing glasses, beam control panel, specimen tray, and a broad work surface. Edges are worn, desiccated tape still feebly holds instructive reminders near control (...) dials; this was once a workhorse in some lab. But it exists now out of time and place; like many of the scientific instruments we study, it has not been touched by knowing hands in decades. (shrink)
Internet Alley is much more a book about regional history than about politics, economics, or history of technology, yet it draws extensively on all of these fields. The book is stronger for its interdisciplinarity, but as a result does not sit comfortably within any traditional historical discourse. Historians of science or technology not dealing with northern Virginia in the twentieth century will find little of help in this book.
Since Robert Hooke published Micrographia, scientists have been expanding the boundaries of science to new scales, giving rise to questions about epistemology and ontology and challenging perceptions of objectivity, life, and artifact. Recent developments in areas such as nanotechnology and synthetic life have not only pushed these boundaries, but have called their very existence into question. In this issue, Spontaneous Generations examines science at the nanoscale from ten perspectives...