Peer review is a widely accepted instrument for raising the quality of science. Peer review limits the enormous unstructured influx of information and the sheer amount of dubious data, which in its absence would plunge science into chaos. In particular, peer review offers the benefit of eliminating papers that suffer from poor craftsmanship or methodological shortcomings, especially in the experimental sciences. However, we believe that peer review is not always appropriate for the evaluation of controversial hypothetical science. We argue that (...) the process of peer review can be prone to bias towards ideas that affirm the prior convictions of reviewers and against innovation and radical new ideas. Innovative hypotheses are thus highly vulnerable to being “filtered out” or made to accord with conventional wisdom by the peer review process. Consequently, having introduced peer review, the Elsevier journal Medical Hypotheses may be unable to continue its tradition as a radical journal allowing discussion of improbable or unconventional ideas. Hence we conclude by asking the publisher to consider re-introducing the system of editorial review to Medical Hypotheses. (shrink)
Matthias Vogel challenges the belief, dominant in contemporary philosophy, that reason is determined solely by our discursive, linguistic abilities as communicative beings. In his view, the medium of language is not the only force of reason. Music, art, and other nonlinguistic forms of communication and understanding are also significant. Introducing an expansive theory of mind that accounts for highly sophisticated, penetrative media, Vogel advances a novel conception of rationality while freeing philosophy from its exclusive attachment to linguistics. (...) class='Hi'>Vogel's media of reason treats all kinds of understanding and thought, propositional and nonpropositional, as important to the processes and production of knowledge and thinking. By developing an account of rationality grounded in a new conception of media, he raises the profile of the prelinguistic and nonlinguistic dimensions of rationality and advances the Enlightenment project, buffering it against the postmodern critique that the movement fails to appreciate aesthetic experience. Guided by the work of Jürgen Habermas, Donald Davidson, and a range of media theorists, including Marshall McLuhan, Vogel rebuilds, if he does not remake, the relationship among various forms of media -- books, movies, newspapers, the Internet, and television -- while offering an original and exciting contribution to media theory. (shrink)
How do people decide which claims should be considered mere beliefs and which count as knowledge? Although little is known about how people attribute knowledge to others, philosophical debate about the nature of knowledge may provide a starting point. Traditionally, a belief that is both true and justified was thought to constitute knowledge. However, philosophers now agree that this account is inadequate, due largely to a class of counterexamples (termed ‘‘Gettier cases’’) in which a person’s justified belief is true, but (...) only due to luck. We report four experiments examining the effect of truth, justification, and ‘‘Gettiering’’ on people’s knowledge attributions. These experiments show that: (1) people attribute knowledge to others only when their beliefs are both true and justified; (2) in contrast to contemporary philosophers, people also attribute knowledge to others in Gettier situations; and (3) knowledge is not attributed in one class of Gettier cases, but only because the agent’s belief is based on ‘‘apparent’’ evidence. These findings suggest that the lay concept of knowledge is roughly consistent with the traditional account of knowledge as justified true belief, and also point to a major difference between the epistemic intuitions of laypeople and those of philosophers. (shrink)
The philosophical work of Jean-Luc Marion has opened new ways of speaking about religious convictions and experiences. In this exploration of Marion’s philosophy and theology, Christina M. Gschwandtner presents a comprehensive and critical analysis of the ideas of saturated phenomena and the phenomenology of givenness. She claims that these phenomena do not always appear in the excessive mode that Marion describes and suggests instead that we consider degrees of saturation. Gschwandtner covers major themes in Marion’s work—the historical event, art, (...) nature, love, gift and sacrifice, prayer, and the Eucharist. She works within the phenomenology of givenness, but suggests that Marion himself has not considered important aspects of his philosophy. (shrink)
In this paper, I present and explore some ideas about how factive emotional states and factive perceptual states each relate to knowledge and reasons. This discussion will shed light on the so-called ‘perceptual model’ of the emotions.
This unique collection brings together internationally recognized scholars of film, philosophy, and the philosophy of perception and aesthetics, as well as many established philosophers working on the Film as Philosophy problem. It also includes several young scholars working currently in the philosophy and film genre. It is especially poised to be used in university undergraduate and graduate courses, but appeals to the larger, more general audience as well as to those working in these particular areas of specialization. Philosophy in motion...
Subjunctivitis is the doctrine that what is distinctive about knowledge is essential modal in character, and thus is captured by certain subjunctive conditionals. One principal formulation of subjunctivism invokes a ``sensitivity condition'' (Nozick, De Rose), the other invokes a ``safety condition'' (Sosa). It is shown in detail how defects in the sensitivity condition generate unwanted results, and that the virtues of that condition are merely apparent. The safety condition is untenable also, because it is too easily satisfied. A powerful motivation (...) for adopting subjunctivism would be that it provides a solution to the problem of misleading evidence, but in fact, it does not. (shrink)
For some time, it seemed that one had to choose between two sharply different theories of epistemic justification, foundationalism and coherentism. Foundationalists typically held that some beliefs were certain, and, hence, basic. Basic beliefs could impart justification to other, non-basic beliefs, but needed no such support themselves. Coherentists denied that there are any basic beliefs; on their view, all justified beliefs require support from other beliefs. The divide between foundationalism and coherentism has narrowed lately, and Susan Haack attempts to synthesize (...) these competing accounts into a view she calls "foundherentism.". (shrink)
This remarkably clear and comprehensive account of empirical knowledge will be valuable to all students of epistemology and philosophy. The author begins from an explanationist analysis of knowing—a belief counts as knowledge if, and only if, its truth enters into the best explanation for its being held. Defending common sense and scientific realism within the explanationist framework, Alan Goldman provides a new foundational approach to justification. The view that emerges is broadly empiricist, counteracting the recently dominant trend that rejects that (...) framework entirely. Topics treated include the Gettier problem, the nature of explanation and inductive inference, the justification of foundations for knowledge in terms of inference to the best explanation, the possibility of realist interpretations of contemporary science, reference, and the relations between empirical psychology and epistemology. Professor Goldman defends the need for a foundational theory of justification and presents a version that refutes standard criticisms of that doctrine. His defense of realism takes into account contemporary advances in semantics and philosophy of science. It attempts to clarify the kinds of skeptical argument the philosopher must take seriously, without succumbing to them. While recent epistemology has tended to dismiss the traditional foundational approach, it has not provided a suitable alternative. Goldman breaks new ground by adapting that approach within his explanationist, inductive theory. (shrink)
We present a conceptual framework on the experience of time and provide a coherent basis on which to base further inquiries into qualitative approaches concerning time experience. We propose two Time-Layers and two Time-Formats forming four Time-Domains. Micro-Flow and Micro-Structure represent the implicit phenomenal basis, from which the explicit experiences of Macro-Flow and Macro-Structure emerge. Complementary to this theoretical proposal, we present empirical results from qualitative content analysis obtained from 25 healthy participants. The data essentially corroborate the theoretical proposal. With (...) respect to Flow, the phenomenally accessible time experience appeared as a continuous passage reaching from the past through the present into the future. With respect to Structure, the individual present was embedded in the individual biography, emerging from past experiences and comprising individual plans and goals. New or changing plans and goals were being integrated into the existing present, thus forming a new present. The future appeared as changeable within the present, by means from the past, and therefore as a space of potential opportunities. Exemplarily, we discuss these results in relation to previous empirical findings on deviant experiences of time in Autism Spectrum Disorder that is presumably characterized by a breakdown of Flow and concomitant compensatory repetition resulting in an overly structured time. Finally, we speculate about possible implications of these findings both for psychopathological and neuroscientific research. (shrink)
Philosophers such as Eric Katz and Robert Elliot have argued against ecological restoration on the grounds that restored landscapes are no longer natural. Katz calls them “artifacts,” but the sharp distinction between nature and artifact doesn’t hold up. Why should the products of one particular natural species be seen as somehow escaping nature? Katz’s account identifies an artifact too tightly with the intentions of its creator: artifacts always have more to them than what their creators intended, and furthermore the intention (...) behind some artifacts might explicitly be to allow things to happen unpredictably. Indeed, to build any artifact is to employ forces that go beyond the builder: in this sense all artifacts are natural. Recognizing the naturalness of artifacts can help encourage the key environmental virtues of self-knowledge and humility. (shrink)
In recent years, most political theorists have agreed that shame shouldn't play any role in democratic politics because it threatens the mutual respect necessary for participation and deliberation. But Christina Tarnopolsky argues that not every kind of shame hurts democracy. In fact, she makes a powerful case that there is a form of shame essential to any critical, moderate, and self-reflexive democratic practice. Through a careful study of Plato's Gorgias, Tarnopolsky shows that contemporary conceptions of shame are far too (...) narrow. For Plato, three kinds of shame and shaming practices were possible in democracies, and only one of these is similar to the form condemned by contemporary thinkers. Following Plato, Tarnopolsky develops an account of a different kind of shame, which she calls "respectful shame." This practice involves the painful but beneficial shaming of one's fellow citizens as part of the ongoing process of collective deliberation. And, as Tarnopolsky argues, this type of shame is just as important to contemporary democracy as it was to its ancient form. Tarnopolsky also challenges the view that the Gorgias inaugurates the problematic oppositions between emotion and reason, and rhetoric and philosophy. Instead, she shows that, for Plato, rationality and emotion belong together, and she argues that political science and democratic theory are impoverished when they relegate the study of emotions such as shame to other disciplines. (shrink)
Microaggressions are a new moral category that refers to the subtle yet harmful forms of discriminatory behavior experienced by members of oppressed groups. Such behavior often results from implicit bias, leaving individual perpetrators unaware of the harm they have caused. Moreover, microaggressions are often dismissed on the grounds that they do not constitute a real or morally significant harm. My goal is therefore to explain why microaggressions are morally significant and argue that we are responsible for their harms. I offer (...) a conceptual framework for microaggressions, exploring the central mechanisms used for identification and the empirical research concerning their harm. The cumulative harm of microaggressions presents a unique case for understanding disaggregation models for contributed harms, blame allocation, and individual responsibility within structural oppression. Our standard moral model for addressing cumulative harm is to hold all individual contributors blameworthy for their particular contributions. However, if we aim to hold people responsible for their unconscious microaggressions and address cumulative harm holistically, this model is inadequate. Drawing on Iris Marion Young's social connection model, I argue that we, as individual perpetrators of microaggressions, have a responsibility to respond to the cumulative harm to which we have individually contributed. (shrink)
Introduction: Fundamental Ontology as a "Fundamental Ethics" In his "Letter on Humanism" Martin Heidegger claims that the fundamental ontology he works out ...
We present a conceptual framework on the experience of time and provide a coherent basis on which to base further inquiries into qualitative approaches concerning time experience. We propose two Time-Layers and two Time-Formats forming four Time-Domains. Micro-Flow and Micro-Structure represent the implicit phenomenal basis, from which the explicit experiences of Macro-Flow and Macro-Structure emerge. Complementary to this theoretical proposal, we present empirical results from qualitative content analysis obtained from 25 healthy participants. The data essentially corroborate the theoretical proposal. With (...) respect to Flow, the phenomenally accessible time experience appeared as a continuous passage reaching from the past through the present into the future. With respect to Structure, the individual present was embedded in the individual biography, emerging from past experiences and comprising individual plans and goals. New or changing plans and goals were being integrated into the existing present, thus forming a new present. The future appeared as changeable within the present, by means from the past, and therefore as a space of potential opportunities. Exemplarily, we discuss these results in relation to previous empirical findings on deviant experiences of time in Autism Spectrum Disorder that is presumably characterized by a breakdown of Flow and concomitant compensatory repetition resulting in an overly structured time. Finally, we speculate about possible implications of these findings both for psychopathological and neuroscientific research. (shrink)
Positive emotions are highly valued and frequently sought. Beyond just being pleasant, however, positive emotions may also lead to long-term benefits in important domains, including work, physical health, and interpersonal relationships. Research thus far has focused on the broader functions of positive emotions. According to the broaden-and-build theory, positive emotions expand people’s thought–action repertoires and allow them to build psychological, intellectual, and social resources. New evidence suggests that positive emotions—particularly gratitude—may also play a role in motivating individuals to engage in (...) positive behaviors leading to self-improvement. We propose and offer supportive evidence that expressing gratitude leads people to muster effort to improve themselves via increases in connectedness, elevation, humility, and specific negative states including indebtedness. (shrink)
This book is an unusually readable and lucid account of the development of Derrida's work, from his early writings on phenomenology and structuralism to his most recent interventions in debates on psychoanalysis, ethics and politics. Christina Howells gives a clear explanation of many of the key terms of deconstruction - including differance, trace, supplement and logocentrism - and shows how they function in Derrida's writing. She explores his critique of the notion of self-presence through his engagement with Husserl, and (...) his critique of humanist conceptions of the subject through an account of his ambivalent and evolving relationship to the philosophy of Sartre. The question of the relationship between philosophy and literature is examined through an analysis of the texts of the 1970s, and in particular Glas, where Derrida confronts Hegel's totalizing dialectics with the fragmentary and iconoclastic writings of Jean Genet. The author addresses directly the vexed questions of the extreme difficulty of Derrida's own writing and of the passionate hostility it arouses in philosophers as diverse as Searle and Habermas. She argues that deconstruction is a vital stimulus to vigilance in both the ethical and political spheres, contributing significantly to debate on issues such as democracy, the legacy of Marxism, responsibility, and the relationship between law and justice. Comprehensive, cogently argued and up to date, this book will be an invaluable text for students and scholars alike. (shrink)
Nagel, San Juan, and Mar report an experiment investigating lay attributions of knowledge, belief, and justification. They suggest that, in keeping with the expectations of philosophers, but contra recent empirical findings [Starmans, C. & Friedman, O. (2012). The folk conception of knowledge. Cognition, 124, 272–283], laypeople consistently deny knowledge in Gettier cases, regardless of whether the beliefs are based on ‘apparent’ or ‘authentic’ evidence. In this reply, we point out that Nagel et al. employed a questioning method that biased participants (...) to deny knowledge. Moreover, careful examination of participants’ responses reveals that they attributed knowledge in Gettier cases. We also note that Nagel et al. misconstrue the distinction between ‘apparent’ and ‘authentic’ evidence, and use scenarios that do not feature the structure that characterizes most Gettier cases. We conclude that NS&M’s findings are fully compatible with the claim that laypeople attribute knowledge in Gettier cases in general, but are significantly less likely to attribute knowledge when a belief is generated based on apparent evidence. (shrink)