This article develops an interpretation of Hegel that aims to show how a proper understanding of the nature of speculative sentences might achieve what Kant set out to do: to vindicate our most fundamental claims to knowledge as actual knowledge, rather than mere acts of believing. To this end, it develops a conception of speculative geographies (or “maps”) as an interpretive tool and introduces an Hegelian-inspired distinction between empirical, generic, and speculative sentences. On this reading, Kant’s employment of the “boundary (...) concept” of a noumenon is bound to fail as it needs to employ a contrast between our human point of view and that of an omniscient God – which turns out to be an aperspectival “view from nowhere” and thus an incoherent notion. The artcile ends by suggesting ways in which Hegel’s logical analysis can help us to better comprehend the reflective ascent necessary to make our conceptual differentiations and typical ways of understanding intelligible to ourselves. (shrink)
Quite likely the most sacrosanct principle in epistemology, it is near-universally accepted that knowledge is factive: knowing that p entails p. Recently, however, Bricker, Buckwalter, and Turri have all argued that we can and often do know approximations that are strictly speaking false. My goal with this paper is to advance this nascent non-factive project in two key ways. First, I provide a critical review of these recent arguments against the factivity of knowledge, allowing us to observe that elements of (...) these arguments mutually reinforce respective weaknesses, thereby offering the non-factive project a much stronger foundation than when these arguments were isolated. Next, I argue tentatively in favor of Bricker’s truthlikeness framework over the representational adequacy account favored by Buckwalter and Turri. Taken together, while none of this constitutes a knock-down argument against factivity, it does allow us to quiet some of the more immediate worries surrounding the non-factive project. (shrink)
How does Quine fare in the first decades of the twenty-first century? In this paper I examine a cluster of Quinean theses that, I believe, are especially fruitful in meeting some of the current challenges of epistemology and ontology. These theses offer an alternative to the traditional bifurcations of truth and knowledge into factual and conceptual-pragmatic-conventional, the traditional conception of a foundation for knowledge, and traditional realism. To make the most of Quine’s ideas, however, we have to take an active (...) stance: accept some of his ideas and reject others, sort different versions of the relevant ideas, sharpen or revise some of the ideas, connect them with new, non-Quinean ideas, and so on. As a result the paper pits Quine against Quine, in an attempt to identify those Quinean ideas that have a lasting value and sketch potential developments. (shrink)
Seeking a decision theory that can handle both the Newcomb problems that challenge evidential decision theory and the unstable problems that challenge causal decision theory, some philosophers recently have turned to ‘graded ratifiability’. However, the graded ratifiability approach to decision theory is, despite its virtues, unsatisfactory; for it conflicts with the platitude that it is always rationally permissible for an agent to knowingly choose their best option.
A number of philosophers have recently proposed several alleged cases of “knowledge from falsehood,” i.e., cases of inferential knowledge epistemised by an inference with a false crucial premise. This paper examines such cases and argues against interpreting them as cases of knowledge from falsehood. Specifically, I argue that the inferences in play in such cases are in no position to epistemise their conclusions.
The principle of epistemic closure is the claim that what is known to follow from knowledge is known to be true. This intuitively plausible idea is endorsed by a vast majority of knowledge theorists. There are significant problems, however, that have to be addressed if epistemic closure – closed knowledge – is endorsed. The present essay locates the problem for closed knowledge in the separation it imposes between knowledge and evidence. Although it might appear that all that stands between knowing (...) the truth of the premises of a valid inference and knowledge of its conclusion is inferring it from the premises, the evidence for each of the premises may jointly count against the conclusion. The intuitive view regarding inferred knowledge says one thing, the evidence says another. One epistemological framework that seems to have the resources to resolve this tension endorses the view that knowledge always requires conclusive evidence. A second framework resolves the tension by limiting the scope of the closure principle. Only inferences drawn directly from propositions contained in the scope of a single knowledge operator are considered closed. The aim of the present essay is to revive the unpopular third option, the idea that knowledge is open. The essay proceeds by arguing that in different ways the two former frameworks only succeed in relocating the problem, not in resolving it. The first framework, the infallibilist view, relocates the problem to a sharp separation between knowledge of the occurrences of events from knowledge of their chance of occurring, a separation leading to several significant additional problems. The fallibilist view, the second framework, in endorsing closure neglects to take into full account the ways in which evidence fails to be transitive. For instance, evidence can count in favor of a conjunction while counting against each of its conjuncts. This fact, which is argued for in the essay on probabilistic as well as non-probabilistic grounds, is used as the foundation of an argument against closed knowledge that can be used as a way to understand several of the most fundamental challenges of epistemology. Not only can an open knowledge view that is based on open evidence resolve all these problems in a simple and natural way, it can also respond to formidable challenges that significantly hinder other open knowledge views. There are good reasons, then, to view both knowledge and evidence as open. (shrink)
This paper has two goals. The first is to argue that the field of bioethics in general and the literature on ideal vs. nonideal theory in particular has underemphasized a primary problem for normative theorizing: the role of conditioning principles. I define these as principles that implicitly or explicitly ground, limit, or otherwise determine the construction and function of other principles, and, as a result, profoundly impact concept formation, perception, judgment, and action, et al. The second is to demonstrate that (...) ableism is one such conditioning principle and that it undermines the field of bioethics and the practice of biomedicine from achieving the aim of justice as fairness. After briefly addressing the history and critiques of principlism in bioethics, I lay out and defend my account of conditioning principles. I then argue that ableism is one such principle and demonstrate it at work through an analysis of a storied debate between Eva Kittay, Peter Singer, and Jeff McMahan. In conclusion, I contend that the ethical and philosophical dangers of conditioning principles are too easily exacerbated by ideal theory frameworks, and I do so by demonstrating how they are especially liable to generate epistemic injustice, especially contributory and hermeneutical injustice. (shrink)
Is reality the basis of everything or has reality itself an other basis? What makes reality – not the real things – to be active, to exist? The question of what is real seems to be an easy question, because in our daily lives we are and must be naive realists. We ourselves, the things around us, the world, the facts, all that is real. there must be several concepts of reality if we want to say that not only physical (...) or material things of everyday life are real, e. g. numbers, π, Dr. Faustus, thoughts, emotions and other things. On the other hand, given the difference of classical physics and modern physics, we see that even that form of knowledge, which seems to be most responsible for reality, natural science, cannot give the desired uniqueness in terms of what itself wants or needs to understand as real. Alternatively, when we see that nothing can be and nothing can be real without being in a world, and when we understand the world as the order of things, which I call worldI , then this leads us to the speculative answer that it is exactly the “unreal” worldI which is the reason why the everyday reality, worldI I , is real. The worldI is the basis for the reality of our empirical worldI I . The considerations presented here have nothing to do with the idealistic conception of possibility, founded in the power of the subject, nor with the existential concept of potentiality, founded in the Entwurf des Daseins. (shrink)
This essay argues against David Carr’s relativism by clarifying the in principle requirements appropriate to non-relative truths and showing that de facto differences of conceptual frameworks threaten none of them. Non-relative truths are not threatened by history. This defense of non-relative truth belongs to a larger defense of Husserlian “science” that shows how essences, even those “delivered” by history, have a universal “governance” and can be affirmed in nonrelative truths-as such science requires. If history also allows the other qualities of (...) Husserlian science to obtain, then, the essay concludes, such science can exist even as a “situated science.”. (shrink)
We explore consequences of the view that to know a proposition your rational credence in the proposition must exceed a certain threshold. In other words, to know something you must have evidence that makes rational a high credence in it. We relate such a threshold view to Dorr et al.’s :277–287, 2014) argument against the principle they call fair coins: “If you know a coin won’t land tails, then you know it won’t be flipped.” They argue for rejecting fair coins (...) because it leads to a pervasive skepticism about knowledge of the future. We argue that the threshold view of evidence and knowledge gives independent grounds to reject fair coins. (shrink)
Universite’nin gorevi nedir? Su tanim’i yapabiliriz: dusunceleri birlestirip,ilerletmek. Dusunceleri birlestirmek ve ilerletmek neden gereklidir? Bir atilim’a gecmek icin, yer alacak olaylarin onceden ve kapsamli olarakdusunulmesi gerekir. Dusunceler, bir atilimin baslangicidir ve yalniz universite icinde gelismez.Dusunceler bir yonetim'e katilim birimi icinde gelkisebilecegi gibi, arkadastopluluklari icinde de yer alabilir. Ayrica, bir tek kisi'ce de olusturulup birkitap icinde dunya'ya sunulabilir.
The leading idea of this article is that one cannot acquire knowledge of any non-epistemic fact by virtue of knowing that one that knows something. The lines of reasoning involved in the surprise exam paradox and in Williamson’s _reductio_ of the KK-principle, which demand that one can, are thereby undermined, and new type of counter-example to epistemic closure emerges.
The paper attempts to give a solution to the Fitch's paradox though the strategy of the reformulation of the paradox in temporal logic, and a notion of knowledge which is a kind of ceteris paribus modality. An analogous solution has been offered in a different context to solve the problem of metaphysical determinism.
The prequel to this paper introduced the topic of iteration principles in epistemology and surveyed some arguments in support of them. In this sequel, I'll consider two influential families of objection to iteration principles. The first turns on the idea that they lead to some variety of skepticism, and the second turns on ‘margin for error’ considerations adduced by Timothy Williamson.
Epistemic iteration principles are principles according to which some or another epistemic operator automatically iterates---e.g., if it is known that P, then it is known that P, or there is evidence that P, then there is evidence that there is evidence that P. This article provides a survey of various arguments for and against epistemic iteration principles, with a focus on arguments relevant to a wide range of such principles.
One of the key tenets of Linda Zagzebski’s book " Epistemic Authority" is the Preemption Thesis. It says that, when an agent learns that an epistemic authority believes that p, the rational response for her is to adopt that belief and to replace all of her previous reasons relevant to whether p by the reason that the authority believes that p. I argue that such a “Hobbesian approach” to epistemic authority yields problematic results. This becomes especially virulent when we apply (...) Preemption to cases in which the agent and the authority share their belief, maybe even for the same reasons, or in which both have either a positive or a negative graded doxastic attitude toward a given proposition. As an alternative I propose a “Socratic account”, according to which the authority will not only motivate us to adopt her belief, but also provide us with higher-order reasons for re-assigning our own considerations their proper place in the web of reasons for and against the view in question. (shrink)
The topic of this article is the closure of a priori knowability under a priori knowable material implication: if a material conditional is a priori knowable and if the antecedent is a priori knowable, then the consequent is a priori knowable as well. This principle is arguably correct under certain conditions, but there is at least one counterexample when completely unrestricted. To deal with this, Anderson proposes to restrict the closure principle to necessary truths and Horsten suggests to restrict it (...) to formulas that belong to less expressive languages. In this article it is argued that Horsten’s restriction strategy fails, because one can deduce that knowable ignorance entails necessary ignorance from the closure principle and some modest background assumptions, even if the expressive resources do not go beyond those needed to formulate the closure principle itself. It is also argued that it is hard to find a justification for Anderson’s restricted closure principle, because one cannot deduce it even if one assumes very strong modal and epistemic background principles. In addition, there is an independently plausible alternative closure principle that avoids all the problems without the need for restriction. (shrink)
What is the epistemological structure of situations where many small risks amount to a large one? Lottery and preface paradoxes and puzzles about quantum-mechanical blips threaten the idea that competent deduction is a way of extending our knowledge. Seemingly, everyday knowledge involves small risks, and competently deducing the conjunction of many such truths from them yields a conclusion too risky to constitute knowledge. But the dilemma between scepticism and abandoning MPC is false. In extreme cases, objectively improbable truths are known. (...) Safety is modal, not probabilistic, in structure, with closure and factiveness conditions. It is modelled using closeness of worlds. Safety is analogous to knowledge. It suggests an interpretation of possible worlds semantics for epistemic logic. To avoid logical omniscience, a relation of epistemic counterparthood between formulas is introduced. This supports a safety conception of knowledge and formalizes how extending knowledge by deduction depends on logical competence. (shrink)
No reader of The Relevance of Charles Peirce will fail to be impressed by what Max Fisch calls "The Range of Peirce's Relevance.' This exciting volume invites scholars in many of the fields of contemporary philosophy to see what Peirce has to contribute to their methods and their conclusions. Articles in the collection offer a more divided interpretation, however, of the meaning of Peirce's relevance. For some, Peirce's relevance is "extensive": like a Renaissance genius, his intellect surveys the universe of (...) human expression, and, by Jove, he has some• thing smart to say about everything! The authors of these articles show us how Peirce enriches the various, established disciplines of philosophy of interest to them, pointing out both the significance and the limitations of his contribution. For others, Peirce's relevance is "intensive": like some philosophic physician, Peirce struggles to cure a malady that infects the Cartesian-Kantian tradition to which he ultimately belongs. Selecting particular instances of this struggle, the authors of these articles try to show how Peirce challenges accepted practices in contemporary philosophy, succeeding in his critical task without necessarily offering unproblematic alternatives. As a whole, the collection successfully promotes the first interpretation of Peirce's relevance, but fails to give sufficient attention to the second. Perhaps the failure is prudent. For the sake of attracting interest among the uninitiated, The Relevance of Charles Peirce offers the most Peirce with the least offense. For that, we are indebted to Eugene Freeman, editor of the collection, as well as of The Monist Library of Philosophy. Freeman first proposed devoting two issues of The Monist to "Peirce's Relevance" (Vol. 63, No. 3, and Vol. 65, No. 2), then did us the service of publishing the present volume by binding The Monist articles together with five other already published pieces. (shrink)
In this thesis we are concerned with two major issues in knowledge representation: semantics of negation in knowledge representation languages, and combining knowledge bases. ;We take two different approaches to characterize the semantics of negation in knowledge representation languages. The first approach is based on an iterated fixpoint computation of the semantics. We present a uniform framework for iterated fixpoint semantics of logic programs. Based on this framework we study three particular instances in detail: Generalized Well-founded Semantics , $WF\sp3$ semantics, (...) and Disjunctive Well-founded Semantics . For WF$\sp3$ semantics we present an equivalent procedural characterization and for GWFS and WF$\sp3$ semantics we present equivalent model theoretic characterizations. The second approach is based on extending the traditional way to characterize default theories and auto-epistemic theories. We introduce the concept of "classes" to be able to characterize the class of all theories when they are represented as normal logic programs, default theories, auto-epistemic theories and non-monotonic modal theories. We study various aspects of the "class" semantics and show that the concept of classes can also be used to give a simple specification of the well-founded semantics. ;We study the problem of combining knowledge bases with respect to three knowledge representation languages: logic programs, first-order theories and default theories. We formalize the various properties that the combined knowledge base should satisfy; mainly consistency with respect to the integrity constraints, and maximality and correctness with respect to the union of the knowledge bases. To satisfy the maximality criterion the combined knowledge base of a set of Horn programs may have to be a disjunctive logic program. Unlike normal and disjunctive logic programs, the union of a set of first-order theories and union of a set of default theories may be inconsistent. We present methods that consider these aspects and construct the combined knowledge base of a set of knowledge bases. We compare combining knowledge bases with the problem of updating a knowledge base. (shrink)
This paper starts with an analysis of the maker’s knowledge principle as one of the main characteristics of Modern epistemology. We start by showing that maker’s knowledge can be understood in two ways: 1) a negative sense, as a way of establishing limits to human knowledge: we can only know what we create; and 2) a positive sense, as legitimizing human knowledge: we effectively know what we create. We proceed then to examine the roots of the maker’s knowledge principle in (...) the context of the transition from Greek philosophy to early Christian thought, seeing Philo of Alexandria as perhaps the first to formulate an early version of the principle. We conclude that it is the Christian conception of God as creator that makes possible a redefinition of the relation between knowing and creating, opening the way to the Modern formulation of the principle. (shrink)
Radical skepticism is the view that we know nothing or at least next to nothing. Nearly no one actually believes that skepticism is true. Yet it has remained a serious topic of discussion for millennia and it looms large in popular culture. What explains its persistent and widespread appeal? How does the skeptic get us to doubt what we ordinarily take ourselves to know? I present evidence from two experiments that classic skeptical arguments gain potency from an interaction between two (...) factors. First, people evaluate inferential belief more harshly than perceptual belief. Second, people evaluate inferential belief more harshly when its content is negative (i.e., that something is not the case) than when it is positive (i.e., that something is the case). It just so happens that potent skeptical arguments tend to focus our attention on negative inferential beliefs, and we are especially prone to doubt that such beliefs count as knowledge. That is, our cognitive evaluations are biased against this specific combination of source and content. The skeptic sows seeds of doubt by exploiting this feature of our psychology. (shrink)
Herein I investigate how four dogmas underpinning the traditional concepts of universality, the genus, class, and abstract universal, generate four paradoxes of self-reference. The four dogmas are the following: (1) that contradiction entails the total absence of determinacy, (2) the necessary finitude of the concept, (3) the separation of principles of universality and particularity, and (4) the necessity of appealing to foundations. In section III I show how these dogmas underpin the paradoxes of self-reference and how one cannot make progress (...) on these paradoxes as long as these four dogmas are in place. Corresponding to the abovementioned dogmas are the four paradoxes of self-reference: (1a) the problem of the .. (shrink)
In ”Formal Problems about Knowledge,” Roy Sorensen examines epistemological issues that have logical aspects. He uses Fitch's proof for unknowables and the surprise test paradox to illustrate the hopes of the modal logicians who developed epistemic logic, and he considers the epistemology of proof with the help of the knower paradox. One solution to this paradox is that knowledge is not closed under deduction. Sorensen reviews the broader history of this maneuver along with the relevant alternatives model of knowledge which (...) assumes that ”know” is an absolute term like ”flat.” Sorensen argues that the difference between epistemic absolute terms and extensional absolute terms gives rise to an asymmetry that undermines recent claims that there is a structural parallel between the supervaluational and epistemicist theories of vagueness, and he suggests that we have overestimated the ability of logical demonstration to produce knowledge. (shrink)
The Knower Paradox is an element of the class of paradoxes of self-reference. It demonstrates that any theory Ó which (1) extends Robinson arithmetic Q, (2) includes a unary knowledge predicate K, and (3) contains certain elementary epistemic principles involving K is inconsistent. In this paper I present different versions of the Knower Paradox (both in the framework of the first-order arithmetic and in the modal logic). There are several solutions of the paradox. Some of them I discuss in detail, (...) namely solution developed within modal logic, solution proposed by C. A. Anderson and solution proposed by P. Égré. The common defect of these proposals is that they developed a connection between the concepts of knowledge and provability. Finally, I suggest a solution using the basic ideas of the revision theory of definitions. (shrink)
Nozick is the author of the conditional definition of knowledge where two subjunctive conditionals replace internalistic notion of justification. If you know that p, you have true belief that p and also in the close possible worlds you would accept p when p is true and you would not accept p when p is false. Nozick agrees with skeptics that we do not know that we are not brains in the vat. But he claims that we do know all the (...) trivial things we think we know. The only way to accept the two theses is to deny the Principle of Clousure. According to Nozick knowledge is not closed under known logical implication. But is it right to deny the principle? Our everyday knowledge implies that the skeptic is wrong. If I know that I am reading a text on Earth, it is false that I am on Alpha Centauri floating in a tank. To reject skeptic it is enough to deny the transparency principle (if I know, I know that I know). When knowledge is possible without knowledge about that knowledge, we can know even if we are not able to prove that we know. (shrink)
An ambitious work, based on a lifetime of reading and research, Science, Language and the Human Condition provides a strong defense of a realist theory of knowledge, opposing various forms of contemporary positivism and subjectivism. Kaplan identifies with the pragmatic tradition of Peirce, James, and Dewey, and acknowledges a particular intellectual debt to Morris Cohen. He views that tradition as fundamentally Aristotelian in orientation, as one that recognizes a plurality of methods of inquiry as well as the open-ended character of (...) science. Encyclopedic in his treatment of contemporary figures and movements, Kaplan calls for a synoptic view of the world and the recognition of objective moral values in that world. He affirms the possibility of knowledge both of nature and of the moral order. But as he puts it, knowledge is not a seamless whole. Knowledge is pragmatically constructed in terms of multiple purposes and levels of certainty. While our knowledge of the world is objective, no single way of knowing yields absolute certainty. Thus Kaplan is equally critical of those positivists who would adhere to the model of mathematical physics as the only source of reliable knowledge and of those system builders such as Hegel and Marx whose global vision is inevitably obscurantist. He is just as harsh in dealing with Derrida and with the deconstructionist movement. (shrink)
This volume of original essays assesses Nozick's analyses of knowledge and evidence and his approach to skepticism. Several of the contributors claim that Nozick has not succeeded in rebutting the skeptic; some offer fresh accounts of skepticism and its flaws; others criticize Nozick's externalist accounts of knowledge and evidence; still others welcome externalism but attempt to replace Nozick's accounts of knowledge and evidence with more plausible analyses.
We discuss our surgical philosophy concerning the subtle interplay between the size of the surgical margin taken and the resultant morbidity from ablative oncological. procedures, which is ever more evident in the treatment of head and neck malignancy. The extent of tissue resection is determined by the "trade off" between cancer control and the perioperative, functional and aesthetic morbidity and mortality of the surgery. We also discuss our dilemmas concerning recent minimally invasive endoscopic microsurgical. techniques for the trans-oral laser removal. (...) or co-ablation of aero-digestive tract tumours, which result in a minimal. surgical margin of oncological clearance. By a process of inductive argument as to the nature of the surgical margin, we consider whether the risks of taking a lesser margin with adjuvant therapy is justified by the attendant gain in reduced surgical morbidity and the possible costs in tumour control. (c) 2006 Elsevier Ltd. All rights reserved. (shrink)