Jonathan Kvanvig's book, The Value of Knowledge and the Pursuit of Understanding (Kvanvig, 2003), is a wonderful example of doing epistemology in a style that Kvanvig himself has termed "value−driven epistemology." On this approach, one takes questions about epistemic value to be central to theoretical concerns, including the concern to provide an adequate account of knowledge. This approach yields the demand that theories of knowledge must provide, not just an adequate account of the nature of knowledge, but also an account (...) of the value of knowledge. Given the near−universal assumption that knowledge has a special kind of value, this demand seems reasonable, though surprisingly hard to satisfy. Another consequence of this approach to doing epistemology is that certain assumptions about epistemic value, like what sorts of things have it and what sorts of things don't, and where such value comes from, become much more salient to the epistemic enterprise. In his book, Kvanvig challenges the assumption that knowledge has some unique store of epistemic value. And he investigates the matter by asking questions about what the bearers of epistemic value are and where they get it. He concludes, of course, that knowledge as we have come to conceive it in 21st century epistemology has no such special value. (shrink)
It is nearly universally acknowledged among epistemologists that a belief, even if true, cannot count as knowledge if it is somehow largely a matter of luck that the person so arrived at the truth. A striking feature of this literature, however, is that while many epistemologists are busy arguing about which particular technical condition most effectively rules out the offensive presence of luck in true believing, almost no one is asking why it matters so much that knowledge be immune from (...) luck in the first place. I argue that the best explanation for the consensus that luck undermines knowledge is that knowledge is, complications aside, credit-worthy true believing. To make this case, I develop both the notions of luck and credit, and sketch a theory of knowledge in those terms. Furthermore, this account also holds promise for being able to solve the “value problem” for knowledge, and it explains why both internal and external conditions are necessary to turn true belief into knowledge. (shrink)
This paper defends the theory that knowledge is credit-worthy true belief against a family of objections, two instances of which were leveled against it in a recent paper by Jennifer Lackey. Lackey argues that both innate knowledge (if there is any) and testimonial knowledge are too easily come by for it to be plausible that the knower deserves credit for it. If this is correct, then knowledge would appear not to be a matter of credit for true belief. I will (...) attempt to neutralize these objections by drawing a distinction between credit as praiseworthiness and credit as attributability. (shrink)
Abstract: Open-mindedness is typically at the top of any list of the intellectual or "epistemic" virtues. Yet, providing an account that simultaneously explains why open-mindedness is an epistemically valuable trait to have and how such a trait is compatible with full-blooded belief turns out to be a challenge. Building on the work of William Hare and Jonathan Adler, I defend a view of open-mindedness that meets this challenge. On this view, open-mindedness is primarily an attitude toward oneself as a believer, (...) rather than toward any particular belief. To be open-minded is to be aware of one's fallibility as a believer, and to acknowledge the possibility that anytime one believes something, one could be wrong. In order to see that such an attitude is epistemically valuable even to an already virtuous agent, some details of the skills and habits of the open-minded agent are elucidated. (shrink)
It is generally assumed that there are (at least) two fundamental epistemic goals: believing truths, and avoiding the acceptance of falsehoods. As has been often noted, these goals are in conflict with one another. Moreover, the norms governing rational belief that we should derive from these two goals depend on how we weight them relative to one another. However, it is not obvious that there is one objectively correct weighting for everyone in all circumstances. Indeed, as I shall argue, it (...) looks as though there are circumstances in which a range of possible weightings of the two goals are all equally epistemically rational. (shrink)
Current epistemological dogma has it that the twin goalsof believing truths and avoiding errors exhaust our cognitive aspirations.On such a view, (call it the TG view) the only evaluationsthat count as genuinely epistemological are those that evaluatesomething (a belief, believer, set of beliefs, a cognitivetrait or process, etc.) in terms of its connection to thesetwo goods. In particular, this view implies that all theepistemic value of knowledge must be derived from thevalue of the two goals cited in TG. I argue (...) thatthis implication is false, and thus that the TG view must be abandoned. I propose a candidate to replacethe TG view that makes better sense of the value ofknowledge. (shrink)
Coherence theorists have universally defined justification as a relation only among (the contents of) belief states, in contradistinction to other theories, such as some versions of foundationalism, which define justification as a relation on belief states and appearance states.
This paper focuses on several issues that arise in Miranda Fricker?s book Epistemic injustice surrounding her claims about our (moral) culpability for perpetrating acts of testimonial injustice. While she makes frequent claims about moral culpability with respect to specific examples, she never addresses the issue in its full generality, and we are left to extrapolate her general view about moral culpability for acts of testimonial injustice from these more restricted and particular claims. Although Fricker never describes testimonial injustice in such (...) terms, I argue that the fundamental wrong done in acts of testimonial injustice is a form of negligence. Once we understand testimonial injustice in this way, it is easier to see when and why we are culpable for perpetrating such injustices. Indeed, explicitly recognizing testimonial injustice as a form of negligence solves several problems for Fricker?s view, which are elucidated briefly along the way. However, construing testimonial injustice as a form of negligence has a cost as well. It highlights the fact that the normative core of Fricker?s view is deontological, rather than virtue-theoretic. Fricker claims to be offering a theory of the virtue of testimonial justice along the model of current virtue theories in epistemology, yet it seems that there is no compelling reason to think of what she has offered as a virtue theory, at least not on the model of virtue theories that one finds in epistemology. This is not to say that her view is any less plausible for not being a virtue theory. But calling it a virtue theory affects how one interprets its various claims, and tends to lead one away from, rather than toward, a proper understanding of the deep deontological nature of her account. (shrink)
Current epistemological dogma has it that the twin goalsof believing truths and avoiding errors exhaust our cognitive aspirations. On such a view, (call it the "TG view") the only evaluations that count as genuinely epistemological are those that evaluate something (a belief, believer, set of beliefs, a cognitive trait or process, etc.) in terms of its connection to these two goods. In particular, this view implies that all the epistemic value of knowledge must be derived from the value of the (...) two goals cited in TG. I argue that this implication is false, and thus that the TG view must be abandoned. I propose a candidate to replace the TG view that makes better sense of the value of knowledge. (shrink)
I am interested in epistemic virtues for reasons rather different than most. I do not offer a virtue theory of anything, I don't argue that we can solve various long−standing problems in epistemology by appeal to epistemic virtues, nor am I an opponent of any of these things (though I certainly find some of these projects more plausible than others.) Rather, my interest in the epistemic virtues stems from a long−standing commitment to epistemic value pluralism, and a belief that, until (...) recently, epistemology has been stifled by an implicit commitment to the hegemony of truth in the realm of epistemic value. Since the language of epistemic virtue theory adds a welcome richness to the vocabulary of epistemic evaluation, I find it appealing for that reason at least. (shrink)
As a proponent and practitioner of value−driven epistemology1, I am very gratified that this collection of essays on epistemic value has been put together. I believe that the recent emphasis on epistemic value within epistemology has already borne fruit, with the promise of much more to come. One reason for this promise is that a value−driven approach to epistemology invites one to ask kinds of questions that, while certainly not prohibited by more traditional epistemological method, do not naturally arise. Twentieth (...) century analytic epistemology became rather myopically focused on getting the conditions under which one has knowledge just right, leaving aside for the most part questions about the role and value of knowledge in our lives more broadly. After all, while humans are knowers, we are many other things besides. Getting clear on precisely why we care about knowledge, whatever exactly it is, helps us understand how our cognitive lives are entwined with our moral lives, our prudential lives, etc. (shrink)
Is knowledge more valuable than mere true belief? Few question the value of having true beliefs, and insofar as having knowledge entails (at least) having a true belief, we value knowledge. But traditionally it has been assumed that whatever it takes to turn true belief into knowledge has some additional value. Traditionally, then, philosophers have been committed to what I will call the ‘Value Principle'.
In this paper I defend the theory that knowledge is credit-worthy true belief against a family of objections, one of which was leveled against it in a recent paper by Jennifer Lackey. In that paper, Lackey argues that testimonial knowledge is problematic for the credit-worthiness theory because when person A comes to know that p by way of the testimony of person B, it would appear that any credit due to A for coming to believe truly that p belongs to (...) the testifier, B, rather than the hearer, A. If so, then knowledge would appear not to be a matter of credit for true belief. I think that the problem this raises actually has little to do with the fact that the knowledge comes by way of testimony, and that similar objections can be formulated in terms of perceptual and memorial knowledge. I will attempt to neutralize these objections by drawing a distinction between credit as praiseworthiness and credit as attributability. (shrink)
: In this essay, Riggs demonstrates how heterosexism shapes foster-care assessment practices in Australia. Through an examination of lesbian and gay foster-care applicants' assessment reports and with a focus on the heteronormative assumptions contained within them, Riggs demonstrates that foster-care public policy and research on lesbian and gay parenting both promote the idea that lesbian and gay parents are always already "just like" heterosexual parents. To counter this idea of "sameness," Riggs proposes an approach to both assessing (...) and researching lesbian and gay parents that privileges the specific experiences of lesbians and gay men and resists the heterosexualization of lesbian and gay families by focusing on some potentially radical differences shaping lesbian and gay lives. (shrink)
The deBroglie–Bohm quantum potential is the potential energy function of the wave field. The quantum potential facilitates the transference of energy from wave field to particle and back again which accounts for energy conservation in isolated quantum systems. Factors affecting energy exchanges and the form of the quantum potential are discussed together with the related issues of the absence of a source term for the wave field and the lack of a classical back reaction.
Recent work on emergence in physics has focused on the presence of singular limit relations between basal and upper-level theories as a criterion for emergence. However, over-emphasis on the role of singular limit relations has somewhat obscured what it means to say that a property or behaviour is emergent. This paper argues that singular limits are not central to emergence and develops an alternative account of emergence in terms of the failure of basal explainability. As a consequence, emergence and reduction, (...) long held to be two sides of the same coin in the emergentist tradition, are largely decoupled. (shrink)
This paper begins by tracing interest in emergence in physics to the work of condensed matter physicist Philip Anderson. It provides a selective introduction to contemporary philosophical approaches to emergence. It surveys two exciting areas of current work that give good reason to re-evaluate our views about emergence in physics. One area focuses on physical systems wherein fundamental theories appear to break down. The other area is the quantum-to-classical transition, where some have claimed that a complete explanation of the behaviors (...) and features of the objects of classical physics entirely in quantum terms is now within our grasp. We suggest that the most useful way to approach the emergent/non-emergent distinction is in epistemic terms, and more specifically that the failure of reductive explanation is constitutive of emergence in physics. (shrink)
Field theories have been central to physics over the last 150 years, and there are several theories in contemporary physics in which physical fields play key causal and explanatory roles. This paper proposes a novel field trope-bundle (FTB) ontology on which fields are composed of bundles of particularized property instances, called tropes and goes on to describe some virtues of this ontology. It begins with a critical examination of the dominant view about the ontology of fields, that fields are properties (...) of a substantial substratum. (shrink)
Many explanations in physics rely on idealized models of physical systems. These explanations fail to satisfy the conditions of standard normative accounts of explanation. Recently, some philosophers have claimed that idealizations can be used to underwrite explanation nonetheless, but only when they are what have variously been called representational, Galilean, controllable or harmless idealizations. This paper argues that such a half-measure is untenable and that idealizations not of this sort can have explanatory capacities.
The problem of the current research is to develop an instrument that accurately measures individuals' adherence or nonadherence to both Protestant Ethic and contemporary work values. The study confirms that the traditional Protestant Ethic work values and the contemporary work values are different and the instrument used to measure the work values that individuals actually support is valid and reliable. Two scales were developed based on Protestant Ethic work values and contemporary work values. A four-point Likert scale was used to (...) indicate the extent of agreement or disagreement with statements written to represent Protestant Ethic and contemporary work values. Face and content validities of the instrument were established by using two panels of experts — one consisted of authorities in the area of work values; the other consisted of editorial critics. Reliability of the instrument was confirmed by the Kuder-Richardson and test-retest methods. Four sets of work values emerged with significant discrimination among them. (shrink)
Nick Huggett and Robert Weingard (1994) have recently proposed a novel approach to interpreting field theories in physics, one which makes central use of the fact that a field generally has an infinite number of degrees of freedom in any finite region of space it occupies. Their characterization, they argue, (i) reproduces our intuitive categorizations of fields in the classical domain and thereby (ii) provides a basis for arguing that the quantum field is a field. Furthermore, (iii) it accomplishes these (...) tasks better than does a well-known rival approach due to Paul Teller (1990, 1995). This paper contends that all three of these claims are mistaken, and suggests that Huggett and Weingard have not shown how counting degrees of freedom provides any insight into the interpretation or the formal properties of field theories in physics. (shrink)
In this paper, criticisms are made of the main tenets of Professor Mellor's argument against ‘backwards’ causation. He requires a closed causal chain of events if there is to be ‘backwards’ causation, but this condition is a metaphysical assumption which he cannot totally substantiate. Other objections to Mellor's argument concern his probabilistic analysis of causation, and the use to which he puts this analysis. In particular, his use of conditional probability inequality to establish the ‘direction’ of causation is shown to (...) be in error. 1I am indebted to Drs H. Krips, L. O'Neill and to the anonymous referee for their suggestions and critical comments on earlier drafts. (shrink)
A signal development in contemporary physics is the widespread use, in explanatory contexts, of highly idealized models. This paper argues that some highly idealized models in physics have genuine explanatory power, and it extends the explanatory role for such idealizations beyond the scope of previous philosophical work. It focuses on idealizations of nonlinear oscillator systems.
The performance of 93 children aged 3 and 4 years on a battery of different counterfactual tasks was assessed. Three measures: short causal chains, location change counterfactual conditionals, and false syllogisms—but not a fourth, long causal chains—were correlated, even after controlling for age and receptive vocabulary. Children's performance on our counterfactual thinking measure was predicted by receptive vocabulary ability and inhibitory control. The role that domain general executive functions may play in 3- to 4-year olds' counterfactual thinking development is discussed.
How do therapists learn to manage sexual feelings in the therapeutic relationship in an ethical, responsible manner? Data from 293 university-based psychotherapists show that the minority who report that their training prepared them to do so "very well" were more likely to have received "content-specific" training related to the topic or an opportunity to explore themselves as sexual beings, or both. In addition, they had experience with supervisors who modeled the belief that sexual feelings are a normal, expected part of (...) any human relationship and must be anticipated and planned for by therapists. (shrink)
Recent work by Robert Batterman and Alexander Rueger has brought attention to cases in physics in which governing laws at the base level “break down” and singular limit relations obtain between base- and upper-level theories. As a result, they claim, these are cases with emergent upper-level properties. This paper contends that this inference—from singular limits to explanatory failure, novelty or irreducibility, and then to emergence—is mistaken. The van der Pol nonlinear oscillator is used to show that there can be a (...) full explanation of upper-level properties entirely in base-level terms even when singular limits are present. Whether upper-level properties are emergent depends not on the presence of a singular limit but rather on details of the ampliative approximation methods used. The paper suggests that focusing on explanatory deficiency at the base level is key to understanding emergence in physics. (shrink)
A common methodological adage holds that diverse evidence better confirms a hypothesis than does the same amount of similar evidence. Proponents of Bayesian approaches to scientific reasoning such as Horwich, Howson and Urbach, and Earman claim to offer both a precise rendering of this maxim in probabilistic terms and an explanation of why the maxim should be part of the methodological canon of good science. This paper contends that these claims are mistaken and that, at best, Bayesian accounts of diverse (...) evidence are crucially incomplete. This failure should lend renewed force to a long-neglected global worry about Bayesian approaches. (shrink)
Potential ethical issues can arise during the process of epidemiological classification. For example, unnatural infant deaths are classified as accidental deaths or homicides. Societal sensitivity to the physical abuse and neglect of children has increased over recent decades. This enhanced sensitivity could impact reported infant homicide rates. Infant homicide and accident mortality rates in boys and girls in the USA from 1940 to 2005 were analysed. In 1940, infant accident mortality rates were over 20 times greater than infant homicide rates (...) in both boys and girls. After about 1980, when the ratio of infant accident mortality rates to infant homicide rates decreased to less than five, and the sum of infant accident and homicide rates became relatively constant, further decreases in infant accident mortality rates were associated with increases in reported infant homicide rates. These findings suggest that the dramatic decline of accidental infant mortality and recent increased societal sensitivity to child abuse may be related to the increased infant homicide rates observed in the USA since 1980 rather than an actual increase in societal violence directed against infants. Ethical consequences of epidemiological classification, involving the principles of beneficence, non-maleficence and justice, are suggested by observed patterns in infant accidental deaths and homicides in the USA from 1940 to 2005. (shrink)
This discussion provides a brief commentary on each of the papers presented in the symposium on the conceptual foundations of field theories in physics. In Section 2 I suggest an alternative to Paul Teller's (1999) reading of the gauge argument that may help to solve, or dissolve, its puzzling aspects. In Section 3 I contend that Sunny Auyang's (1999) arguments against substantivalism and for "objectivism" in the context of gauge field theories face serious worries. Finally, in Section 4 I claim (...) that Gordon Fleming's (1999) proposal for hyperplane-dependent Newton-Wigner fields differs importantly from his previous arguments about hyperplane-dependent properties in quantum mechanics. (shrink)
Are animals not ours to use? According to proponents of veganism such as Gary Francione, any and all use of animals by humans is exploitative and wrong. It is wrong because animals have intrinsic worth and humans' use of animals fails to respect that worth. Contra Francione, I argue that that there are conditions under which it may be morally appropriate to collect, consume, sell, or otherwise use animal products. Francione is mistaken in his belief that assigning intrinsic worth to (...) a being is impossible if said being is also conceived as a resource. Using and (non-instrumental) valuing are not mutually exclusive; if they were, many if not most human relationships would be deemed morally unacceptable. Through a series of thought experiments involving intra-human relationships, I suggest that moral condemnation of relationships within which a less dependent party regularly takes from a more dependent party is indefensible. In fact, relationships of use between asymmetrically dependent parties are essential to the functioning of cooperative society, and are therefore desirable. My aims with this article are to convince readers of the need to reject principled veganism, and to garner support for new philosophical accounts of morally appropriate human-nonhuman animal relationships. (shrink)