I argue that believing that p implies having a credence of 1 in p. This is true because the belief that p involves representing p as being the case, representing p as being the case involves not allowing for the possibility of not-p, while having a credence that’s greater than 0 in not-p involves regarding not-p as a possibility.
This book argues that correspondence theories of truth fail because the relation that holds between a true thought and a fact is that of identity, not correspondence. Facts are not complexes of worldly entities which make thoughts true they are merely true thoughts. According to Julian Dodd, the resulting modest identity theory, while not defining truth, correctly diagnoses the failure of correspondence theories, and thereby prepares the ground for a defensible deflation of the concept of truth.
Introduction -- The type/token theory introduced -- Motivating the type/token theory : repeatability -- Nominalist approaches to the ontology of music -- Musical anti-realism -- The type/token theory elaborated -- Types I : abstract, unstructured, unchanging -- Types introduced and nominalism repelled -- Types as abstracta -- Types as unstructured entities -- Types as fixed and unchanging -- Types II : platonism -- Introduction : eternal existence and timelessness -- Types and properties -- The eternal existence of properties reconsidered -- (...) Types and patterns -- Defending the type/token theory I -- Unstructuredness and analogical predication -- Musical works as fixed and unchanging -- Abstractness and audibility (again) -- Works and interpretations -- Conclusion and resumé -- Defending the type/token theory II : musical platonism -- Platonism it is : replies to Anderson and Levinson -- The existence conditions of works of music -- Composition as creative discovery -- The nature of the compositional process : replies to objections -- Composition and aesthetic appraisal : a reply to Levinson -- Composition and aesthetic appraisal : understanding, interpretation, and correctness -- Musical works as continuants : a theory rejected -- A theory introduced -- Explicating and motivating the continuant view -- The continuant view and repeatability -- Further objections to the continuant view -- Musical works as compositional actions : a critique -- Currie's action-type hypothesis -- Davies's performance theory -- Sonicism I : against instrumentalism -- Sonicism introduced -- Sonicism motivated : moderate empiricism -- Instrumentation : timbral sonicism introduced -- Scores -- Instrumentation, artistic properties, and aesthetic content -- Levinson's rejoinder -- Sonicism II : against contextualism -- Introduction : formulating contextualism -- Contextualist ontological proposals -- Levinson's doppelgänger thought-experiments -- Artistic, representational, and object-directed expressive properties -- Aesthetic and non-object-directed expressive properties -- Conclusion : the place of context. (shrink)
This paper argues that a consideration of the problem of providing truthmakers for negative truths undermines truthmaker theory. Truthmaker theorists are presented with an uncomfortable dilemma. Either they must take up the challenge of providing truthmakers for negative truths, or else they must explain why negative truths are exceptions to the principle that every truth must have a truthmaker. The first horn is unattractive since the prospects of providing truthmakers for negative truths do not look good neither absences, nor totality (...) states of affairs, nor Graham Priest and J.C. Beall’s ‘polarities’ (Beall, 2000; Priest, 2000) are up to the job. The second horn, meanwhile, is problematic because restricting the truthmaker principle to atomic truths, or weakening it to the thesis that truth supervenes on being, undercuts truthmaker theory’s original motivation. The paper ends by arguing that truthmaker theory is, in any case, an under-motivated doctrine because the groundedness of truth can be explained without appeal to the truthmaker principle. This leaves us free to give the ommonsensical and deflationary explanation of negative truths that common-sense suggests. (shrink)
This book pursues the problem of whether violence can be understood to be constitutive of its own sense or meaning, as opposed to being merely instrumental. Dodd draws on the resources of phenomenological philosophy, and takes the form of a series of dialogues between figures both inside and outside of this tradition. The central figures considered include Carl von Clausewitz, Carl Schmitt, Hannah Arendt, Jean-Paul Sartre, Ernst Jünger, and Martin Heidegger, and the study concludes with an analysis of the (...) philosophy of Jan Patocka. (shrink)
Newcomers and more experienced feminist theorists will welcome this even-handed survey of the care/justice debate within feminist ethics. Grace Clement clarifies the key terms, examines the arguments and assumptions of all sides to the debate, and explores the broader implications for both practical and applied ethics. Readers will appreciate her generous treatment of the feminine, feminist, and justice-based perspectives that have dominated the debate.Clement also goes well beyond description and criticism, advancing the discussion through the incorporation of a (...) broad range of insights into a new integration of the values of care and justice. Care, Autonomy, and Justice marks a major step forward in our understanding of feminist ethics. It is both direct and helpful enough to work as an introduction for students and insightful and original enough to make it necessary reading for scholars. (shrink)
In this paper I argue for a doctrine I call ?infallibilism?, which I stipulate to mean that If S knows that p, then the epistemic probability of p for S is 1. Some fallibilists will claim that this doctrine should be rejected because it leads to scepticism. Though it's not obvious that infallibilism does lead to scepticism, I argue that we should be willing to accept it even if it does. Infallibilism should be preferred because it has greater explanatory power (...) than fallibilism. In particular, I argue that an infallibilist can easily explain why assertions of ?p, but possibly not-p? (where the ?possibly? is read as referring to epistemic possibility) is infelicitous in terms of the knowledge rule of assertion. But a fallibilist cannot. Furthermore, an infallibilist can explain the infelicity of utterances of ?p, but I don't know that p? and ?p might be true, but I'm not willing to say that for all I know, p is true?, and why when a speaker thinks p is epistemically possible for her, she will agree (if asked) that for all she knows, p is true. The simplest explanation of these facts entails infallibilism. Fallibilists have tried and failed to explain the infelicity of ?p, but I don't know that p?, but have not even attempted to explain the last two facts. I close by considering two facts that seem to pose a problem for infallibilism, and argue that they don't. (shrink)
ABSTRACTWhat is John Cage's 4′33″? This paper disambiguates this question into three sub-questions concerning, respectively, the work's ontological nature, the art form to which it belongs, and the genre it is in. We shall see that the work's performances consist of silence, that it is a work of performance art, and that it belongs to the genre of conceptual art. Seeing the work in these ways helps us to understand it better, and promises to assuage somewhat the puzzlement and irritation (...) of those who are at first resistant to its charms. (shrink)
This paper argues that, within the Western ‘classical’ tradition of performing works of music, there exists a performance value of authenticity that is distinct from that of complying with the instructions encoded in the work's score. This kind of authenticity—interpretive authenticity—is a matter of a performance's displaying an understanding of the performed work. In the course of explaining the nature of this norm, two further claims are defended: that the respective values of interpretive authenticity and score compliance can come into (...) conflict; and that when this happens, compromising ideal score compliance for the sake of making the performance more interpretively authentic can make for a better performance. (shrink)
Concessive knowledge attributions (CKAs) are knowledge attributions of the form ‘S knows p, but it’s possible that q’, where q obviously entails not-p (Rysiew, Nous (Detroit, Mich.) 35:477–514, 2001). The significance of CKAs has been widely discussed recently. It’s agreed by all that CKAs are infelicitous, at least typically. But the agreement ends there. Different writers have invoked them in their defenses of all sorts of philosophical theses; to name just a few: contextualism, invariantism, fallibilism, infallibilism, and that the knowledge (...) rules for assertion and practical reasoning are false. In fact, there is a lot of confusion about CKAs and their significance. I try to clear some of this confusion up, as well as show what their significance is with respect to the debate between fallibilists and infallibilists about knowledge in particular. (shrink)
Descriptivism in the ontology of art is the thesis that the correct ontological proposal for a kind of artwork cannot show the nascent ontological conception of such things embedded in our critical and appreciative practices to be substantially mistaken. Descriptivists believe that the kinds of revisionary art ontological proposals propounded by Nelson Goodman, Gregory Currie, Mark Sagoff, and me are methodologically misconceived. In this paper I examine the case that has been made for a local form of descriptivism in the (...) ontology of art: a form that does not quarrel with the possibility of revisionism in matters of ‘fundamental metaphysics’, but which argues that special features of the arts make descriptivism in this particular sphere obligatory. David Davies, Andrew Kania and Stephen Davies are local descriptivists in this sense. I argue that the burden of proof lies with the local descriptivist, but that this burden is too heavy for him to carry. Specifically, it emerges that the only way in which the local descriptivist can motivate his position is by arguing that our artistic practices determine the art ontological facts: a thesis that local descriptivists typically appeal to, but have not been able to argue for successfully. My conclusion is that the methodological debate in the ontology of art should now proceed by focussing on the case for global descriptivism: i.e. that form of descriptivism that opposes the possibility of revisionism in ontological matters across the board. (shrink)
Several philosophers have claimed that S knows p only if S’ s belief is safe, where S's belief is safe iff (roughly) in nearby possible worlds in which S believes p, p is true. One widely held intuition many people have is that one cannot know that one's lottery ticket will lose a fair lottery prior to an announcement of the winner, regardless of how probable it is that it will lose. Duncan Pritchard has claimed that a chief advantage of (...) safety theory is that it can explain the lottery intuition without succumbing to skepticism. I argue that Pritchard is wrong. If a version of safety theory can explain the lottery intuition, it will also lead to skepticism. Content Type Journal Article Category Original Article Pages 1-26 DOI 10.1007/s10670-011-9305-z Authors Dylan Dodd, Department of Philosophy, Northern Institute of Philosophy, University of Aberdeen, Aberdeen, UK Journal Erkenntnis Online ISSN 1572-8420 Print ISSN 0165-0106. (shrink)
We study generalizations of Dodd parameters and establish their fine structural properties in Jensen extender models with λ-indexing. These properties are one of the key tools in various combinatorial constructions, such as constructions of square sequences and morasses.
Julian Dodd has argued that the type–token theory in musical ontology has a ‘default’ status because it can explain the repeatability and audibility of musical works without the need for philosophical reinterpretation. I present two challenges to Dodd's claims about audibility. First, I argue (a) that a type–token theorist who, like Dodd, adheres to Wolterstorff's doctrine of analogical predication must grant that musical works themselves are hearable only in an ‘analogical’ sense; and (b) that alternative musical ontologies (...) are able to explain the latter just as well as the type–token theory. Second, I argue that Dodd cannot evade this objection by claiming that what matters in musical ontology is accounting for audibility ‘in a derivative sense’, since the latter also allows of explanation by a range of musical ontologies. (shrink)
This paper asks whether we should accept a weakened version of the truthmaker principle: namely, the claim that truth supervenes on being, in which 'being' is understood as whether things are. I consider a number of positive answers to this question, including the following: that the truthmaker principle is a requirement of any plausible explanation of truth; that the principle must be accepted, if we are to do justice to the Wittgensteinian insight that the world is the totality of facts, (...) not of things; and that the correctness of the principle is a necessary condition of a realist metaphysics. In my view, none of these attempts to motivate the truthmaker principle is satisfactory. (shrink)
The main thesis of this paper is that John McDowell (in his Mind and World) tries to occupy a position that is not coherently statable; namely, that facts have objects and properties as constituents and are yet identical with true (Fregean) Thoughts. This position is contrasted with two other identity theories of truth: the robust theory, in which true propositions are identified with facts (which are understood to have objects and properties as constituents); and the modest theory, in which facts (...) are identified with true Fregean Thoughts. I argue that the modest theory is to be preferred. (shrink)
In “All Play and No Work,” Andrew Kania claims that standard form jazz involves no works, only performances. This article responds to Kania by defending one of the alternative ontological proposals that he rejects, namely, that jazz works are ontologically continuous with works of classical music. I call this alternative “the standard view,” and I argue that it is the default position in the ontology of standard form jazz. Kania has three objections to the standard view. The bulk of the (...) article is devoted to explaining why none of these objections succeed. (shrink)
Timothy Williamson's epistemology leads to a fairly radical version of scepticism. According to him, all knowledge is evidence. It follows that if S knows p, the evidential probability for S that p is 1. I explain Williamson's infallibilist account of perceptual knowledge, contrasting it with Peter Klein's, and argue that Klein's account leads to a certain problem which Williamson's can avoid. Williamson can allow that perceptual knowledge is possible and that all knowledge is evidence, while at the same time avoiding (...) Klein's problem. But while Williamson can allow that we know some things through experience, there are very many things he must say we cannot know. Given just how very many these are, he should be considered a sceptic. (shrink)
According to the discovery model in the ontology of art, the facts concerning the ontological status of artworks are mind-independent and, hence, are facts about which the folk may be substantially ignorant or in error. In recent work Amie Thomasson has claimed that the most promising solution to the ‘ qua problem’—a problem concerning how the reference of a referring-expression is fixed—requires us to give up the discovery model. I argue that this claim is false. Thomasson's solution to the qua (...) problem—a hybrid descriptive/causal theory of reference-fixing—has a superior competitor, in the form of the account of reference-fixing suggested by Gareth Evans; and Evans's theory leaves the discovery model untouched. (shrink)
We distinguish, among other things, between the agent of the context, the speaker of the agent's utterance, the mechanism the agent uses to produce her utterance, and the tokening of the sentence uttered. Armed with these distinctions, we tackle the the ‘answer-machine’, ‘post-it note’ and other allegedly problematic cases, arguing that they can be handled without departing significantly from Kaplan's semantical framework for indexicals. In particular, we argue that these cases don't require adopting Stefano Predelli's intentionalism.
It has recently been suggested that the type/token theorist concerning musical works cannot come up with an adequate semantic theory of those sentences in which we purport to talk about such works. Specifically, it has been claimed that, since types are abstract entities, a type/token theorist can only account for the truth of sentences such as “The 1812 Overture is very loud” and “Bach's Two Part Invention in C has an F-sharp in its fourth measure” by adopting an untenable semantic (...) claim: namely, that the predicates in such sentences, once applied to musical works, undergo a systematic shift in their meanings. This article is a sustained explanation of why our talk about musical works in fact provides no problem for the type/token theorist. First, we demonstrate that the aforementioned “meaning shift” approach to the sentences’ predicates is well motivated and very credible. Second, we explain how the type/token theorist can adopt the best available version of an alternative, generic quantificational approach to such sentences. Third, we establish that other semantic theories, presented as undermining the type/token theory by giving us a reason for adopting eliminativism about types, are much less theoretically virtuous than the two theories that a type/token theorist can freely adopt. (shrink)
Peter Kivy has become convinced that it is impossible for pure, instrumental music to be profound. This is because he takes works of such music to be incapable of meeting what he claims to be two necessary conditions for artistic profundity: that the work denotes something profound, and that the work expresses profound propositions about its profound denotatum. The negative part of this paper argues as follows. Although works of pure, instrumental music do, indeed, fail to meet these conditions, the (...) said conditions are not themselves necessary for artistic profundity. In fact, each is a misinterpretation of a distinct, genuinely necessary condition for a work’s being profound: respectively, that the work has a profound subject matter, and that the work handles its profound subject matter in such a way as to elicit a deeper understanding of it in the suitably situated and prepared appreciator. The positive part of the paper elaborates these genuinely necessary conditions for artistic profundity and then argues that, once they have been disentangled from Kivy’s misconstruals of them, it becomes evident that they can be met by pieces of pure, instrumental music. (shrink)
According to the Imprecise Credence Framework (ICF), a rational believer's doxastic state should be modelled by a set of probability functions rather than a single probability function, namely, the set of probability functions allowed by the evidence ( Joyce  ). Roger White (  ) has recently given an arresting argument against the ICF, which has garnered a number of responses. In this article, I attempt to cast doubt on his argument. First, I point out that it's not an (...) argument against the ICF per se , but an argument for the Principle of Indifference. Second, I present an argument that's analogous to White's. I argue that if White's premises are true, the premises of this argument are too. But the premises of my argument entail something obviously false. Therefore, White's premises must not all be true. (shrink)
Strawson (1994) and Peacocke (1992) introduced thought experiments that show that it seems intuitive that there is, in some way, an experiential character to mental events of understanding. Some (e.g., Siewert 1998, 2011; Pitt 2004) try to explain these intuitions by saying that just as we have, say, headache experiences and visual experiences of blueness, so too we have experiences of understanding. Others (e.g., Prinz 2006, 2011; Tye 1996) propose that these intuitions can be explained without positing experiences of understanding. (...) Call this the debate between Realism and Anti-Realism about experiences of understanding. This paper aims to advance that debate in two ways. In the first half, I develop more precise characterizations of what Realists and Anti-Realists propose. In the second half, I distinguish the four most plausible versions of Anti-Realism and argue that Realism better explains the target intuition than any of them does. (shrink)
If one flips an unbiased coin a million times, there are 2 1,000,000 series of possible heads/tails sequences, any one of which might be the sequence that obtains, and each of which is equally likely to obtain. So it seems (1) 'If I had tossed a fair coin one million times, it might have landed heads every time' is true. But as several authors have pointed out, (2) 'If I had tossed a fair coin a million times, it wouldn't have (...) come up heads every time' will be counted as true in everyday contexts. And according to David Lewis' influential semantics for counterfactuals, (1) and (2) are contradictories. We have a puzzle. We must either (A) deny that (2) is true, (B) deny that (1) is true, or (C) deny that (1) and (2) are contradictories, thus rejecting Lewis' semantics. In this paper I discuss and criticize the proposals of David Lewis and more recently J. Robert G. Williams which solve the puzzle by taking option (B). I argue that we should opt for either (A) or (C). (shrink)
According to the traditional view of weakness of will, a weak-willed agent acts in a way inconsistent with what she judges to be best.1 Richard Holton has argued against this view, claiming that ‘the central cases of weakness of will are best characterized not as cases in which people act against their better judgment, but as cases in which they fail to act on their intentions’ (1999: 241). But Holton doesn’t think all failures to act on one’s prior intentions, or (...) all revisings of intentions, are cases of weakness of will (WW). Rather, he thinks an intention-revision is a case of WW only when it occurs ‘in circumstances in which [one] should not have revised [the intention]’. Holton points out that according to the traditional view of WW, to call an agent ‘weak-willed’ is to make descriptive claim about the agent (about whether an action in fact is inconsistent with what (s)he judges to be best). But according to Holton’s account, the question of whether the agent was weak-willed ‘will depend on which intentions [the agent] should have stuck with as a rational intender. That is a normative question’ (my emphasis) (241-3, 247-8. (shrink)
Cartesian skepticism about epistemic justification (‘skepticism’) is the view that many of our beliefs about the external world – e.g., my current belief that I have hands – aren’t justified. I examine the two most influential arguments for skepticism – the Closure Argument and the Underdetermination Argument – from an evidentialist perspective. For both arguments it is clear which premise the anti-skeptic must deny. The Closure Argument, I argue, is the better argument in that its key premise is weaker than (...) the Underdetermination Argument’s key premise. However, it’s also likely that the motivation for accepting both key premises is exactly the same. So there may be a sense in which both arguments provide exactly the same motivation for skepticism. Then I argue that if I I’m right about what the motivation for accepting the arguments’ key premises is, then neither argument succeeds in providing a good reason to accept skepticism. I conclude by explaining why I think epistemologists are right to expend a lot of time and effort on refuting these arguments, even if neither argument provides any motivation for skepticism. (shrink)
This paper sees me clarify, elaborate, and defend the conclusions reached in my ‘Musical Works as Eternal Types’ in the wake of objections raised by Robert Howell, R. A. Sharpe, and Saam Trivedi. In particular, I claim that the thesis that musical works are discovered rather than created by their composers is obligatory once we commit ourselves to thinking of works of music as types, and once we properly understand the ontological nature of types and properties. The central argument of (...) the paper is ‘the argument from the eternal existence of properties’, its moral being that types are eternal entities because they inherit their existence conditions from their eternally existent property-associates. The two key premises in this argument—that properties exist eternally, and that a type exists just in case its property-associate exists—are motivated and then defended at length. (shrink)
The ontological nature of works of music has been a particularly lively area of philosophical debate during the past few years. This paper serves to introduce the reader to some of the most fertile and interesting issues. Starting by distinguishing three questions – the categorial question, the individuation question, and the persistence question – the article goes on to focus on the first: the question of which ontological category musical works fall under. The paper ends by introducing, and briefly considering, (...) meta-ontological questions in the ontology of music. (shrink)
How can experience provide knowledge, or even justified belief, about the objective world outside our minds? This volume presents original essays by prominent contemporary epistemologists, who show how philosophical progress on foundational issues can improve our understanding of, and suggest a solution to, this famous sceptical question.
Are works of music types of performance or are they continuants? Types are unchanging entities that could not have been otherwise; continuants can undergo change through time and could have been different. Picking up on this distinction, Guy Rohrbaugh has recently argued that musical works are continuants rather than performance-types. This paper replies to his arguments and, in the course of so doing, elaborates and defends the conception of musical works as types of performance. I end the article by arguing (...) that the conception of musical works as continuants is under-motivated and, ultimately, obscure. (shrink)
The purpose of this study is to investigate whether the availability of financial bounties and anonymous reporting channels impact individuals’ general reporting intentions of questionable acts and whether the availability of financial bounties will prompt people to reveal their identities. The recent passage of the Dodd–Frank Wall Street Reform and Consumer Protection Act of 2010 creates a financial bounty for whistle-blowers. In addition, SOX requires companies to provide employees with an anonymous reporting channel option. It is unclear of the (...) effect of these provisions as they relate to whistle-blowing. Our results indicate that a financial bounty has the potential to increase participants’ propensity to report questionable acts and their willingness to reveal their identities when reporting, but the availability of an anonymous reporting channel does not affect participants’ propensity to report questionable acts. These findings could potentially help corporate management, government policy makers and accounting researchers to assess the effectiveness of their internal compliance programs and help determine if financial bounties in the private sector could encourage whistle-blowing. (shrink)
In a recent paper, Robert Stecker proposes the following test for whether a value possessed by an artwork is artistic or not: ‘Does one need to understand the work to appreciate its being valuable in that way? If so, it is an artistic value. If not, it is not.’ An important question here is what Stecker means by ‘appreciation’ in this context. Stecker himself says little about this, but I offer him two accounts of the nature of appreciation, both of (...) which are suggested by remarks of his own. It turns out that Stecker’s proposed test is flawed on either understanding of ‘appreciation’. (shrink)
Social learning is likely to include affective processes: it is necessary for newcomers to discover what value to attach to objects, persons, and events in a given social environment. This learning relies largely on the evaluation of others’ emotional expressions. This study has two objectives. Firstly, we compare two closely related concepts that are employed to describe the use of another person’s appraisal to make sense of a given situation: social appraisal and social referencing. We contend that social referencing constitutes (...) a type of social appraisal. Secondly, we introduce the concept of affective social learning with the hope that it may help to discriminate the different ways in which emotions play a critical role in the processes of socialization. (shrink)
Simplifying somewhat, sonicists believe that works of music are individuated purely in terms of how they sound. For them, exact sound-alikes are identical. Stephen Davies, in his ‘Musical Works and Orchestral Colour’ ( BJA 48 (2008), pp. 363–375) took me to task for defending a version of sonicism. In this paper I seek to explain why Davies's objections miss their mark. In the course of the discussion, I make some methodological remarks about the ontology of music.
We propose a rational method for addressing an important question—who deserves to be an author of a scientific article? We review various contentious issues associated with this question and recommend that the scientific community should view authorship in terms of contributions and responsibilities, rather than credits. We propose a new paradigm that conceptually divides a scientific article into four basic elements: ideas, work, writing, and stewardship. We employ these four fundamental elements to modify the well-known International Committee of Medical Journal (...) Editors (ICMJE) authorship guidelines. The modified ICMJE guidelines are then used as the basis to develop an approach to quantify individual contributions and responsibilities in multi-author articles. The outcome of the approach is an authorship matrix, which can be used to answer several nagging questions related to authorship. (shrink)