Psycholinguistic studies often look at the production of referring expressions in interactive settings, but so far few referring expression generation algorithms have been developed that are sensitive to earlier references in an interaction. Rather, such algorithms tend to rely on domain-dependent preferences for both content selection and linguistic realization. We present three experiments showing that humans may opt for dispreferred attributes and dispreferred modifier orderings when these were primed in a preceding interaction (without speakers being consciously aware of this). In (...) addition, we show that speakers are more likely to produce overspecified references, including dispreferred attributes (although minimal descriptions with preferred attributes would suffice), when these were similarly primed. (shrink)
This study investigates to what extent the amount of variation in a visual scene causes speakers to mention the attribute color in their definite target descriptions, focusing on scenes in which this attribute is not needed for identification of the target. The results of our three experiments show that speakers are more likely to redundantly include a color attribute when the scene variation is high as compared with when this variation is low (even if this leads to overspecified descriptions). We (...) argue that these findings are problematic for existing algorithms that aim to automatically generate psychologically realistic target descriptions, such as the Incremental Algorithm, as these algorithms make use of a fixed preference order per domain and do not take visual scene variation into account. (shrink)
In this paper I show that Proclus is an adherent of the Classical Model of Science as set out elsewhere in this issue (de Jong and Betti 2008), and that he adjusts certain conditions of the Model to his Neoplatonic epistemology and metaphysics. In order to show this, I develop a case study concerning philosophy of nature, which, despite its unstable subject matter, Proclus considers to be a science. To give this science a firm foundation Proclus distills from Plato’s Timaeus (...) the basic concepts Being and Becoming and a number of basic propositions, among others the quasi-definitions of the basic concepts. He subsequently explains the use of these quasi-definitions, that are actually epistemic guides, in such a way that he obtains a connection between a rational and an empirical approach to the natural world. A crucial task in establishing the connection is performed by the faculty of doxa and by geometrical conversion. The result is that Proclus secures a universal, necessary and known foundation for all of philosophy of nature. (shrink)
One of the hardest questions to answer for a (Neo)platonist is to what extent and how the changing and unreliable world of sense perception can itself be an object of scientific knowledge. My dissertation is a study of the answer given to that question by the Neoplatonist Proclus (Athens, 411-485) in his Commentary on Plato’s Timaeus. I present a new explanation of Proclus’ concept of nature and show that philosophy of nature consists of several related subdisciplines matching the ontological stratification (...) of nature. Moreover, I demonstrate that for Proclus philosophy of nature is a science, albeit a hypothetical one, which takes geometry as its methodological paradigm. I also offer an explanation of Proclus’ view of what is later called the mathematization of physics, i.e. the role of the substance of mathematics, as opposed to its method, in explaining the natural world. Finally, I discuss Proclus’ views of the discourse of philosophy of nature and its iconic character. (shrink)
This volume collects Late Ancient, Byzantine and Medieval appropriations of Aristotle's Posterior Analytics, addressing the logic of inquiry, concept formation, the question whether metaphysics is a science, and the theory of demonstration.
Subject sensitive invariantism is the view that whether a subject knows depends on what is at stake for that subject: the truth-value of a knowledge-attribution is sensitive to the subject's practical interests. I argue that subject sensitive invariantism cannot accept a very plausible principle for memory to transmit knowledge. I argue, furthermore, that semantic contextualism and contrastivism can accept this plausible principle for memory to transmit knowledge. I conclude that semantic contextualism and contrastivism are in a dialectical position better than (...) subject sensitive invariantism is. (shrink)
In order to explain such puzzling cases as the Bank Case and the Airport Case, semantic contextualists defend two theses. First, that the truth-conditions of knowledge sentences fluctuate in accordance with features of the conversational context. Second, that this fluctuation can be explained by the fact that 'knows' is an indexical. In this paper, I challenge both theses. In particular, I argue (i) that it isn't obvious that 'knows' is an indexical at all, and (ii) that contrastivism can do the (...) same work as contextualism is supposed to do, without being linguistically implausible. (shrink)
Amartya Sen argues that for the advancement of justice identification of ‘perfect’ justice is neither necessary nor sufficient. He replaces ‘perfect’ justice with comparative justice. Comparative justice limits itself to comparing social states with respect to degrees of justice. Sen’s central thesis is that identifying ‘perfect’ justice and comparing imperfect social states are ‘analytically disjoined’. This essay refutes Sen’s thesis by demonstrating that to be able to make adequate comparisons we need to identify and integrate criteria of comparison. This is (...) precisely the aim of a theory of justice (such as John Rawls’s theory): identifying, integrating and ordering relevant principles of justice. The same integrated criteria that determine ‘perfect’ justice are needed to be able to adequately compare imperfect social states. Sen’s alternative approach, which is based on social choice theory, is incapable of avoiding contrary, indeterminate or incoherent directives where plural principles of justice conflict. (shrink)
In this introduction to the special issue of Social Epistemology on epistemological contrastivism, I make some remarks on the history of contrastivism, describe three main versions of contrastivism, and offer a guide through the papers that compose this issue.
I reply to Martijn Blaauw's recent article about subject sensitive invariantism, in which he argues that SSI, unlike its contextualist and contrastivist competitors, cannot give a proper account of memorial knowledge. I argue that these theories are on a par when it comes to such an account.
Many philosophers are building a solid case in favour of the knowledge account of assertion (KAA). According to KAA, if one asserts that P one represents oneself as knowing that P. KAA has recently received support from linguistic data about prompting challenges, parenthetical positioning and predictions. In this article, I add another argument to this rapidly growing list: an argument from what I will call ‘reinforcing parenthesis’.
John Turri has recently provided two problem cases for the knowledge account of assertion (KAA) to argue for the express knowledge account of assertion (EKAA). We defend KAA by explaining away the intuitions about the problem cases and by showing that our explanation is theoretically superior to EKAA.
I air three kinds of problem to which Sinnott-Armstrong's epistemological contrastivism seems to be exposed: (a) the theory gives an unplausible account of justification attributions; (b) the Pyrrhonism which results from its inability to identify relevant contrast classes bars us from epistemic responsibility; (c) contextualism does just as well as Pyrrhonism, despite Sinnott-Armstrong's arguments to the contrary.
In cases of imaginative contagion, imagining something has doxastic or doxastic-like consequences. In this reply to Tamar Szabó Gendler's article in this collection, I investigate what the philosophical consequences of these cases could be. I argue (i) that imaginative contagion has consequences for how we should understand the nature of imagination and (ii) that imaginative contagion has consequences for our understanding of what belief-forming mechanisms there are. Along the way, I make some remarks about what the consequences of the contagion (...) cases are for the relation between knowledge and imagination. (shrink)
This article discusses the possibility of a rationally justified choice between two options neither of which is better than the other while they are not equally good either (‘3NT’). Joseph Raz regards such options as incomparable and argues that reason cannot guide the choice between them. Ruth Chang, by contrast, tries to show that many cases of putative incomparability are instead cases of parity—a fourth value relation of comparability, in addition to the three standard value relations ‘better than’, ‘worse than’ (...) and ‘equally good as’. It follows, she argues, that many choice situations in which rationally justified choice seems precluded are in fact situations within the reach of practical reason. This article has three aims: (1) it challenges Chang’s argument for the possibility of parity; (2) it demonstrates that, even if parity would exist, its problematic implications for practical reason would not differ from those of Raz’s incomparability; (3) it discusses the underlying cause of hard cases of comparison: the fact that none of the three standard value relations applies (‘3NT’). It will be shown that the problematic implications for the rational justification of the choice are due to 3NT itself, irrespective of whether 3NT is explained as incomparability or parity. (shrink)
A central intuition many epistemologists seem to have is that knowledge is distinctively valuable. In his paper 'Radical Scepticism, Epistemic Luck and Epistemic Value', Duncan Pritchard rejects the virtue-theoretic explanation of this intuition. This explanation says that knowledge is distinctively valuable because it is a cognitive achievement. It is maintained, in the first place, that the arguments Pritchard musters against the thesis that knowledge is a cognitive achievement are unconvincing. It is argued, in the second place, that even (...) if the arguments against the thesis that knowledge is a cognitive achievement were convincing, there is another explanation of the intuition that knowledge has final value available: the question-relative treatment of knowledge. (shrink)
Computer simulations can be useful tools to support philosophers in validating their theories, especially when these theories concern phenomena showing nontrivial dynamics. Such theories are usually informal, whilst for computer simulation a formally described model is needed. In this paper, a methodology is proposed to gradually formalise philosophical theories in terms of logically formalised dynamic properties. One outcome of this process is an executable logic-based temporal specification, which within a dedicated software environment can be used as a simulation model to (...) perform simulations. This specification provides a logical formalisation at the lowest aggregation level of the basic mechanisms underlying a process. In addition, dynamic properties at a higher aggregation level that may emerge from the mechanisms specified by the lower level properties, can be specified. Software tools are available to support specification, and to automatically check such higher level properties against the lower level properties and against generated simulation traces. As an illustration, three case studies are discussed showing successful applications of the approach to formalise and analyse, among others, Clark’s theory on extended mind, Damasio’s theory on core consciousness, and Dennett’s perspective on intertemporal decision making and altruism. (shrink)
In this rich and impressive new book, Henry Somers-Hall gives a nuanced analysis of the philosophical relationship between G. W. F. Hegel and Gilles Deleuze. He convincingly shows that a serious study of Hegel provides an improved insight into Deleuze’s conception of pure difference as the transcendental condition of identity. Somers-Hall develops his argument in three steps. First, both Hegel and Deleuze formulate a critique of representation. Second, Hegel’s proposed alternative is as logically consistent as Deleuze’s. Third, Deleuze can account (...) for evolution, whereas Hegel cannot. (shrink)
I reply to comments by Gerry Hough, Peter Baumann and Martijn Blaauw on my book Moral Skepticisms. The main issues concern whether modest justifiedness is epistemic and how it is related to extreme justifiedness; how contrastivists can handle crazy contrast classes, indeterminacy and common language; whether Pyrrhonian scepticism leads to paralysis in decision-making or satisfies our desires to evaluate beliefs as justified or not; and how contextualists can respond to my arguments against relevance of contrast classes.
Contextualism is a quite popular research program nowadays. In essence, the contextualist holds that the truth conditions of knowledge attributing and of knowledge denying sentences vary in accordance with the context in which the sentences are uttered. This theory is positively motivated by its (alleged) capability of best explaining certain intuitions we have about knowledge attributions and knowledge denials. In this paper, I will argue that this positive motivation isn't as compelling as the contextualists think it to be. This I (...) will do by construing a so-called ‘warranted assertability maneuver’ (or WAM) against contextualism which shows that, with respect to knowledge attributing and denying sentences, the con-textualist has confused a variance in warranted assertability conditions for a variance in truth conditions. (shrink)
Computational modeling of the brain holds great promise as a bridge from brain to behavior. To fulfill this promise, however, it is not enough for models to be 'biologically plausible': models must be structurally accurate. Here, we analyze what this entails for so-called psychobiological models, models that address behavior as well as brain function in some detail. Structural accuracy may be supported by (1) a model's a priori plausibility, which comes from a reliance on evidence-based assumptions, (2) fitting existing data, (...) and (3) the derivation of new predictions. All three sources of support require modelers to be explicit about the ontology of the model, and require the existence of data constraining the modeling. For situations in which such data are only sparsely available, we suggest a new approach. If several models are constructed that together form a hierarchy of models, higher-level models can be constrained by lower-level models, and low-level models can be constrained by behavioral features of the higher-level models. Modeling the same substrate at different levels of representation, as proposed here, thus has benefits that exceed the merits of each model in the hierarchy on its own. (shrink)
A decade ago, Isham and Butterfield proposed a topos theoretic approach to quantum mechanics, which meanwhile has been extended by Doering and Isham so as to provide a new mathematical foundation for all of physics. Last year, three of the present authors redeveloped and refined these ideas by combining the C*-algebraic approach to quantum theory with the so-called internal language of topos theory (see arXiv:0709.4364). The goal of the present paper is to illustrate our abstract setup through the concrete example (...) of the C*-algebra M_n(C) of complex n x n matrices. This leads to an explicit expression for the pointfree quantum phase space and the associated logical structure and Gelfand transform of an n-level system. We also determine the pertinent non-probabilisitic state-proposition pairing (or valuation) and give a very natural topos-theoretic reformulation of the Kochen-Specker Theorem. In our approach, the nondistributive lattice P(M_n(C)) of projections in M_n(C)(which forms the basis of the traditional quantum logic of Birkhoff and von Neumann)is replaced by a specific distributive lattice of functions from the poset of all unital commutative C*-subalgebras of M_n(C) to P(M_n(C)). The latter lattice is essentially the (pointfree) topology of the quantum phase space mentioned above, and as such defines a Heyting algebra. Each element of the lattice corresponds to a ``Bohrified'' proposition, in the sense that to each classical context it associates a yes-no question pertinent to this context, rather than being a single projection as in standard quantum logic. Distributivity is recovered at the expense of the law of the excluded middle (Tertium Non Datur), whose demise is in our opinion to be welcomed, not just in intuitionistic logic in the spirit of Brouwer, but also in quantum logic in the spirit of von Neumann. (shrink)