This article examines cognitive process models of human sentence comprehension based on the idea of informed search. These models are rational in the sense that they strive to find a good syntactic analysis quickly. Informed search derives a new account of garden pathing that handles traditional counterexamples. It supports a symbolic explanation for local coherence as well as an algorithmic account of entropy reduction. The models are expressed in a broad framework for theories of human sentence comprehension.
In “Double Vision Two Questions about the Neo-Fregean Programme”, John MacFarlane’s raises two main questions: (1) Why is it so important to neo-Fregeans to treat expressions of the form ‘the number of Fs’ as a species of singular term? What would be lost, if anything, if they were analysed instead as a type of quantifier-phrase, as on Russell’s Theory of Definite Descriptions? and (2) Granting—at least for the sake of argument—that Hume’s Principle may be used as a means of (...) implicitly defining the number operator, what advantage, if any, does adopting this course possess over a direct stipulation of the Dedekind-Peano axioms? This paper attempts to answer them. In response to the first, we spell out the links between the recognition of numerical terms as vehicles of singular reference and the conception of numbers as possible objects of singular, or object-directed, thought, and the role of the acknowledgement of numbers as objects in the neo-Fregean attempt to justify the basic laws of arithmetic. In response to the second, we argue that the crucial issue concerns the capacity of either stipulation—of Hume’s Principle, or of the Dedekind-Peano axioms—to found knowledge of the principles involved, and that in this regard there are crucial differences which explain why the former stipulation can, but the latter cannot, play the required foundational role. (shrink)
Despite there being deep lines of convergence between the philosophies of Alfred North Whitehead, C. S. Peirce, William James, John Dewey, and other classical American philosophers, it remains an open question whether Whitehead is a pragmatist, and conversation between pragmatists and Whitehead scholars have been limited. Indeed, it is difficult to find an anthology of classical American philosophy that includes Whitehead’s writings. These camps began separately, and so they remain. This volume questions the wisdom of that separation, exploring their (...) connections, both historical and in application. The essays in this volume embody original and creative work by leading scholars that not only furthers the understanding of American philosophy, but seeks to advance it by working at the intersection of experience and reality to incite novel and creative thought. This exploration is long overdue. Specific questions that are addressed are: Is Whitehead a pragmatist? What contrasts and affinities exist between American pragmatism and Whitehead’s thought? What new questions, strategies, and critiques emerge by juxtaposing their distinct perspectives? -/- . (shrink)
Anything worth regarding as logicism about number theory holds that its fundamental laws – in effect, the Dedekind-Peano axioms – may be known on the basis of logic and definitions alone. For Frege, the logic in question was that of the Begriffschrift – effectively, full impredicative second order logic - together with the resources for dealing with the putatively “logical objects” provided by Basic Law V of Grundgesetze. With this machinery in place, and with the course-of-values operator governed by Basic (...) Law V counting as logical, it is possible for all the definitions involved in the logicist reconstruction of arithmetic and analysis to be fully explicit, abbreviative definitions. Had Frege’s project succeeded, he would therefore have been in position – by his own lights – to regard the axioms of number theory simply as definitional abbreviations of certain theorems of his pure logic. Basic Law V, as every interested party knows, is inconsistent. But twentieth century orthodoxy would have scorned its description as a law of logic in any case, purely on the grounds of its existential fecundity. Contemporary Neo-Fregeanism in the foundations of mathematics does not, in intention at least, pick any quarrel with the idea that pure logic should be ontologically austere. It does however maintain that the existence of the natural numbers and the real numbers as classically conceived, and thereby the truth of the traditional axioms of arithmetic and analysis, may still be known a priori on the basis of logic and definitions. For the purposes of this claim, logic is once again conceived as essentially the system of Begriffschrift. But Basic Law V is superseded by a variety of abstraction principles, of which Hume's Principle is the best known example, which we are regarded as free to lay down as true by way of determination of the meaning of the non-logical vocabulary that they contain. Thus — the idea is — the Dedekind-Peano axioms, for example, may be known, a priori, to be true by virtue of their derivation in pure logic from a principle which may be regarded as stipulatively true, and whose very stipulation may be regarded as conferring content upon the sole item of non-logical vocabulary – the cardinality operator – which it contains and thereby as conferring content upon Hume's Principle itself.. (shrink)
_Wittgenstein’s Intentions_, first published in 1993, presents a series of essays dedicated to the great Wittgenstein exegete John Hunter. The problematic topics discussed are identified not only by Wittgenstein’s own philosophical writings, but also by contemporary scholarship: areas of ambiguity, perhaps even confusion, as well as issues which the father of analytic philosophy did not himself address. The difficulties involved in speaking cogently about religious belief, suspicion, consciousness, the nature of the will, the coincidence of our thoughts with reality, (...) and transfinite numbers are all investigated, as well as a variety of other intriguing questions: why can’t a baby pretend to smile? How do I know what I was going to say? Wittgenstein’s Intentions is an invaluable resource for students of Wittgenstein as well as scholars, and opens up a wide horizon of philosophical questioning for those as yet unfamiliar with this style of reasoning. (shrink)
Wittgenstein’s Intentions , first published in 1993, presents a series of essays dedicated to the great Wittgenstein exegete John Hunter. The problematic topics discussed are identified not only by Wittgenstein’s own philosophical writings, but also by contemporary scholarship: areas of ambiguity, perhaps even confusion, as well as issues which the father of analytic philosophy did not himself address. The difficulties involved in speaking cogently about religious belief, suspicion, consciousness, the nature of the will, the coincidence of our thoughts with (...) reality, and transfinite numbers are all investigated, as well as a variety of other intriguing questions: why can’t a baby pretend to smile? How do I know what I was going to say? Wittgenstein’s Intentions is an invaluable resource for students of Wittgenstein as well as scholars, and opens up a wide horizon of philosophical questioning for those as yet unfamiliar with this style of reasoning. (shrink)
My aim in this study is not to praise Fischer's fine theory of moral responsibility, but to (try to) bury the "semi" in "semicompatibilism". I think Fischer gives the Consequence Argument (CA) too much credit, and gives himself too little credit. In his book, The Metaphysics of Free Will, Fischer gave the CA as good a statement as it will ever get, and put his finger on what is wrong with it. Then he declared stalemate rather than victory. In my (...) view, Fischer's view amounts to sophisticated compatibilism. It would be nice to be able to call it by its right name. In The Metaphysics of Free Will, Fischer develops his own version of Consequence Argument, which turns on two principles, one of which is the fixity of the past. FP: For any action K, agent S and time i, if it is true that is S were to do Y at t, some fact about that past relative to t would not have been a fact, then S cannot at t do Fat 1. 1 argue that the equipment needed to reject FP (and thereby defend the most plausible version of compatibilism) is needed to deal with the problem of fatalism. In addition, I argue that the rejection of FP is compatible with Fischer's approach to Frankfurt cases and with his account of transfer principles. (shrink)