Ineffability, method, and ontology, by G. Bergmann.--The glory and the misery of Ludwig Wittgenstein, by G. Bergmann.--Stenius on the Tractatus, by G. Bergmann.--Naming and saying, by W. Sellars.--The ontology of Wittgenstein's Tractatus, by E. D. Klemke.--Material properties in the Tractatus, by H. Hochberg.--Wittgenstein's pantheism: a new light on the ontology of the Tractatus, by N. Garver.--Science and metaphysics: a Wittgensteinian interpretation, by H. Petrie.--Wittgenstein on private languages, by C. L. Hardin.--Wittgenstein on private language, by N. Garver.--Wittgenstein and private languages, by (...) W. Todd.--The private-language argument, by H.-N. Castañeda.--Wittgenstein on privacy, by J. W. Cook.--"Forms of life" in Wittgenstein's Philosophical investigations, by J. F. M. Hunter.--Privacy and language, by M. S. Gram.--On language games and forms of life, by F. Zabeeh.--Wittgenstein on meaning and use, by J. F. M. Hunter.--Wittgenstein on phenomenalism, skepticism, and criteria, by A. Oldenquist.--Tractarian reflections on saying and showing, by D. W. Stampe.--Wittgenstein and logical necessity, by B. Stroud.--Negation and generality, by H. Hochberg.--Facts, possibilities, and essences in the Tractatus, by H. Hochberg.--Arithmetic and propositional form in Wittgenstein's Tractatus, by H. Hochberg.--Selected bibliography (p. 543-546). (shrink)
This brief paperback is designed for symbolic/formal logic courses. It features the tree method proof system developed by Jeffrey. The new edition contains many more examples and exercises and is reorganized for greater accessibility.
The nature of reference, or the relation of a word to the object to which it refers, has been perhaps the dominant concern of twentieth-century analytic philosophy. Extremely influential arguments by Gottlob Frege around the turn of the century convinced the large majority of philosophers that the meaning of a word must be distinguished from its referent, the former only providing some kind of direction for reaching the latter. In the last twenty years, this Fregean orthodoxy has been vigorously challenged (...) by those who argue that certain important kinds of words, at least, refer directly without need of an intermediate meaning or sense. The essays in this volume record how a long-term study of Frege has persuaded the author that Frege's pivotal distinction between sense and reference, and his attendant philosophical views about language and thought, are unsatisfactory. Frege's perspective, he argues, imposes a distinctive way of thinking about semantics, specifically about the centrality of cognitive significance puzzles for semantics. Freed from Frege's perspective, we will no longer find it natural to think about semantics in this way. (shrink)
This book attempts to marry truth-conditional semantics with cognitive linguistics in the church of computational neuroscience. To this end, it examines the truth-conditional meanings of coordinators, quantifiers, and collective predicates as neurophysiological phenomena that are amenable to a neurocomputational analysis. Drawing inspiration from work on visual processing, and especially the simple/complex cell distinction in early vision (V1), we claim that a similar two-layer architecture is sufficient to learn the truth-conditional meanings of the logical coordinators and logical quantifiers. As a prerequisite, (...) much discussion is given over to what a neurologically plausible representation of the meanings of these items would look like. We eventually settle on a representation in terms of correlation, so that, for instance, the semantic input to the universal operators (e.g. and, all)is represented as maximally correlated, while the semantic input to the universal negative operators (e.g. nor, no)is represented as maximally anticorrelated. On the basis this representation, the hypothesis can be offered that the function of the logical operators is to extract an invariant feature from natural situations, that of degree of correlation between parts of the situation. This result sets up an elegant formal analogy to recent models of visual processing, which argue that the function of early vision is to reduce the redundancy inherent in natural images. Computational simulations are designed in which the logical operators are learned by associating their phonological form with some degree of correlation in the inputs, so that the overall function of the system is as a simple kind of pattern recognition. Several learning rules are assayed, especially those of the Hebbian sort, which are the ones with the most neurological support. Learning vector quantization (LVQ) is shown to be a perspicuous and efficient means of learning the patterns that are of interest. We draw a formal parallelism between the initial, competitive layer of LVQ and the simple cell layer in V1, and between the final, linear layer of LVQ and the complex cell layer in V1, in that the initial layers are both selective, while the final layers both generalize. It is also shown how the representations argued for can be used to draw the traditionally-recognized inferences arising from coordination and quantification, and why the inference of subalternacy breaks down for collective predicates. Finally, the analogies between early vision and the logical operators allow us to advance the claim of cognitive linguistics that language is not processed by proprietary algorithms, but rather by algorithms that are general to the entire brain. Thus in the debate between objectivist and experiential metaphysics, this book falls squarely into the camp of the latter. Yet it does so by means of a rigorous formal, mathematical, and neurological exposition – in contradiction of the experiential claim that formal analysis has no place in the understanding of cognition. To make our own counter-claim as explicit as possible, we present a sketch of the LVQ structure in terms of mereotopology, in which the initial layer of the network performs topological operations, while the final layer performs mereological operations. The book is meant to be self-contained, in the sense that it does not assume any prior knowledge of any of the many areas that are touched upon. It therefore contains mini-summaries of biological visual processing, especially the retinocortical and ventral /what?/ parvocellular pathways computational models of neural signaling, and in particular the reduction of the Hodgkin-Huxley equations to the connectionist and integrate-and-fire neurons Hebbian learning rules and the elaboration of learning vector quantization the linguistic pathway in the left hemisphere memory and the hippocampus truth-conditional vs. image-schematic semantics objectivist vs. experiential metaphysics and mereotopology. All of the simulations are implemented in MATLAB, and the code is available from the book’s website. • The discovery of several algorithmic similarities between visison and semantics. • The support of all of this by means of simulations, and the packaging of all of this in a coherent theoretical framework. (shrink)
The material elements of writing have long been undervalued, and have been dismissed by recent historicising trends of criticism; but analysis of these elements - sound, signature, letters - can transform our understanding of literary texts. In this book Tom Cohen shows how, in an era of representational criticism and cultural studies, the role of close reading has been overlooked. Arguing that much recent criticism has been caught in potentially regressive models of representation, Professor Cohen undertakes to counter this by (...) rethinking the 'materiality' of the text itself. Through a series of revealing new readings of the work of writers including Plato, Bakhtin, Poe, Whitman and Conrad, Professor Cohen exposes the limitations of new historicism and neo-pragmatism, and demonstrates how 'the materiality of language' operates to undo the representational models of meaning imposed by the literary canon. (shrink)
This is a comprehensive study of the English word 'or', and the logical operators variously proposed to present its meaning. Although there are indisputably disjunctive uses of or in English, it is a mistake to suppose that logical disjunction represents its core meaning. 'Or' is descended from the Anglo-Saxon word meaning second, a form which survives in such expressions as "every other day." Its disjunctive uses arise through metalinguistic applications of an intermediate adverbial meaning which is conjunctive rather than disjunctive (...) in character. These conjunctive uses have puzzled philosophers and logicians, and have been discussed extensively under such headings as "free choice permission." This study examines the textbook myths that have clouded our understanding of how or and other "logical" vocabulary comes to have something approaching its logical meaning in natural languages. It considers the various historical conceptions of disjunction and its place in logic from the Stoics to the present day. (shrink)
Knowledge and Lotteries is organized around an epistemological puzzle: in many cases, we seem consistently inclined to deny that we know a certain class of propositions, while crediting ourselves with knowledge of propositions that imply them. In its starkest form, the puzzle is this: we do not think we know that a given lottery ticket will be a loser, yet we normally count ourselves as knowing all sorts of ordinary things that entail that its holder will not suddenly acquire a (...) large fortune. After providing a number of specific and general characterizations of the puzzle, Hawthorne carefully examines the competing merits of candidate solutions. In so doing, he explores a number of central questions concerning the nature and importance of knowledge, including the relationship of knowledge to assertion and practical reasoning, the status of epistemic closure principles, the merits of various brands of scepticism, the prospects for a contextualist account of knowledge, and the potential for other sorts of salience-sensitive accounts. Along the way, he offers a careful treatment of pertinent issues at the foundations of semantics. His book will be of interest to anyone working in the field of epistemology, as well as to philosophers of language. (shrink)
Contemporary philosophy and theoretical psychology are dominated by an acceptance of content-externalism: the view that the contents of one's mental states are constitutively, as opposed to causally, dependent on facts about the external world. In the present work, it is shown that content-externalism involves a failure to distinguish between semantics and pre-semantics---between, on the one hand, the literal meanings of expressions and, on the other hand, the information that one must exploit in order to ascertain their literal meanings. It is (...) further shown that, given the falsity of content-externalism, the falsity of the Computational Theory of Mind (CTM) follows. It is also shown that CTM involves a misunderstanding of terms such as "computation," "syntax," "algorithm," and "formal truth." Novel analyses of the concepts expressed by these terms are put forth. These analyses yield clear, intuition-friendly, and extensionally correct answers to the questions "what are propositions?, "what is it for a proposition to be true?", and "what are the logical and psychological differences between conceptual (propositional) and non-conceptual (non-propositional) content?" Naively taking literal meaning to be in lockstep with cognitive content, Burge, Salmon, Falvey, and other semantic externalists have wrongly taken Kripke's correct semantic views to justify drastic and otherwise contraindicated revisions of commonsense. (Salmon: What is non-existent exists; at a given time, one can rationally accept a proposition and its negation. Burge: Somebody who is having a thought may be psychologically indistinguishable from somebody who is thinking nothing. Falvey: somebody who rightly believes himself to be thinking about water is psychologically indistinguishable from somebody who wrongly thinks himself to be doing so and who, indeed, isn't thinking about anything.) Given a few truisms concerning the differences between thought-borne and sentence-borne information, the data is easily modeled without conceding any legitimacy to any one of these rationality-dismantling atrocities. (It thus turns out, ironically, that no one has done more to undermine Kripke's correct semantic points than Kripke's own followers!). (shrink)
This book analyzes--in terms of branching--the pervasive reorganization of Latin syntactic and morphological structures: in the development from Latin to French, a shift can be observed from the archaic, left-branching structures (which Latin inherited from Proto-Indo-European) to modern right-branching equivalents. Brigitte Bauer presents a detailed analysis of this development based on the theoretical discussion and definition of "branching" and "head." Subsequently she relates the diachronic shift to psycholinguistic evidence, arguing that the difficuly of LB complex structures as reflected in their (...) painstaking and delayed acquisition accounts for the extensive typological shift from left to right branching that took place in Latin/French and the other Indo-European languages. (shrink)
Are propositions of law true or false? If so, what does it mean to say that propositions of law are true and false? This book takes up these questions in the context of the wider philosophical debate over realism and anti-realism. Despite surface differences, Patterson argues that the leading contemporary jurisprudential theories all embrace a flawed conception of the nature of truth in law. Instead of locating that in virtue of which propositions of law are true, Patterson argues that lawyers (...) use forms of argument to show the truth of propositions of law. Additionally, Patterson argues that the realism/anti-realism debate in jurisprudence is part of a larger argument over the role of postmodernism in jurisprudence. For this, Patterson offers an analytic account of postmodernism and charts its implications for legal theory. This book will be of interest to those in legal theory, philosophy, social and political theory, and ethics. (shrink)
Mental Spaces is the classic introduction to the study of mental spaces and conceptual projection, as revealed through the structure and use of language. It examines in detail the dynamic construction of connected domains as discourse unfolds. The discovery of mental space organization has modified our conception of language and thought: powerful and uniform accounts of superficially disparate phenomena have become available in the areas of reference, presupposition projection, counterfactual and analogical reasoning, metaphor and metonymy, and time and aspect in (...) discourse. The present work lays the foundation for this research. It uncovers simple and general principles that lie behind the awesome complexity of everyday logic. (shrink)
This book opens up a new route to the study of knowledge dynamics and the sociology of knowledge. The focus is on the role of metaphors as powerful catalysts and the book dissects their role in the construction of theories of knowledge and will therefore be of vital interest to social and cognitive scientists alike.
Split constructions are widespread in natural languages. The separation of the semantic restriction of a quantifier from that quantifier is a typical example of such a construction. This study addresses the problem that such discontinuous strings exhibit--namely, a number of locality constraints, including intervention effects. These are shown to follow from the interaction of a minimalist syntax with a semantics that directly assigns a model-theoretic interpretation to syntactic logical forms. The approach is shown to have wide empirical coverage and a (...) conceptual simplicity. The book will be of interest to scholars and advanced students of syntax and semantics. (shrink)
Focusing on Truth explores the question of what truth is, balancing historical with issue-orientated discussion. The book offers a comprehensive survey of all the major theories of truth. Lawrence Johnson investigates a number of closely related matters of truth in his inquiry, such as: What sorts of things are true or false? What is attributed to them when they are said to be true or false? What do facts have to do with truth? What can we learn from previous theories? (...) The book opens with an analysis of the coherence theory of truth and then the correspondence theory of truth, as developed by Moore, Russell and Wittgenstein. Through a study of the semantic conceptions of truth, the author reveals that an adequate theory of truth must take account of the pragmatics of person, purpose, and circumstance. A full understanding of facts and truth bearers is considered central to Johnson's criticism of the opposing truth theories of J. L. Austin and P. F. Strawson. Drawing on the merits of these theories and others, while identifying their deficiencies, Johnson presents a new account of truth, based on the correlation of referential foci and the use of linguistic conventions. This account is defended as being adequate to meet the legitimate demands made on a theory of truth. Johnson argues that the account leaves scope for statements of many different sorts to be true in their own widely varying ways, without the existence of a need to posit fundamentally different kinds of truth. (shrink)
Many people find themselves dissatisfied with recent linguistic philosophy, and yet know that language has always mattered deeply to philosophy and must in some sense continue to do so. Ian Hacking considers here some dozen case studies in the history of philosophy to show the different ways in which language has been important, and the consequences for the development of the subject. There are chapters on, among others, Hobbes, Berkeley, Russell, Ayer, Wittgenstein, Chomsky, Feyerabend and Davidson. Dr Hacking (...) ends by speculating about the directions in which philosophy and the study of language seem likely to go. The book will provide students with a stimulating, broad survey of problems in the theory of meaning and the development of philosophy, particularly in this century. The topics treated in the philosophy of language are among the central, current concerns of philosophers, and the historical framework makes it possible to introduce concretely and intelligibly all the main theoretical issues. (shrink)
Meaning seems to shift from context to context; how do we know when someone says "grab a chair" that an ottoman or orange crate will do, but when someone says "let's buy a chair," they won't? In Plastic Glasses and Church Fathers, Kronenfeld offers a theory that explains both the usefulness of language's variability of reference and the mechanisms which enable us to understand each other in spite of the variability. Kronenfeld's theory, rooted in the tradition of ethnoscience (or cognitive (...) anthropology), accomplishes three things. First, it distinguishes prototypic referents from extended referents. Second, it describes the various bases of semantic extensions. Finally it details how we use the situational context of usage, the linguistic context of opposition and inclusion, and the conceptual context of knowledge about the world to interpret communicative events. (shrink)
The analytic movement advertised its 'linguistic turn' as a radical break from the two-thousand-year-old substance tradition. But this is an illusion. On the fundamental level of ontology, there is enough reformulation and presupposition of traditional 'no entity without identity' themes to analogize Frege, Russell, Wittgenstein, and Quine to Aristotle as paradigmatic of modified realism. Thus the pace of ontology is glacial. Frege and Russell, not Wittgenstein and Quine, emerge as the true analytic progenitors of 'no entity without identity,' offering between (...) them at least twenty-nine private language arguments and sixty-four 'no entity without identity' theories. (shrink)
This contribution to Palgrave's 'Advances' series addresses a wide range of issues that have arisen in post-Gricean pragmatic theory, in chapters by distinguished authors. Among the specific topics covered are scalar implicatures, lexical semantics and pragmatics, indexicality, procedural meaning, the semantics and pragmatics of negation. The volume includes both defences and critiques of Relevance Theory and of Neo-Gricean Pragmatics.
The author of the highly popular book Think, which Time magazine hailed as "the one book every smart person should read to understand, and even enjoy, the key questions of philosophy," Simon Blackburn is that rara avis--an eminent thinker who is able to explain philosophy to the general reader. Now Blackburn offers a tour de force exploration of what he calls "the most exciting and engaging issue in the whole of philosophy"--the age-old war over truth. The front lines of (...) this war are well defined. On one side are those who believe in plain, unvarnished facts, rock-solid truths that can be found through reason and objectivity--that science leads to truth, for instance. Their opponents mock this idea. They see the dark forces of language, culture, power, gender, class, ideology and desire--all subverting our perceptions of the world, and clouding our judgement with false notions of absolutetruth. Beginning with an early skirmish in the war--when Socrates confronted the sophists in ancient Athens--Blackburn offers a penetrating look at the longstanding battle these two groups have waged, examining the philosophical battles fought by Plato, Protagoras, William James, David Hume, Hans-Georg Gadamer, Jacques Derrida, Michel Foucault, Richard Rorty, and many others, with a particularly fascinating look at Nietzsche. Among the questions Blackburn considers are: is science mere opinion, can historians understand another historical period, and indeed can one culture ever truly understand another. Blackburn concludes that both sides have merit, and that neither has exclusive ownership of truth. What is important is that, whichever side we embrace, we should know where we stand and what is to be said for our opponents. (shrink)
McCarthy develops a theory of radical interpretation--the project of characterizing from scratch the language and attitudes of an agent or population--and applies it to the problems of indeterminacy of interpretation first described by Quine. The major theme in McCarthy's study is that a relatively modest set of interpretive principles, properly applied, can serve to resolve the major indeterminacies of interpretation.