Ineffability, method, and ontology, by G. Bergmann.--The glory and the misery of Ludwig Wittgenstein, by G. Bergmann.--Stenius on the Tractatus, by G. Bergmann.--Naming and saying, by W. Sellars.--The ontology of Wittgenstein's Tractatus, by E. D. Klemke.--Material properties in the Tractatus, by H. Hochberg.--Wittgenstein's pantheism: a new light on the ontology of the Tractatus, by N. Garver.--Science and metaphysics: a Wittgensteinian interpretation, by H. Petrie.--Wittgenstein on private languages, by C. L. Hardin.--Wittgenstein on private language, by N. Garver.--Wittgenstein and private languages, by (...) W. Todd.--The private-language argument, by H.-N. Castañeda.--Wittgenstein on privacy, by J. W. Cook.--"Forms of life" in Wittgenstein's Philosophical investigations, by J. F. M. Hunter.--Privacy and language, by M. S. Gram.--On language games and forms of life, by F. Zabeeh.--Wittgenstein on meaning and use, by J. F. M. Hunter.--Wittgenstein on phenomenalism, skepticism, and criteria, by A. Oldenquist.--Tractarian reflections on saying and showing, by D. W. Stampe.--Wittgenstein and logical necessity, by B. Stroud.--Negation and generality, by H. Hochberg.--Facts, possibilities, and essences in the Tractatus, by H. Hochberg.--Arithmetic and propositional form in Wittgenstein's Tractatus, by H. Hochberg.--Selected bibliography (p. 543-546). (shrink)
Contemporary philosophy and theoretical psychology are dominated by an acceptance of content-externalism: the view that the contents of one's mental states are constitutively, as opposed to causally, dependent on facts about the external world. In the present work, it is shown that content-externalism involves a failure to distinguish between semantics and pre-semantics---between, on the one hand, the literal meanings of expressions and, on the other hand, the information that one must exploit in order to ascertain their literal meanings. It is (...) further shown that, given the falsity of content-externalism, the falsity of the Computational Theory of Mind (CTM) follows. It is also shown that CTM involves a misunderstanding of terms such as "computation," "syntax," "algorithm," and "formal truth." Novel analyses of the concepts expressed by these terms are put forth. These analyses yield clear, intuition-friendly, and extensionally correct answers to the questions "what are propositions?, "what is it for a proposition to be true?", and "what are the logical and psychological differences between conceptual (propositional) and non-conceptual (non-propositional) content?" Naively taking literal meaning to be in lockstep with cognitive content, Burge, Salmon, Falvey, and other semantic externalists have wrongly taken Kripke's correct semantic views to justify drastic and otherwise contraindicated revisions of commonsense. (Salmon: What is non-existent exists; at a given time, one can rationally accept a proposition and its negation. Burge: Somebody who is having a thought may be psychologically indistinguishable from somebody who is thinking nothing. Falvey: somebody who rightly believes himself to be thinking about water is psychologically indistinguishable from somebody who wrongly thinks himself to be doing so and who, indeed, isn't thinking about anything.) Given a few truisms concerning the differences between thought-borne and sentence-borne information, the data is easily modeled without conceding any legitimacy to any one of these rationality-dismantling atrocities. (It thus turns out, ironically, that no one has done more to undermine Kripke's correct semantic points than Kripke's own followers!). (shrink)
Focusing on Truth explores the question of what truth is, balancing historical with issue-orientated discussion. The book offers a comprehensive survey of all the major theories of truth. Lawrence Johnson investigates a number of closely related matters of truth in his inquiry, such as: What sorts of things are true or false? What is attributed to them when they are said to be true or false? What do facts have to do with truth? What can we learn from previous theories? (...) The book opens with an analysis of the coherence theory of truth and then the correspondence theory of truth, as developed by Moore, Russell and Wittgenstein. Through a study of the semantic conceptions of truth, the author reveals that an adequate theory of truth must take account of the pragmatics of person, purpose, and circumstance. A full understanding of facts and truth bearers is considered central to Johnson's criticism of the opposing truth theories of J. L. Austin and P. F. Strawson. Drawing on the merits of these theories and others, while identifying their deficiencies, Johnson presents a new account of truth, based on the correlation of referential foci and the use of linguistic conventions. This account is defended as being adequate to meet the legitimate demands made on a theory of truth. Johnson argues that the account leaves scope for statements of many different sorts to be true in their own widely varying ways, without the existence of a need to posit fundamentally different kinds of truth. (shrink)
Knowledge and Lotteries is organized around an epistemological puzzle: in many cases, we seem consistently inclined to deny that we know a certain class of propositions, while crediting ourselves with knowledge of propositions that imply them. In its starkest form, the puzzle is this: we do not think we know that a given lottery ticket will be a loser, yet we normally count ourselves as knowing all sorts of ordinary things that entail that its holder will not suddenly acquire a (...) large fortune. After providing a number of specific and general characterizations of the puzzle, Hawthorne carefully examines the competing merits of candidate solutions. In so doing, he explores a number of central questions concerning the nature and importance of knowledge, including the relationship of knowledge to assertion and practical reasoning, the status of epistemic closure principles, the merits of various brands of scepticism, the prospects for a contextualist account of knowledge, and the potential for other sorts of salience-sensitive accounts. Along the way, he offers a careful treatment of pertinent issues at the foundations of semantics. His book will be of interest to anyone working in the field of epistemology, as well as to philosophers of language. (shrink)
The ability to produce and understand referring expressions is basic to human language use and human cognition. Reference comprises the ability to think of and represent objects (both real and imagined/fictional), to indicate to others which of these objects we are talking about, and to determine what others are talking about when they use a nominal expression. The articles in this volume are concerned with some of the central themes and challenges in research on reference within the cognitive sciences - (...) philosophy (including philosophy of language and mind, logic, and formal semantics), theoretical and computational linguistics, and cognitive psychology. The papers address four basic questions: What is reference? What is the appropriate analysis of different referring forms, such as definite descriptions? How is reference resolved? and How do speaker/writers select appropriate referring forms, such as pronouns vs. full noun phrases, demonstrative vs. personal pronouns, and overt vs. null/zero pronominal forms? Some of the papers assume and build on existing theories, such as Centering Theory and the Givenness Hierarchy framework; others propose their own models of reference understanding or production. The essays examine reference from a number of disciplinary and interdisciplinary perspectives, informed by different research traditions and employing different methodologies. While the contributors to the volume were primarily trained in one of the four represented disciplines-computer science, linguistics, philosophy and psychology, and use methodologies typical of that discipline, each of them bridges more than one discipline in their methodology and/or their approach. (shrink)
This contribution to Palgrave's 'Advances' series addresses a wide range of issues that have arisen in post-Gricean pragmatic theory, in chapters by distinguished authors. Among the specific topics covered are scalar implicatures, lexical semantics and pragmatics, indexicality, procedural meaning, the semantics and pragmatics of negation. The volume includes both defences and critiques of Relevance Theory and of Neo-Gricean Pragmatics.
The material elements of writing have long been undervalued, and have been dismissed by recent historicising trends of criticism; but analysis of these elements - sound, signature, letters - can transform our understanding of literary texts. In this book Tom Cohen shows how, in an era of representational criticism and cultural studies, the role of close reading has been overlooked. Arguing that much recent criticism has been caught in potentially regressive models of representation, Professor Cohen undertakes to counter this by (...) rethinking the 'materiality' of the text itself. Through a series of revealing new readings of the work of writers including Plato, Bakhtin, Poe, Whitman and Conrad, Professor Cohen exposes the limitations of new historicism and neo-pragmatism, and demonstrates how 'the materiality of language' operates to undo the representational models of meaning imposed by the literary canon. (shrink)
The analytic movement advertised its 'linguistic turn' as a radical break from the two-thousand-year-old substance tradition. But this is an illusion. On the fundamental level of ontology, there is enough reformulation and presupposition of traditional 'no entity without identity' themes to analogize Frege, Russell, Wittgenstein, and Quine to Aristotle as paradigmatic of modified realism. Thus the pace of ontology is glacial. Frege and Russell, not Wittgenstein and Quine, emerge as the true analytic progenitors of 'no entity without identity,' offering between (...) them at least twenty-nine private language arguments and sixty-four 'no entity without identity' theories. (shrink)
This book looks at the ways in which conditionals, an integral part of philosophy and logic, can be of practical use in computer programming. It analyzes the different types of conditionals, including their applications and potential problems. Other topics include defeasible logics, the Ramsey test, and a unified view of consequence relation and belief revision. Its implications will be of interest to researchers in logic, philosophy, and computer science, particularly artificial intelligence.
This book opens up a new route to the study of knowledge dynamics and the sociology of knowledge. The focus is on the role of metaphors as powerful catalysts and the book dissects their role in the construction of theories of knowledge and will therefore be of vital interest to social and cognitive scientists alike.
This book analyzes--in terms of branching--the pervasive reorganization of Latin syntactic and morphological structures: in the development from Latin to French, a shift can be observed from the archaic, left-branching structures (which Latin inherited from Proto-Indo-European) to modern right-branching equivalents. Brigitte Bauer presents a detailed analysis of this development based on the theoretical discussion and definition of "branching" and "head." Subsequently she relates the diachronic shift to psycholinguistic evidence, arguing that the difficuly of LB complex structures as reflected in their (...) painstaking and delayed acquisition accounts for the extensive typological shift from left to right branching that took place in Latin/French and the other Indo-European languages. (shrink)
Are propositions of law true or false? If so, what does it mean to say that propositions of law are true and false? This book takes up these questions in the context of the wider philosophical debate over realism and anti-realism. Despite surface differences, Patterson argues that the leading contemporary jurisprudential theories all embrace a flawed conception of the nature of truth in law. Instead of locating that in virtue of which propositions of law are true, Patterson argues (...) that lawyers use forms of argument to show the truth of propositions of law. Additionally, Patterson argues that the realism/anti-realism debate in jurisprudence is part of a larger argument over the role of postmodernism in jurisprudence. For this, Patterson offers an analytic account of postmodernism and charts its implications for legal theory. This book will be of interest to those in legal theory, philosophy, social and political theory, and ethics. (shrink)
Split constructions are widespread in natural languages. The separation of the semantic restriction of a quantifier from that quantifier is a typical example of such a construction. This study addresses the problem that such discontinuous strings exhibit--namely, a number of locality constraints, including intervention effects. These are shown to follow from the interaction of a minimalist syntax with a semantics that directly assigns a model-theoretic interpretation to syntactic logical forms. The approach is shown to have wide empirical coverage and a (...) conceptual simplicity. The book will be of interest to scholars and advanced students of syntax and semantics. (shrink)
This book attempts to marry truth-conditional semantics with cognitive linguistics in the church of computational neuroscience. To this end, it examines the truth-conditional meanings of coordinators, quantifiers, and collective predicates as neurophysiological phenomena that are amenable to a neurocomputational analysis. Drawing inspiration from work on visual processing, and especially the simple/complex cell distinction in early vision (V1), we claim that a similar two-layer architecture is sufficient to learn the truth-conditional meanings of the logical coordinators and logical quantifiers. As a prerequisite, (...) much discussion is given over to what a neurologically plausible representation of the meanings of these items would look like. We eventually settle on a representation in terms of correlation, so that, for instance, the semantic input to the universal operators (e.g. and, all)is represented as maximally correlated, while the semantic input to the universal negative operators (e.g. nor, no)is represented as maximally anticorrelated. On the basis this representation, the hypothesis can be offered that the function of the logical operators is to extract an invariant feature from natural situations, that of degree of correlation between parts of the situation. This result sets up an elegant formal analogy to recent models of visual processing, which argue that the function of early vision is to reduce the redundancy inherent in natural images. Computational simulations are designed in which the logical operators are learned by associating their phonological form with some degree of correlation in the inputs, so that the overall function of the system is as a simple kind of pattern recognition. Several learning rules are assayed, especially those of the Hebbian sort, which are the ones with the most neurological support. Learning vector quantization (LVQ) is shown to be a perspicuous and efficient means of learning the patterns that are of interest. We draw a formal parallelism between the initial, competitive layer of LVQ and the simple cell layer in V1, and between the final, linear layer of LVQ and the complex cell layer in V1, in that the initial layers are both selective, while the final layers both generalize. It is also shown how the representations argued for can be used to draw the traditionally-recognized inferences arising from coordination and quantification, and why the inference of subalternacy breaks down for collective predicates. Finally, the analogies between early vision and the logical operators allow us to advance the claim of cognitive linguistics that language is not processed by proprietary algorithms, but rather by algorithms that are general to the entire brain. Thus in the debate between objectivist and experiential metaphysics, this book falls squarely into the camp of the latter. Yet it does so by means of a rigorous formal, mathematical, and neurological exposition – in contradiction of the experiential claim that formal analysis has no place in the understanding of cognition. To make our own counter-claim as explicit as possible, we present a sketch of the LVQ structure in terms of mereotopology, in which the initial layer of the network performs topological operations, while the final layer performs mereological operations. The book is meant to be self-contained, in the sense that it does not assume any prior knowledge of any of the many areas that are touched upon. It therefore contains mini-summaries of biological visual processing, especially the retinocortical and ventral /what?/ parvocellular pathways computational models of neural signaling, and in particular the reduction of the Hodgkin-Huxley equations to the connectionist and integrate-and-fire neurons Hebbian learning rules and the elaboration of learning vector quantization the linguistic pathway in the left hemisphere memory and the hippocampus truth-conditional vs. image-schematic semantics objectivist vs. experiential metaphysics and mereotopology. All of the simulations are implemented in MATLAB, and the code is available from the book’s website. • The discovery of several algorithmic similarities between visison and semantics. • The support of all of this by means of simulations, and the packaging of all of this in a coherent theoretical framework. (shrink)
Semantic Analysis is a lively and clearly written introduction to the study of meaning in language and to the language-culture connection. Goddard covers traditional and contemporary issues and approaches with the relationship between semantics, conceptualization, and culture as a key theme. He also details a number of case studies that draw on a wide range of material from non-Indo-European languages, particularly Australian Aboriginal languages and Malay, on which the author is an authority.
McCarthy develops a theory of radical interpretation--the project of characterizing from scratch the language and attitudes of an agent or population--and applies it to the problems of indeterminacy of interpretation first described by Quine. The major theme in McCarthy's study is that a relatively modest set of interpretive principles, properly applied, can serve to resolve the major indeterminacies of interpretation.
The author of the highly popular book Think, which Time magazine hailed as "the one book every smart person should read to understand, and even enjoy, the key questions of philosophy," Simon Blackburn is that rara avis--an eminent thinker who is able to explain philosophy to the general reader. Now Blackburn offers a tour de force exploration of what he calls "the most exciting and engaging issue in the whole of philosophy"--the age-old war over truth. The front lines of this (...) war are well defined. On one side are those who believe in plain, unvarnished facts, rock-solid truths that can be found through reason and objectivity--that science leads to truth, for instance. Their opponents mock this idea. They see the dark forces of language, culture, power, gender, class, ideology and desire--all subverting our perceptions of the world, and clouding our judgement with false notions of absolute truth. Beginning with an early skirmish in the war--when Socrates confronted the sophists in ancient Athens--Blackburn offers a penetrating look at the longstanding battle these two groups have waged, examining the philosophical battles fought by Plato, Protagoras, William James, David Hume, Hans-Georg Gadamer, Jacques Derrida, Michel Foucault, Richard Rorty, and many others, with a particularly fascinating look at Nietzsche. Among the questions Blackburn considers are: is science mere opinion, can historians understand another historical period, and indeed can one culture ever truly understand another. Blackburn concludes that both sides have merit, and that neither has exclusive ownership of truth. What is important is that, whichever side we embrace, we should know where we stand and what is to be said for our opponents. (shrink)
A comprehensive introduction to the ways in which meaning is conveyed in language. Alan Cruse covers semantic matters, but also deals with topics that are usually considered to fall under pragmatics. A major aim is to highlight the richness and subtlety of meaning phenomena, rather than to expound any particular theory. Rich in examples and exercises, Meaning in Language provides an invaluable descriptive approach to this area of linguistics for undergraduates and postgraduates alike.
Meaning seems to shift from context to context; how do we know when someone says "grab a chair" that an ottoman or orange crate will do, but when someone says "let's buy a chair," they won't? In Plastic Glasses and Church Fathers, Kronenfeld offers a theory that explains both the usefulness of language's variability of reference and the mechanisms which enable us to understand each other in spite of the variability. Kronenfeld's theory, rooted in the tradition of ethnoscience (or cognitive (...) anthropology), accomplishes three things. First, it distinguishes prototypic referents from extended referents. Second, it describes the various bases of semantic extensions. Finally it details how we use the situational context of usage, the linguistic context of opposition and inclusion, and the conceptual context of knowledge about the world to interpret communicative events. (shrink)
The concept of truth lies at the heart of philosophy; whether one approaches it from epistemology or metaphysics, from the philosophy of language or the philosophy of science or religion, one must come to terms with the nature of truth.In this brisk introduction, Frederick Schmitt covers all the most important historical and contemporary theories of truth. Along the way he also sheds considerable light on such closely related issues as realism and idealism, absolutism and relativism, and the nature of contemporary (...) pragmatism.At a time when it is fashionable for scholars outside of philosophy to deny the possibility of truth, Schmitt’s lucid, technically accurate survey offers the easiest way to understand what is really at stake in such denials. Truth: A Primer is a quick but accurate and philosophically sophisticated overview that will prove invaluable to philosophers and their students in a wide range of courses, in particular epistemology, metaphysics, and philosophy of language. (shrink)
The nature of reference, or the relation of a word to the object to which it refers, has been perhaps the dominant concern of twentieth-century analytic philosophy. Extremely influential arguments by Gottlob Frege around the turn of the century convinced the large majority of philosophers that the meaning of a word must be distinguished from its referent, the former only providing some kind of direction for reaching the latter. In the last twenty years, this Fregean orthodoxy has been vigorously challenged (...) by those who argue that certain important kinds of words, at least, refer directly without need of an intermediate meaning or sense. The essays in this volume record how a long-term study of Frege has persuaded the author that Frege's pivotal distinction between sense and reference, and his attendant philosophical views about language and thought, are unsatisfactory. Frege's perspective, he argues, imposes a distinctive way of thinking about semantics, specifically about the centrality of cognitive significance puzzles for semantics. Freed from Frege's perspective, we will no longer find it natural to think about semantics in this way. (shrink)
Exponents and critics of semantic presupposition have almost invariably based their discussion on the ('Standard') definition of presupposition implied by Frege and Strawson. In this study Noel Burton-Roberts argues convincingly against this definition, that leads it to a three-valued semantics. He presents a very simple semantic definition which is weaker, more general and leads to a semantics more easily interpreted as two-valued with gaps. The author shows that a wide range of intuitive facts that eluded the Standard definition follow directly (...) from this ('Revised') definition itself: facts about the presuppositions of compound sentences and modal sentences, about presuppositional conflict and about differences in the logical status of simple sentences suffering from presupposition failure. The book includes a detailed argument that an ambiguity of natural language negation, generally assumed to be necessary to the defence of semantic presupposition, is neither possible nor necessary in a presuppositional semantics. Noel Burton-Roberts has made an authoritative contribution to a debate which has involved philosophers and linguists for many years. His command of the issues, his clarity of exposition and his theoretical insight may well serve to change the boundaries of that debate. (shrink)
Investigates the possibility of constructing an interdisciplinary ontology to address such fundamental issues as guidelines for behavior and the validity and scope of knowledge from other than a limited perspective.