(i)Â Â Languages are indefinitely various along every dimension. (ii)Â Languages are essentially systems of habit/dispositions. (iii) Languages are learnt from experience via analogy and generalisation. (iv) There is no component of the speaker/hearerâ€™s psychology that is Â Â Â Â Â Â specifically linguistic. (v)Â Syntactic relations are ones of surface immediate constituency. (vi) Linguistics is a descriptive/taxonomic science - there is nothing to Â Â Â Â Â explain.
Deflationism is perhaps the prevailing conception of truth within contemporary philosophy. The chief reason for this ascendancy, I think, is that deflationary theories present themselves to be neutral between all disputes in epistemology and metaphysics. This offers deflationism a straightforward dialectical advantage over the more traditional theories that seek to explicate..
"It as little occurs to me to get involved in the philosophical quarrels and arguments of my times as to go down an ally and take part in a scuffle when I see the mob fighting there." — Arthur Schopenhauer, 1828-30, Adversaria' in Manuscript Remains, Vol. 3: Berlin Manuscripts (1818-1830). Oxford: Berg Publishers.
(i) Languages are indefinitely various along every dimension. (ii) Languages are essentially systems of habit/dispositions. (iii) Languages are learnt from experience via analogy and generalisation. (iv) There is no component of the speaker/hearer’s psychology that is..
Among the many philosophers who hold that causal facts1 are to be explained in terms of—or more ambitiously, shown to reduce to—facts about what happens, together with facts about the fundamental laws that govern what happens, the clear favorite is an approach that sees counterfactual dependence as the key to such explanation or reduction. The paradigm examples of causation, so advocates of this approach tell us, are examples in which events c and e—the cause and its effect—both occur, but: had (...) c not occurred, e would not have occurred either. From this starting point ideas proliferate in a vast profusion. But the remarkable disparity among these ideas should not obscure their common foundation. Neither should the diversity of opinion about the prospects for a philosophical analysis of causation obscure their importance. For even those philosophers who see these prospects as dim—perhaps because they suffer post-Quinean queasiness at the thought of any analysis of any concept of interest—can often be heard to say such things as that causal relations among events are somehow “a matter of” the patterns of counterfactual dependence to be found in them. It was not always so. Thirty-odd years ago, so-called “regularity” analyses (so-called, presumably, because they traced back to Hume’s well-known analysis of causation as constant conjunction) ruled the day, with Mackie’s Cement of the Universe embodying a classic statement. But they fell on hard times, both because of internal problems—which we will review in due course—and because dramatic improvements in philosophical understanding of counterfactuals made possible the emergence of a serious and potent rival: a counterfactual analysis of causation resting on foundations firm enough to be repel the kind of philosophical suspicion that had formerly warranted dismissal.. (shrink)
It is widely held that propositions are structured entities. In The Nature and Structure of Content (2007), Jeff King argues that the structure of propositions is none other than the syntactic structure deployed by the speaker/hearers who linguistically produce and consume the sentences that express the propositions. The present paper generalises from King’s position and claims that syntax provides the best in-principle account of propositional structure. It further seeks to show, however, that the account faces serve problems pertaining to the (...) fine individuation of propositions that the account entails. The ‘fineness of cut’ problem has been raised by Collins (The unity of linguistic meaning, 2007) and others. King (Philos Stud 163(3):763–781, 2013) responds to these complaints in ways this paper rebuts. Thus, the very idea of structured propositions is brought into doubt, for the best in-principle account of such structure appears to fail. (shrink)
Collins, John Francis; Carroll, Sandra In the April 2012 edition of The Australasian Catholic Record (ACR) John Duiker presented a useful overview and history of the Catholic Charismatic Renewal (CCR) titled 'Spreading the Culture of Pentecost in the Midst of Disenchantment.' According to Duiker the CCR as an ecclesial movement 'has its origins in a retreat that was held at Duquesne University in Pittsburgh, Pennsylvania in the USA in February 1967.' Describing this event as a Pentecost experience Duiker writes that (...) the movement that was started by this event 'spread to other college campuses and continued to spread right across the world, and now exists in over 220 countries and has touched the lives of over 120 million Catholics.' Duiker's article draws on Charles Taylor's thesis that our post-enlightenment Western culture has been emptied-out of the idea of God's providence leading to 'a diminishing of the necessity of grace and a fading of the sense of mystery.' Duiker then presents a case for CCR being recognised 'as an example for the re-enchantment of a post-Enlightenment secular world.'. (shrink)
Collins, John Francis In October this year there are to be two events at the Vatican. Beginning on 7 October and going through to 28 October bishops from all over the world are to gather at a Synod on 'New Evangelization for the Transmission of the Christian Faith.' On 11 October, midway through the Synod, the whole Church will mark the fiftieth anniversary of the opening of the Second Vatican Council. The bishops who are to gather this year at the (...) Synod follow in the footsteps of the more than 2000 Bishops who gathered at the Second Vatican Council. John XXIII opened the Second Vatican Council, with the following words 'Looked at one way there is the deposit of faith or the truths which are contained in our doctrine which we venerate, looked at another way there is the way by which the same (the deposit of faith) is enunciated both in its meaning and its spirit.' In a recent interview for Salt and Light Television the inaugural head of the Pontifical Council for the promotion of the New Evangelisation Archbishop Salvatore Fisichella noted that what Vatican II did for the Church is still present in our community. Later in the interview the Archbishop stated that the 'New Evangelisation is not a new work, it is a new mentality; a new language, a new enthusiasm for announcing the gospel.' There is continuity between both the spirit and letter of the Archbishop's words recorded in 2012 and the words of John XXIII in opening Vatican II. That is, as a Church, what we are seeking is new ways to announce the meaning and spirit of the deposit of faith, the truths contained in doctrine. What would later be called the new evangelisation permeated Vatican II. (shrink)
Griffiths and Machery (2008) argue that innateness is a ?folk biological? notion, which, as such, has no useful reconstruction in contemporary biology. If this is so, not only is it wrong to identify the vernacular notion with the precise theoretical concept of canalization, but worse, it would appear that many of the putative scientific claims for particular competences and capacities being innate are simply misplaced. The present paper challenges the core substantive claim of Griffiths and Machery's position, namely, that innateness (...) understood on canalization lines as environment-independent development (somehow and to some degree) is a confused, outmoded notion. It will be contended that the modality-independence of language offers a prima facie case against Griffiths and Machery's general position. (shrink)
Do lexical items have internal structure that contributes to, or determines, the stable interpretation of their potential hosts? One argument in favour of the claim that lexical items are so structured is that certain putative verbs appear to be ‘impossible’, where the intended interpretation of them is apparently precluded by the character of their internal structure. The adequacy of such reasoning has recently been debated by Fodor and Lepore and Johnson, but to no apparent resolution. The present paper argues that (...) such ‘impossible word arguments' for internal lexical structure, although not apodictic, do constitute inferences to the best explanation for such structure. Alternative explanations for the ‘impossible words' are considered and rejected. (shrink)
The problem of the unity of the proposition is almost as old as philosophy itself, and was one of the central themes of early analytical philosophy, greatly exercising the minds of Frege, Russell, Wittgenstein, and Ramsey. The problem is how propositions or meanings can be simultaneously unities (single things) and complexes, made up of parts that are autonomous of the positions they happen to fill in any given proposition. The problem has been associated with numerous paradoxes and has motivated general (...) theories of thought and meaning, but has eluded any consensual resolution; indeed, the problem is sometimes thought to be wholly erroneous, a result of atomistic assumptions we should reject. In short, the problem has been thought to be of merely historical interest. Collins argues that the problem is very real and poses a challenge to any theory of linguistic meaning. He seeks to resolve the problem by laying down some minimal desiderata on a solution and presenting a uniquely satisfying account. The first part of the book surveys and rejects extant 'solutions' and dismissals of the problem from (especially) Frege and Russell, and a host of more contemporary thinkers, including Davidson and Dummett. The book's second part offers a novel solution based upon the properties of a basic syntactic principle called 'Merge', which may be said to create objects inside objects, thus showing how unities can be both single things but also made up of proper parts. The solution is defended from both philosophical and linguistic perspectives. The overarching ambition of the book, therefore, is to strengthen the ties between current linguistics and contemporary philosophy of language in a way that is genuinely sensitive to the history of both fields. (shrink)
I argue that the dispute between two leading theories of interpretation of legal texts, textual originalism and textual evolutionism, depends on the false presupposition that changes in the way a word is used necessarily require a change in the wordâ€™s meaning. Semantic externalism goes a long way towards reconciling these views by showing how a wordâ€™s semantic properties can be stable over time, even through vicissitudes of usage. I argue that temporal externalism can account for even more semantic stability, however. (...) Temporal externalism is the theory that the content of an utterance at time t may be determined by developments in linguistic usage subsequent to t . If this semantic theory is correct, then the originalist and evolutionist positions effectively collapse. Originalism is correct in that the original meaning of the text is the meaning that is binding on jurists, but evolutionism is vindicated, as it is the current practices and standards that determine the meaning the text now has, and has always had . Objections to temporal externalism, and to its application to the interpretation of legal texts, are considered and addressed. (shrink)
Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 This article explores the contemporary politics of global violence through an examination of the particular challenges and possibilities facing Palestinians who seek to defend their communities against an ongoing settler-colonial project (Zionism) that is approaching a crisis point. As the colonial dynamic in Israel/Palestine returns to its most elemental level – land, trees, homes – it also continues to be a laboratory for new forms of accelerated violence whose global impact is (...) hard to overestimate. In such a context, Palestinians and international solidarity activists find themselves confronting a quintessential 21 st century activist dilemma: how to craft a strategy of what Paul Virilio calls “popular defense” at a time when everyone seems to be implicated in the machinery of global violence? I argue that while this dilemma represents a formidable challenge for Palestinians, it also helps explain why the Palestinian struggle is increasingly able to build bridges with wider struggles for global justice, ecological sustainability, and indigenous rights. (shrink)
It is commonly assumed that natural languages, construed as sets of sentences, contain denumerably many sentences. One argument for this claim is that the sentences of a language must be recursively enumerable by a grammar, if we are to understand how a speaker-hearer could exhibit unbounded competence in a language. The paper defends this reasoning by articulating and defending a principle that excludes the construction of a sentence non-denumerably many words long.
Borg (2009) surveys and rejects a number of arguments in favour of semantic internalism. This paper, in turn, surveys and rejects all of Borg's anti-internalist arguments. My chief moral is that, properly conceived, semantic internalism is a methodological doctrine that takes its lead from current practice in linguistics. The unifying theme of internalist arguments, therefore, is that linguistics neither targets nor presupposes externalia. To the extent that this claim is correct, we should be internalists about linguistic phenomena, including semantics.
Robert Hanna (Rationality and logic. MIT Press, Cambridge, 2006) articulates and defends the thesis of logical cognitivism, the claim that human logical competence is grounded in a cognitive faculty (in Chomsky’s sense) that is not naturalistically explicable. This position is intended to steer us between the Scylla of logical Platonism and the Charybdis of logical naturalism (/psychologism). The paper argues that Hanna’s interpretation of Chomsky is mistaken. Read aright, Chomsky’s position offers a defensible version of naturalism, one Hanna may accept (...) as far as his version of naturalism goes, although not one that supports the claim that cognitive science offers a place for logic that is somehow outside the natural, contingent order. (shrink)
A range of positions persist in the proper interpretation of generative linguistics. The paper responds to recent work in this area that either weakly or strongly diverges from the non-contentful, internalist model presented in Collins (2008a). Against the sympathetic criticisms of Matthews (2008) and Smith (2008), it is argued that a crucial role for content in our understanding of linguistic theories remains obscure, although the discussion here will hopefully clarify the divergence between the parties as merely perspectival. Rey (2008) more (...) strongly argues that the non-contentful model is prey to some classic complaints. The charges are rebutted. Finally, the position of Devitt (2008a, b) is considered. It is argued that his most recent presentation of his brand of realism fails to speak to the fundamental complaints levelled against it, especially as regards the putative role of conventions in the explanation of unvoiced syntax. (shrink)
This note briefly responds to Devitt’s (2008) riposte to Collins’s (2008a) argument that linguistic realism prima facie fails to accommodate unvoiced elements within syntax. It is argued that such elements remain problematic. For it remains unclear how conventions might target the distribution of PRO and how they might explain hierarchical structure that is presupposed by such distribution and which is not witnessed in concrete strings.
The article takes up a range of issues concerning knowledge of language in response to recent work of Rey, Smith, Matthews and Devitt. I am broadly sympathetic with the direction of Rey, Smith, and Matthews. While all three are happy with the locution ‘knowledge of language’, in their different ways they all reject the apparent role for a substantive linguistic epistemology in linguistic explanation. I concur but raise some friendly concerns over even a deflationary notion of knowledge of language. Against (...) Devitt I have more serious worries. The latter half of the paper seeks further clarification of Devitt’s realism and raises concerns over its ability to reflect the shape and content of linguistics. (shrink)
Most content externalists concede that even if externalism is compatible with the thesis that one has authoritative self-knowledge of thought contents, it is incompatible with the stronger claim that one is always able to tell by introspection whether two of one’s thought tokens have the same, or different, content. If one lacks such authoritative discriminative self-knowledge of thought contents, it would seem that brute logical error – non-culpable logical error – is possible. Some philosophers, such as Paul Boghossian, have argued (...) that this would present a big problem for externalism, forcing the externalist to overhaul our norms of rationality. I consider several externalist strategies to block this possibly unhappy epistemological consequence, but I argue that they all fail. (shrink)
The paper considers our ordinary mentalistic discourse in relation to what we should expect from any genuine science of the mind. A meta-scientific eliminativism is commended and distinguished from the more familiar eliminativism of Skinner and the Churchlands. Meta-scientific eliminativism views folk psychology qua folksy as unsuited to offer insight into the structure of cognition, although it might otherwise be indispensable for our social commerce and self-understanding. This position flows from a general thesis that scientific advance is marked by an (...) eschewal of folk understanding. The latter half of the paper argues that, contrary to the received view, Chomsky's review of Skinner offers not just an argument against Skinner's eliminativism, but, more centrally, one in favour of the second eliminativism. (shrink)
Much of the best contemporary work in the philosophy of language and content makes appeal to the theories developed in generative syntax. In particular, there is a presumption that—at some level and in some way—the structures provided by syntactic theory mesh with or support our conception of content/linguistic meaning as grounded in our first-person understanding of our communicative speech acts. This paper will suggest that there is no such tight fit. Its claim will be that, if recent generative theories are (...) on the right lines, syntactic structure provides both too much and too little to serve as the structural partner for content, at least as that notion is generally understood in philosophy. The paper will substantiate these claims by an assessment of the recent work of King, Stanley, and others. (shrink)
My contribution takes up a set of methodological and philosophical issues in linguistics that have recently occupied the work of Devitt and Rey. Devitt construes the theories of generative linguistics as being about an external linguistic reality of utterances, inscriptions, etc.; that is, Devitt rejects the ‘psychologistic’ construal of linguistics. On Rey’s conception, linguistics concerns the mental contents of speaker / hearers; there are no external linguistic items at all. I reject both views. Against Devitt, I argue that the philosophical (...) issues in linguistics should be framed in terms of the theories themselves, not pre-theoretical conceptions front either philosophy or commonsense as to what linguistics is about or what a language is. In this light, I suggest that Devitt’s key arguments (concerning parameter setting, psychological reality, and the role of intuitions) do not make sense of current linguistic inquiry and so do not offer an adequate philosophical basis of that work. To this extent, I agree with Rey. Ourdifferences emerge over the putative role of content in linguistic inquiry and how the concept of computation ought to be understood. Following the lead of Chomsky’s recent philosophical remarks, I argue that a theory of the language faculty should be understood as an abstract specification of the function that pairs ‘sound’ with ‘meaning’ rather than as a specification of the content the mind represents. But doesn’t ‘computation’ presuppose ‘representation’? I argue for a negative answer, at least if ‘representation’ is read intentionally. A ‘representation’ can be construed as brain structure that, at the present stage of inquiry, can only be picked out via the abstract concepts of linguistic theory. We are entitled to posit such structures insofar as they earn their explanatory keep over the output of the faculty. The linguistic function is a way of setting the boundary conditions on what the brain must be doing such that humans get to be competent speaker / hearers, although we do not therefore take the function to be a story of the causal spring of linguistic performance. (shrink)
Prinz (Perceptual the Mind: Concepts and Their Perceptual Basis, MIT Press, 2002) presents a new species of concept empiricism, under which concepts are off-line long-term memory networks of representations that are ‘copies’ of perceptual representations – proxytypes. An apparent obstacle to any such empiricism is the prevailing nativism of generative linguistics. The paper critically assesses Prinz’s attempt to overcome this obstacle. The paper argues that, prima facie, proxytypes are as incapable of accounting for the structure of the linguistic mind as (...) are the more traditional species of empiricism. This position is then confirmed by looking in detail at two suggestions (one derived from recent connectionist research) from Prinz of how certain aspects of syntactic structure might be accommodated by the proxytype theory. It is shown that the suggestions fail to come to terms with both the data and theory of contemporary linguistics. (shrink)
Temporal externalism (TE) is the thesis (defended by Jackman (1999)) that the contents of some of an individual’s thoughts and utterances at time t may be determined by linguistic developments subsequent to t. TE has received little discussion so far, Brown 2000 and Stoneham 2002 being exceptions. I defend TE by arguing that it solves several related problems concerning the extension of natural kind terms in scientifically ignorant communities. Gary Ebbs (2000) argues that no theory can reconcile our ordinary, practical (...) judgments of sameness of extension over time with the claim that linguistic usage determines word extensions. I argue that Ebbs shows at most that no theory other than TE can effect this reconciliation. Furthermore, while Ebbs’ argument undermines Jessica Brown’s solutions to two closely related problems about natural kind term extensions (Brown 1998), TE can solve both problems without difficulty. Some criticisms of TE are briefly addressed as well. (shrink)
Jerry Fodor, among others, has maintained that Chomsky's language faculty hypothesis is an epistemological proposal, i.e. the faculty comprises propositional structures known (cognized) by the speaker/hearer. Fodor contrasts this notion of a faculty with an architectural (directly causally efficacious) notion of a module. The paper offers an independent characterisation of the language faculty as an abstractly specified nonpropositional structure of the mind/brain that mediates between sound and meaning—a function in intension that maps to a pair of structures that determine soundmeaning (...) convergence. This conception will be elaborated and defended against a number of likely complaints deriving from Fodor's faculty/module distinction and other positions which seek to credit knowledge of language with an empirical or theoretical significance. A recent explicit argument from Fodor that Chomsky must share his conception will be diagnosed and the common appeal to implicit knowledge as a foundation for linguistic competence will be rejected. (shrink)
In recent years, a number of philosophers have argued against a biological understanding of the innate in favor of a narrowly psychological notion. On the other hand, Ariew ((1996). Innateness and canalization. Philosophy of Science, 63, S19-S27. (1999). Innateness is canalization: in defense of a developmental account of innateness. In V. Hardcastle (Ed.), Where biology meets psychology: Philosophical essays (pp. 117-138). Cambridge, MA: MIT.) has developed a novel substantial account of innateness based on developmental biology: canalization. The governing thought of (...) this paper is that the notion of the innate, as it re-emerged with the work of Chomsky, is a general notion that applies equally to all biological traits. On this basis, the paper recommends canalization as a promising candidate account of the notion of the innate. (shrink)
Jerry Fodor argues that the massive modularity thesis – the claim that (human) cognition is wholly served by domain specific, autonomous computational devices, i.e., modules – is a priori incoherent, self-defeating. The thesis suffers from what Fodor dubs the input problem: the function of a given module (proprietarily understood) in a wholly modular system presupposes non-modular processes. It will be argued that massive modularity suffers from no such a priori problem. Fodor, however, also offers what he describes as a really (...) real input problem (i.e., an empirical one). It will be suggested that this problem is real enough, but it does not selectively strike down massive modularity – it is a problem for everyone. (shrink)
Paul Horwich (1998), following a number of others, proposes a schematic compositional format for the specification of the meanings of complex expressions. The format is schematic in the sense that it identifies grammatical schemata that do not presuppose any particular account of primitive word meanings: whatever the nature of meanings, the application of the schemata to them will serve to explain compositionality. This signals, for Horwich, that compositionality is a non-substantive constraint on theories of meaning. Drawing on a range of (...) linguistic data, the present paper argues that while the bare idea of compositionality indeed does not presuppose any account of meaning, Horwich's format is empirically inadequate. The argument here goes back to Chomsky's early position on the descriptive inadequacy of rewrite grammars and the consequent need for transformations. It will also be seen that the data militates for a general claim that meaning relevant structure is projected from words rather than imposed on them schematically. Finally, it will be indicated how this reasoning from syntactic considerations is flush with a more traditional philosophical understanding of compositionality. (shrink)