Many people have argued that the evolution of the human language faculty cannot be explained by Darwinian natural selection. Chomsky and Gould have suggested that language may have evolved as the by-product of selection for other abilities or as a consequence of as-yet unknown laws of growth and form. Others have argued that a biological specialization for grammar is incompatible with every tenet of Darwinian theory – that it shows no genetic variation, could not exist in any (...) intermediate forms, confers no selective advantage, and would require more evolutionary time and genomic space than is available. We examine these arguments and show that they depend on inaccurate assumptions about biology or language or both. Evolutionary theory offers clear criteria for when a trait should be attributed to natural selection: complex design for some function, and the absence of alternative processes capable of explaining such complexity. Human language meets these criteria: Grammar is a complex mechanism tailored to the transmission of propositional structures through a serial interface. Autonomous and arbitrary grammatical phenomena have been offered as counterexamples to the position that language is an adaptation, but this reasoning is unsound: Communication protocols depend on arbitrary conventions that are adaptive as long as they are shared. Consequently, language acquisition in the child should systematically differ from language evolution in the species, and attempts to analogize them are misleading. Reviewing other arguments and data, we conclude that there is every reason to believe that a specialization for grammar evolved by a conventional neo-Darwinian process. (shrink)
Knowledge is power. Knowledge about human psychology is increasingly being produced using naturallanguage processing (NLP) and related techniques. The power that accompanies and harnesses this knowledge should be subject to ethical controls and oversight. In this chapter, we address the ethical pitfalls that are likely to be encountered in the context of such research. These pitfalls occur at various stages of the NLP pipeline, including data acquisition, enrichment, analysis, storage, and sharing. We also address secondary uses of (...) the results and tools developed through psychometric NLP, such as profit-driven targeted advertising, political campaigns, and domestic and international psyops. Along the way, we reflect on potential ethical guidelines and considerations that may help researchers navigate these pitfalls. (shrink)
It is common in contemporary metaphysics to distinguish two levels of ontology: the ontology of ordinary objects and the ontology of fundamental reality. This papers argues that naturallanguage reflects not only the ontology of ordinary objects, but also a language-driven ontology, which is involved in the mass-count distinction and part-structure-sensitive semantic selection, as well as perhaps the light ontology of pleonastic entities. The paper recasts my older theory of situated part structures without situations, making use of (...) a primitive notion of unity. (shrink)
This paper gives an outline of naturallanguage ontology as a subdiscipline of both linguistics and philosophy. It argues that part of the constructional ontology reflected in naturallanguage is in significant respects on a par with syntax (on the generative view).
The paper presents a proof-theoretic semantics (PTS) for a fragment of naturallanguage, providing an alternative to the traditional model-theoretic (Montagovian) semantics (MTS), whereby meanings are truth-condition (in arbitrary models). Instead, meanings are taken as derivability-conditions in a dedicated natural-deduction (ND) proof-system. This semantics is effective (algorithmically decidable), adhering to the meaning as use paradigm, not suffering from several of the criticisms formulated by philosophers of language against MTS as a theory of meaning. In particular, Dummett’s (...) manifestation argument does not obtain, and assertions are always warranted, having grounds of assertion. The proof system is shown to satisfy Dummett’s harmony property, justifying the ND rules as meaning conferring. The semantics is suitable for incorporation into computational linguistics grammars, formulated in type-logical grammar. (shrink)
One of the great successes in the study of language has been the application of formal methods, including those of formal logic. Even so, this chapter argues against one way of accounting for this success, by arguing that the study of naturallanguage semantics and of logical consequence relations are not the same. There is indeed a lot we can glean about logic from looking at our languages, and at our inferential practices, but the semantic properties of (...)natural languages do not determine genuine logical consequence relations. We can get from naturallanguage semantics to logical consequence, but only by a significant process of identification of logical constants, abstraction, and idealization. The chapter also discusses different approaches to the nature of logical consequence, and examines which allow logic and naturallanguage to come closer together. (shrink)
The aim of naturallanguage ontology is to uncover the ontological categories and structures that are implicit in the use of naturallanguage, that is, that a speaker accepts when using a language. This article aims to clarify what exactly the subject matter of naturallanguage ontology is, what sorts of linguistic data it should take into account, how naturallanguage ontology relates to other branches of metaphysics, in what ways (...) class='Hi'>naturallanguage ontology is important, and what may be distinctive of the ontological categories and structures reflected in naturallanguage. (shrink)
In this paper we propose a way to deal with naturallanguage inference by implementing Modern Type Theoretical Semantics in the proof assistant Coq. The paper is a first attempt to deal with NLI and naturallanguage reasoning in general by using the proof assistant technology. Valid NLIs are treated as theorems and as such the adequacy of our account is tested by trying to prove them. We use Luo’s Modern Type Theory with coercive subtyping as (...) the formal language into which we translate naturallanguage semantics, and we further implement these semantics in the Coq proof assistant. It is shown that the use of a MTT with an adequate subtyping mechanism can give us a number of promising results as regards NLI. Specifically, it is shown that a number of inference cases, i.e. quantifiers, adjectives, conjoined noun phrases and temporal reference among other things can be successfully dealt with. It is then shown, that even though Coq is an interactive and not an automated theorem prover, automation of all of the test examples is possible by introducing user-defined automated tactics. Lastly, the paper offers a number of innovative approaches to NL phenomena like adjectives, collective predication, comparatives and factive verbs among other things, contributing in this respect to the theoretical study of formal semantics using MTTs. (shrink)
This paper elaborates distinctions between a core and a periphery in the ontological and the conceptual domain associated with naturallanguage. The ontological core-periphery distinction is essential for naturallanguage ontology and is the basis for the central thesis of my 2013 book Abstract Objects and the Semantics of NaturalLanguage, namely that naturallanguage permits reference to abstract objects in its periphery, but not its core.
This light piece reflects on analogies between two often disjoint streams of research: the logical semantics and pragmatics of naturallanguage and dynamic logics of general information-driven agency. The two areas show significant overlap in themes and tools, and yet, the focus seems subtly different in each, defying a simple comparison. We discuss some unusual questions that emerge when the two are put side by side, without any pretense at covering the whole literature or at reaching definitive conclusions.
The paper sets out to offer an alternative to the function/argument approach to the most essential aspects of naturallanguage meanings. That is, we question the assumption that semantic completeness (of, e.g., propositions) or incompleteness (of, e.g., predicates) exactly replicate the corresponding grammatical concepts (of, e.g., sentences and verbs, respectively). We argue that even if one gives up this assumption, it is still possible to keep the compositionality of the semantic interpretation of simple predicate/argument structures. In our opinion, (...) compositionality presupposes that we are able to compare arbitrary meanings in term of information content. This is why our proposal relies on an ‘intrinsically’ type free algebraic semantic theory. The basic entities in our models are neither individuals, nor eventualities, nor their properties, but ‘pieces of evidence’ for believing in the ‘truth’ or ‘existence’ or ‘identity’ of any kind of phenomenon. Our formal language contains a single binary non-associative constructor used for creating structured complex terms representing arbitrary phenomena. We give a finite Hilbert-style axiomatisation and a decision algorithm for the entailment problem of the suggested system. (shrink)
This article contains the results of a theoretical analysis of the phenomenon of naturallanguage understanding (NLU), as a methodological problem. The combination of structural-ontological and informational-psychological approaches provided an opportunity to describe the subject matter field of NLU, as a composite function of the mind, which systemically combines the verbal and discursive structural layers. In particular, the idea of NLU is presented, on the one hand, as the relation between the discourse of a specific speech message and (...) the meta-discourse of a language, in turn, activated by the need-motivational factors. On the other hand, it is conceptualized as a process with a specific structure of information metabolism, the study of which implies the necessity to differentiate the affective (emotional) and need-motivational influences on the NLU, as well as to take into account their interaction. At the same time, the hypothesis about the influence of needs on NLU under the scenario similar to the pattern of Yerkes-Dodson is argued. And the theoretical conclusion that emotions fulfill the function of the operator of the structural features of the information metabolism of NLU is substantiated. Thus, depending on the modality of emotions in the process of NLU, it was proposed to distinguish two scenarios for the implementation of information metabolism - reduction and synthetic. The argument in favor of the conclusion about the productive and constitutive role of emotions in the process of NLU is also given. (shrink)
Inner speech is a pervasive feature of our conscious mental lives. Yet its function and character remain an issue of philosophical debate. The present paper focuses on the relation between inner speech and naturallanguage and on the cognitive functions that various contributors have ascribed to inner speech. In particular, it is argued that inner speech does not consist of bare, context-free internal presentations of sentential (or subsentential) content, but rather has an ineliminably perspectival element. The proposed model (...) of inner speech, which characterizes inner speech as akin to the testimony of an inner interlocutor, accounts for this perspectival element and, it is argued, is explanatorily superior, insofar as it better explains, amongst other phenomena, the often condensed character of inner speech. (shrink)
Sabdabrahma Siddhanta, popularized by Patanjali and Bhartruhari will be scientifically analyzed. Sphota Vada, proposed and nurtured by the Sanskrit grammarians will be interpreted from modern physics and communication engineering points of view. Insight about the theory of language and modes of language acquisition and communication available in the Brahma Kanda of Vakyapadeeyam will be translated into modern computational terms. A flowchart of language processing in humans will be given. A gross model of human language acquisition, comprehension (...) and communication process forming the basis to develop software for relevant mind-machine modeling will be presented. The implications of such a model to artificial intelligence and cognitive sciences will be discussed. The essentiality and necessity of a physics, communication engineering , biophysical and biochemical insight as both complementary and supplementary to using mathematical and computational methods in delineating the theory of Sanskrit language is put forward. Naturallanguage comprehension as distinct and different from naturallanguage processing is pointed out. (shrink)
In this paper, I shall provide a novel response to the argument from context-sensitivity against truth-conditional semantics. It is often argued that the contextual influences on truth-conditions outstrip the resources of standard truth-conditional accounts, and so truth-conditional semantics rests on a mistake. The argument assumes that truth-conditional semantics is legitimate if and only if naturallanguage sentences have truth-conditions. I shall argue that this assumption is mistaken. Truth-conditional analyses should be viewed as idealised approximations of the complexities of (...)naturallanguage meaning. From this perspective, disparity between the scientific model and its real-world target is to be expected. I elaborate on what such an approach to semantics would look like. (shrink)
This paper gives a characterization of the ontology implicit in naturallanguage and the entities it involves, situates naturallanguage ontology within metaphysics, and responds to Chomskys' dismissal of externalist semantics.
The famous advaitic expressions -/- Brahma sat jagat mithya jivo brahma eva na apraha and Asti bhaati priyam namam roopamcheti amsa panchakam AAdya trayam brahma roopam tato dwayam jagat roopam -/- will be analyzed through physics and electronics and interpreted. -/- Four phases of mind, four modes of language acquisition and communication and seven cognitive states of mind participating in human cognitive and language acquisition and communication processes will be identified and discussed. -/- Implications and application of such (...) an identification and analysis to the fields of cognitive sciences, mind-machine modeling, naturallanguage comprehension field of artificial intelligence and physiological psychology will be hinted. A comprehensive modern scientific understanding of human consciousness, mind and mental functions will be presented. (shrink)
The Monist’s call for papers for this issue ended: “if formalism is true, then it must be possible in principle to mechanize meaning in a conscious thinking and language-using machine; if intentionalism is true, no such project is intelligible”. We use the Grelling-Nelson paradox to show that naturallanguage is indefinitely extensible, which has two important consequences: it cannot be formalized and model theoretic semantics, standard for formal languages, is not suitable for it. We also point out (...) that object-object mapping theories of semantics, the usual account for the possibility of non intentional semantics, doesn’t seem able to account for the indefinitely extensible productivity of naturallanguage. (shrink)
Successful biomedical data mining and information extraction require a complete picture of biological phenomena such as genes, biological processes, and diseases; as these exist on different levels of granularity. To realize this goal, several freely available heterogeneous databases as well as proprietary structured datasets have to be integrated into a single global customizable scheme. We will present a tool to integrate different biological data sources by mapping them to a proprietary biomedical ontology that has been developed for the purposes of (...) making computers understand medical naturallanguage. (shrink)
Stephen Laurence and Eric Margolis have recently argued that certain kinds of regress arguments against the language of thought (LOT) hypothesis as an account of how we understand natural languages have been answered incorrectly or inadequately by supporters of LOT ('Regress arguments against the language of thought', Analysis, 57 (1), 60-6, J 97). They argue further that this does not undermine the LOT hypothesis, since the main sources of support for LOT are (or might be) independent of (...) it providing an account of how we understand naturallanguage. In my paper I seek to refute both these claims, and reinstate the putative explanation of naturallanguage understanding as a necessarily central part of the support for LOT. The main argument exploits the fact that Laurence and Margolis give too little weight to the ideas (a) that LOT might be innate (b) that for LOT supporters a semantic theory must apply to in-the-head tokens, not linguistic utterances. (shrink)
In this paper, we investigate the possibility of translating a fragment of natural deduction system (NDS) for naturallanguage semantics into modern type theory (MTT), originally suggested by Luo (2014). Our main goal will be to examine and translate the basic rules of NDS (namely, meta-rules, structural rules, identity rules, noun rules and rules for intersective and subsective adjectives) to MTT. Additionally, we will also consider some of their general features.
The semantic rules governing naturallanguage quantifiers (e.g. "all," "some," "most") neither coincide with nor resemble the semantic rules governing the analogues of those expressions that occur in the artificial languages used by semanticists. Some semanticists, e.g. Peter Strawson, have put forth data-consistent hypotheses as to the identities of the semantic rules governing some natural-language quantifiers. But, despite their obvious merits, those hypotheses have been universally rejected. In this paper, it is shown that those hypotheses are (...) indeed correct. Moreover, data-consistent hypotheses are put forth as to the identities of the semantic rules governing the words "most" and "many," the semantic rules governing which semanticists have thus far been unable to identify. The points made in this paper are anticipated in a paper, published in the same issue of the Journal of Pragmatics, by Andrzej Boguslawski. (shrink)
We present Property Theory with Curry Typing (PTCT), an intensional first-order logic for naturallanguage semantics. PTCT permits fine-grained specifications of meaning. It also supports polymorphic types and separation types. We develop an intensional number theory within PTCT in order to represent proportional generalized quantifiers like âmost.â We use the type system and our treatment of generalized quantifiers in naturallanguage to construct a type-theoretic approach to pronominal anaphora that avoids some of the difficulties that undermine (...) previous type-theoretic analyses of this phenomenon. (shrink)
We describe a knowledge representation and inference formalism, based on an intensional propositional semantic network, in which variables are structures terms consisting of quantifier, type, and other information. This has three important consequences for naturallanguage processing. First, this leads to an extended, more natural formalism whose use and representations are consistent with the use of variables in naturallanguage in two ways: the structure of representations mirrors the structure of the language and allows (...) re-use phenomena such as pronouns and ellipsis. Second, the formalism allows the specification of description subsumption as a partial ordering on related concepts (variable nodes in a semantic network) that relates more general concepts to more specific instances of that concept, as is done in language. Finally, this structured variable representation simplifies the resolution of some representational difficulties with certain classes of naturallanguage sentences, namely, donkey sentences and sentences involving branching quantifiers. The implementation of this formalism is called ANALOG (A NAtural LOGIC) and its utility for naturallanguage processing tasks is illustrated. (shrink)
If the begriffsschrift from Frege does represent the logical form of naturallanguage it either lacks a logical form itself, or its logical form is different to that of naturallanguage. But Frege insists that his notation has a logical form. So the second disjunct holds. This suggests that Frege's notation will generate consequences different to those that can be derived with naturallanguage, with its different logical form. For anyone looking for "a means (...) of avoiding misunderstandings", as Frege does, this would be unpalatable, to say the least. (shrink)
Modern weaponry is often too complex for unaided human operation, and is largely or totally controlled by computers. But modern software, particularly artificial intelligence software, exhibits such complexity and inscrutability that there are grave dangers associated with its use in non-benign applications. Recent efforts to make computer systems more accessible to military personnel through naturallanguage processing systems, as proposed in the Strategic Computing Initiative of the Defense Advanced Research Projects Agency, increases rather than decreases the dangers of (...) unpredictable behavior. Defense systems constitute, in fact, a paradigm case of the wrong kind of application for this technology. This cannot be expected to change, since the unpredictability stems from inherent properties of computer systems and of natural languages. (shrink)
A formal, computational, semantically clean representation of naturallanguage is presented. This representation captures the fact that logical inferences in naturallanguage crucially depend on the semantic relation of entailment between sentential constituents such as determiner, noun, adjective, adverb, preposition, and verb phrases.The representation parallels naturallanguage in that it accounts for human intuition about entailment of sentences, it preserves its structure, it reflects the semantics of different syntactic categories, it simulates conjunction, disjunction, and (...) negation in naturallanguage by computable operations with provable mathematical properties, and it allows one to represent coordination on different syntactic levels. (shrink)
Models of category learning have been extensively studied in cognitive science and primarily tested on perceptual abstractions or artificial stimuli. In this paper, we focus on categories acquired from naturallanguage stimuli, that is, words. We present a Bayesian model that, unlike previous work, learns both categories and their features in a single process. We model category induction as two interrelated subproblems: the acquisition of features that discriminate among categories, and the grouping of concepts into categories based on (...) those features. Our model learns categories incrementally using particle filters, a sequential Monte Carlo method commonly used for approximate probabilistic inference that sequentially integrates newly observed data and can be viewed as a plausible mechanism for human learning. Experimental results show that our incremental learner obtains meaningful categories which yield a closer fit to behavioral data compared to related models while at the same time acquiring features which characterize the learned categories. (shrink)
This paper presents two projects concerned with the application of naturallanguage processing technology for improving communication between Public Administration and citizens. The first project, GIST,is concerned with automatic multilingual generation of instructional texts for form-filling. The second project, TAMIC, aims at providing an interface for interactive access to information, centered on naturallanguage processing and supposed to be used by the clerk but with the active participation of the citizen.
The article argues that cognitive linguistic theory may prove an alternative to the Montague paradigm for designing naturallanguage understanding systems. Within this framework it describes a system which models language understanding as a dialogical process between user and computer. The system operates with naturallanguage texts as input and represent language meaning as entity-relationship diagrams.
Logics of discrete time are, in Arthur Prior’s words, “applicable in limited fields of discourse in which we are concerned with what happens in a sequence of discrete states,” independent of “any serious metaphysical assumption that time is discrete.” This insight is applied to naturallanguage semantics, a widespread assumption in which is that time is, as is the real line, dense. “Limited fields of discourse” are construed as finite sets of temporal propositions, inducing bounded notions of temporal (...) granularity that can be refined to expand the discourse. The construal is developed in line with Prior’s view of what is “metaphysically fundamental”. (shrink)
There has been a recent surge of work on deontic modality within philosophy of language. This work has put the deontic logic tradition in contact with naturallanguage semantics, resulting in significant increase in sophistication on both ends. This chapter surveys the main motivations, achievements, and prospects of this work.
Hauser defends the proposition that our languages of thought are public languages. One group of arguments points to the coincidence of clearly productive (novel, unbounded) cognitive competence with overt possession of recursive symbol systems. Another group relies on phenomenological experience. A third group cites practical and methodological considerations: Occam's razor and the "streetlight principle" (other things being equal, look under the lamp) that motivate looking for instantiations of outer languages in thought first.
Since the rise of modern natural science there has been deep tension between the conceptual and the natural. Wittgenstein's discussion of how we learn a sensation-language contains important resources that can help us relieve this tension. The key here, I propose, is to focus our attention on animal nature, conceived as partially re-enchanted. To see how nature, so conceived, helps us relieve the tension in question, it is crucial to gain a firm and detailed appreciation of how (...) the primitive-instinctive, a central part of animal nature, actually serves the conceptual. I offer such an appreciation by closely examining §244 of the Philosophical Investigations and Peter Winch's discussion of it. The general aim is to bring out a certain kind of Wittgensteinian “naturalism”, a “naturalism” that is fully alive to the rootedness of conceptuality in nature. A concomitant aim is to illustrate the truth of Wittgenstein's saying that in philosophy one often has to pay close attention to details. (shrink)
This book pursues the question of how and whether naturallanguage allows for reference to abstract objects in a fully systematic way. By making full use of contemporary linguistic semantics, it presents a much greater range of linguistic generalizations than has previously been taken into consideration in philosophical discussions, and it argues for an ontological picture is very different from that generally taken for granted by philosophers and semanticists alike. Reference to abstract objects such as properties, numbers, propositions, (...) and degrees is considerably more marginal than generally held. (shrink)