This paper does not deal with the topic of ‘the generosity of artiﬁcial languages from an Asian or a comparative perspective’. Rather, it is concerned with a particular case taken from a development in the Western tradition, when in the wake of the rise of formal logic at the end of the nineteenth and the beginning of the twentieth century people in philosophy and later in linguistics started to use formallanguages in the study of the (...)semantics of natural languages. This undertaking rests on certain philosophical assumptions and instantiates a particular methodology, that we want to examine critically. However, that in itself is still too broad a topic for a single paper, so we will focus on a particular aspect, viz., the distinction between grammatical form and logical form and the crucial role it plays in how the relationship between natural languages and formallanguages is understood in this tradition. We will uncover two basic assumptions that underlie the standard view on the distinction between grammatical form and logical form, and discuss how they have contributed to the shaping of a particular methodology and a particular view on the status of semantics as a discipline. (shrink)
This paper does not deal with the topic of ‘the generosity of artiﬁcial languages from an Asian or a comparative perspective’. Rather, it is concerned with a particular case taken from a development in the Western tradition, when in the wake of the rise of formal logic at the end of the nineteenth and the beginning of the twentieth century people in philosophy and later in linguistics started to use formallanguages in the study of the (...)semantics of natural languages. This undertaking rests on certain philosophical assumptions and instantiates a particular methodology, that we want to examine critically. However, that in itself is still too broad a topic for a single paper, so we will focus on a particular aspect, viz., the distinction between grammatical form and logical form and the crucial role it plays in how the relationship between natural languages and formallanguages is understood in this tradition. We will uncover two basic assumptions that underlie the standard view on the distinction between grammatical form and logical form, and discuss how they have contributed to the shaping of a particular methodology and a particular view on the status of semantics as a discipline. But before we turn to the topic at hand, a few more words on the general nature of the investigation are in order. Its general aim is to ﬁnd out how semantics constructs its object, i.e., we are interested in what semanticists view as the proper object of study, how they think this object can best be approached, and how they view the relations between their own undertaking and neighbouring disciplines that deal with related, or even the same kind of phenomena, such as cognitive psychology, cognitive neuroscience, philosophy, anthropology. The background assumption is that, much like in other disciplines, semantics, too, does not have its object of investigation cut out for itself by nature, but constructs it in a complex process that involves empirical elements (‘facts’ being already too dangerous a term), philosophical assumptions, and borrowings.... (shrink)
This paper aims to argue for two related statements: first, that formalsemantics should not be conceived of as interpreting natural language expressions in a single model (a very large one representing the world as a whole, or something like that) but as interpreting them in many different models (formal counterparts, say, of little fragments of reality); second, that accepting such a conception of formalsemantics yields a better comprehension of the relation between semantics (...) and pragmatics and of the role to be played by formalsemantics in the general enterprise of understanding meaning. For this purpose, three kinds of arguments are given: firstly, empirical arguments showing that the many models approach is the most straightforward and natural way of giving a formal counterpart to natural language sentences. Secondly, logical arguments proving the logical impossibility of a single universal model. And thirdly, theoretical arguments to the effect that such a conception of formalsemantics fits in a natural and fruitful way with pragmatic theories and facts. In passing, this conception will be shown to cast some new light on the old problems raised by liar and sorites paradoxes. (shrink)
In the formalsemantics based on modern type theories, common nouns are interpreted as types, rather than as predicates of entities as in Montague’s semantics. This brings about important advantages in linguistic interpretations but also leads to a limitation of expressive power because there are fewer operations on types as compared with those on predicates. The theory of coercive subtyping adequately extends the modern type theories and, as shown in this paper, plays a very useful role in (...) making type theories more expressive for formalsemantics. It not only gives a satisfactory solution to the basic problem of ‘multiple categorisation’ caused by interpreting common nouns as types, but provides a powerful formal framework to model interesting linguistic phenomena such as copredication, whose formal treatment has been found difficult in a Montagovian setting. In particular, we show how to formally introduce dot-types in a type theory with coercive subtyping and study some type-theoretic constructs that provide useful representational tools for reference transfers and multiple word meanings in formal lexical semantics. (shrink)
This unique book presents a comprehensive and rigorous treatment of the theory of computability which is introductory yet self-contained. It takes a novel approach by looking at the subject using computation models rather than a limitation orientation, and is the first book of its kind to include software. Accompanying software simulations of almost all computational models are available for use in conjunction with the text, and numerous examples are provided on disk in a user-friendly format. Its applications to computer science (...) itself include interesting links to programming language theory, compiler design theory, and algorithm design. The software, numerous examples, and solutions make this book ideal for self-study by computer scientists and mathematicians alike. (shrink)
This is a review of From Discourse to Logic: Introduction to Model-theoretic Semantics of Natural Language, Formal Logic and Discourse Representation Theory, by Hans Kamp and Uwe Reyle, published by Kluwer Academic Publishers in 1993.
Formalsemantics is understood either as a formal analysis of semantical features of natural language or as model-theoretic semantics of formal(ized) languages. This paper focuses on the second understanding. The problem is how to identify the formal aspects of formalsemantics, if we understand ‘formal’ as ‘independent of content’. This is done by showing that the form of semantical interpretation of a language L is given by its syntax and the (...) parallelism of the signature of L and its interpretative structure SI. However, the content of interpretation, that is, the way of correlating of L-expressions with extralinguistic items depends of informal factors. The discussion shows that semantics is prior to syntax. (shrink)
This is the second in a sequence of three essays which axiomatize and apply Edmund Husserl's dependence ontology of parts and wholes as a non-Diodorean, non-Kantian temporal semantics for first-order predicate modal languages. The Ontology of Intentionality I introduced enough of Husserl's dependence-ontology of parts and wholes to formulate his account of order as effected by relating moments of unity, and The Ontology of Intentionality II extends that axiomatic dependence-ontology far enough to enable its semantic application. Formalizing the (...) compatibility [Vereinbarkeit] relation implicated in Husserl's notorious doctrine of impossible meanings, the essay introduces a compatibility restriction on relations to formulate Husserl's distinction between singular [einheitliche] and plural [mehrheitliche] objects, using plural relating moments to define first-order versions of Husserl's notions of relation complexes (i.e. Sachverhalte), abstracta of n-ary relation complexes, categorial relations, abstract eide as unifications of categorial relations, semantic domains as completions of abstract eide, and material regions as semantic domains which are compatibility upper bounds of categorial relations. These concepts will enable the formal dependence-ontological noetic semantics for two-valued, first-order modal languages introduced in the sequel Two-Valued Logics of Intentionality, the third essay in the sequence. (shrink)
Conditional structures lie at the heart of the sciences, humanities, and everyday reasoning. It is hence not surprising that conditional logics – logics specifically designed to account for natural language conditionals – are an active and interdisciplinary area. The present book gives a formal and a philosophical account of indicative and counterfactual conditionals in terms of Chellas-Segerberg semantics. For that purpose a range of topics are discussed such as Bennett’s arguments against truth value based semantics for indicative (...) conditionals. (shrink)
i) We show for each context-free language L that by considering each word of L as a structure in a natural way, one turns L into a finite union of classes which satisfy a finitary analog of the characteristic properties of complete universal first order classes of structures equipped with elementary embeddings. We show this to hold for a much larger class of languages which we call free local languages. ii) We define local languages, a class of (...)languages between free local and context-sensitive languages. Each local language L has a natural extension L ∞ to infinite words, and we prove a series of "pumping lemmas", analogs for each local language L of the "uvxyz theorem" of context free languages: they relate the existence of large words in L or L ∞ to the existence of infinite "progressions" of words included in L, and they imply the decidability of various questions about L or L ∞ . iii) We show that the pumping lemmas of ii) are independent from strong axioms, ranging from Peano arithmetic to ZF + Mahlo cardinals. We hope that these results are useful for a model-theoretic approach to the theory of formallanguages. (shrink)
In this paper I will be concerned with the question of the extent to which semantics can be thought of as a purely formal exercise, which we can engage in in a way that is neutral with respect to how our formal system is to be interpreted. I will be arguing, to the contrary, that the features of the formal systems which we use to do semantics are closely linked, in several different ways, to the (...) interpretation that we give to those formal systems. The occasion for this question, and the main example that I will use to illustrate my answer to it, is the close relationship between the formal systems employed in recent statements of apparently competing accounts of epistemic modals with the dynamic, expressivist, and relativist theoretical paradigms. The structure of the paper will be straightforward. In part 1, I will briefly introduce four theories of epistemic modals – one dynamic theory, two expressivist theories, and one relativist theory. Then in part 2 I’ll show that one expressivist theory is formally equivalent to the dynamic theory, that the other is formally equivalent to the relativist theory, and that the two expressivist theories are themselves essentially notational variants. I’ll use these facts to pose our central question: if these theories have so much formally in common, then doesn’t that suggest that we can separate the task of constructing a formalsemantics from the task of deciding between competing interpretations of it? Finally, in part 3 I’ll answer that question in the negative. There are at least three reasons why formalsemantics cannot be separated from questions of interpretation that are illustrated by the theories I introduce in part 1. (shrink)
The Monist’s call for papers for this issue ended: “if formalism is true, then it must be possible in principle to mechanize meaning in a conscious thinking and language-using machine; if intentionalism is true, no such project is intelligible”. We use the Grelling-Nelson paradox to show that natural language is indefinitely extensible, which has two important consequences: it cannot be formalized and model theoretic semantics, standard for formallanguages, is not suitable for it. We also point out (...) that object-object mapping theories of semantics, the usual account for the possibility of non intentional semantics, doesn’t seem able to account for the indefinitely extensible productivity of natural language. (shrink)
This accessible introduction to formal, and especially Montague, semantics within a linguistic framework, presupposes no previous background in logic, but takes students step-by-step from simple predicate/argument structures and their interpretation to Montague's intentional logic.
Formalsemantics is an approach to SEMANTICS1, the study of meaning, with roots in logic, the philosophy of language, and linguistics, and since the 1980’s a core area of linguistic theory. Characteristics of formalsemantics to be treated in this article include the following: Formal semanticists treat meaning as mind-independent (though abstract), contrasting with the view of meanings as concepts “in the head” (see I-LANGUAGE AND E-LANGUAGE and MEANING EXTERNALISM AND INTERNALISM); formal semanticists distinguish (...)semantics from knowledge of semantics (Lewis 1975, Cresswell 1978), which has consequences for the notion of semantic COMPETENCE. A central part of the meaning of a sentence on this approach is its TRUTH CONDITIONS, and most although not all formalsemantics is model-theoretic, relating linguistic expressions to model-theoretically constructed semantic values cast in terms of truth, REFERENCE, and possible worlds. This sets formalsemantics apart from approaches which view semantics as relating a sentence just to a representation on another linguistic “level” (LOGICAL FORM) or a representation in an innate LANGUAGE OF THOUGHT. The formal semanticist could accept such representations as an aspect of semantics but would insist on asking what the model-theoretic semantic interpretation of the given representationlanguage is (Lewis 1970). Formalsemantics is centrally concerned with COMPOSITIONALITY at the SYNTAX-SEMANTICS INTERFACE, how the meanings of larger constituents are built up from the meanings of their parts on the basis of their syntactic structure, and with the relation between compositional SENTENCE MEANING and meaning in discourse. (shrink)
Like Spanish moss on a live oak tree, the scientific study of meaning in language has expanded in the last 100 years, and continues to expand steadily. In this essay I want to chart some central themes in that expansion, including their histories and their important figures. Our attention will be directed toward what is called 'formalsemantics', which is the adaptation to natural language of analytical techniques from logic. The first, background, section of the paper will survey (...) the changing attitudes of linguists toward semantics into the last third of the century. The second and third sections will examine current formal approaches to meaning. In the final section I will summarize some of the common assumptions of the approaches examined in the middle sections of the paper, sketch a few alternatives, and make some daring predictions. (shrink)
With a few notable exceptions formalsemantics, as it originated from the seminal work of Richard Montague, Donald Davidson, Max Cresswell, David Lewis and others, in the late sixties and early seventies of the previous century, does not consider Wittgenstein as one of its ancestors. That honour is bestowed on Frege, Tarski, Carnap. And so it has been in later developments. Most introductions to the subject will refer to Frege and Tarski (Carnap less frequently) —in addition to the (...) pioneers just mentioned, of course— , and discuss the main elements of their work that helped shape formalsemantics in some detail. But Wittgenstein is conspicuously absent whenever the history of the subject is mentioned (usually brieﬂy, if at all). Of course, if one thinks of Wittgenstein’s later work, this is obvious: nothing, it seems, could be more antithetic to what formalsemantics aims for and to how it pursues those aims than the views on meaning and language that Wittgenstein expounds in, e.g., Philosophical Investigations, with its insistence on particularity and diversity, and its rejection of explanation and formal modelling. But what about his earlier work, the Tractatus (henceforth )? At ﬁrst sight, that seems much more congenial, as it develops a conception of language and meaning that is both general and uniform, explanatory.. (shrink)
The paper presents a formal explication of the early Wittgenstein's views on ontology, the syntax and semantics of an ideal logical language, and the propositional attitudes. It will be shown that Wittgenstein gave a language of thought analysis of propositional attitude ascriptions, and that his ontological views imply that such ascriptions are truth-functions of (and supervenient upon) elementary sentences. Finally, an axiomatization of a quantified doxastic modal logic corresponding to Tractarian semantics will be given.
When in 1980, on the Third Amsterdam Colloquium, Johan van Benthem read a paper with the title ‘Why is Semantics What?’ (cf. ), I was puzzled: Wasn’t it obvious what semantics is? Why did our concept of it stand in need of justification? Later, much later, I came to appreciate what Van Benthem was doing in this paper (and in some others). Questioning the ‘standard model’, the assumptions on which the working semanticists silently agree, Van Benthem opened up (...) a space of issues to be discussed, questions to be asked, routes to be explored, that had been hidden from view by the unreflective endorsement of just one possible, albeit fruitful way of doing semantics. History, by the way, has proven him right on many points: the monolithic approach that dominated formalsemantics of natural language in the seventies, and which relied heavily on Montague’s seminal papers, has given way to a multitude of different ways of tackling semantic issues, using different formal techniques. Some limitations, in particular the almost exclusive focus on sentences as the primary units of analysis, have been overcome. In another respect, however, I feel that the message of Van Benthem’s paper has not caught on sufficiently. He urges semanticists to take more interest in the properties of their tools, arguing that such questions are important if we are to come to a real, deep understanding of what semantics is. Such ‘meta-level’ considerations, although certainly less scarce than they used to be, are still not an everyday concern of the working semanticist. (shrink)
The paper presents a formal explication of the early Wittgenstein's views on ontology, the syntax and semantics of an ideal logical language, and the propositional attitudes. It will be shown that Wittgenstein gave a "language of thought" analysis of propositional attitude ascriptions, and that his ontological views imply that such ascriptions are truth-functions of (and supervenient upon) elementary sentences. Finally, an axiomatization of a quantified doxastic modal logic corresponding to Tractarian semantics will be given.
This book describes the mathematical aspects of the semantics of programming languages. The main goals are to provide formal tools to assess the meaning of programming constructs in both a language-independent and a machine-independent way, and to prove properties about programs, such as whether they terminate, or whether their result is a solution of the problem they are supposed to solve. In order to achieve this the authors first present, in an elementary and unified way, the theory (...) of certain topological spaces that have proved of use in the modelling of various families of typed lambda calculi considered as core programming languages and as meta-languages for denotational semantics. This theory is now known as Domain Theory, and was founded as a subject by Scott and Plotkin. One of the main concerns is to establish links between mathematical structures and more syntactic approaches to semantics, often referred to as operational semantics, which is also described. This dual approach has the double advantage of motivating computer scientists to do some mathematics and of interesting mathematicians in unfamiliar application areas from computer science. (shrink)
This article presents a study of the semantics of clitic pronouns and clitic doubling in Spanish and related languages. Its main hypothesis is that the co-occurrence restrictions that are observed between the clitic element and its quantifier associate can be properly characterized within Generalized Quantifiers Theory. Clitics are treated as generalized quantifier functions which are restricted to a context set In clitic doubling constructions, the context set is retrieved from the doubled NP-quantifier. Three main constraints are formulated that (...) restrict this mechanism: the Principal Filter Constraint, the Presuppositionality Constraint, and the Context Dependence Constraint. The resulting interactions are studied in a variety of configurations with respect to generalized quantifiers of different properties, namely clitic doubling of existential, universal, negative quantifiers and doubling in questions. (shrink)
In this paper we study the semantic data complexity of several controlled fragments of English designed for natural language front-ends to OWL (Web Ontology Language) and description logic ontology-based systems. Controlled languages are fragments of natural languages, obtained by restricting natural language syntax, vocabulary and semantics with the goal of eliminating ambiguity. Semantic complexity arises from the formal logic modelling of meaning in natural language and fragments thereof. It can be characterized as the computational complexity of (...) the reasoning problems associated to their semantic representations. Data complexity (the complexity of answering a question over an ontology, stated in terms of the data items stored therein), in particular, provides a measure of the scalability of controlled languages to ontologies, since tractable data complexity implies scalability of data access. We present maximal tractable controlled languages and minimal intractable controlled languages. (shrink)
We can distinguish different senses in which a formal language can be said to have been provided with an interpretation. We focus on two: (i) We provide a model (or structure) and a definition of satisfaction and truth in the standard way (ii) We provide a translation into a natural language. We argue that the sentences of a formal language interpreted as in (i) do not have meaning. A formal language interpreted as in (i) models the way (...) the truth of a sentence would be affected by two factors: the interpretation as in (ii) of the language, and a way the world might be. Viewing in this way the relation between interpreting a formal language as in (i) and as in (ii) allows us to justify the conceptual adequacy of the standard model-theoretic definitions of the properties of logical truth and logical consequence. (shrink)
To ascertain that a formalization of the intuitive notion of a ‘concept’ is linguistically interesting, one has to check whether it allows to get a grip on distinctions and notions from lexical semantics. Prime candidates are notions like ‘prototype’, ‘stereotypical attribute’, ‘essential attribute versus accidental attribute’, ‘intension versus extension’. We will argue that although the current paradigm of formal concept analysis as an application of lattice theory is not rich enough for an analysis of these notions, a lattice (...) theoretical approach to concepts is a suitable starting point for formalizing them. (shrink)
My aim in this note is to address the question of how a context of utterance can ﬁgure within a formal, speciﬁcally truth-conditional, semantic theory. In particular, I want to explore whether a formal semantic theory could, or should, take the intentional states of a speaker to be relevant in determining the literal meaning of an uttered sentence. The answer I’m going to suggest, contrary to the position of many contemporary formal theorists, is negative. The structure of (...) this note is then as follows: ﬁrst, I’ll very brieﬂy sketch three distinct forms of semantic theory. One, ‘strong formalsemantics’, will be seen to be immediately problematic, leaving us with two other options: use-based theories and what I’ll term ‘moderate formalsemantics’. If we opt for the latter position, the question arises of what kinds of appeals to a context of utterance are legitimate given a formal outlook. I’ll suggest that this question arises in two distinct ways and explore the moderate formal semanticist’s position in regard to both. However, the conclusion I will reach is that what is characteristic of formalsemantics is that it makes only the most minimal semantic concessions to context. (shrink)
Linguists often sharply distinguish the different modules that support linguistics competence, e.g., syntax, semantics, pragmatics. However, recent work has identified phenomena in syntax (polarity sensitivity) and pragmatics (implicatures), which seem to rely on semantic properties (monotonicity). We propose to investigate these phenomena and their connections as a window into the modularity of our linguistic knowledge. We conducted a series of experiments to gather the relevant syntactic, semantic and pragmatic judgments within a single paradigm. The comparison between these quantitative data (...) leads us to four main results, (i) Our results support a departure from one element of the classical Gricean approach, thus helping to clarify and settle an empirical debate. This first outcome also confirms the soundness of the methodology, as the results align with standard contemporary accounts of scalar implicature (SI), (ii) We confirm that the formal semantic notion of monotonicity underlies negative polarity item (NPI) syntactic acceptability, but (iii) our results indicate that the notion needed is perceived monotonicity. We see results (ii) and (iii) as the main contribution of this study: (ii) provides an empirical interpretation and confirmation of one of the insights of the model-theoretic approach to semantics, while (iii) calls for an incremental, cognitive implementation of the current generalizations, (iv) Finally, our results do not indicate that the relationship between NPI acceptability and monotonicity is mediated by pragmatic features related to Sis: this tells against elegant attempts to unify polarity sensitivity and Sis (pioneered by Krifka and Chierchia). These results illustrate a new methodology for integrating theoretically rigorous work in formalsemantics with an experimentally-grounded cognitively-oriented view of linguistic competence. (shrink)
With a few notable exceptions formalsemantics, as it originated from the seminal work of Richard Montague, Donald Davidson, Max Cresswell, David Lewis and others, in the late sixties and early seventies of the previous century, does not consider Wittgenstein as one of its ancestors. That honour is bestowed on Frege, Tarski, Carnap. And so it has been in later developments. Most introductions to the subject will refer to Frege and Tarski (Carnap less frequently) —in addition to the (...) pioneers just mentioned, of course— , and discuss the main elements of their work that helped shape formalsemantics in some detail. But Wittgenstein is conspicuously absent whenever the history of the subject is mentioned (usually brieﬂy, if at all). (shrink)
A paraconsistent logic is a logic which allows non-trivial inconsistent theories. One of the oldest and best known approaches to the problem of designing useful paraconsistent logics is da Costa’s approach, which seeks to allow the use of classical logic whenever it is safe to do so, but behaves completely diﬀerently when contradictions are involved. da Costa’s approach has led to the family of Logics of Formal (In)consistency (LFIs). In this paper we provide non-deterministic semantics for a very (...) large family of ﬁrst-order LFIs (which includes da Costa’s original system.. (shrink)
A paraconsistent logic is a logic which allows non-trivial inconsistent theories. One of the oldest and best known approaches to the problem of designing useful paraconsistent logics is da Costa’s approach, which seeks to allow the use of classical logic whenever it is safe to do so, but behaves completely differently when contradictions are involved. da Costa’s approach has led to the family of Logics of Formal (In)consistency (LFIs). In this paper we provide non-deterministic semantics for a very (...) large family of first-order LFIs (which includes da Costa’s original system.. (shrink)
This paper discusses a number of methodological issues with mainstream formalsemantics and then investigates whetherWittgenstein’s later work provides an alternative approach that is able to avoid these issues.
Taken at face value, a programming language is defined by a formal grammar. But, clearly, there is more to it. By themselves, the naked strings of the language do not determine when a program is correct relative to some specification. For this, the constructs of the language must be given some semantic content. Moreover, to be employed to generate physical computations, a programming language must have a physical implementation. How are we to conceptualize this complex package? Ontologically, what kind (...) of thing is it? In this paper, we shall argue that an appropriate conceptualization is furnished by the notion of a technical artifact. (shrink)
The doctrines of scientific realism have enjoyed a close and enduring, if not always harmonious, association with Tarski's semantic conception of truth and theories of formalsemantics generally. From its inception Tarski's theory received unqualified support from some realists, like Karl Popper, who saw it as legitimizing the use of semantic notions in epistemology and the philosophy of science.
It is now a quarter of a century ago that Wolfgang Stegmfiller wrote his monograph 'Das Wahrheitsproblem und die Idee der Semantik' (1957) which dealt with Tarski's and Carnap's foundational work in the field of semantics. While this book is about the definition of the basic semantical concepts in artificial formallanguages there is an article written a year earlier (1956) in which Stegmfiller addresses himself specifically to the relation between logic and natural language. Here he gives (...) a logical analysis of the standard structural expressions in language that are still of primary concern for current semantics: quantifiers, pronouns, articles, etc. The motives for such an analysis at that time were mainly philosophical: the aim was to expose the misconceptions and pitfalls of traditional philosophy arising from the disregard of various systematic semantic ambiguities in everyday language. Or as Stegmfiller puts it: Ober sie [i.e. einige nicht triviale F~ille von Vagheit in der A11tagssprache] Klarheit zu gewinnen, ist schon deshalb yon auBerordentlicher Bedeutung, weil Unkenntnis fiber sie zu schwersten philosophischen Verirrungen ffihren kann, namlich entweder der Unterlassung von berechtigten Fragestellungen, oder, was weit h~ufiger vorgekommen ist, der Formulierung yon falsch gestellten Fragen, denen gegenfiber man dann nur die Wahl hat, entweder fiberhaupt keine oder nur sinnlose Antworten zu geben. So logic was to regiment language. Sentences involving the copula and the above-mentioned structure words are assigned one or more unambiguous formal representations in an already interpreted formal language, usually the first order predicate calculus. The natural language expressions thereby receive a precise meaning, since the semantics of the formal language has been specified in advance, as is always assumed. This procedure has, of course, always been common practice, witness the typical syntactical argot of the mathematicians. It carries, however, an obvious methodological presupposition: it is the idea that the logic of our choice to which the formal representations belong is in some sense an adequate framework to express our thoughts. (shrink)
In this paper we discuss two approaches to the axiomatization of scientific theories in the context of the so called semantic approach, according to which (roughly) a theory can be seen as a class of models. The two approaches are associated respectively to Suppes’ and to da Costa and Chuaqui’s works. We argue that theories can be developed both in a way more akin to the usual mathematical practice (Suppes), in an informal set theoretical environment, writing the set theoretical predicate (...) in the language of set theory itself or, more rigorously (da Costa and Chuaqui), by employing formallanguages that help us in writing the postulates to define a class of structures. Both approaches are called internal , for we work within a mathematical framework, here taken to be first-order ZFC. We contrast these approaches with an external one, here discussed briefly. We argue that each one has its strong and weak points, whose discussion is relevant for the philosophical foundations of science. (shrink)
The paper sets out to offer an alternative to the function/argument approach to the most essential aspects of natural language meanings. That is, we question the assumption that semantic completeness (of, e.g., propositions) or incompleteness (of, e.g., predicates) exactly replicate the corresponding grammatical concepts (of, e.g., sentences and verbs, respectively). We argue that even if one gives up this assumption, it is still possible to keep the compositionality of the semantic interpretation of simple predicate/argument structures. In our opinion, compositionality presupposes (...) that we are able to compare arbitrary meanings in term of information content. This is why our proposal relies on an ‘intrinsically’ type free algebraic semantic theory. The basic entities in our models are neither individuals, nor eventualities, nor their properties, but ‘pieces of evidence’ for believing in the ‘truth’ or ‘existence’ or ‘identity’ of any kind of phenomenon. Our formal language contains a single binary non-associative constructor used for creating structured complex terms representing arbitrary phenomena. We give a finite Hilbert-style axiomatisation and a decision algorithm for the entailment problem of the suggested system. (shrink)
A series of representations must be semantics-driven if the members of that series are to combine into a single thought. Where semantics is not operative, there is at most a series of disjoint representations that add up to nothing true or false, and therefore do not constitute a thought at all. There is necessarily a gulf between simulating thought, on the one hand, and actually thinking, on the other. A related point is that a popular doctrine - the (...) so-called 'computational theory of mind' (CTM) - is based on a confusion. CTM is the view that thought-processes consist in 'computations', where a computation is defined as a 'form-driven' operation on symbols. The expression 'form-driven operation' is ambiguous, and may refer either to syntax-driven operations or to morphology-driven operations. Syntax-driven operations presuppose the existence of operations that are driven by semantic and extra-semantic knowledge. So CTM is false if the terms 'computation' and 'form-driven operation' are taken to refer to syntax-driven operations. Thus, if CTM is to work, those expressions must be taken to refer to morphology-driven operations; and CTM therefore fails, given that an operation must be semantics-driven if it is to qualify as a thought. CTM therefore fails on every disambiguation of the expressions 'formal operation' and 'computation,' and it is therefore false. (shrink)
This project investigates the possibility of variation in the semantic component, a new and dynamic area of study in formal approaches to semantics. Its particular focus is the effect on variation of language contact. The semantic status of classifier languages of South Asia, which have been described as marginal instances of this language type, is used to illustrate the nature of the investigation. Data from a small representative sample of such languages will be collected. The semantic (...) system of these languages, which have been in contact with languages without classifiers, will be compared with the semantic system of core classifier languages such as Chinese and Japanese. The study will enrich the theoretical base for analyses of classifier systems by introducing the data from South Asian classifier languages, hitherto unknown in the semantic literature. It will shed light on current debates on the range of semantic variation permitted in natural language. (shrink)
For decades Ryszard Wójcicki has been a highly influential scholar in the community of logicians and philosophers. Our aim is to outline and comment on some essential issues on logic, methodology of science and semantics as seen from the perspective of distinguished contributions of Wójcicki to these areas of philosophical investigations.
ABSTRACT To more efficiently cover a wide spectrum of conceptual modeling applications such as computer-aided design, computer-aided manufacturing, and medical information systems, we envision multi-paradigm design environments which have reasoning capability to support analyzing specifcations for correctness. For such applications, information system designers employ conceptual models characterized by semantically-rich specification languages. The problem of providing a comprehensive formal framework for such languages has not been adequately addressed. This paper investigates a formal system for this purpose called (...) Event-Formula Logic (EFL). The analysis focuses in particular on characterizing correctness of database updates. Its applicability is demonstrated from two perspectives: deriving preconditions that guarantee a given update will not violate an integrity constraint, and determining alternative integrity maintenance rules for performing corrective actions when the application semantics requires this update interpretation. The work described in this paper represents an important step in the analysis of database update semantics through the use of algorithmic logic. As a result, it contributes towards providing a formal basis for semantically-rich information system design environments. (shrink)