In [HKL00] (henceforth HKL), Hamm, Kamp and van Lambalgen declare ‘‘there is no opposition between formal and cognitive semantics,’’ notwithstanding the realist/mentalist divide. That divide separates two sides Jackendo¤ has (in [Jac96], following Chomsky) labeled E(xternalized)-semantics, relating language to a reality independent of speakers, and I(nternalized)-semantics, revolving around mental representations and thought. Although formal semanticists have (following David Lewis) traditionally leaned towards E-semantics, it is reasonable to apply formal methods also to I-semantics. This point is made clear in HKL via (...) two computational approaches to natural language semantics, Discourse Representation Theory (DRT, [KR93]) and the Event Calculus (EC) presented in [LH05]. In this short note, I wish to raise certain questions about EC that can be traced to the applicability of formal methods to E-semantics and I-semantics alike. These opposing orientations suggest di¤erent notions of time, event and representation. (shrink)
Relations computed by ﬁnite-state transducers are applied to interpret temporal propositions in terms of strings representing ﬁnite contexts or situations. Carnap–Montague intensions mapping indices to extensions are reformulated as relations between strings that can serve as indices and extensions alike. Strings are related according to information content, temporal span and granularity, the bounds on which reﬂect the partiality of natural language statements. That partiality shapes not only strings-as-extensions (indicating what statements are about) but also strings-as-indices (underlying truth conditions).
Events employed in natural language semantics are characterized in terms of regular languages, each string in which can be regarded as a motion picture. The relevant ﬁnite automata then amount to movie cameras/projectors, or more formally, to ﬁnite Kripke structures with par- tial valuations. The usual regular constructs (concatena- tion, choice, etc) are supplemented with superposition of strings/automata/languages, realized model-theoretically as conjunction.
Finite-state methods are applied to the Russell-Wiener-Kamp notion of time (based on events) and developed into an account of interval relations and semi-intervals. Strings are formed and collected in regular languages and regular relations that are argued to embody temporal relations in their various underspeciﬁed guises. The regular relations include retractions that reduce computations by projecting strings down to an appropriate level of granularity, and notions of containment for partiality within and across such levels.
Finite-state methods are applied to the Russell-Wiener notion of time (based on events) and developed into an account of interval relations and temporal propositions. Strings are formed and collected in regular languages and regular relations that are argued to embody temporal relations in their various underspecified guises. The regular relations include retractions that reduce computations by projecting strings down to an appropriate level of granularity, and non-deterministic relations defining notions of partiality within and across such levels.
Inertia is enshrined in Newton’s ﬁrst law of motion, a body at rest or in uniform motion remains in that state unless a force is applied to it. Now, consider (1). (1) Pat stopped the car before it hit the tree. Can we conclude from (1) that the car struck the tree? Not without further information such as that supplied in (2). (2) But the bus behind kept going. A post-condition for Pat stopping the car is that the car be (...) at rest. To satisfy a pre-condition for the car hitting the tree (namely, that the car not be at rest), inertia requires that some intervening force act on the car (as hinted, for example, by (2)). In the absence of such a force, (1) would appear to suggest that Pat prevented a collision between car and tree. Exactly what bit of physics are we importing into natural language interpretation here? Oversimpliﬁed, Newton’s ﬁrst law of motion says: no change without force. Identifying force with cause, we come to the slogan no temporality without cause, capturing in a phrase the proposal from Steedman 2000 that.. (shrink)
Schubert’s proposal ([Sch00]) that sentences not only describe but also characterize situations is worked out in the context of Linear Temporal Logic (LTL), a well-known propositional logic with linear future operators (e.g. [HR04]). The resulting formalism LTL∗∗ illustrates an approach that diverges from Schubert’s FOL∗∗ in technical details but shares many (if not all) its motivations.
judgments of acceptability A basic choice point is whether the conjunction of two propositions each separately acceptable must be deemed acceptable Concepts of acceptability closed under conjunction are analyzed within Keisler s weak logic for generalized quanti ers or more speci cally lter quanti ers In a di erent direction the notion of a lter is generalized so as to allow sets with probability non in nitesimally below to be acceptable..
String representations of events are applied to Robin Cooper’s proposal that propositions in natural language semantics are types of situations. Links with the higher types of prooftheoretic semantics are forged, deepening type-theoretic interpretations of Discourse Representation Structures to encompass event structures.
Intervals and the events that occur in them are encoded as strings, elaborating on a conception of events as “intervals cum description.” Notions of satisfaction in interval temporal logics are formulated in terms of strings, and the possibility of computing these via ﬁnite-state machines/transducers is investigated. This opens up temporal semantics to ﬁnite-state methods, with entailments that are decidable insofar as these can be reduced to inclusions between regular languages.
Notions of context for natural language interpretation are factored in terms of three processes: translation, entailment and attunement. The processes are linked by accessibility relations of the kind studied in many-dimensional modal logic, modulo complications from constraints between translation and entailment (violations in which may trigger re-attunement) and from reﬁnement and underspeciﬁcation.
‘The proper treatment of events’ is the title of a recent book [LH04] by M. van Lambalgen and F. Hamm, applying the event calculus from [Sha97] to natural language semantics. Some basic ideas behind that treatment are presented in a technically diﬀerent form below, shaped by a concrete formulation of events as strings of sets of ﬂuents ([Fer04]). These strings can be read as comic strips that are (I think) easy to grasp and work with, providing a friendly (if not (...) altogether proper) approach to events. (shrink)
The processing of sequences of (English) sentences is analyzed compositionally through transitions that merge sentences, rather than decomposing them. Transitions that are in a precise sense inertial are related to disjunctive and non-deterministic approaches to ambiguity. Modal interpretations are investigated, inducing various equivalences on sequences.
This course aims to assess the principle of compositionality (CP) and how it fits with recent developments in natural language interpretation, especially those that stress the role of context. We first try to lay down a suitable formal framework for CP, reviewing proposals by Montague, Janssen, Hendriks, Kracht and Hodges. Versions of CP of varying strength are formulated, and some recent results on the existence of compositional semantics and the (much debated) issue of the empirical import of CP discussed. Complementing (...) CP is the notion of context which, under modern (e.g. "dynamic") conceptions, not only conditions interpretation but also is transformed during interpretation. The tension between CP and context is examined relative to problems of anaphora, presupposition, idioms and ambiguity. A somewhat un-orthodox computational application of CP is suggested, emphasizing co-inductive aspects of interpretation that cut across the divide between model-theoretic and proof-theoretic approaches, and between procedural and declarative styles. (shrink)
Conservativity in generalized quantifiers is linked to presupposition filtering, under a propositions-as-types analysis extended with dependent quantifiers. That analysis is underpinned by modeltheoretically interpretable proofs which inhabit propositions they prove, thereby providing objects for quantification and hooks for anaphora.
with the meaning function [[·]] appearing on both sides. (1) is commonly construed as a prescription for computing the meaning of a based on the parts of a and their mode of combination. As equality is symmetric, however, we can also read (1) from right to left, as a constraint on the meaning [[b]] of a term b that brings in the wider context where b may occur, in accordance with what Dag Westerst˚ahl has recently described as “one version of (...) Frege’s famous Context Principle”. (shrink)
Anankastic conditionals are analyzed in terms of events conceived as sequences of snapshots – roughly, comics. Quantiﬁcation is applied not to worlds (sets of which are customarily identiﬁed with propositions) but to strings that record observations of actions. The account generalizes to other types of conditionals, sidestepping certain well-known problems that beset possible worlds treatments, such as logical omniscience and irrelevance. A reﬁnement for anankastic conditionals is considered, incorporating action relations.
The “surge in use of finite-state methods” () in computational linguistics has largely, if not completely, left semantics untouched. The present paper is directed towards correcting this situation. Techniques explained in  are applied to a fragment of temporal semantics through an approach we call finite-state temporality. This proceeds from the intuition of an event as “a series of snapshots” (; see also ), equating snapshots with symbols that collectively form our alphabet. A sequence of snapshots then becomes a string (...) over that alphabet, evoking comic/film strips. Jackendoff has, among others, objected to conceptualizing events in terms of snapshots (). To counter these objections, we step up from events-as-strings to event-typesas-regular languages ([5, 6]), recognizing the need for variable granularity. Beyond the introduction of disjunction implicit in the step from a single string up to a set of strings, we obtain a useful logic from the regular operations and a careful choice of the snapshots (constituting our alphabet). (shrink)
Finite-state descriptions for temporal semantics are outlined through which to distinguish soft inferences reflecting manners of conceptualization from more robust semantic entailments defined over models. Just what descriptions are built (before being interpreted model-theoretically) and how they are grounded in models of reality explain (upon examination) why some inferences are soft.
Finite-state methods are applied to determine the consequences of events, represented as strings of sets of fluents. Developed to flesh out events used in natural language semantics, the approach supports reasoning about action in AI, including the frame problem and inertia. Representational and inferential aspects of the approach are explored, centering on conciseness of language, context update and constraint application with bias.
Reichenbach's event, reference and speech times are interpreted semantically by stringing and superposing sets of temporal formulae, structured within regular languages. Notions of continuation branches and of inertia, bound (in a precise sense) by reference time, are developed and applied to the progressive and the perfect.
The idea that temporal propositions are vague predicates is examined with attention to the nature of the objects over which the predicates range. These objects should not, it is argued, be identified once and for all with points or intervals in the real line (or any fixed linear order). Context has an important role to play not only in sidestepping the Sorites paradox (Gaifman 2002) but also in shaping temporal moments/extent (Landman 1991). The Russell-Wiener construction of time from events (Kamp (...) 1979) is related to a notion of context given by a string of observations, the vagueness in which is brought out by grounding the observations in the real line. With this notion of context, the context dependency functions in Gaifman 2002 are adapted to interpret temporal propositions. (shrink)
Temporal propositions are mapped to sets of strings that witness (in a precise sense) the propositions over discrete linear Kripke frames. The strings are collected into regular languages to ensure the decidability of entailments given by inclusions between languages. (Various notions of bounded entailment are shown to be expressible as language inclusions.) The languages unwind computations implicit in the logical (and temporal) connectives via a system of finite-state constraints adapted from finite-state morphology. Applications to Hybrid Logic and non-monotonic inertial reasoning (...) are briefly considered. (shrink)
To address complications involving ambiguity, presupposition and implicature, three processes underlying natural language interpretation are isolated: translation, entailment and attunement. A meta-language integrating these processes is outlined, elaborating on a proof-theoretic approach to presupposition.
The Yale Shooting Problem introduced by Steve Hanks & Drew McDermott (1987) is a well-known test case of non-monotonic temporal reasoning. There is a sequence of situations. In the initial situation a gun is not loaded and the target is alive. In the next situation the gun is loaded. Eventually, a shot is fired, perhaps with fatal consequences. In this scenario there are two "fluents", alive and loaded, and two actions, load and shoot. Being loaded and being alive are inert (...) propositions in the sense that if they are true at a given moment, they will be true at the next moment unless some action such as.. (shrink)
This groundbreaking collection, the most thorough treatment of the philosophy of linguistics ever published, brings together philosophers, scientists and historians to map out both the foundational assumptions set during the second half of ...
Situations serving as partial worlds as well as events in natural language semantics are constructed from a type-theoretic interpretation of firstorder formulae and (after a type reduction) temporal formulae. Limitations of the Russell-Wiener-Kamp derivation of time from events are discussed and overcome to give a more widely applicable account of temporal granularity. Finite situations are formulated as strings of observations, conceptualized to persist inertially (in the absence of forces).
A distinction is drawn between situations as indices required for semantically evaluating sentences and situations as denotations resulting from such evaluation. For atomic sentences, possible worlds may serve as indices, and events as denotations. The distinction is extended beyond atomic sentences according to formulae-as-types and applied to implicit quantifier domain restrictions, intensionality and conditionals.
The notion of inertia is explicated in terms of forces recorded in snapshots that are strung together to represent events. The role inertia worlds were conceived to serve in the semantics of the progressive is assumed by a branching construct that specifies what may follow, apart from what follows.
Events and situations are represented by strings of temporally ordered observations, on the basis of which the events and situations are recognized. Allen’s basic interval relations are derived from superposing strings that mark interval boundaries, and Kamp’s event structures are constructed as projective limits of strings. Observations are generalized to temporal propositions, leading to event-types that classify event-instances. Working with sets of strings built from temporal propositions, we obtain natural notions of bounded entailment from set inclusions. These inclusions are decidable (...) if the sets are accepted by finite automata. (shrink)
Eventualities and worlds are analysed uniformly as schedules of certain descriptions of eventuality-types (reversing the reduction of eventuality-types to eventualities). The temporal interpretation of modals in Condoravdi 2002 is reformulated to bring out what it is about eventualities and worlds that is essential to the account. What is essential, it is claimed, can be recovered from schedules that may or may not include worlds.
The processing of sequences of (English) sentences is analyzedcompositionally through transitions that merge sentences, rather thandecomposing them. Transitions that are in a precise senseinertial are related to disjunctive and non-deterministic approaches toambiguity. Modal interpretations are investigated, inducing variousequivalences on sequences.
Dynamic and proof-conditional approaches to discourse (exemplified by Discourse Representation Theory and Type-Theoretical Grammar, respectively) are related through translations and transitions labeled by first-order formulas with anaphoric twists. Type-theoretic contexts are defined relative to a signature and instantiated modeltheoretically, subject to change.
A modal logic for translating a sequence of English sentences to a sequence of logical forms is presented, characterized by Kripke models with points formed from input/output sequences, and valuations determined by entailment relations. Previous approaches based (to one degree or another) on Quantified Dynamic Logic are embeddable within it. Applications to presupposition and ambiguity are described, and decision procedures and axiomatizations supplied.
Notions of disambiguation supporting a compositional interpretation ofvambiguous expressions and reflecting intuitions about how sentences combinevin discourse are investigated. Expressions are analyzed both inductively byvbreaking them apart, and co-inductively by embedding them within larger contexts.
are considered with a view toward analyzing operational semantics from the perspective of predicate logic. The notion of a bisimulation is employed in two distinct ways: (i) as an extensional notion of equivalence on programs (or processes) generalizing input/output equivalence (at a cost exceeding II' ,over certain transition predicates computable in log space). and (ii) as a tool for analyzing the dependence of transitions on data (which can be shown to be elementary or nonelementary. depending on the formulation of the (...) transitions). (shrink)