This introductory chapter summarises key findings of the twenty-two book chapters in terms of five propositions. These propositions, each building on relevant findings linked to forward-looking suggestions for research, policy and practice, reflect the architecture of the book, whose sections proceed from setting the stage to critical issues, followed by a section on methods and tools, to chapters that provide geographic perspectives, and finally to a section that identifies potential policy options. The propositions comprise (1) Risk management can be an (...) effective entry point for aligning perspectives and debates, if framed comprehensively, coupled with climate justice considerations and linked to established risk management and adaptation practice; (2) Attribution science is advancing rapidly and fundamental to informing actions to minimise, avert, and address losses and damages; (3) Climate change research, in addition to identifying physical/hard limits to adaptation, needs to more systematically examine soft limits to adaptation, for which we find some evidence across several geographies globally; (4) Climate risk insurance mechanisms can serve the prevention and cure aspects emphasised in the L&D debate but solidarity and accountability aspects need further attention, for which we find tentative indication in applications around the world; (5) Policy deliberations may need to overcome the perception that L&D constitutes a win-lose negotiation “game” by developing a more inclusive narrative that highlights collective ambition for tackling risks, mutual benefits and the role of transformation. [Open Access]. (shrink)
This paper embeds the core part of Discourse Representation Theory in the classical theory of types plus a few simple axioms that allow the theory to express key facts about variables and assignments on the object level of the logic. It is shown how the embedding can be used to combine core analyses of natural language phenomena in Discourse Representation Theory with analyses that can be obtained in Montague Semantics.
This book radically simplifies Montague Semantics and generalizes the theory by basing it on a partial higher order logic. The resulting theory is a synthesis of Montague Semantics and Situation Semantics. In the late sixties Richard Montague developed the revolutionary idea that we can understand the concept of meaning in ordinary languages much in the same way as we understand the semantics of logical languages. Unfortunately, however, he formalized his idea in an unnecessarily complex way - two outstanding researchers in (...) the field even compared his work to a `Rube Goldberg machine.' Muskens' work does away with such unnecessary complexities, obtains a streamlined version of the theory, shows how partialising the theory automatically provides us with the most central concepts of Situation Semantics, and offers a simple logical treatment of propositional attitude verbs, perception verbs and proper names. (shrink)
While the enormous influence of Martin Heidegger's thought in Japan and China is well documented, the influence on him from East-Asian sources is much lesser known. This remarkable study shows that Heidegger drew some of the major themes of his philosophy--on occasion almost word for word--from German translations of Chinese Daoist and Zen Buddhist classics.
Sustainable development (SD) – that is, “Development that meets the needs of current generations without compromising the ability of future generations to meet their needs and aspirations” – can be pursued in many different ways. Stakeholder relations management (SRM) is one such way, through which corporations are confronted with economic, social, and environmental stakeholder claims. This paper lays the groundwork for an empirical analysis of the question of how far SD can be achieved through SRM. It describes the so-called SD–SRM (...) perspective as a distinctive research approach and shows how it relates to the wider body of stakeholder theory. Next, the concept of SD is operationalized for the microeconomic level with reference to important documents. Based on the ensuing SD framework, it is shown how SD and SRM relate to each other, and how the two concepts relate to other popular concepts such as Corporate Sustainability and Corporate Social Responsibility. The paper concludes that the significance of societal guiding models such as SD and of management approaches like CSR is strongly dependent on their footing in society. (shrink)
The term “vagueness” describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno's sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib's and Pelletier's () theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful (...) as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substantial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon. (shrink)
Carl Schmitt is one of the most widely read and influential German thinkers of the twentieth century. His fundamental works on friend and enemy, legality and legitimacy, dictatorship, political theology and the concept of the political are read today with great interest by everyone from conservative Catholic theologians to radical political thinkers on the left. In his private life, however, Schmitt was haunted by the demons of his wild anti-Semitism, his self-destructive and compulsive sexuality and his deep-seated resentment against the (...) complacency of bourgeois life. As a young man from a modest background, full of social envy, he succeeded in making his way to the top of the academic discipline of law in Germany through his exceptional intellectual prowess. And yet he never felt at home in the academic establishment and among those of high social standing. In his works, Schmitt unmasked the liberal Rechtsstaat as a constitutional façade and reflected on the legitimacy of dictatorship. When the Nazis seized power Schmitt was susceptible to their ideology. He broke with his Jewish friends, joined the Nazi Party in May 1933 and lent a helping hand to Hitler, thereby becoming deeply entangled with the regime. Schmitt was irrevocably compromised by his role as the ‘crown jurist’ of the Third Reich. But by 1936 he had already lost his influential position. After the war, he led a secluded life in his home town in the Sauerland and became a key background figure in the intellectual scene of postwar Germany. Reinhard Mehring’s outstanding biography is the most comprehensive work available on the life and work of Carl Schmitt. Based on thorough research and using new sources that were previously unavailable, Mehring portrays Schmitt as a Shakespearean figure at the centre of the German catastrophe. (shrink)
Measurement instruments assessing multiple emotions during epistemic activities are largely lacking. We describe the construction and validation of the Epistemically-Related Emotion Scales, which measure surprise, curiosity, enjoyment, confusion, anxiety, frustration, and boredom occurring during epistemic cognitive activities. The instrument was tested in a multinational study of emotions during learning from conflicting texts. The findings document the reliability, internal validity, and external validity of the instrument. A seven-factor model best fit the data, suggesting that epistemically-related emotions should be conceptualised in terms (...) of discrete emotion categories, and the scales showed metric invariance across the North American and German samples. Furthermore, emotion scores changed over time as a function of conflicting task information and related significantly to perceived task value and use of cognitive and metacognitive learning strategies. (shrink)
In this paper it is shown how the DRT (Discourse Representation Theory) treatment of temporal anaphora can be formalized within a version of Montague Semantics that is based on classical type logic.
Non-therapeutic circumcision violates boys’ right to bodily integrity as well as to self-determination. There is neither any verifiable medical advantage connected with the intervention nor is it painless nor without significant risks. Possible negative consequences for the psychosexual development of circumcised boys (due to substantial loss of highly erogenous tissue) have not yet been sufficiently explored, but appear to ensue in a significant number of cases. According to standard legal criteria, these considerations would normally entail that the operation be deemed (...) an ‘impermissible risk’—neither justifiable on grounds of parental rights nor of religious liberty: as with any other freedom right, these end where another person's body begins. Nevertheless, after a resounding decision by a Cologne district court that non-therapeutic circumcision constitutes bodily assault, the German legislature responded by enacting a new statute expressly designed to permit male circumcision even outside of medical settings. We first criticise the normative foundations upon which such a legal concession seems to rest, and then analyse two major flaws in the new German law which we consider emblematic of the difficulty that any legal attempt to protect medically irrelevant genital cutting is bound to face. (shrink)
The paper shows how ideas that explain the sense of an expression as a method or algorithm for finding its reference, preshadowed in Frege’s dictum that sense is the way in which a referent is given, can be formalized on the basis of the ideas in Thomason (1980). To this end, the function that sends propositions to truth values or sets of possible worlds in Thomason (1980) must be replaced by a relation and the meaning postulates governing the behaviour of (...) this relation must be given in the form of a logic program. The resulting system does not only throw light on the properties of sense and their relation to computation, but also shows circular behaviour if some ingredients of the Liar Paradox are added. The connection is natural, as algorithms can be inherently circular and the Liar is explained as expressing one of those. Many ideas in the present paper are closely related to those in Moschovakis (1994), but receive a considerably lighter formalization. (shrink)
We explore the different meanings of “quantum uncertainty” contained in Heisenberg’s seminal paper from 1927, and also some of the precise definitions that were developed later. We recount the controversy about “Anschaulichkeit”, visualizability of the theory, which Heisenberg claims to resolve. Moreover, we consider Heisenberg’s programme of operational analysis of concepts, in which he sees himself as following Einstein. Heisenberg’s work is marked by the tensions between semiclassical arguments and the emerging modern quantum theory, between intuition and rigour, and between (...) shaky arguments and overarching claims. Nevertheless, the main message can be taken into the new quantum theory, and can be brought into the form of general theorems. They come in two kinds, not distinguished by Heisenberg. These are, on one hand, constraints on preparations, like the usual textbook uncertainty relation, and, on the other, constraints on joint measurability, including trade-offs between accuracy and disturbance. (shrink)
In his introductory paper to first-order logic, Jon Barwise writes in the Handbook of Mathematical Logic :[T]he informal notion of provable used in mathematics is made precise by the formal notion provable in first-order logic. Following a sug[g]estion of Martin Davis, we refer to this view as Hilbert’s Thesis.This paper reviews the discussion of Hilbert’s Thesis in the literature. In addition to the question whether it is justifiable to use Hilbert’s name here, the arguments for this thesis are compared with (...) those for Church’s Thesis concerning computability. This leads to the question whether one could provide an analogue for proofs of the concept of partial recursive function. (shrink)
ABSTRACTIf war is an inevitable condition of human nature, as David Hume suggests, then what type of societies can best protect us from defeat and conquest? For David Hume, commerce decreases the relative cost of war and promotes technological military advances as well as martial spirit. Commerce therefore makes a country militarily stronger and better equipped to protect itself against attacks than any other kind of society. Hume does not assume commerce would yield a peaceful world nor that commercial societies (...) would be militarily weak, as many contemporary scholars have argued. On the contrary, for him, military might is a beneficial consequence of commerce. (shrink)
A logic is called higher order if it allows for quantification over higher order objects, such as functions of individuals, relations between individuals, functions of functions, relations between functions, etc. Higher order logic began with Frege, was formalized in Russell [46] and Whitehead and Russell [52] early in the previous century, and received its canonical formulation in Church [14].1 While classical type theory has since long been overshadowed by set theory as a foundation of mathematics, recent decades have shown remarkable (...) comebacks in the fields of mechanized reasoning (see, e.g., Benzm¨. (shrink)
In this paper we define intensional models for the classical theory of types, thus arriving at an intensional type logic ITL. Intensional models generalize Henkin's general models and have a natural definition. As a class they do not validate the axiom of Extensionality. We give a cut-free sequent calculus for type theory and show completeness of this calculus with respect to the class of intensional models via a model existence theorem. After this we turn our attention to applications. Firstly, it (...) is argued that, since ITL is truly intensional, it can be used to model ascriptions of propositional attitude without predicting logical omniscience. In order to illustrate this a small fragment of English is defined and provided with an ITL semantics. Secondly, it is shown that ITL models contain certain objects that can be identified with possible worlds. Essential elements of modal logic become available within classical type theory once the axiom of Extensionality is given up. (shrink)
Taking the lead from orthodox quantum theory, I will introduce a handy generalization of the Boolean approach to propositions and questions: the orthoalgebraic framework. I will demonstrate that this formalism relates to a formal theory of questions (or ‘observables’ in the physicist’s jargon). This theory allows formulating attitude questions, which normally are non-commuting, i.e., the ordering of the questions affects the answer behavior of attitude questions. Further, it allows the expression of conditional questions such as “If Mary reads the book, (...) will she recommend it to Peter?”, and thus gives the framework the semantic power of raising issues and being informative at the same time. In the case of commuting observables, there are close similarities between the orthoalgebraic approach to questions and the Jäger/Hulstijn approach to question semantics. However, there are also differences between the two approaches even in case of commuting observables. The main difference is that the Jäger/Hulstijn approach relates to a partition theory of questions whereas the orthoalgebraic approach relates to a ‘decorated’ partition theory (i.e. the elements of the partition are decorated by certain semantic values). Surprisingly, the orthoalgebraic approach is able to overcome most of the difficulties of the Jäger/Hulstijn approach. Furthermore, the general approach is suitable to describe the different types of (non-commutative) attitude questions as investigated in modern survey research. Concluding, I will suggest that an active dialogue between the traditional model-theoretic approaches to semantics and the orthoalgebraic paradigm is mandatory. (shrink)
In this paper we consider the theory of predicate logics in which the principle of Bivalence or the principle of Non-Contradiction or both fail. Such logics are partial or paraconsistent or both. We consider sequent calculi for these logics and prove Model Existence. For L4, the most general logic under consideration, we also prove a version of the Craig-Lyndon Interpolation Theorem. The paper shows that many techniques used for classical predicate logic generalise to partial and paraconsistent logics once the right (...) set-up is chosen. Our logic L4 has a semantics that also underlies Belnap’s [4] and is related to the logic of bilattices. L4 is in focus most of the time, but it is also shown how results obtained for L4 can be transferred to several variants. (shrink)
We consider several puzzles of bounded rationality. These include the Allais- and Ellsberg paradox, the disjunction effect, and related puzzles. We argue that the present account of quantum cognition—taking quantum probabilities rather than classical probabilities—can give a more systematic description of these puzzles than the alternate treatments in the traditional frameworks of bounded rationality. Unfortunately, the quantum probabilistic treatment does not always provide a deeper understanding and a true explanation of these puzzles. One reason is that quantum approaches introduce additional (...) parameters which possibly can be fitted to empirical data but which do not necessarily explain them. Hence, the phenomenological research has to be augmented by responding to deeper foundational issues. In this article, we make the general distinction between foundational and phenomenological research programs, explaining the foundational issue of quantum cognition from the perspective of operational realism. This framework is motivated by assuming partial Boolean algebras. They are combined into a uniform system via a mechanism preventing the simultaneous realization of perspectives. Gleason’s theorem then automatically leads to a distinction between probabilities that are defined by pure states and probabilities arising from the statistical mixture of pure states. This formal distinction relates to the conceptual distinction between risk and ignorance. Another outcome identifies quantum aspects in dynamic macro-systems using the framework of symbolic dynamics. Finally, we discuss several ideas that are useful for justifying complementarity in cognitive systems. (shrink)
In this paper we discuss a new perspective on the syntax-semantics interface. Semantics, in this new set-up, is not ‘read off’ from Logical Forms as in mainstream approaches to generative grammar. Nor is it assigned to syntactic proofs using a Curry-Howard correspondence as in versions of the Lambek Calculus, or read off from f-structures using Linear Logic as in Lexical-Functional Grammar (LFG, Kaplan & Bresnan [9]). All such approaches are based on the idea that syntactic objects (trees, proofs, fstructures) are (...) somehow prior and that semantics must be parasitic on those syntactic objects. We challenge this idea and develop a grammar in which syntax and semantics are treated in a strictly parallel fashion. The grammar will have many ideas in common with the (converging) frameworks of categorial grammar and LFG, but its treatment of the syntax-semantics interface is radically different. Also, although the meaning component of the grammar is a version of Montague semantics and although there are obvious affinities between Montague’s conception of grammar and the work presented here, the grammar is not compositional, in the sense that composition of meaning need not follow surface structure. (shrink)
The neurosciences not only challenge assumptions about the mind’s place in the natural world but also urge us to reconsider its role in the normative world. Based on mind-brain dualism, the law affords only one-sided protection: it systematically protects bodies and brains, but only fragmentarily minds and mental states. The fundamental question, in what ways people may legitimately change mental states of others, is largely unexplored in legal thinking. With novel technologies to both intervene into minds and detect mental activity, (...) the law should, we suggest, introduce stand alone protection for the inner sphere of persons. We shall address some metaphysical questions concerning physical and mental harm and demonstrate gaps in current doctrines, especially in regard to manipulative interferences with decision-making processes. We then outline some reasons for the law to recognize a human right to mental liberty and propose elements of a novel criminal offence proscribing severe interventions into other minds. (shrink)
The paper develops Lambda Grammars, a form of categorial grammar that, unlike other categorial formalisms, is non-directional. Linguistic signs are represented as sequences of lambda terms and are combined with the help of linear combinators.
This paper shows how the dynamic interpretation of natural language introduced in work by Hans Kamp and Irene Heim can be modeled in classical type logic. This provides a synthesis between Richard Montague's theory of natural language semantics and the work by Kamp and Heim.
Vector models of language are based on the contextual aspects of words and how they co-occur in text. Truth conditional models focus on the logical aspects of language, the denotations of phrases, and their compositional properties. In the latter approach the denotation of a sentence determines its truth conditions and can be taken to be a truth value, a set of possible worlds, a context change potential, or similar. In this short paper, we develop a vector semantics for language based (...) on the simply typed lambda calculus. Our semantics uses techniques familiar from the truth conditional tradition and is based on a form of dynamic interpretation inspired by Heim's context updates. (shrink)
This paper introduces λ-grammar, a form of categorial grammar that has much in common with LFG. Like other forms of categorial grammar, λ-grammars are multi-dimensional and their components are combined in a strictly parallel fashion. Grammatical representations are combined with the help of linear combinators, closed pure λ-terms in which each abstractor binds exactly one variable. Mathematically this is equivalent to employing linear logic, in use in LFG for semantic composition, but the method seems more practicable.
There are two kinds of semantic theories of anaphora. Some, such as Heim’s File Change Semantics, Groenendijk and Stokhof’s Dynamic Predicate Logic, or Muskens’ Compositional DRT (CDRT), seem to require full coindexing of anaphora and their antecedents prior to interpretation. Others, such as Kamp’s Discourse Representation Theory (DRT), do not require this coindexing and seem to have an important advantage here. In this squib I will sketch a procedure that the first group of theories may help themselves to so that (...) they can interleave interpretation and coindexing in DRT’s way. (shrink)
This paper developes a relational---as opposed to a functional---theory of types. The theory is based on Hilbert and Bernays' eta operator plus the identity symbol, from which Church's lambda and the other usual operators are then defined. The logic is intended for use in the semantics of natural language.
The ‘syntax’ and ‘combinatorics’ of my title are what Curry (1961) referred to as phenogrammatics and tectogrammatics respectively. Tectogrammatics is concerned with the abstract combinatorial structure of the grammar and directly informs semantics, while phenogrammatics deals with concrete operations on syntactic data structures such as trees or strings. In a series of previous papers (Muskens, 2001a; Muskens, 2001b; Muskens, 2003) I have argued for an architecture of the grammar in which finite sequences of lambda terms are the basic data structures, (...) pairs of terms syntax, semantics for example. These sequences then combine with the help of simple generalizations of the usual abstraction and application operations. This theory, which I call Lambda Grammars and which is closely related to the independently formulated theory of Abstract Categorial Grammars (de Groote, 2001; de Groote, 2002), in fact is an implementation of Curry’s ideas: the level of tectogrammar is encoded by the sequences of lambda-terms and their ways of combination, while the syntactic terms in those sequences constitute the phenogrammatical level. In de Groote’s formulation of the theory, tectogrammar is the level of abstract terms, while phenogrammar is the level of object terms. (shrink)
We give a survey on truth theories for applicative theories. It comprises Frege structures, universes for Frege structures, and a theory of supervaluation. We present the proof-theoretic results for these theories and show their syntactical expressive power. In particular, we present as a novelty a syntactical interpretation of ID1 in a applicative truth theory based on supervaluation.
There is concern that the use of neuroenhancements to alter character traits undermines consumer's authenticity. But the meaning, scope and value of authenticity remain vague. However, the majority of contemporary autonomy accounts ground individual autonomy on a notion of authenticity. So if neuroenhancements diminish an agent's authenticity, they may undermine his autonomy. This paper clarifies the relation between autonomy, authenticity and possible threats by neuroenhancements. We present six neuroenhancement scenarios and analyse how autonomy accounts evaluate them. Some cases are considered (...) differently by criminal courts; we demonstrate where academic autonomy theories and legal reasoning diverge and ascertain whether courts should reconsider their concept of autonomy. We argue that authenticity is not an appropriate condition for autonomy and that new enhancement technologies pose no unique threats to personal autonomy. (shrink)
We present Logical Description Grammar (LDG), a model ofgrammar and the syntax-semantics interface based on descriptions inelementary logic. A description may simultaneously describe the syntacticstructure and the semantics of a natural language expression, i.e., thedescribing logic talks about the trees and about the truth-conditionsof the language described. Logical Description Grammars offer a naturalway of dealing with underspecification in natural language syntax andsemantics. If a logical description (up to isomorphism) has exactly onetree plus truth-conditions as a model, it completely specifies thatgrammatical (...) object. More common is the situation, corresponding tounderspecification, in which there is more than one model. A situation inwhich there are no models corresponds to an ungrammatical input. (shrink)