This introductory chapter summarises key findings of the twenty-two book chapters in terms of five propositions. These propositions, each building on relevant findings linked to forward-looking suggestions for research, policy and practice, reflect the architecture of the book, whose sections proceed from setting the stage to critical issues, followed by a section on methods and tools, to chapters that provide geographic perspectives, and finally to a section that identifies potential policy options. The propositions comprise (1) Risk management can be an (...) effective entry point for aligning perspectives and debates, if framed comprehensively, coupled with climate justice considerations and linked to established risk management and adaptation practice; (2) Attribution science is advancing rapidly and fundamental to informing actions to minimise, avert, and address losses and damages; (3) Climate change research, in addition to identifying physical/hard limits to adaptation, needs to more systematically examine soft limits to adaptation, for which we find some evidence across several geographies globally; (4) Climate risk insurance mechanisms can serve the prevention and cure aspects emphasised in the L&D debate but solidarity and accountability aspects need further attention, for which we find tentative indication in applications around the world; (5) Policy deliberations may need to overcome the perception that L&D constitutes a win-lose negotiation “game” by developing a more inclusive narrative that highlights collective ambition for tackling risks, mutual benefits and the role of transformation. [Open access]. (shrink)
In this programmatic paper we renew the well-known question “What is a proof?”. Starting from the challenge of the mathematical community by computer assisted theorem provers we discuss in the first part how the experiences from examinations of proofs can help to sharpen the question. In the second part we have a look to the new challenge given by “big proofs”.
The order of stages in a multistage game is often interpreted by looking at earlier stages as involving more long term decisions. For the purpose of making this interpretation precise, the notion of a delay supergame of a bounded multistage game is introduced. A multistage game is bounded if the length of play has an upper bound. A delay supergame is played over many periods. Decisions on all stages are made simultaneously, but with different delays until they become effective. The (...) earlier the stage the longer the delay. A subgame perfect equilibrium of a bounded multistage game generates a subgame perfect equilibrium in every one of its delay supergames. This is the first main conclusion of the paper. A subgame perfect equilibrium set is a set of subgame perfect equilibria all of which yield the same payoffs, not only in the game as a whole, but also in each of its subgames. The second xmain conclusion concerns multistage games with a unique subgame perfect equilibrium set and their delay supergames which are bounded in the sense that the number of periods is finite. If a bounded multistage game has a unique subgame perfect equilibrium set, then the same is true for every one of its bounded delay supergames. Finally the descriptive relevance of multistage game models and their subgame perfect equilibria is discussed in the light of the results obtained. (shrink)
While the enormous influence of Martin Heidegger's thought in Japan and China is well documented, the influence on him from East-Asian sources is much lesser known. This remarkable study shows that Heidegger drew some of the major themes of his philosophy--on occasion almost word for word--from German translations of Chinese Daoist and Zen Buddhist classics.
Sustainable development (SD) – that is, “Development that meets the needs of current generations without compromising the ability of future generations to meet their needs and aspirations” – can be pursued in many different ways. Stakeholder relations management (SRM) is one such way, through which corporations are confronted with economic, social, and environmental stakeholder claims. This paper lays the groundwork for an empirical analysis of the question of how far SD can be achieved through SRM. It describes the so-called SD–SRM (...) perspective as a distinctive research approach and shows how it relates to the wider body of stakeholder theory. Next, the concept of SD is operationalized for the microeconomic level with reference to important documents. Based on the ensuing SD framework, it is shown how SD and SRM relate to each other, and how the two concepts relate to other popular concepts such as Corporate Sustainability and Corporate Social Responsibility. The paper concludes that the significance of societal guiding models such as SD and of management approaches like CSR is strongly dependent on their footing in society. (shrink)
Measurement instruments assessing multiple emotions during epistemic activities are largely lacking. We describe the construction and validation of the Epistemically-Related Emotion Scales, which measure surprise, curiosity, enjoyment, confusion, anxiety, frustration, and boredom occurring during epistemic cognitive activities. The instrument was tested in a multinational study of emotions during learning from conflicting texts. The findings document the reliability, internal validity, and external validity of the instrument. A seven-factor model best fit the data, suggesting that epistemically-related emotions should be conceptualised in terms (...) of discrete emotion categories, and the scales showed metric invariance across the North American and German samples. Furthermore, emotion scores changed over time as a function of conflicting task information and related significantly to perceived task value and use of cognitive and metacognitive learning strategies. (shrink)
This paper embeds the core part of Discourse Representation Theory in the classical theory of types plus a few simple axioms that allow the theory to express key facts about variables and assignments on the object level of the logic. It is shown how the embedding can be used to combine core analyses of natural language phenomena in Discourse Representation Theory with analyses that can be obtained in Montague Semantics.
This book radically simplifies Montague Semantics and generalizes the theory by basing it on a partial higher order logic. The resulting theory is a synthesis of Montague Semantics and Situation Semantics. In the late sixties Richard Montague developed the revolutionary idea that we can understand the concept of meaning in ordinary languages much in the same way as we understand the semantics of logical languages. Unfortunately, however, he formalized his idea in an unnecessarily complex way - two outstanding researchers in (...) the field even compared his work to a `Rube Goldberg machine.' Muskens' work does away with such unnecessary complexities, obtains a streamlined version of the theory, shows how partialising the theory automatically provides us with the most central concepts of Situation Semantics, and offers a simple logical treatment of propositional attitude verbs, perception verbs and proper names. (shrink)
We explore the different meanings of “quantum uncertainty” contained in Heisenberg’s seminal paper from 1927, and also some of the precise definitions that were developed later. We recount the controversy about “Anschaulichkeit”, visualizability of the theory, which Heisenberg claims to resolve. Moreover, we consider Heisenberg’s programme of operational analysis of concepts, in which he sees himself as following Einstein. Heisenberg’s work is marked by the tensions between semiclassical arguments and the emerging modern quantum theory, between intuition and rigour, and between (...) shaky arguments and overarching claims. Nevertheless, the main message can be taken into the new quantum theory, and can be brought into the form of general theorems. They come in two kinds, not distinguished by Heisenberg. These are, on one hand, constraints on preparations, like the usual textbook uncertainty relation, and, on the other, constraints on joint measurability, including trade-offs between accuracy and disturbance. (shrink)
The paper shows how ideas that explain the sense of an expression as a method or algorithm for finding its reference, preshadowed in Frege’s dictum that sense is the way in which a referent is given, can be formalized on the basis of the ideas in Thomason (1980). To this end, the function that sends propositions to truth values or sets of possible worlds in Thomason (1980) must be replaced by a relation and the meaning postulates governing the behaviour of (...) this relation must be given in the form of a logic program. The resulting system does not only throw light on the properties of sense and their relation to computation, but also shows circular behaviour if some ingredients of the Liar Paradox are added. The connection is natural, as algorithms can be inherently circular and the Liar is explained as expressing one of those. Many ideas in the present paper are closely related to those in Moschovakis (1994), but receive a considerably lighter formalization. (shrink)
In his introductory paper to first-order logic, Jon Barwise writes in the Handbook of Mathematical Logic :[T]he informal notion of provable used in mathematics is made precise by the formal notion provable in first-order logic. Following a sug[g]estion of Martin Davis, we refer to this view as Hilbert’s Thesis.This paper reviews the discussion of Hilbert’s Thesis in the literature. In addition to the question whether it is justifiable to use Hilbert’s name here, the arguments for this thesis are compared with (...) those for Church’s Thesis concerning computability. This leads to the question whether one could provide an analogue for proofs of the concept of partial recursive function. (shrink)
In this paper we define intensional models for the classical theory of types, thus arriving at an intensional type logic ITL. Intensional models generalize Henkin's general models and have a natural definition. As a class they do not validate the axiom of Extensionality. We give a cut-free sequent calculus for type theory and show completeness of this calculus with respect to the class of intensional models via a model existence theorem. After this we turn our attention to applications. Firstly, it (...) is argued that, since ITL is truly intensional, it can be used to model ascriptions of propositional attitude without predicting logical omniscience. In order to illustrate this a small fragment of English is defined and provided with an ITL semantics. Secondly, it is shown that ITL models contain certain objects that can be identified with possible worlds. Essential elements of modal logic become available within classical type theory once the axiom of Extensionality is given up. (shrink)
Es geht um Carnaps Konstitution der Qualitätsklassen (qual) mittels der Methode der Quasianalyse. Diese Konstitutionsmethode hat Carnap in seinem Werk Der logische Außau der Welt verwendet. Es wird gezeigt, daß Carnaps Qualitätsdefinition sowohl an technischen als auch an grundsätzlichen Mängeln scheitert. Des weiteren werden Verbesserungsvorschläge der Camapschen Methode, die von Brockhaus, Goodman, Moulines und Eberle gemacht worden sind, untersucht und als inadäquat nachgewiesen.
A logic is called higher order if it allows for quantiﬁcation over higher order objects, such as functions of individuals, relations between individuals, functions of functions, relations between functions, etc. Higher order logic began with Frege, was formalized in Russell  and Whitehead and Russell  early in the previous century, and received its canonical formulation in Church .1 While classical type theory has since long been overshadowed by set theory as a foundation of mathematics, recent decades have shown remarkable (...) comebacks in the ﬁelds of mechanized reasoning (see, e.g., Benzm¨. (shrink)
_Heidegger's Hidden Sources_ documents for the first time Heidegger's remarkable debt to East Asian philosophy. In this groundbreaking study, Reinhard May shows conclusively that Martin Heidegger borrowed some of the major ideas of his philosophy - on occasion almost word for word - from German translations of Chinese Daoist and Zen Buddhist classics. The discovery of this astonishing appropriation of non-Western sources will have important consequences for future interpretations of Heidegger's work. Moreover, it shows Heidegger as a pioneer of (...) comparative philosophy and transcultural thinking. (shrink)
In this paper we consider the theory of predicate logics in which the principle of Bivalence or the principle of Non-Contradiction or both fail. Such logics are partial or paraconsistent or both. We consider sequent calculi for these logics and prove Model Existence. For L4, the most general logic under consideration, we also prove a version of the Craig-Lyndon Interpolation Theorem. The paper shows that many techniques used for classical predicate logic generalise to partial and paraconsistent logics once the right (...) set-up is chosen. Our logic L4 has a semantics that also underlies Belnap’s  and is related to the logic of bilattices. L4 is in focus most of the time, but it is also shown how results obtained for L4 can be transferred to several variants. (shrink)
The term “vagueness” describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno's sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib's and Pelletier's () theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful (...) as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substantial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon. (shrink)
In this paper we discuss a new perspective on the syntax-semantics interface. Semantics, in this new set-up, is not ‘read off’ from Logical Forms as in mainstream approaches to generative grammar. Nor is it assigned to syntactic proofs using a Curry-Howard correspondence as in versions of the Lambek Calculus, or read off from f-structures using Linear Logic as in Lexical-Functional Grammar (LFG, Kaplan & Bresnan ). All such approaches are based on the idea that syntactic objects (trees, proofs, fstructures) are (...) somehow prior and that semantics must be parasitic on those syntactic objects. We challenge this idea and develop a grammar in which syntax and semantics are treated in a strictly parallel fashion. The grammar will have many ideas in common with the (converging) frameworks of categorial grammar and LFG, but its treatment of the syntax-semantics interface is radically different. Also, although the meaning component of the grammar is a version of Montague semantics and although there are obvious affinities between Montague’s conception of grammar and the work presented here, the grammar is not compositional, in the sense that composition of meaning need not follow surface structure. (shrink)
This paper shows how the dynamic interpretation of natural language introduced in work by Hans Kamp and Irene Heim can be modeled in classical type logic. This provides a synthesis between Richard Montague's theory of natural language semantics and the work by Kamp and Heim.
The paper develops Lambda Grammars, a form of categorial grammar that, unlike other categorial formalisms, is non-directional. Linguistic signs are represented as sequences of lambda terms and are combined with the help of linear combinators.
Vector models of language are based on the contextual aspects of words and how they co-occur in text. Truth conditional models focus on the logical aspects of language, the denotations of phrases, and their compositional properties. In the latter approach the denotation of a sentence determines its truth conditions and can be taken to be a truth value, a set of possible worlds, a context change potential, or similar. In this short paper, we develop a vector semantics for language based (...) on the simply typed lambda calculus. Our semantics uses techniques familiar from the truth conditional tradition and is based on a form of dynamic interpretation inspired by Heim's context updates. (shrink)
This paper introduces λ-grammar, a form of categorial grammar that has much in common with LFG. Like other forms of categorial grammar, λ-grammars are multi-dimensional and their components are combined in a strictly parallel fashion. Grammatical representations are combined with the help of linear combinators, closed pure λ-terms in which each abstractor binds exactly one variable. Mathematically this is equivalent to employing linear logic, in use in LFG for semantic composition, but the method seems more practicable.
There are two kinds of semantic theories of anaphora. Some, such as Heim’s File Change Semantics, Groenendijk and Stokhof’s Dynamic Predicate Logic, or Muskens’ Compositional DRT (CDRT), seem to require full coindexing of anaphora and their antecedents prior to interpretation. Others, such as Kamp’s Discourse Representation Theory (DRT), do not require this coindexing and seem to have an important advantage here. In this squib I will sketch a procedure that the first group of theories may help themselves to so that (...) they can interleave interpretation and coindexing in DRT’s way. (shrink)
This paper developes a relational---as opposed to a functional---theory of types. The theory is based on Hilbert and Bernays' eta operator plus the identity symbol, from which Church's lambda and the other usual operators are then defined. The logic is intended for use in the semantics of natural language.
If payoffs are tickets for binary lotteries, which involve only two money prizes, then rationality requires expected value maximization in tickets. This payoff scheme was increasingly used to induce risk neutrality in experiments. The experiment presented here involved lottery choice and evaluation tasks. One subject group was paid in binary lottery tickets, another directly in money. Significantly greater deviations from risk neutral behavior are observed with binary lottery payoffs. This discrepancy increases when subjects have easy access to the alternatives' expected (...) values and mean absolute deviations. Behavioral regularities are observed at least as often as with direct money payoffs. (shrink)
Carl Schmitt is one of the most widely read and influential German thinkers of the twentieth century. His fundamental works on friend and enemy, legality and legitimacy, dictatorship, political theology and the concept of the political are read today with great interest by everyone from conservative Catholic theologians to radical political thinkers on the left. In his private life, however, Schmitt was haunted by the demons of his wild anti-Semitism, his self-destructive and compulsive sexuality and his deep-seated resentment against the (...) complacency of bourgeois life. As a young man from a modest background, full of social envy, he succeeded in making his way to the top of the academic discipline of law in Germany through his exceptional intellectual prowess. And yet he never felt at home in the academic establishment and among those of high social standing. In his works, Schmitt unmasked the liberal Rechtsstaat as a constitutional façade and reflected on the legitimacy of dictatorship. When the Nazis seized power Schmitt was susceptible to their ideology. He broke with his Jewish friends, joined the Nazi Party in May 1933 and lent a helping hand to Hitler, thereby becoming deeply entangled with the regime. Schmitt was irrevocably compromised by his role as the ‘crown jurist’ of the Third Reich. But by 1936 he had already lost his influential position. After the war, he led a secluded life in his home town in the Sauerland and became a key background figure in the intellectual scene of postwar Germany. Reinhard Mehring’s outstanding biography is the most comprehensive work available on the life and work of Carl Schmitt. Based on thorough research and using new sources that were previously unavailable, Mehring portrays Schmitt as a Shakespearean figure at the centre of the German catastrophe. (shrink)
Force, Fate, and Freedom serves as an introduction to historical sociology, as well as a critical analysis of the belief in economic and political progress through social knowledge. Reinhard Bendix offers a development of the historicist approach to social change first championed by Max Weber, and presents an overview of the foundations of political authority in Japan, Russia, Germany, France, and England.
The ‘syntax’ and ‘combinatorics’ of my title are what Curry (1961) referred to as phenogrammatics and tectogrammatics respectively. Tectogrammatics is concerned with the abstract combinatorial structure of the grammar and directly informs semantics, while phenogrammatics deals with concrete operations on syntactic data structures such as trees or strings. In a series of previous papers (Muskens, 2001a; Muskens, 2001b; Muskens, 2003) I have argued for an architecture of the grammar in which ﬁnite sequences of lambda terms are the basic data structures, (...) pairs of terms syntax, semantics for example. These sequences then combine with the help of simple generalizations of the usual abstraction and application operations. This theory, which I call Lambda Grammars and which is closely related to the independently formulated theory of Abstract Categorial Grammars (de Groote, 2001; de Groote, 2002), in fact is an implementation of Curry’s ideas: the level of tectogrammar is encoded by the sequences of lambda-terms and their ways of combination, while the syntactic terms in those sequences constitute the phenogrammatical level. In de Groote’s formulation of the theory, tectogrammar is the level of abstract terms, while phenogrammar is the level of object terms. (shrink)
We present Logical Description Grammar (LDG), a model ofgrammar and the syntax-semantics interface based on descriptions inelementary logic. A description may simultaneously describe the syntacticstructure and the semantics of a natural language expression, i.e., thedescribing logic talks about the trees and about the truth-conditionsof the language described. Logical Description Grammars offer a naturalway of dealing with underspecification in natural language syntax andsemantics. If a logical description (up to isomorphism) has exactly onetree plus truth-conditions as a model, it completely specifies thatgrammatical (...) object. More common is the situation, corresponding tounderspecification, in which there is more than one model. A situation inwhich there are no models corresponds to an ungrammatical input. (shrink)
Taking the lead from orthodox quantum theory, I will introduce a handy generalization of the Boolean approach to propositions and questions: the orthoalgebraic framework. I will demonstrate that this formalism relates to a formal theory of questions (or ‘observables’ in the physicist’s jargon). This theory allows formulating attitude questions, which normally are non-commuting, i.e., the ordering of the questions affects the answer behavior of attitude questions. Further, it allows the expression of conditional questions such as “If Mary reads the book, (...) will she recommend it to Peter?”, and thus gives the framework the semantic power of raising issues and being informative at the same time. In the case of commuting observables, there are close similarities between the orthoalgebraic approach to questions and the Jäger/Hulstijn approach to question semantics. However, there are also differences between the two approaches even in case of commuting observables. The main difference is that the Jäger/Hulstijn approach relates to a partition theory of questions whereas the orthoalgebraic approach relates to a ‘decorated’ partition theory (i.e. the elements of the partition are decorated by certain semantic values). Surprisingly, the orthoalgebraic approach is able to overcome most of the difficulties of the Jäger/Hulstijn approach. Furthermore, the general approach is suitable to describe the different types of (non-commutative) attitude questions as investigated in modern survey research. Concluding, I will suggest that an active dialogue between the traditional model-theoretic approaches to semantics and the orthoalgebraic paradigm is mandatory. (shrink)
Non-therapeutic circumcision violates boys’ right to bodily integrity as well as to self-determination. There is neither any verifiable medical advantage connected with the intervention nor is it painless nor without significant risks. Possible negative consequences for the psychosexual development of circumcised boys (due to substantial loss of highly erogenous tissue) have not yet been sufficiently explored, but appear to ensue in a significant number of cases. According to standard legal criteria, these considerations would normally entail that the operation be deemed (...) an ‘impermissible risk’—neither justifiable on grounds of parental rights nor of religious liberty: as with any other freedom right, these end where another person's body begins. Nevertheless, after a resounding decision by a Cologne district court that non-therapeutic circumcision constitutes bodily assault, the German legislature responded by enacting a new statute expressly designed to permit male circumcision even outside of medical settings. We first criticise the normative foundations upon which such a legal concession seems to rest, and then analyse two major flaws in the new German law which we consider emblematic of the difficulty that any legal attempt to protect medically irrelevant genital cutting is bound to face. (shrink)
We give a survey on truth theories for applicative theories. It comprises Frege structures, universes for Frege structures, and a theory of supervaluation. We present the proof-theoretic results for these theories and show their syntactical expressive power. In particular, we present as a novelty a syntactical interpretation of ID1 in a applicative truth theory based on supervaluation.