In this paper, I show that the availability of what some authors have called the weak reading and the strong reading of donkey sentences with relative clauses is systematically related to monotonicity properties of the determiner. The correlation is different from what has been observed in the literature in that it concerns not only right monotonicity, but also left monotonicity (persistence/antipersistence). I claim that the reading selected by a donkey sentence with a double monotone determiner is in fact the one (...) that validates inference based on the left monotonicity of the determiner. This accounts for the lack of strong reading in donkey sentences with MON determiners, which have been neglected in the literature. I consider the relevance of other natural forms of inference as well, but also suggest how monotonicity inference might play a central role in the actual process of interpretation. The formal theory is couched in dynamic predicate logic with generalized quantifiers. (shrink)
Some formal properties of enriched systems of Lambek calculus with analogues of conjunction and disjunction are investigated. In particular, it is proved that the class of languages recognizable by the Lambek calculus with added intersective conjunction properly includes the class of finite intersections of context-free languages.
By formalizing Berry's paradox, Vopěnka, Chaitin, Boolos and others proved the incompleteness theorems without using the diagonal argument. In this paper, we shall examine these proofs closely and show their relationships. Firstly, we shall show that we can use the diagonal argument for proofs of the incompleteness theorems based on Berry's paradox. Then, we shall show that an extension of Boolos' proof can be considered as a special case of Chaitin's proof by defining a suitable Kolmogorov complexity. We shall show (...) also that Vopěnka's proof can be reformulated in arithmetic by using the arithmetized completeness theorem. (shrink)
This paper deals with intercultural aspects of privacy, particularly with regard to important differences between Japanese and the Western views. This paper is based on our discussions with Rafael Capurro – a dialogue now represented by two separate but closely interrelated articles. The companion paper is broadly focused on the cultural and historical backgrounds of the concepts of privacy and individualism in “Western” worlds; our main theme focuses on different concepts of privacy in Japan and their sources in related aspects (...) of Japanese culture. The interrelationship between our two papers is apparent in our taking up identical or similar topics in each paper. Reading our two papers in conjunction with each other will bring about deeper and broader insights into the diverse values and worldviews of Japan and Western cultures that underlie concepts of privacy that at a surface level appear to be similar. (shrink)
We give a proof of Gödel's first incompleteness theorem based on Berry's paradox, and from it we also derive the second incompleteness theorem model-theoretically.
We define a liar-type paradox as a consistent proposition in propositional modal logic which is obtained by attaching boxes to several subformulas of an inconsistent proposition in classical propositional logic, and show several famous paradoxes are liar-type. Then we show that we can generate a liar-type paradox from any inconsistent proposition in classical propositional logic and that undecidable sentences in arithmetic can be obtained from the existence of a liar-type paradox. We extend these results to predicate logic and discuss Yablo’s (...) Paradox in this framework. Furthermore, we define explicit and implicit self-reference in paradoxes in the incompleteness phenomena. (shrink)
Within a weak subsystem of second-order arithmetic , that is -conservative over , we reformulate Kreisel's proof of the Second Incompleteness Theorem and Boolos' proof of the First Incompleteness Theorem.
This article examines the practice of "co-participant completion" in Japanese conversation, and explores what kinds of resources are mobilized to provide the opportunity to complete another participant's utterance-in-progress. It suggests the following observations as potential characteristics of Japanese co-participant completion: (i) Syntactically-defined two-part formats (e.g. [If X] + [then Y]) may not play as prominent a role as in English; (ii) The majority of cases of co-participant completion take the form of 'terminal item completion;' (iii) Locally emergent structures like 'contrast' (...) and 'list' as well as 'unprojected' features of turn construction often play an important role in enhancing the opportunity for completing another participant's utterance-in-progress. The article then discusses the implications of these findings for the investigation of the mutual bearing of grammar and social interaction. In particular, the discussion focuses on what we can learn from the practice of co-participant completion about how projection of turn-shapes is accomplished in Japanese conversation. (shrink)
Every language recognized by the Lambek calculus with brackets is context-free. This is shown by combining an observation by Jäger with an entirely straightforward adaptation of the method Pentus used for the original Lambek calculus. The case of the variant of the calculus allowing sequents with empty antecedents is slightly more complicated, requiring a restricted use of the multiplicative unit.
We systematically study several principles and give a principle which is weaker than disjunctive Markov’s principle. We also show that the principle is underivable and strictly weaker than MP∨ in certain extensions of the system EL of elementary analysis.
It is proved that for any k, the class of classical categorial grammars that assign at most k types to each symbol in the alphabet is learnable, in the Gold (1967) sense of identification in the limit from positive data. The proof crucially relies on the fact that the concept known as finite elasticity in the inductive inference literature is preserved under the inverse image of a finite-valued relation. The learning algorithm presented here incorporates Buszkowski and Penn's (1990) algorithm for (...) determining categorial grammars from input consisting of functor-argument structures. (shrink)
A minimal formula is a formula which is minimal in provable formulas with respect to the substitution relation. This paper shows the following: (1) A β-normal proof of a minimal formula of depth 2 is unique in NJ. (2) There exists a minimal formula of depth 3 whose βη-normal proof is not unique in NJ. (3) There exists a minimal formula of depth 3 whose βη-normal proof is not unique in NK.
The issue of justice after catastrophe is an enormous challenge to contemporary theories of distributive justice. In the past three decades, the controversy over distributive justice has centered on the ideal of equality. One of intensely debated issues concerns what is often called the “equality of what,” on which there are three primary views: welfarism, resourcism, and the capabilities approach. Another major point of dispute can be termed the “equality or another,” about which three positions debate: egalitarianism, prioritarianism, and sufficientarianism. (...) On these topics of distributive justice, authors are concerned with the current difference between the better-off and the worse-off or the present situation of the badly-off. By contrast, it is essential to take account of the past distribution of well-being as well as the present situation in order to explore questions of post-catastrophe justice. Without looking at the pre-disaster distribution of income, preference satisfaction, or basic capabilities among affected people, no present assessment of the damage caused by the disaster could be correct and no proposed remedy adequate. It is true that luck egalitarians assess the current distribution among people by referring to the decision that each individual made. Yet they pay scant attention to the situation in which each one stayed in the past. Therefore, we can legitimately say that most theorists of distributive justice, including luck egalitarians, have failed to give consideration to the past state of each person. -/- To fill this gap in the literature, the present article explores philosophical questions that arise when we take account of each person’s past and present situations in discussing distributive justice regarding public compensation and assistance to survivors and families of victims of natural and industrial disasters. In addressing these novel questions, I develop and refine various concepts, ideas, and arguments that have been presented in the study of distributive justice in normal settings. I tackle two tasks, the first of which is to explore the foundation and scope of luck egalitarianism. Despite the moral appeal it has in many cases, luck egalitarianism has attracted the so-called harshness objection. Some luck egalitarians attempt to avoid this objection in a pragmatic way by combining the luck egalitarian doctrine with the principle of basic needs satisfaction. However, they do not provide any systematic rationale for this combination. In contrast with such pragmatic responses, I seek to offer a principled argument for holding individuals responsible for their choices only when their basic needs are met, by invoking the ideas of respect for human voluntariness and rescue of human vulnerability. Based on this argument, I propose a form of responsibility-sensitive theory, which considers the pre-disaster distribution of well-being as a default position. The second task I take on is to refine sufficientarianism in the context of post-catastrophe justice. Luck egalitarianism with boundaries set by the basic needs principle seems to indicate the potential for sufficientarianism. But major proponents of this view conceive the welfarist assumption, a considerably high standard of well-being, and the controversial treatment of persons staying below the threshold, all of which seem problematic in the post-disaster situation. I try to construct a new version of sufficientarianism by replacing these current features with more robust ones. (shrink)
We investigate stationarity of types over models in simple theories. In particular, we show that in simple theories with finite SU-rank, any complete type over a model having Cantor-Bendixson rank is stationary.
In this collection on the Kyoto School of Philosophy, the author offers the reader Tanabe's religious philosophy, but also, and for the first time, his ...
I present a new syntactical method for proving the Interpolation Theorem for the implicational fragment of intuitionistic logic and its substructural subsystems. This method, like Prawitz’s, works on natural deductions rather than sequent derivations, and, unlike existing methods, always finds a ‘strongest’ interpolant under a certain restricted but reasonable notion of what counts as an ‘interpolant’.
This paper shows that the inhabitation problem in the lambda calculus with negation, product, polymorphic, and existential types is decidable, where the inhabitation problem asks whether there exists some term that belongs to a given type. In order to do that, this paper proves the decidability of the provability in the logical system defined from the second-order natural deduction by removing implication and disjunction. This is proved by showing the quantifier elimination theorem and reducing the problem to the provability in (...) propositional logic. The magic formulas are used for quantifier elimination such that they replace quantifiers. As a byproduct, this paper also shows the second-order witness theorem which states that a quantifier followed by negation can be replaced by a witness obtained only from the formula. As a corollary of the main results, this paper also shows Glivenko’s theorem, Double Negation Shift, and conservativity for antecedent-empty sequents between the logical system and its classical version. (shrink)
A simple and complete proof of strong normalization for first- and second-order intuitionistic natural deduction including disjunction, first-order existence and permutative conversions is given. The paper follows the Tait–Girard approach via computability predicates and saturated sets. Strong normalization is first established for a set of conversions of a new kind, then deduced for the standard conversions. Difficulties arising for disjunction are resolved using a new logic where disjunction is restricted to atomic formulas.
The article illustrates morphosyntactic characteristics of Thai, an isolating language, in contrast to the modern European languages. Thai is characterized as a topic-prominent language, where the voluntary–spontaneous contrast rather than transitive–intransitive one plays significant roles in forming basic sentence constructions. By assuming non-hierarchical serial verb constructions as its basic sentence structures, the author claims that the modern hierarchical view of language structure is not appropriate for Thai. In Thai, verbs are serialized to denote not only successive actions or an action (...) and its objective, but also a cause and its result, an action and its evaluation. Furthermore, causative and passive constructions are analyzed as part of verb serializations which are structurally identical, but antiparallel to each other in the direction of affectedness. (shrink)
We consider a set-theoretic version of mereology based on the inclusion relation ⊆ and analyze how well it might serve as a foundation of mathematics. After establishing the non-definability of ∈ from ⊆, we identify the natural axioms for ⊆-based mereology, which constitute a finitely axiomatizable, complete, decidable theory. Ultimately, for these reasons, we conclude that this form of set-theoretic mereology cannot by itself serve as a foundation of mathematics. Meanwhile, augmented forms of set-theoretic mereology, such as that obtained by (...) adding the singleton operator, are foundationally robust. (shrink)
This paper points out an error of Parigot's proof of strong normalization of second order classical natural deduction by the CPS-translation, discusses erasing-continuation of the CPS-translation, and corrects that proof by using the notion of augmentations.
This paper presents a new algorithm to find an appropriate similarityunder which we apply legal rules analogically. Since there may exist a lotof similarities between the premises of rule and a case in inquiry, we haveto select an appropriate similarity that is relevant to both thelegal rule and a top goal of our legal reasoning. For this purpose, a newcriterion to distinguish the appropriate similarities from the others isproposed and tested. The criterion is based on Goal-DependentAbstraction (GDA) to select a (...) similarity such that an abstraction basedon the similarity never loses the necessary information to prove the ground (purpose of legislation) of the legal rule. In order to cope withour huge space of similarities, our GDA algorithm uses some constraintsto prune useless similarities. (shrink)
This paper proves the strong normalization of classical natural deduction with disjunction and permutative conversions, by using CPS-translation and augmentations. Using them, this paper also proves the strong normalization of classical natural deduction with general elimination rules for implication and conjunction, and their permutative conversions. This paper also proves that natural deduction can be embedded into natural deduction with general elimination rules, strictly preserving proof normalization.
The video-recording of police interrogations of suspects has become widespread in criminal justice systems and is routinely regarded by legal professionals and lay people alike as a means of protecting the rights of suspects and reducing the likelihood of coerced or false confessions. This study, based on evidence from Japan and experiments conducted in Tokyo as well as cases and reinforced by studies from elsewhere, finds that the way visual images of suspects and their narratives are depicted on film can, (...) on the contrary, be misleading. Not only lay participants in trials, but also legal professionals, may be misled into accepting unreliable confessions. Indeed the very power of visual images to convince viewers calls for great caution in their use. Possible solutions include the use of independent expert witnesses to evaluate the reliability of visual recordings and the restriction of taped evidence to audio tracks. (shrink)