A chief aim of the science of consciousness is to discover general principles that determine exactly which states of phenomenal consciousness occur in exactly which conditions. In this paper I argue that making progress towards the discovery of such principles requires developing a new regimented language for describing phenomenal states. This language should allow us to describe phenomenal states in a way that is commensurable with our descriptions of physical states. I suggest one way of doing this. My approach extends (...) and sharpens the language used in the scientific literature to describe phenomenal states. The end result is a representational language of consciousness without the metaphysical baggage of a representational theory of consciousness. (shrink)
There is a long-standing debate whether propositions, sentences, statements or utterances provide an answer to the question of what objects logical formulas stand for. Based on the traditional understanding of logic as a science of valid arguments, this question is firstly framed more exactly, making explicit that it calls not only for identifying some class of objects, but also for explaining their relationship to ordinary language utterances. It is then argued that there are strong arguments against the proposals commonly put (...) forward in the debate. The core of the problem is that an informative account of the objects formulas stand for presupposes a theory of formalization; that is, a theory that explains what formulas may adequately substitute for an inference in proofs of validity. Although such theories are still subject to research, some consequences can be drawn from an analysis of the reasons why the common accounts featuring sentences, propositions or utterances fail. Theories of formalization cannot refer to utterances qua expressions of propositions; instead they may refer to sentences and rely on additional information about linguistic structure and pragmatic context. (shrink)
Computation and formalization are not modalities of pure abstractive operations. The essay tries to revise the assumption of the constitutive nonsensuality of the formal. The argument is that formalization is a kind of linear spatialization, which has significant visual dimensions. Thus, a connection can be discovered between visualization by figurative graphism and formalization by symbolic calculations: Both use spatial relations not only to represent but also to operate on epistemic, nonspatial, nonvisual entities. Descartes was one of the (...) pioneers of using this kind of two-dimensional spatiality as a cognitive instrument. (shrink)
This article identifies problems with regard to providing criteria that regulate the matching of logical formulae and natural language. We then take on to solve these problems by defining a necessary and sufficient criterion of adequate formalization. On the basis of this criterion we argue that logic should not be seen as an ars iudicandi capable of evaluating the validity or invalidity of informal arguments, but as an ars explicandi that renders transparent the formal structure of informal reasoning.
Proof, Logic and Formalization addresses the various problems associated with finding a philosophically satisfying account of mathematical proof. It brings together many of the most notable figures currently writing on this issue in an attempt to explain why it is that mathematical proof is given prominence over other forms of mathematical justification. The difficulties that arise in accounts of proof range from the rightful role of logical inference and formalization to questions concerning the place of experience in proof (...) and the possibility of eliminating impredictive reasoning from proof. Students and lecturers of philosophy, philosophy of logic, and philosophy of mathematics will find this to be essential reading. A companion volume entitled Proof and Logic in Mathematics is also available from Routledge. (shrink)
In this paper, we provide a logical formalization of the emotion triggering process and of its relationship with mental attitudes, as described in Ortony, Clore, and Collins’s theory. We argue that modal logics are particularly adapted to represent agents’ mental attitudes and to reason about them, and use a specific modal logic that we call Logic of Emotions in order to provide logical definitions of all but two of their 22 emotions. While these definitions may be subject to debate, (...) we show that they allow to reason about emotions and to draw interesting conclusions from the theory. (shrink)
The advantages and disadvantages of formalization in philosophy are summarized. It is concluded that formalized philosophy is an endangered speciality that needs to be revitalized and to increase its interactions with non-formalized philosophy. The enigmatic style that is common in philosophical logic must give way to explicit discussions of the problematic relationship between formal models and the philosophical concepts and issues that motivated their development.
I make an attempt at the description of the delicate role of the standard model of arithmetic for the syntax of formal systems. I try to assess whether the possible instability in the notion of finiteness deriving from the nonstandard interpretability of arithmetic affects the very notions of syntactic metatheory and of formal system. I maintain that the crucial point of the whole question lies in the evaluation of the phenomenon of formalization. The ideas of Skolem, Zermelo, Beth and (...) Carnap (among others) on the problem are discussed. ‘A tries to explain to B the meaning of negation. Finally A gives up, saying: “You don’t understand what I mean, and I am not going to explain any longer,” to which B replies: “Yes, I see what you mean, and I am glad you are willing to continue your explanations”’. G. Mannoury, reported by E. W. Beth (Beth, 1963, 489). (shrink)
The most difficult problem that Leniewski came across in constructing his system of the foundations of mathematics was the problem of defining definitions, as he used to put it. He solved it to his satisfaction only when he had completed the formalization of his protothetic and ontology. By formalization of a deductive system one ought to understand in this context the statement, as precise and unambiguous as possible, of the conditions an expression has to satisfy if it is (...) added to the system as a new thesis. Now, some protothetical theses, and some ontological ones, included in the respective systems, happen to be definitions. In the present essay I employ Leniewski's method of terminological explanations for the purpose of formalizing ukasiewicz's system of implicational calculus of propositions, which system, without having recourse to quantification, I first extended some time ago into a functionally complete system. This I achieved by allowing for a rule of implicational definitions, which enabled me to define any propositionforming functor for any finite number of propositional arguments. (shrink)
In the paper we build up the ontology of Leśniewski’s type for formalizing synthetic propositions. We claim that for these propositions an unconventional square of opposition holds, where a, i are contrary, a, o (resp. e, i) are contradictory, e, o are subcontrary, a, e (resp. i, o) are said to stand in the subalternation. Further, we construct a non-Archimedean extension of Boolean algebra and show that in this algebra just two squares of opposition are formalized: conventional and the square (...) that we invented. As a result, we can claim that there are only two basic squares of opposition. All basic constructions of the paper (the new square of opposition, the formalization of synthetic propositions within ontology of Leśniewski’s type, the non-Archimedean explanation of square of opposition) are introduced for the first time. (shrink)
I describe in this paper some of my efforts in developing formal theories of social processes. These include work on models of occupational mobility, on models to describe the emergence of expectations out of performance evaluations, and on the graph theory formulation of the Status Characteristics theory. Not all models have been equally significant in developing theory. However, the graph theory formulation has played a central role in the growth of the Expectation States program. It has been involved in the (...) generalization of theories, the integration of theories, and in the construction of highly sensitive tests of theories that would be impossible without the inferential capacities of formalization. (shrink)
The aim of this paper is to propose a criterion of finite detachment-substitutional formalization for normal modal systems. The criterion will comprise only those normal modal systems which are finitely axiomatizable by means of the substitution, detachment for material implication and Gödel rules.
This article applies two new criteria, desirability and faithfulness, to evaluate Peli et al.'s (1994) formalization of Hannan and Freeman's structural inertia argument (1984, 1989). We conclude that this formalization fails to meet these criteria. We argue that part of the rational reconstruction on which this formalization builds does not reflect well the substantive argument in translating the natural language theory into logic. We propose two alternative formalizations that meet both of these criteria. Moreover, both derive the (...) inertia theorem from much weaker, so much less constraining, premises. While both new formalizations draw information only from the original statement of the inertia theory, they reflect two different interpretations of inertia accumulation. The two new formalizations are compatible with some recent theory extensions in organizational ecology. However, they lead to substantially different consequences when additional sociological considerations are added to their premise sets. The interplay between logical formalization and sociological content is highlighted using the example of Stinchcombe's (1965) liability-of-newness theorem. Even modest extensions of the proposed models lead to contrary implications about the age dependence in organizational mortality rates. Even "faithful" logical formalizations of arguments ordinarily involve implicit theory building. (shrink)
After several decades during which formalization has flourished it now becomes possible to detect its shortcomings. A definition of formalization is given at the outset. It is next shown that the main justification of formalization as making explicit the form of a proof has serious difficulties. An important shortcoming is found in the fact that many validation procedures in logic and mathematics are not adequately represented deductively. Several such procedures relating to the validation of logical and mathematical (...) sentences are examined. It is concluded that formalization is materially inadequate. (shrink)
This paper discusses six formalization techniques, of varying strengths, for extending a formal system based on traditional mathematical logic. The purpose of these formalization techniques is to simulate the introduction of new syntactic constructs, along with associated semantics for them. We show that certain techniques (among the six) subsume others. To illustrate sharpness, we also consider a selection of constructs and show which techniques can and cannot be used to introduce them. The six studied techniques were selected on (...) the basis of actual practice in logic and computing. They do not form an exhaustive list. (shrink)
. Artificial Intelligence (AI) has long dealt with the issue of finding a suitable formalization for commonsense reasoning. Defeasible argumentation has proven to be a successful approach in many respects, proving to be a confluence point for many alternative logical frameworks. Different formalisms have been developed, most of them sharing the common notions of argument and warrant. In defeasible argumentation, an argument is a tentative (defeasible) proof for reaching a conclusion. An argument is warranted when it ultimately prevails over (...) other conflicting arguments. In this context, defeasible consequence relationships for modelling argument and warrant as well as their logical properties have gained particular attention. This article analyzes two non-monotonic inference operators Carg and Cwar intended for modelling argument construction and dialectical analysis (warrant), respectively. As a basis for such analysis we will use the LDSar framework, a unifying approach to computational models of argument using Labelled Deductive Systems (LDS). In the context of this logical framework, we show how labels can be used to represent arguments as well as argument trees, facilitating the definition and study of non-monotonic inference operators, whose associated logical properties are studied and contrasted. We contend that this analysis provides useful comparison criteria that can be extended and applied to other argumentation frameworks. (shrink)
The problems we deal with concern reasoning about incomplete knowledge. Knowledge is understood as ability of an ideal rational agent to make decisions about pieces of information. The formalisms we are particularly interested in are Moore's autoepistemic logic (AEL) and its variant, the logic of acceptance and rejection (AEL2). It is well-known that AEL may be seen as the nonmonotonic KD45 modal logic. The aim is to give an appropriate modal formalization for AEL2.
Whether human thinking can be formalized and whether machines can think in a human sense are questions that have been addressed by both Peirce and Searle. Peirce came to roughly the same conclusion as Searle, that the digital computer would not be able to perform human thinking or possess human understanding. However, his rationale and Searle's differ on several important points. Searle approaches the problem from the standpoint of traditional analytic philosophy, where the strict separation of syntax and semantics renders (...) understanding impossible for a purely syntactical device. Peirce disagreed with that analysis, but argued that the computer would only be able to achieve algorithmic thinking, which he considered the simplest type. Although their approaches were radically dissimilar, their conclusions were not. I will compare and analyze the arguments of both Peirce and Searle on this issue, and outline some implications of their conclusions for the field of Artificial Intelligence. (shrink)
According to a prevalent view among philosophers formal logic is the philosopher’s main tool to assess the validity of arguments, i.e. the philosopher’s ars iudicandi. By drawing on a famous dispute between Russell and Strawson over the validity of a certain kind of argument – of arguments whose premises feature definite descriptions – this paper casts doubt on the accuracy of the ars iudicandi conception. Rather than settling the question whether the contentious arguments are valid or not, Russell and Strawson, (...) upon discussing the proper logical analysis of definite descriptions, merely contrast converse informal validity assessments rendered explicit by nonequivalent logical for-malizations. (shrink)
A critique is given of the attempt by Hettema and Kuipers to formalize the periodic table. In particular I dispute their notions of identifying a naïve periodic table with tables having a constant periodicity of eight elements and their views on the different conceptions of the atom by chemists and physicists. The views of Hettema and Kuipers on the reduction of the periodic system to atomic physics are also considered critically.
In this paper we analyze the Strawson's notion of presupposition proposed in his book Introduction to Logical Theory. Strawsonian notion of presupposition is dependent on the notion of logical entailment. We make use of the theory of logical consequence operation as a general framework to show that it is impossible to find a logical consequence operation which mirrors the philosophical intuitions of the Strawson's notions of presupposition. The aim of this paper is to present in details the philosophical backgrounds of (...) the formal analysis presented in the author's paper "Strawsonian presuppositions and logical entailment". (shrink)
The presentation of the formal conception of noemata is the main aim of the article. In the first section, three informal approaches to noemata are discussed. The goal of this chapter is specifying main controversies and their sources concerned with different ways of the understanding of noemata. In the second section, basic assumptions determining the proposed way of understanding noemata are presented. The third section is devoted to the formal set-theoretic construction needed for the formal comprehension of noemata. In the (...) fourth section, definitions of noemata and their various kinds, as well as definitions of other phenomenological notions are formulated. In the last section, possibilities of further developing the proposed formal conception are indicated. (shrink)
Three things are presented: How Hilbert changed the original construction postulates of his geometry into existential axioms; In what sense he formalized geometry; How elementary geometry is formalized to present day's standards.
It has been accepted since the early part of the Century that there is no problem formalizing mathematics in standard formal systems of axiomatic set theory. Most people feel that they know as much as they ever want to know about how one can reduce natural numbers, integers, rationals, reals, and complex numbers to sets, and prove all of their basic properties. Furthermore, that this can continue through more and more complicated material, and that there is never a real problem.
The paper introduces and formally defines a functional concept of a measuring system, on this basis characterizing the measurement as an evaluation performed by means of a calibrated measuring system. The distinction between exact and uncertain measurement is formalized in terms of the properties of the traceability chain joining the measuring system to the primary standard. The consequence is drawn that uncertain measurements lose the property of relation-preservation, on which the very concept of measurement is founded according to the representational (...) viewpoint. Finally, from the analysis of the inter-relations between calibration and measurement the fundamental reasons of the claimed objectivity and intersubjectivity of measurement are highlighted, a valuable epistemological result to characterize measurement as a particular kind of evaluation. (shrink)
Three things are presented: How Hilbert changed the original construction postulates of his geometry into existential axioms; In what sense he formalized geometry; How elementary geometry is formalized to present day's standards.
Formal definitions of the following concepts of animal ecology are given: environment, niche, locality, local population, natural population, community, ecosystem. Five primitive (undefined) notions are used including "animal", "offspring" and "habitat", the latter in the sense of Charles Elton. The defining equations for the environment of one animal are first given, then niche (in the Elton sense) is formally defined in terms of the environment. The fifth primitve notion "habitat" is then introduced in order to define the remaining concepts.
The paper studies two formal schemes related to -completeness.LetS be a suitable formal theory containing primitive recursive arithmetic and letT be a formal extension ofS. Denoted by (a), (b) and (c), respectively, are the following three propositions (where (x) is a formula with the only free variable x): (a) (for anyn) ( T (n)), (b) T x Pr T (–(x)–) and (c) T x(x) (the notational conventions are those of Smoryski ). The aim of this paper is to examine the (...) meaning of the schemes which result from the formalizations, over the base theoryS, of the implications (b) (c) and (a) (b), where ranges over all formulae. The analysis yields two results overS : 1. the schema corresponding to (b) (c) is equivalent to ¬Cons T and 2. the schema corresponding to (a) (b) is not consistent with 1-CON T. The former result follows from a simple adaptation of the -incompleteness proof; the second is new and is based on a particular application of the diagonalization lemma. (shrink)
Timing diagrams are popular in hardware design. They have been formalized for use in reasoning tasks, such as computer-aided verification. These efforts have largely treated timing diagrams as interfaces to established notations for which verification is decidable; this has restricted timing diagrams to expressing only regular language properties. This paper presents a timing diagram logic capable of expressing certain context-free and context-sensitive properties. It shows that verification is decidable for properties expressible in this logic. More specifically, it shows that containment (...) of -regular languages generated by Büchi automata in timing diagram languages is decidable. The result relies on a correlation between timing diagram and reversal bounded counter machine languages. (shrink)
On September 6, 2004, using the Isabelle proof assistant, I veriﬁed the following statement: (%x. pi x * ln (real x) / (real x)) ----> 1 The system thereby conﬁrmed that the prime number theorem is a consequence of the axioms of higher-order logic together with an axiom asserting the existence of an inﬁnite set. All told, our number theory session, including the proof of the prime number theorem and supporting libraries, constitutes 673 pages of proof scripts, or roughly 30,000 (...) lines. This count includes about 65 pages of elementary number theory that we had at the outset, developed by Larry Paulson and others; also about 50 pages devoted to a proof of the law of quadratic reciprocity and properties of Euler’s ϕ function, neither of which are used in the proof of the prime number theorem. The page count does not include the basic HOL library, or properties of the real numbers that we obtained from the HOL-Complex library. (shrink)
This paper explores some of the constructive dimensions and specifics of human theoretic cognition, combining perspectives from (Husserlian) genetic phenomenology and distributed cognition approaches. I further consult recent psychological research concerning spatial and numerical cognition. The focus is on the nexus between the theoretic development of abstract, idealized geometrical and mathematical notions of space and the development and effective use of environmental cognitive support systems. In my discussion, I show that the evolution of the theoretic cognition of space apparently follows (...) two opposing, but in truth, intrinsically aligned trajectories. On the epistemic plane, which is the main focus of Husserl’s genetic phenomenological investigations, theoretic conceptions of space are progressively constituted by way of an idealizing emancipation of spatial cognition from the concrete, embodied intentionality underlying the human organism’s perception of space. As a result of this emancipation, it ultimately becomes possible for the human mind to theoretically conceive of and posit space as an ideal entity that is universally geometrical and mathematical. At the same time, by synthesizing a range of literature on spatial and mathematical cognition, I illustrate that for the theoretic mind to undertake precisely this emancipating process successfully, and further, for an ideal and objective notion of geometrical and mathematical space to first of all become fully scientifically operative, the cognitive support provided by a range of specific symbolic technologies is central. These include lettered diagrams, notation systems, and more generally, the technique of formalization and require for their functioning various cognitively efficacious types of embodiment. Ultimately, this paper endeavors to understand the specific symbolic-technological dimensions that have been instrumental to major shifts in the development of idealized, scientific conceptions of space. The epistemic characteristics of these shifts have been previously discussed in genetic phenomenology, but without devoting sufficient attention to the constructive role of symbolic technologies. At the same time, this paper identifies some of the irreducible phenomenological and epistemic dimensions that characterize the functioning of the historically situated, embodied and distributed theoretic mind. (shrink)
On an ordinary view of the relation of philosophy of science to science, science serves only as a topic for philosophical reflection, reflection that proceeds by its own methods and according to its own standards. This ordinary view suggests a way of writing a global history of philosophy of science that finds substantially the same philosophical projects being pursued across widely divergent scientific eras. While not denying that this view is of some use regarding certain themes of and particular time (...) periods, this essay argues that much of the epistemology and philosophy of science in the early twentieth century in a variety of projects (neo-Kantianism, logical empiricism, pragmatism, phenomenology) looked to the then current context of the exact sciences, especially geometry and physics, not merely for its topics but also for its conceptual resources and technical tools. This suggests a more variable project of philosophy of science, a deeper connection between early twentieth-century philosophy of science and its contemporary science, and a more interesting and richer history of philosophy of science than is ordinarily offered. (shrink)
N. G. de Bruijn, now professor emeritus of the Eindhoven University of Technology, was a pioneer in the field of interactive theorem proving. From 1967 to the end of the 1970’s, his work on the Automath system introduced the architecture that is common to most of today’s proof assistants, and much of the basic technology. But de Bruijn was a mathematician first and foremost, as evidenced by the many mathematical notions and results that bear his name, among them de Bruijn (...) sequences, de Bruin graphs, the de Bruijn-Newman constant, and the de Bruijn-Erd¨. (shrink)
The paper considers the legal tools that have been developed in German pharmaceutical regulation as a result of the precautionary attitude inaugurated by the Contergan decision (1970). These tools are (i) the notion of “well-founded suspicion”, which attenuates the requirements for safety intervention by relaxing the requirement of a proved causal connection between danger and source, and the introduction of (ii) the reversal of proof burden in liability norms. The paper focuses on the first and proposes seeing the precautionary principle (...) as an instance of the requirement that one should maximise expected utility. In order to maximise expected utility certain probabilities are required and it is argued that objective Bayesianism offers the most plausible means to determine the optimal decision in cases where evidence supports diverging choices. (shrink)
The reduction of the lambda calculus to the theory of combinators in [Sch¨ onfinkel, 1924] applies to positive implicational logic, i.e. to the typed lambda calculus, where the types are built up from atomic types by means of the operation A −→ B, to show that the lambda operator can be eliminated in favor of combinators K and S of each type A −→ (B −→ A) and (A −→ (B −→ C)) −→ ((A −→ B) −→ (A −→ C)), (...) respectively.1 I will extend that result to the case in which the types are built up by means of the general function type ∀x : A.B(x) as well as the disjoint union type ∃x : A.B(x)– essentially to the theory of [Howard, 1980]. To extend the treatment of −→ to ∀ we shall need a generalized form of the combinators K and S, and to deal with ∃ we will need to introduce a new form of the combinator S.. (shrink)
At first sight the title Â“RegimentationÂ” seems to imply nothing more than a description in detail of the changes set forth above; but while in part it brings into view one side of these changes, and suggests their common tendency, it serves a further end. I use it here to express certain wider changes which are their concomitants. For as indicated some pages back, and as shown at length in The Principles of Sociology , in a chapter on Â “The (...) Militant Type,Â” that graduated subordination which we see in an army, characterizes a militant society at large more and more as militancy increases. (shrink)
The Monist’s call for papers for this issue ended: “if formalism is true, then it must be possible in principle to mechanize meaning in a conscious thinking and language-using machine; if intentionalism is true, no such project is intelligible”. We use the Grelling-Nelson paradox to show that natural language is indefinitely extensible, which has two important consequences: it cannot be formalized and model theoretic semantics, standard for formal languages, is not suitable for it. We also point out that object-object mapping (...) theories of semantics, the usual account for the possibility of non intentional semantics, doesn’t seem able to account for the indefinitely extensible productivity of natural language. (shrink)
We propose a formal representation of objects , those being mathematical or empirical objects. The powerful framework inside which we represent them in a unique and coherent way is grounded, on the formal side, in a logical approach with a direct mathematical semantics in the well-established field of constructive topology, and, on the philosophical side, in a neo-Kantian perspective emphasizing the knowing subject’s role, which is constructive for the mathematical objects and constitutive for the empirical ones.
A series of representations must be semantics-driven if the members of that series are to combine into a single thought. Where semantics is not operative, there is at most a series of disjoint representations that add up to nothing true or false, and therefore do not constitute a thought at all. There is necessarily a gulf between simulating thought, on the one hand, and actually thinking, on the other. A related point is that a popular doctrine - the so-called 'computational (...) theory of mind' (CTM) - is based on a confusion. CTM is the view that thought-processes consist in 'computations', where a computation is defined as a 'form-driven' operation on symbols. The expression 'form-driven operation' is ambiguous, and may refer either to syntax-driven operations or to morphology-driven operations. Syntax-driven operations presuppose the existence of operations that are driven by semantic and extra-semantic knowledge. So CTM is false if the terms 'computation' and 'form-driven operation' are taken to refer to syntax-driven operations. Thus, if CTM is to work, those expressions must be taken to refer to morphology-driven operations; and CTM therefore fails, given that an operation must be semantics-driven if it is to qualify as a thought. CTM therefore fails on every disambiguation of the expressions 'formal operation' and 'computation,' and it is therefore false. (shrink)
It is often claimed that emotions are linked to formal objects. But what are formal objects? What roles do they play? According to some philosophers, formal objects are axiological properties which individuate emotions, make them intelligible and give their correctness conditions. In this paper, I evaluate these claims in order to answer the above questions. I first give reasons to doubt the thesis that formal objects individuate emotions. Second, I distinguish different ways in which emotions are intelligible and argue that (...) philosophers are wrong in claiming that emotions only make sense when they are based on prior sources of axiological information. Third, I investigate how issues of intelligibility connect with the correctness conditions of emotions. I defend a theory according to which emotions do not respond to axiological information, but to non-axiological reasons. According to this theory, we can allocate fundamental roles to the formal objects of emotions while dispensing with the problematic features of other theories. (shrink)
Much philosophy of logic is shaped, explicitly or implicitly, by the thought that logic is distinctively formal and abstracts from material content. The distinction between formal and material does not appear to coincide with the more familiar contrasts between a priori and empirical, necessary and contingent, analytic and synthetic—indeed, it is often invoked to explain these. Nor, it turns out, can it be explained by appeal to schematic inference patterns, syntactic rules, or grammar. What does it mean, then, to say (...) that logic is distinctively formal? (shrink)
According to the naive theory of vagueness, the vagueness of an expression consists in the existence of both positive and negative cases of application of the expression and in the non-existence of a sharp cut-off point between them. The sorites paradox shows the naive theory to be inconsistent in most logics proposed for a vague language. The paper explores the prospects of saving the naive theory by revising the logic in a novel way, placing principled restrictions on the transitivity of (...) the consequence relation. A lattice-theoretical framework for a whole family of (zeroth-order) “tolerant logics” is proposed and developed. Particular care is devoted to the relation between the salient features of the formal apparatus and the informal logical and semantic notions they are supposed to model. A suitable non-transitive counterpart to classical logic is defined. Some of its properties are studied, and it is eventually shown how an appropriate regimentation of the naive theory of vagueness is consistent in such a logic. (shrink)
To ascertain that a formalization of the intuitive notion of a ‘concept’ is linguistically interesting, one has to check whether it allows to get a grip on distinctions and notions from lexical semantics. Prime candidates are notions like ‘prototype’, ‘stereotypical attribute’, ‘essential attribute versus accidental attribute’, ‘intension versus extension’. We will argue that although the current paradigm of formal concept analysis as an application of lattice theory is not rich enough for an analysis of these notions, a lattice theoretical (...) approach to concepts is a suitable starting point for formalizing them. (shrink)
Formal ontology as it is presented in Husserl`s Third Logical Investigation can be interpreted as a fundamental tool to describe objects in a formal sense. It is presented one of the main sources: chapter five of Carl Stumpf`s Ûber den psycholoogischen Ursprung der Raumovorstellung (1873), and then it is described how Husserlian Formal Ontology is applied in Fifth Logical Investigation. Finally, it is applied to dramatic structures, in the spirit of Roman Ingarden.