There is a long-standing debate whether propositions, sentences, statements or utterances provide an answer to the question of what objects logical formulas stand for. Based on the traditional understanding of logic as a science of valid arguments, this question is firstly framed more exactly, making explicit that it calls not only for identifying some class of objects, but also for explaining their relationship to ordinary language utterances. It is then argued that there are strong arguments against the proposals commonly put (...) forward in the debate. The core of the problem is that an informative account of the objects formulas stand for presupposes a theory of formalization; that is, a theory that explains what formulas may adequately substitute for an inference in proofs of validity. Although such theories are still subject to research, some consequences can be drawn from an analysis of the reasons why the common accounts featuring sentences, propositions or utterances fail. Theories of formalization cannot refer to utterances qua expressions of propositions; instead they may refer to sentences and rely on additional information about linguistic structure and pragmatic context. (shrink)
Computation and formalization are not modalities of pure abstractive operations. The essay tries to revise the assumption of the constitutive nonsensuality of the formal. The argument is that formalization is a kind of linear spatialization, which has significant visual dimensions. Thus, a connection can be discovered between visualization by figurative graphism and formalization by symbolic calculations: Both use spatial relations not only to represent but also to operate on epistemic, nonspatial, nonvisual entities. Descartes was one of the (...) pioneers of using this kind of two-dimensional spatiality as a cognitive instrument. (shrink)
Exact sciences are described as sciences whose theories are formalized. These are contrasted to inexact sciences, whose theories are not formalized. Formalization is described as a broader category than mathematization, involving any form/content distinction allowing forms, e.g., as represented in theoretical models, to be studied independently of the empirical content of a subject-matter domain. Exactness is a practice depending on the use of theories to control subject-matter domains and to align theoretical with empirical models and not merely a state (...) of a science. Inexact biological sciences tolerate a degree of “mismatch” between theoretical and empirical models and concepts. Three illustrations from biological sciences are discussed in which formalization is achieved by various means: Mendelism, Weismannism, and Darwinism. Frege’s idea of a “conceptual notation” is used to further characterize the notion of a form/content distinction. (shrink)
Three common strategies used by informal logicians are considered: (1) the appeal to standard cases, (2) the attempt to partially formalize so-called "informal fallacies," and (3) restatement of arguments in such a way as to make their logical character more perspicuous. All three strategies are found to be useful. Attention is drawn to several advantages of a "stock case" approach, a minimalist approach to formalization is recommended, and doubts are raised about the applicability, from a logical point of view, (...) of a principle of charitable construal in the reconstruction of arguments. (shrink)
The square of opposition and many other geometrical logical figures have increasingly proven to be applicable to different fields of knowledge. This paper seeks to show how Blanché generalizes the classical theory of oppositions of propositions and extends it to the structure of opposition of concepts. Furthermore, it considers how Blanché restructures the Apuleian square by transforming it into a hexagon. After presenting G. Kalinowski’s formalization of Blanché’s hexagonal theory, an illustration of its applicability to mathematics, to modal logic, (...) and to the logic of norms is depicted. The paper concludes by criticizing Blanché’s claim according to which, his logical hexagon can be considered as the objective basis of the structure of the organisation of concepts, and as the formal structure of thought in general. It is maintained that within the frame of diagrammatic reasoning Blanché’s hexagon keeps its privileged place as a “nice” and useful tool, but not necessarily as a norm of thought. (shrink)
We present a formalization of first-order predicate calculus with equality which, unlike traditional systems with axiom schemata or substitution rules, is finitely axiomatized in the sense that each step in a formal proof admits only finitely many choices. This formalization is primarily based on the inference rule of condensed detachment of Meredith. The usual primitive notions of free variable and proper substitution are absent, making it easy to verify proofs in a machine-oriented application. Completeness results are presented. The (...) example of Zermelo-Fraenkel set theory is shown to be finitely axiomatized under the formalization. The relationship with resolution-based theorem provers is briefly discussed. A closely related axiomatization of traditional predicate calculus is shown to be complete in a strong metamathematical sense. (shrink)
This article identifies problems with regard to providing criteria that regulate the matching of logical formulae and natural language. We then take on to solve these problems by defining a necessary and sufficient criterion of adequate formalization. On the basis of this criterion we argue that logic should not be seen as an ars iudicandi capable of evaluating the validity or invalidity of informal arguments, but as an ars explicandi that renders transparent the formal structure of informal reasoning.
Proof, Logic and Formalization addresses the various problems associated with finding a philosophically satisfying account of mathematical proof. It brings together many of the most notable figures currently writing on this issue in an attempt to explain why it is that mathematical proof is given prominence over other forms of mathematical justification. The difficulties that arise in accounts of proof range from the rightful role of logical inference and formalization to questions concerning the place of experience in proof (...) and the possibility of eliminating impredictive reasoning from proof. Students and lecturers of philosophy, philosophy of logic, and philosophy of mathematics will find this to be essential reading. A companion volume entitled Proof and Logic in Mathematics is also available from Routledge. (shrink)
In this paper, we provide a logical formalization of the emotion triggering process and of its relationship with mental attitudes, as described in Ortony, Clore, and Collins’s theory. We argue that modal logics are particularly adapted to represent agents’ mental attitudes and to reason about them, and use a specific modal logic that we call Logic of Emotions in order to provide logical definitions of all but two of their 22 emotions. While these definitions may be subject to debate, (...) we show that they allow to reason about emotions and to draw interesting conclusions from the theory. (shrink)
The advantages and disadvantages of formalization in philosophy are summarized. It is concluded that formalized philosophy is an endangered speciality that needs to be revitalized and to increase its interactions with non-formalized philosophy. The enigmatic style that is common in philosophical logic must give way to explicit discussions of the problematic relationship between formal models and the philosophical concepts and issues that motivated their development.
I make an attempt at the description of the delicate role of the standard model of arithmetic for the syntax of formal systems. I try to assess whether the possible instability in the notion of finiteness deriving from the nonstandard interpretability of arithmetic affects the very notions of syntactic metatheory and of formal system. I maintain that the crucial point of the whole question lies in the evaluation of the phenomenon of formalization. The ideas of Skolem, Zermelo, Beth and (...) Carnap (among others) on the problem are discussed. ‘A tries to explain to B the meaning of negation. Finally A gives up, saying: “You don’t understand what I mean, and I am not going to explain any longer,” to which B replies: “Yes, I see what you mean, and I am glad you are willing to continue your explanations”’. G. Mannoury, reported by E. W. Beth (Beth, 1963, 489). (shrink)
In the paper we build up the ontology of Leśniewski’s type for formalizing synthetic propositions. We claim that for these propositions an unconventional square of opposition holds, where a, i are contrary, a, o (resp. e, i) are contradictory, e, o are subcontrary, a, e (resp. i, o) are said to stand in the subalternation. Further, we construct a non-Archimedean extension of Boolean algebra and show that in this algebra just two squares of opposition are formalized: conventional and the square (...) that we invented. As a result, we can claim that there are only two basic squares of opposition. All basic constructions of the paper (the new square of opposition, the formalization of synthetic propositions within ontology of Leśniewski’s type, the non-Archimedean explanation of square of opposition) are introduced for the first time. (shrink)
The most difficult problem that Leniewski came across in constructing his system of the foundations of mathematics was the problem of defining definitions, as he used to put it. He solved it to his satisfaction only when he had completed the formalization of his protothetic and ontology. By formalization of a deductive system one ought to understand in this context the statement, as precise and unambiguous as possible, of the conditions an expression has to satisfy if it is (...) added to the system as a new thesis. Now, some protothetical theses, and some ontological ones, included in the respective systems, happen to be definitions. In the present essay I employ Leniewski's method of terminological explanations for the purpose of formalizing ukasiewicz's system of implicational calculus of propositions, which system, without having recourse to quantification, I first extended some time ago into a functionally complete system. This I achieved by allowing for a rule of implicational definitions, which enabled me to define any propositionforming functor for any finite number of propositional arguments. (shrink)
I describe in this paper some of my efforts in developing formal theories of social processes. These include work on models of occupational mobility, on models to describe the emergence of expectations out of performance evaluations, and on the graph theory formulation of the Status Characteristics theory. Not all models have been equally significant in developing theory. However, the graph theory formulation has played a central role in the growth of the Expectation States program. It has been involved in the (...) generalization of theories, the integration of theories, and in the construction of highly sensitive tests of theories that would be impossible without the inferential capacities of formalization. (shrink)
The aim of this paper is to propose a criterion of finite detachment-substitutional formalization for normal modal systems. The criterion will comprise only those normal modal systems which are finitely axiomatizable by means of the substitution, detachment for material implication and Gödel rules.
After several decades during which formalization has flourished it now becomes possible to detect its shortcomings. A definition of formalization is given at the outset. It is next shown that the main justification of formalization as making explicit the form of a proof has serious difficulties. An important shortcoming is found in the fact that many validation procedures in logic and mathematics are not adequately represented deductively. Several such procedures relating to the validation of logical and mathematical (...) sentences are examined. It is concluded that formalization is materially inadequate. (shrink)
This article applies two new criteria, desirability and faithfulness, to evaluate Peli et al.'s (1994) formalization of Hannan and Freeman's structural inertia argument (1984, 1989). We conclude that this formalization fails to meet these criteria. We argue that part of the rational reconstruction on which this formalization builds does not reflect well the substantive argument in translating the natural language theory into logic. We propose two alternative formalizations that meet both of these criteria. Moreover, both derive the (...) inertia theorem from much weaker, so much less constraining, premises. While both new formalizations draw information only from the original statement of the inertia theory, they reflect two different interpretations of inertia accumulation. The two new formalizations are compatible with some recent theory extensions in organizational ecology. However, they lead to substantially different consequences when additional sociological considerations are added to their premise sets. The interplay between logical formalization and sociological content is highlighted using the example of Stinchcombe's (1965) liability-of-newness theorem. Even modest extensions of the proposed models lead to contrary implications about the age dependence in organizational mortality rates. Even "faithful" logical formalizations of arguments ordinarily involve implicit theory building. (shrink)
A Criticism misjudging itself too. On the Deficiency of Reflection in Formal Explications. The criticism formulated by L. B. Puntel concerning the theory of dialectic proposed by the author is rejected. Puntel's attempt at explicating predication by means of (second order) predicate logic fails: It misjudges predication being already presupposed for the possibility of predicate logic, thus belonging to the transcendental conditions of formal predicate logic, so that predication itself cannot be further explicated by means of such logic. What is (...) in fact criticized by Puntel is something like an artefact of formalization. The unreflected application of formal logic here generates problems instead of solving them. (shrink)
This paper discusses six formalization techniques, of varying strengths, for extending a formal system based on traditional mathematical logic. The purpose of these formalization techniques is to simulate the introduction of new syntactic constructs, along with associated semantics for them. We show that certain techniques (among the six) subsume others. To illustrate sharpness, we also consider a selection of constructs and show which techniques can and cannot be used to introduce them. The six studied techniques were selected on (...) the basis of actual practice in logic and computing. They do not form an exhaustive list. (shrink)
. Artificial Intelligence (AI) has long dealt with the issue of finding a suitable formalization for commonsense reasoning. Defeasible argumentation has proven to be a successful approach in many respects, proving to be a confluence point for many alternative logical frameworks. Different formalisms have been developed, most of them sharing the common notions of argument and warrant. In defeasible argumentation, an argument is a tentative (defeasible) proof for reaching a conclusion. An argument is warranted when it ultimately prevails over (...) other conflicting arguments. In this context, defeasible consequence relationships for modelling argument and warrant as well as their logical properties have gained particular attention. This article analyzes two non-monotonic inference operators Carg and Cwar intended for modelling argument construction and dialectical analysis (warrant), respectively. As a basis for such analysis we will use the LDSar framework, a unifying approach to computational models of argument using Labelled Deductive Systems (LDS). In the context of this logical framework, we show how labels can be used to represent arguments as well as argument trees, facilitating the definition and study of non-monotonic inference operators, whose associated logical properties are studied and contrasted. We contend that this analysis provides useful comparison criteria that can be extended and applied to other argumentation frameworks. (shrink)
The problems we deal with concern reasoning about incomplete knowledge. Knowledge is understood as ability of an ideal rational agent to make decisions about pieces of information. The formalisms we are particularly interested in are Moore's autoepistemic logic (AEL) and its variant, the logic of acceptance and rejection (AEL2). It is well-known that AEL may be seen as the nonmonotonic KD45 modal logic. The aim is to give an appropriate modal formalization for AEL2.
We emphasize the role of the choice of vocabulary in formalization of a mathematical area and remark that this is a particular preoccupation of logicians. We use this framework to discuss Kennedyformalism freenessspatial contents through algebra, of the embedding theorem.
The paper offers a historical survey of the emergence of logical formalization in twentieth-century analytically oriented philosophy of religion. This development is taken to have passed through three main ?stages?: a pioneering stage in the late nineteenth and early twentieth centuries (led by Frege and Russell), a stage of crisis in the 1920s and early 1930s (occasioned by Wittgenstein, logical positivists such as Carnap, and neo-Thomists such as Maritain), and a stage of rehabilitation in the 1930s, 1940s, and 1950s (...) (led by the Cracow Circle and Quine). (shrink)
The article presents a formalization of Thomas Aquinas proof for the indestructibility of the human soul. The author of the formalization—the first of its kind in the history of philosophy—is Father Joseph Maria Bocheński. The presentation involves no more than updating the logical symbolism used and accompanies the logical formulae with ordinary language paraphrases in order to ease the reader’s understanding of the formulae. “The fundamental idea of the Thomist proof is of utmost simplicity: things which are destructible (...) are destructible either per se or per accidens; but the human soul is destructible neither per se nor per accidens: therefore the human soul is not destructible”. Bochenski’s words required him to devote considerable effort for the sake of precision of the symbolic language that would be maximally adequate to Thomas’ discourse. Moreover, I have thought it necessary to provide an ample commentary to the traditional and contemporary semantical presuppositions of Aquinas philosophical anthropology in light of Bocheński’s interpretation thereof. (shrink)
Problem: The article seeks to tackle three problems of Mitterer's non-dualistic philosophy. Firstly, the key term description remains not only rather unclear and rudimentary but also isolated from relevant neighboring terms and theories of other disciplines. Secondly, a logical reconstruction and formal model of non-dualism is still lacking. Thirdly, there are hardly any extensions of philosophical non-dualism to non-philosophical disciplines and fields. Findings: The three main findings of the article are based on the abovementioned problems. Firstly, the non-dualistic term description (...) will be connected to the sociological and semiotic term meaning by emphasizing their semantic-pragmatic similarities. Moreover, a common and distinction-theoretic conceptualization of both terms will be proposed. Secondly, a non-dualistic formalization and logical reconstruction will be elaborated by deducing non-dualism from dualism using the operation of re-entry. Thirdly, the non-dualistic formalization will be applied to the classical semiotic triangle, resulting in the elaboration of a non-dualistic semiotic triangle. Benefits: The aforementioned findings have two possible benefits. Firstly, the compatibility between the terms description and meaning makes philosophical non-dualism connectable to social science approaches, especially to sociology and semiotics. This may be an important avenue for interdisciplinary cross-fertilization and co-operation. Secondly, the formalization and logical deduction may help to clarify and explicitize non-dualism's main arguments and implicit assumptions. (shrink)
Formal topologies are today an established topic in the development of constructive mathematics. One of the main tools in formal topology is inductive generation since it allows to introduce inductive methods in topology. The problem of inductively generating formal topologies with a cover relation and a unary positivity predicate has been solved in [CSSV]. However, to deal both with open and closed subsets, a binary positivity predicate has to be considered. In this paper we will show how to adapt to (...) this framework the method used to generate inductively formal topologies with a unary positivity predicate; the main problem that one has to face in such a new setting is that, as a consequence of the lack of a complete formalization, both the cover relation and the positivity predicate can have proper axioms. (shrink)
Whether human thinking can be formalized and whether machines can think in a human sense are questions that have been addressed by both Peirce and Searle. Peirce came to roughly the same conclusion as Searle, that the digital computer would not be able to perform human thinking or possess human understanding. However, his rationale and Searle's differ on several important points. Searle approaches the problem from the standpoint of traditional analytic philosophy, where the strict separation of syntax and semantics renders (...) understanding impossible for a purely syntactical device. Peirce disagreed with that analysis, but argued that the computer would only be able to achieve algorithmic thinking, which he considered the simplest type. Although their approaches were radically dissimilar, their conclusions were not. I will compare and analyze the arguments of both Peirce and Searle on this issue, and outline some implications of their conclusions for the field of Artificial Intelligence. (shrink)
According to a prevalent view among philosophers formal logic is the philosopher’s main tool to assess the validity of arguments, i.e. the philosopher’s ars iudicandi. By drawing on a famous dispute between Russell and Strawson over the validity of a certain kind of argument – of arguments whose premises feature definite descriptions – this paper casts doubt on the accuracy of the ars iudicandi conception. Rather than settling the question whether the contentious arguments are valid or not, Russell and Strawson, (...) upon discussing the proper logical analysis of definite descriptions, merely contrast converse informal validity assessments rendered explicit by nonequivalent logical for-malizations. (shrink)
A critique is given of the attempt by Hettema and Kuipers to formalize the periodic table. In particular I dispute their notions of identifying a naïve periodic table with tables having a constant periodicity of eight elements and their views on the different conceptions of the atom by chemists and physicists. The views of Hettema and Kuipers on the reduction of the periodic system to atomic physics are also considered critically.
The presentation of the formal conception of noemata is the main aim of the article. In the first section, three informal approaches to noemata are discussed. The goal of this chapter is specifying main controversies and their sources concerned with different ways of the understanding of noemata. In the second section, basic assumptions determining the proposed way of understanding noemata are presented. The third section is devoted to the formal set-theoretic construction needed for the formal comprehension of noemata. In the (...) fourth section, definitions of noemata and their various kinds, as well as definitions of other phenomenological notions are formulated. In the last section, possibilities of further developing the proposed formal conception are indicated. (shrink)
In this paper we analyze the Strawson's notion of presupposition proposed in his book Introduction to Logical Theory. Strawsonian notion of presupposition is dependent on the notion of logical entailment. We make use of the theory of logical consequence operation as a general framework to show that it is impossible to find a logical consequence operation which mirrors the philosophical intuitions of the Strawson's notions of presupposition. The aim of this paper is to present in details the philosophical backgrounds of (...) the formal analysis presented in the author's paper "Strawsonian presuppositions and logical entailment". (shrink)
Three things are presented: How Hilbert changed the original construction postulates of his geometry into existential axioms; In what sense he formalized geometry; How elementary geometry is formalized to present day's standards.
It has been accepted since the early part of the Century that there is no problem formalizing mathematics in standard formal systems of axiomatic set theory. Most people feel that they know as much as they ever want to know about how one can reduce natural numbers, integers, rationals, reals, and complex numbers to sets, and prove all of their basic properties. Furthermore, that this can continue through more and more complicated material, and that there is never a real problem.
The paper introduces and formally defines a functional concept of a measuring system, on this basis characterizing the measurement as an evaluation performed by means of a calibrated measuring system. The distinction between exact and uncertain measurement is formalized in terms of the properties of the traceability chain joining the measuring system to the primary standard. The consequence is drawn that uncertain measurements lose the property of relation-preservation, on which the very concept of measurement is founded according to the representational (...) viewpoint. Finally, from the analysis of the inter-relations between calibration and measurement the fundamental reasons of the claimed objectivity and intersubjectivity of measurement are highlighted, a valuable epistemological result to characterize measurement as a particular kind of evaluation. (shrink)
Three things are presented: How Hilbert changed the original construction postulates of his geometry into existential axioms; In what sense he formalized geometry; How elementary geometry is formalized to present day's standards.
Formal definitions of the following concepts of animal ecology are given: environment, niche, locality, local population, natural population, community, ecosystem. Five primitive (undefined) notions are used including "animal", "offspring" and "habitat", the latter in the sense of Charles Elton. The defining equations for the environment of one animal are first given, then niche (in the Elton sense) is formally defined in terms of the environment. The fifth primitve notion "habitat" is then introduced in order to define the remaining concepts.
The paper studies two formal schemes related to -completeness.LetS be a suitable formal theory containing primitive recursive arithmetic and letT be a formal extension ofS. Denoted by (a), (b) and (c), respectively, are the following three propositions (where (x) is a formula with the only free variable x): (a) (for anyn) ( T (n)), (b) T x Pr T (–(x)–) and (c) T x(x) (the notational conventions are those of Smoryski ). The aim of this paper is to examine the (...) meaning of the schemes which result from the formalizations, over the base theoryS, of the implications (b) (c) and (a) (b), where ranges over all formulae. The analysis yields two results overS : 1. the schema corresponding to (b) (c) is equivalent to ¬Cons T and 2. the schema corresponding to (a) (b) is not consistent with 1-CON T. The former result follows from a simple adaptation of the -incompleteness proof; the second is new and is based on a particular application of the diagonalization lemma. (shrink)
Within a weak subsystem of second-order arithmetic , that is -conservative over , we reformulate Kreisel's proof of the Second Incompleteness Theorem and Boolos' proof of the First Incompleteness Theorem.
Timing diagrams are popular in hardware design. They have been formalized for use in reasoning tasks, such as computer-aided verification. These efforts have largely treated timing diagrams as interfaces to established notations for which verification is decidable; this has restricted timing diagrams to expressing only regular language properties. This paper presents a timing diagram logic capable of expressing certain context-free and context-sensitive properties. It shows that verification is decidable for properties expressible in this logic. More specifically, it shows that containment (...) of -regular languages generated by Büchi automata in timing diagram languages is decidable. The result relies on a correlation between timing diagram and reversal bounded counter machine languages. (shrink)
There are proposed two simple formal descriptions of the notion of God’s omnipotence which are inspired by formalizations of C. Christian and E. Nieznański. Our first proposal is expressed in a modal sentential language with quantifires. The second one is formulated in first order predicate language. In frame of the second aproach we admit using self-referential expressions. In effect we link our considerations with so called paradox of God’s omnipotence and reconstruct some argumentation against the possibility of reference God’s omnipotence (...) to a lack of itself. (shrink)
El texto aborda una de las cuestiones metodológicas fundamentales que Heidegger se planteaba en las Frühe Freiburger Vorlesungen (1919-1923), a saber, el problema de la indicación formal. En efecto, si la vida misma (Dasein) es un acontecimiento de sentido cerrado en sí mismo es necesario establecer un punto de vista que exprese conceptualmente la vida sin objetivarla. El gran problema con el que Heidegger se enfrenta es encontrar un metalenguaje no objetivante. El concepto de indicación formal es lo que le (...) permite elaborar un discurso sobre el origen. (shrink)
According to the standard opinions in the literature, blocking the unacceptable consequences of the notorious slingshot argument requires imposing constraints on the metaphysics of facts or on theories of definite descriptions (or class abstracts). This paper argues that both of these well-known strategies to rebut the slingshot overshoot the mark. The slingshot, first and foremost, raises the question as to the adequate logical formalization of statements about facts, i.e. of factual contexts. It will be shown that a rigorous application (...) of Quine’s maxim of shallow analysis to formalizations of factual contexts paves the way for an account of formalizing such contexts which blocks the slingshot without ramifications for theories of facts or definite descriptions. (shrink)