Explication is the conceptual cornerstone of Carnap’s approach to the methodology of scientific analysis. From a philosophical point of view, it gives rise to a number of questions that need to be addressed, but which do not seem to have been fully addressed by Carnap himself. This paper reconsiders Carnapian explication by comparing it to a different approach: the ‘formalisms as cognitive tools’ conception. The comparison allows us to discuss a number of aspects of the Carnapian methodology, as well as (...) issues pertaining to formalization in general. We start by introducing Carnap’s conception of explication, arguing that there is a tension between his proposed criteria of fruitfulness and similarity; we also argue that his further desideratum of exactness is less crucial than might appear at first. We then bring in the general idea of formalisms as cognitive tools, mainly by discussing the reliability of so-called statistical prediction rules, i.e. simple algorithms used to make predictions across a range of areas. SPRs allow for a concrete instantiation of Carnap’s fruitfulness desideratum, which is arguably the most important desideratum for him. Finally, we elaborate on what we call the ‘paradox of adequate formalization’, which for the Carnapian corresponds to the tension between similarity and fruitfulness. We conclude by noting that formalization is an inherently paradoxical enterprise in general, but one worth engaging in given the ‘cognitive boost’ it affords as a tool for discovery. (shrink)
The purpose of this paper is twofold. First, it aims at introducing the ontological argument through the analysis of five historical developments: Anselm’s argument found in the second chapter of his Proslogion, Gaunilo’s criticism of it, Descartes’ version of the ontological argument found in his Meditations on First Philosophy, Leibniz’s contribution to the debate on the ontological argument and his demonstration of the possibility of God, and Kant’s famous criticisms against the (cartesian) ontological argument. Second, it intends to critically examine (...) the enterprise of formally analyzing philosophical arguments and, as such, contribute in a small degree to the debate on the role of formalization in philosophy. My focus will be mainly on the drawbacks and limitations of such enterprise; as a guideline, I shall refer to a Carnapian, or Carnapian-like theory of argument analysis. (shrink)
Exact sciences are described as sciences whose theories are formalized. These are contrasted to inexact sciences, whose theories are not formalized. Formalization is described as a broader category than mathematization, involving any form/content distinction allowing forms, e.g., as represented in theoretical models, to be studied independently of the empirical content of a subject-matter domain. Exactness is a practice depending on the use of theories to control subject-matter domains and to align theoretical with empirical models and not merely a state (...) of a science. Inexact biological sciences tolerate a degree of “mismatch” between theoretical and empirical models and concepts. Three illustrations from biological sciences are discussed in which formalization is achieved by various means: Mendelism, Weismannism, and Darwinism. Frege’s idea of a “conceptual notation” is used to further characterize the notion of a form/content distinction. (shrink)
This article identifies problems with regard to providing criteria that regulate the matching of logical formulae and natural language. We then take on to solve these problems by defining a necessary and sufficient criterion of adequate formalization. On the basis of this criterion we argue that logic should not be seen as an ars iudicandi capable of evaluating the validity or invalidity of informal arguments, but as an ars explicandi that renders transparent the formal structure of informal reasoning.
Traditional logical reconstruction of arguments aims at assessing the validity of ordinary language arguments. It involves several tasks: extracting argumentations from texts, breaking up complex argumentations into individual arguments, framing arguments in standard form, as well as formalizing arguments and showing their validity with the help of a logical formalism. These tasks are guided by a multitude of partly antagonistic goals, they interact in various feedback loops, and they are intertwined with the development of theories of valid inference and adequate (...)formalization. This paper explores how the method of reflective equilibrium can be used for modelling the complexity of such reconstructions and for justifying the various steps involved. The proposed approach is illustrated and tested in a detailed reconstruction of the beginning of Anselm’s De casu diaboli. (shrink)
There is a long-standing debate whether propositions, sentences, statements or utterances provide an answer to the question of what objects logical formulas stand for. Based on the traditional understanding of logic as a science of valid arguments, this question is firstly framed more exactly, making explicit that it calls not only for identifying some class of objects, but also for explaining their relationship to ordinary language utterances. It is then argued that there are strong arguments against the proposals commonly put (...) forward in the debate. The core of the problem is that an informative account of the objects formulas stand for presupposes a theory of formalization; that is, a theory that explains what formulas may adequately substitute for an inference in proofs of validity. Although such theories are still subject to research, some consequences can be drawn from an analysis of the reasons why the common accounts featuring sentences, propositions or utterances fail. Theories of formalization cannot refer to utterances qua expressions of propositions; instead they may refer to sentences and rely on additional information about linguistic structure and pragmatic context. (shrink)
The article addresses two closely related questions: What are the criteria of adequacy of logical formalization of natural language arguments, and what gives logic the authority to decide which arguments are good and which are bad? Our point of departure is the criticism of the conception of logical formalization put forth, in a recent paper, by M. Baumgartner and T. Lampert. We argue that their account of formalization as a kind of semantic analysis brings about more problems (...) than it solves. We also argue that the criteria of adequate formalization need not be based on truth conditions associated with logical formulas; in our view, they are better based on structural (inferential) grounds. We then put forward our own version of the criteria. The upshot of the discussion that follows is that the quest for an adequate formalization in a suitable logical language is best conceived of as the search for a Goodmanian reflective equilibrium. (shrink)
This paper offers a conceptually novel contribution to the understanding of the distinctive governance challenges arising from the increasing reliance on formalized knowledge in the governance of research activities. It uses the current Australian research governance system as an example – a system which exhibits a comparatively strong degree of formalization as to its knowledge mechanisms. Combining theoretical reflections on the political-administrative and epistemic dimensions of processes of formalization with analyses of interview data gathered at Australian universities, it (...) is suggested that such a strong reliance on formalized knowledge has rather ambivalent governance ramifications. On the one hand, it allows for a seemingly rational and efficient form of the control and coordination of research activities. Yet on the other hand, it also increases the risk that knowledge is used in governance contexts in superficial, unconsidered and ultimately unreasonable ways. It is further suggested that there are a range of indications that precisely such use elicits and reinforces a range of dysfunctional behaviors on part of relevant individual and organizational actors in the public science system. (shrink)
The square of opposition and many other geometrical logical figures have increasingly proven to be applicable to different fields of knowledge. This paper seeks to show how Blanché generalizes the classical theory of oppositions of propositions and extends it to the structure of opposition of concepts. Furthermore, it considers how Blanché restructures the Apuleian square by transforming it into a hexagon. After presenting G. Kalinowski’s formalization of Blanché’s hexagonal theory, an illustration of its applicability to mathematics, to modal logic, (...) and to the logic of norms is depicted. The paper concludes by criticizing Blanché’s claim according to which, his logical hexagon can be considered as the objective basis of the structure of the organisation of concepts, and as the formal structure of thought in general. It is maintained that within the frame of diagrammatic reasoning Blanché’s hexagon keeps its privileged place as a “nice” and useful tool, but not necessarily as a norm of thought. (shrink)
This paper makes an independent start with formalizing the rules for the argumentation stage of critical discussions. It does not deal with the well-known code of conduct consisting of ten rules but with the system consisting of fifteen rules on which the code of conduct is based. The rules of this system are scrutinized and problems they raise are discussed. Then a formal dialectical system is defined that reflects most of the contents of these rules. The aim is to elucidate (...) the way the rules work and to show how a formal approach can be useful to achieve this. It is also shown how the present method can be used to study the nature of circular argumentation. While, generally, the formalization follows closely the original rules for the argumentation stage of critical discussions, there will also be proposed some modifications of the original protocol. (shrink)
Lampert and Baumgartner (2010) critically discuss accounts of adequate formalization focusing on my analysis in (Brun 2004). There, I investigated three types of criteria of adequacy (matching truth conditions or inferential role, corresponding syntactical surface and systematicity) and argued that they ultimately call for a procedure of formalization. Although Lampert and Baumgartner have a point about matching truth conditions, their arguments target a truncated version of my account. They ignore all aspects of systematicity which make their counter-example unconvincing.
This paper compares several models of formalization. It articulates criteria of correct formalization and identifies their problems. All of the discussed criteria are so called “semantic” criteria, which refer to the interpretation of logical formulas. However, as will be shown, different versions of an implicitly applied or explicitly stated criterion of correctness depend on different understandings of “interpretation” in this context.
This paper explores the political import of Husserl’s critical discussion of the epistemic effects of the formalization of rational thinking. More specifically, it argues that this discussion is of direct relevance to make sense of the pervasive processes of ‘technization’, that is, of a mechanistic and superficial generation and use of knowledge, to be observed in current contexts of governance. Building upon Husserl’s understanding of formalization as a symbolic technique for abstraction in the thinking with and about numbers, (...) I argue that processes of technization, while being necessary and legitimate procedures for the reduction of complexities, also may give rise to politically unresponsive and ultimately dysfunctional ‘economies of thinking.’ This paper is structured in three parts. In the first part I outline Husserl’s account of the formalization and technization of thought and knowledge. In the second part I make my case for the political import of this account, departing in this context from positions that (a) regard Husserl’s discussions of formalization and its effects as merely epistemological, or that (b) try to mobilize Husserl for a one-sided critique of instrumental reason. In the final part I address a major shortcoming of Husserl’s account, namely its neglect of the concrete and historically evolving technological infrastructures of processes of formalization/technization. (shrink)
Computation and formalization are not modalities of pure abstractive operations. The essay tries to revise the assumption of the constitutive nonsensuality of the formal. The argument is that formalization is a kind of linear spatialization, which has significant visual dimensions. Thus, a connection can be discovered between visualization by figurative graphism and formalization by symbolic calculations: Both use spatial relations not only to represent but also to operate on epistemic, nonspatial, nonvisual entities. Descartes was one of the (...) pioneers of using this kind of two-dimensional spatiality as a cognitive instrument. (shrink)
Three common strategies used by informal logicians are considered: (1) the appeal to standard cases, (2) the attempt to partially formalize so-called "informal fallacies," and (3) restatement of arguments in such a way as to make their logical character more perspicuous. All three strategies are found to be useful. Attention is drawn to several advantages of a "stock case" approach, a minimalist approach to formalization is recommended, and doubts are raised about the applicability, from a logical point of view, (...) of a principle of charitable construal in the reconstruction of arguments. (shrink)
Structure and organization seems to be at the root of many of the questions raised about institutional behaviour; however, with respect to research on university capacity building, few studies have examined research organizational problems, particularly in developing countries. This study investigates academic reactions to the structure and organization of research at four leading Vietnamese universities. Through document analysis and semi-structured interviews with 55 participants, the study finds that the four case-study Vietnamese universities have accomplished a number of the more visible (...) tasks of research management such as creating research and research management positions; deciding primary organizational units for research delivery; creating a research office; and creating research oversight committees. However, they seem to neglect the other less visible tasks of organizing and structuring research such as developing rules for research integrity; developing a mechanism for evaluating the quality of research outcomes; preparing researchers and research managers for the necessary skills and knowledge; and deciding vertical and horizontal decentralization. The study concludes that even though research has been formally structured and organized, the management of research has not yet been professionalized. The key problem in organizing and structuring research is the lack of an effective system for research behaviour formalization. A more effective system for better formalizing research behaviours should be developed so that Vietnamese universities can integrate more successfully into the global research. (shrink)
Natural Formalization proposes a concrete way of expanding proof theory from the meta-mathematical investigation of formal theories to an examination of “the concept of the specifically mathematical proof.” Formal proofs play a role for this examination in as much as they reflect the essential structure and systematic construction of mathematical proofs. We emphasize three crucial features of our formal inference mechanism: (1) the underlying logical calculus is built for reasoning with gaps and for providing strategic directions, (2) the mathematical (...) frame is a definitional extension of Zermelo–Fraenkel set theory and has a hierarchically organized structure of concepts and operations, and (3) the construction of formal proofs is deeply connected to the frame through rules for definitions and lemmas. To bring these general ideas to life, we examine, as a case study, proofs of the Cantor–Bernstein Theorem that do not appeal to the principle of choice. A thorough analysis of the multitude of “different” informal proofs seems to reduce them to exactly one. The natural formalization confirms that there is one proof, but that it comes in two variants due to Dedekind and Zermelo, respectively. In this way it enhances the conceptual understanding of the represented informal proofs. The formal, computational work is carried out with the proof search system AProS that serves as a proof assistant and implements the above inference mechanism; it can be fully inspected at (see link below). (shrink)
The advantages and disadvantages of formalization in philosophy are summarized. It is concluded that formalized philosophy is an endangered speciality that needs to be revitalized and to increase its interactions with non-formalized philosophy. The enigmatic style that is common in philosophical logic must give way to explicit discussions of the problematic relationship between formal models and the philosophical concepts and issues that motivated their development.
In the paper we build up the ontology of Leśniewski’s type for formalizing synthetic propositions. We claim that for these propositions an unconventional square of opposition holds, where a, i are contrary, a, o (resp. e, i) are contradictory, e, o are subcontrary, a, e (resp. i, o) are said to stand in the subalternation. Further, we construct a non-Archimedean extension of Boolean algebra and show that in this algebra just two squares of opposition are formalized: conventional and the square (...) that we invented. As a result, we can claim that there are only two basic squares of opposition. All basic constructions of the paper (the new square of opposition, the formalization of synthetic propositions within ontology of Leśniewski’s type, the non-Archimedean explanation of square of opposition) are introduced for the first time. (shrink)
I investigate Bocheński's first-order logic formalization of the argument for the incorruptibility of the human soul given by Aquinas in Summa Theologiae (Ia,75,6). I suggest a slightly different axiomatization that reflect better Aquinas' informal argument. Along the way, I also fix a mistake in Bocheński's derivation that the human soul is not corruptible per se.
I describe in this paper some of my efforts in developing formal theories of social processes. These include work on models of occupational mobility, on models to describe the emergence of expectations out of performance evaluations, and on the graph theory formulation of the Status Characteristics theory. Not all models have been equally significant in developing theory. However, the graph theory formulation has played a central role in the growth of the Expectation States program. It has been involved in the (...) generalization of theories, the integration of theories, and in the construction of highly sensitive tests of theories that would be impossible without the inferential capacities of formalization. (shrink)
.Formal topologies are today an established topic in the development of constructive mathematics. One of the main tools in formal topology is inductive generation since it allows to introduce inductive methods in topology. The problem of inductively generating formal topologies with a cover relation and a unary positivity predicate has been solved in [CSSV]. However, to deal both with open and closed subsets, a binary positivity predicate has to be considered. In this paper we will show how to adapt to (...) this framework the method used to generate inductively formal topologies with a unary positivity predicate; the main problem that one has to face in such a new setting is that, as a consequence of the lack of a complete formalization, both the cover relation and the positivity predicate can have proper axioms. (shrink)
In this paper, we provide a logical formalization of the emotion triggering process and of its relationship with mental attitudes, as described in Ortony, Clore, and Collins’s theory. We argue that modal logics are particularly adapted to represent agents’ mental attitudes and to reason about them, and use a specific modal logic that we call Logic of Emotions in order to provide logical definitions of all but two of their 22 emotions. While these definitions may be subject to debate, (...) we show that they allow to reason about emotions and to draw interesting conclusions from the theory. (shrink)
The paper offers a historical survey of the emergence of logical formalization in twentieth-century analytically oriented philosophy of religion. This development is taken to have passed through three main ?stages?: a pioneering stage in the late nineteenth and early twentieth centuries (led by Frege and Russell), a stage of crisis in the 1920s and early 1930s (occasioned by Wittgenstein, logical positivists such as Carnap, and neo-Thomists such as Maritain), and a stage of rehabilitation in the 1930s, 1940s, and 1950s (...) (led by the Cracow Circle and Quine). (shrink)
This article discusses some of Chateaubriand’s views on the connections between the ideas of formalization and infinity, as presented in chapters 19 and 20 of Logical Forms. We basically agree with his criticisms of the standard construal of these connections, a view we named “formal proofs as ultimate provings”, but we suggest an alternative way of picturing that connection based on some ideas of the late Wittgenstein.
Any theory of information needs to comply with what we call the implementation, formalization, and representation constraints. These constraints are justified by basic considerations concerning scientific modelling and methodology. In the first part of this paper, we argue that the implementation and formalization constraints cannot be satisfied because the relation between Shannon information and IIT must be clarified. In the second part of the paper, we focus on the representation constraint. We argue that IIT cannot succeed in satisfying (...) this constraint for semantic contents without offering models for concepts, conceptual roles, and context sensitivity. In the final part of the paper, further complications are raised with respect to the distinction between consciousness and attention, which apply more specifically to the representation constraint. We conclude with some recommendations as to how IIT may succeed in solving these problems, highlighting the advantages and enormous potential of IIT as a scientific theory. (shrink)
This article summarizes the main ideas for formalizing categorial languages genrated by classical categorial grammar originated by K. Ajdukiewicz [1935,1960].This formalization is presented in detail in the author's monographs in Polish "Teorie Języków Syntaktycznie Kategorialnych" ("Theories of Syntactically Categorical Languages"), PWN, Warszawa-Wrocław 1985 and in English "Theory of Language Syntax, Categorial Approach", Kluwer Academic Publishers, Boston-London-Dordrecht 1991.
I propose that a logical formalization of a natural language text may be regarded as adequate if the following three groups of beliefs can be integrated into a wide reflective equilibrium: our initial, spontaneous beliefs about the structure and logical quality of the text; our beliefs about its structure and logical quality as reflected in the proposed formalization, and our background beliefs about the original text’s author, his thought and other contextually relevant factors. Unlike a good part of (...) the literature, I stress the indispensable role of initial beliefs in achieving such a wide reflective equilibrium. In the final sections I show that my approach does not succumb to undue subjectivism or the mere perpetuation of prejudice. The examples I use to illustrate my claims are chiefly taken from Anselm’s Proslogion 2–3 and the various attempts to formalize these texts. (shrink)
The tools of logic are used properly or improperly relative to two interrelated purposes. Logic is both a symbolism for the expression of the formal structures of thought and an inference mechanism. Formalization in philosophical logic is justified to the extent that it contributes to our understanding of logical properties and the conceptual problems they may help to state, clarify, or resolve. This view of the value and limits of formalization in logic affords a pragmatic perspective that in (...) principle should selectively support the development of particular formal systems, while excluding others as unjustified. Yet the pragmatic grounds are so liberal as to disallow virtually no exercises in formalization as entirely useless. Logic is abused in another sense, when it is wrongly used, not by violating logical canons generally, but by presupposing that substantive metaphysical, normative, or scientific content can be derived from purely formal logical relations. (shrink)
This paper deals with Tarski's first axiomatic presentations of the syntax of deductive system. Andrzej Grzegorczyk's significant results which laid the foundations for the formalization of metalogic, are touched upon briefly. The results relate to Tarski's theory of concatenation, also called the theory of strings, and to Tarski's ideas on the formalization of metamathematics. There is a short mention of author's research in the field. The main part of the paper surveys research on the theory of deductive systems (...) initiated by Tarski, in particular research on the axiomatization of the general notion of consequence operation, axiom systems for the theories of classic consequence and for some equivalent theories, and axiom systems for the theories of nonclassic consequence. In this paper the results of Jerzy Supecki's research are taken into account, and also the author's and other people belonging to his circle of scientific research. Particular study is made of his dual characterization of deductive systems, both as systems in regard to acceptance and systems in regard to rejection . Comparison is made, therefore, with axiomatizations of the theories of rejection and dual consequence, and the theory of the usual consequence operation. (shrink)
I make an attempt at the description of the delicate role of the standard model of arithmetic for the syntax of formal systems. I try to assess whether the possible instability in the notion of finiteness deriving from the nonstandard interpretability of arithmetic affects the very notions of syntactic metatheory and of formal system. I maintain that the crucial point of the whole question lies in the evaluation of the phenomenon of formalization. The ideas of Skolem, Zermelo, Beth and (...) Carnap (among others) on the problem are discussed. ‘A tries to explain to B the meaning of negation. Finally A gives up, saying: “You don’t understand what I mean, and I am not going to explain any longer,” to which B replies: “Yes, I see what you mean, and I am glad you are willing to continue your explanations”’. G. Mannoury, reported by E. W. Beth (Beth, 1963, 489). (shrink)
Putnam’s proof that time flow is incompatible with Relativity is underestimated, mostly due to Stein’s interpretation of the notion of reality in it as a two-term relation. This interpretation makes it vulnerable to easy criticism and makes various ways of escaping its conclusion possible. An alternative approach is proposed, resulting in a formalization which seems closer to Putnam’s intentions where reality is interpreted as a non-relational property. Although it makes the proof immune to all standard strategies of blocking the (...) proof, it reveals its real weak point which consists in assuming an overly strong interpretation of the principle of relativity. (shrink)
. Artificial Intelligence (AI) has long dealt with the issue of finding a suitable formalization for commonsense reasoning. Defeasible argumentation has proven to be a successful approach in many respects, proving to be a confluence point for many alternative logical frameworks. Different formalisms have been developed, most of them sharing the common notions of argument and warrant. In defeasible argumentation, an argument is a tentative (defeasible) proof for reaching a conclusion. An argument is warranted when it ultimately prevails over (...) other conflicting arguments. In this context, defeasible consequence relationships for modelling argument and warrant as well as their logical properties have gained particular attention. This article analyzes two non-monotonic inference operators Carg and Cwar intended for modelling argument construction and dialectical analysis (warrant), respectively. As a basis for such analysis we will use the LDSar framework, a unifying approach to computational models of argument using Labelled Deductive Systems (LDS). In the context of this logical framework, we show how labels can be used to represent arguments as well as argument trees, facilitating the definition and study of non-monotonic inference operators, whose associated logical properties are studied and contrasted. We contend that this analysis provides useful comparison criteria that can be extended and applied to other argumentation frameworks. (shrink)
This article applies two new criteria, desirability and faithfulness, to evaluate Peli et al.'s (1994) formalization of Hannan and Freeman's structural inertia argument (1984, 1989). We conclude that this formalization fails to meet these criteria. We argue that part of the rational reconstruction on which this formalization builds does not reflect well the substantive argument in translating the natural language theory into logic. We propose two alternative formalizations that meet both of these criteria. Moreover, both derive the (...) inertia theorem from much weaker, so much less constraining, premises. While both new formalizations draw information only from the original statement of the inertia theory, they reflect two different interpretations of inertia accumulation. The two new formalizations are compatible with some recent theory extensions in organizational ecology. However, they lead to substantially different consequences when additional sociological considerations are added to their premise sets. The interplay between logical formalization and sociological content is highlighted using the example of Stinchcombe's (1965) liability-of-newness theorem. Even modest extensions of the proposed models lead to contrary implications about the age dependence in organizational mortality rates. Even "faithful" logical formalizations of arguments ordinarily involve implicit theory building. (shrink)
Proof assistants are software-based tools that are used in the mechanization of proof construction and validation in mathematics and computer science, and also in certified program development. Different such tools are being increasingly used in order to accelerate and simplify proof checking, and the Coq proof assistant is one of the most well known and used in large-scale projects. Language and automata theory is a well-established area of mathematics, relevant to computer science foundations and information technology. In particular, context-free language (...) theory is of fundamental importance in the analysis, design, and implementation of computer programming languages. This work describes a formalization effort, using the Coq proof assistant, of fundamental results of the classical theory of contextfree grammars and languages. These include closure properties (union, concatenation, and Kleene star), grammar simplification (elimination of useless symbols, inaccessible symbols, empty rules, and unit rules), the existence of a Chomsky Normal Form for context-free grammars and the Pumping Lemma for context-free languages. The result is an important set of libraries covering the main results of context-free language theory, with more than 500 lemmas and theorems fully proved and checked. As it turns out, this is a comprehensive formalization of the classical context-free language theory in the Coq proof assistant and includes the formalization of the Pumping Lemma for context-free languages. The perspectives for the further development of this work are diverse and can be grouped in three different areas: inclusion of new devices and results, code extraction, and general enhancements of its libraries. (shrink)
We present a formalization of first-order predicate calculus with equality which, unlike traditional systems with axiom schemata or substitution rules, is finitely axiomatized in the sense that each step in a formal proof admits only finitely many choices. This formalization is primarily based on the inference rule of condensed detachment of Meredith. The usual primitive notions of free variable and proper substitution are absent, making it easy to verify proofs in a machine-oriented application. Completeness results are presented. The (...) example of Zermelo-Fraenkel set theory is shown to be finitely axiomatized under the formalization. The relationship with resolution-based theorem provers is briefly discussed. A closely related axiomatization of traditional predicate calculus is shown to be complete in a strong metamathematical sense. (shrink)
o formalization of intensional functions was made for the purpose of many-valued interpretation of the belief-operators within the scope of the classical logic system. The first aim of the paper is to present and discuss this rather unknown many-valued construction and its properties. The fact that the manyvaluedness of o systems is purely formal - their characteristic matrices are Boolean - calls for further consideration. Departing from intristic similarities of the tables for the epistemic operators to the information functions (...) we show that o structures may be rewritten as special knowledge representation systems. These systems use 0 and 1 as the only values and are called “epistemic”. Their role for the theory of knowledge information systems may be compared to that of the functionally complete matrices in the class of all logical matrices for a given propositional language. (shrink)
This essay examines the conjunction of French historical epistemology and Lacanian theory in postwar France. In particular, Lacan's account of scientific formalization is scrutinized insofar as it develops aspects of the prior epistemological research of Gaston Bachelard, whose innovative approach to the problem of the nature and limits of scientific knowledge proved so influential on the subsequent field of French structuralism. Lacan's reflections on formalization will be shown, in contrast to Bachelard, to place an emphasis on the constitutive (...) and limiting role of language in its interaction with logical and scientific projects. In asking how Lacan's structural psychoanalysis extends and subverts the rationalist emphasis of French philosophy of science, I hope to provide a new optic through which to assess the role of formalization in critical theory today. (shrink)
Problem: The article seeks to tackle three problems of Mitterer's non-dualistic philosophy. Firstly, the key term description remains not only rather unclear and rudimentary but also isolated from relevant neighboring terms and theories of other disciplines. Secondly, a logical reconstruction and formal model of non-dualism is still lacking. Thirdly, there are hardly any extensions of philosophical non-dualism to non-philosophical disciplines and fields. Findings: The three main findings of the article are based on the abovementioned problems. Firstly, the non-dualistic term description (...) will be connected to the sociological and semiotic term meaning by emphasizing their semantic-pragmatic similarities. Moreover, a common and distinction-theoretic conceptualization of both terms will be proposed. Secondly, a non-dualistic formalization and logical reconstruction will be elaborated by deducing non-dualism from dualism using the operation of re-entry. Thirdly, the non-dualistic formalization will be applied to the classical semiotic triangle, resulting in the elaboration of a non-dualistic semiotic triangle. Benefits: The aforementioned findings have two possible benefits. Firstly, the compatibility between the terms description and meaning makes philosophical non-dualism connectable to social science approaches, especially to sociology and semiotics. This may be an important avenue for interdisciplinary cross-fertilization and co-operation. Secondly, the formalization and logical deduction may help to clarify and explicitize non-dualism's main arguments and implicit assumptions. (shrink)
The problems we deal with concern reasoning about incomplete knowledge. Knowledge is understood as ability of an ideal rational agent to make decisions about pieces of information. The formalisms we are particularly interested in are Moore's autoepistemic logic (AEL) and its variant, the logic of acceptance and rejection (AEL2). It is well-known that AEL may be seen as the nonmonotonic KD45 modal logic. The aim is to give an appropriate modal formalization for AEL2.
The most difficult problem that Leniewski came across in constructing his system of the foundations of mathematics was the problem of defining definitions, as he used to put it. He solved it to his satisfaction only when he had completed the formalization of his protothetic and ontology. By formalization of a deductive system one ought to understand in this context the statement, as precise and unambiguous as possible, of the conditions an expression has to satisfy if it is (...) added to the system as a new thesis. Now, some protothetical theses, and some ontological ones, included in the respective systems, happen to be definitions. In the present essay I employ Leniewski's method of terminological explanations for the purpose of formalizing ukasiewicz's system of implicational calculus of propositions, which system, without having recourse to quantification, I first extended some time ago into a functionally complete system. This I achieved by allowing for a rule of implicational definitions, which enabled me to define any propositionforming functor for any finite number of propositional arguments. (shrink)
The aim of this paper is to propose a criterion of finite detachment-substitutional formalization for normal modal systems. The criterion will comprise only those normal modal systems which are finitely axiomatizable by means of the substitution, detachment for material implication and Gödel rules.
After several decades during which formalization has flourished it now becomes possible to detect its shortcomings. A definition of formalization is given at the outset. It is next shown that the main justification of formalization as making explicit the form of a proof has serious difficulties. An important shortcoming is found in the fact that many validation procedures in logic and mathematics are not adequately represented deductively. Several such procedures relating to the validation of logical and mathematical (...) sentences are examined. It is concluded that formalization is materially inadequate. (shrink)
This paper discusses six formalization techniques, of varying strengths, for extending a formal system based on traditional mathematical logic. The purpose of these formalization techniques is to simulate the introduction of new syntactic constructs, along with associated semantics for them. We show that certain techniques (among the six) subsume others. To illustrate sharpness, we also consider a selection of constructs and show which techniques can and cannot be used to introduce them. The six studied techniques were selected on (...) the basis of actual practice in logic and computing. They do not form an exhaustive list. (shrink)
An investigation of what might be called the logical formalization of the process of theory change due to anomalies is presented. By anomaly, we mean an observed fact falling into the explanatory scope of a theory that does not agree with the theory prevision. A classical approach to restore the explicative power of a theory faced with an anomaly is to propose new, tentative auxiliary hypotheses which, along with part of the old set of auxiliary hypotheses, are able to (...) solve the anomaly. After laying down some conclusions about the structure of such process, we propose a multi-modal and non-monotonic logical framework able to represent some key aspects of this important facet of the dynamics of scientific theories. Due to the necessity of accommodating incompatible tentative hypotheses, this framework incorporates a weak form of paraconsistency. As a case study, we analyze the anomalous behaviour of the planet Uranus that threatened the Newtonian celestial mechanics for more than half a century and gave rise to the discovery of Neptune. (shrink)