IΣ n andBΣ n are well known fragments of first-order arithmetic with induction and collection forΣ n formulas respectively;IΣ n 0 andBΣ n 0 are their second-order counterparts. RCA0 is the well known fragment of second-order arithmetic with recursive comprehension;WKL 0 isRCA 0 plus weak König's lemma. We first strengthen Harrington's conservation result by showing thatWKL 0 +BΣ n 0 is Π 1 1 -conservative overRCA 0 +BΣ n 0 . Then we develop some model theory inWKL 0 and illustrate (...) the use of formalized model theory by giving a relatively simple proof of the fact thatIΣ 1 provesBΣ n+1 to be Π n+2-conservative overIΣ n . Finally, we present a proof-theoretic proof of the stronger fact that theΠ n+2 conservation result is provable already inIΔ 0 + superexp. ThusIΣ n+1 proves 1-Con (BΣ n+1) andIΔ 0 +superexp proves Con (IΣ n )↔Con(BΣ n+1). (shrink)
The article presents a formalization of Thomas Aquinas proof for the indestructibility of the human soul. The author of the formalization—the first of its kind in the history of philosophy—is Father Joseph Maria Bocheński. The presentation involves no more than updating the logical symbolism used and accompanies the logical formulae with ordinary language paraphrases in order to ease the reader’s understanding of the formulae. “The fundamental idea of the Thomist proof is of utmost simplicity: things which are destructible are destructible (...) either per se or per accidens; but the human soul is destructible neither per se nor per accidens: therefore the human soul is not destructible”. Bochenski’s words required him to devote considerable effort for the sake of precision of the symbolic language that would be maximally adequate to Thomas’ discourse. Moreover, I have thought it necessary to provide an ample commentary to the traditional and contemporary semantical presuppositions of Aquinas philosophical anthropology in light of Bocheński’s interpretation thereof. (shrink)
It has been accepted since the early part of the Century that there is no problem formalizing mathematics in standard formal systems of axiomatic set theory. Most people feel that they know as much as they ever want to know about how one can reduce natural numbers, integers, rationals, reals, and complex numbers to sets, and prove all of their basic properties. Furthermore, that this can continue through more and more complicated material, and that there is never a real problem.
This paper presents the behavioral interview model that we developed to formalize our hiring practices when we, most recently, needed to hire a new clinical ethicist to join our staff at the Center for Ethics at Washington Hospital Center.
There is a review of how Mark Addis has made a case that it would require great effort for scant philosophical profit to formalize a notion of surveyability as a metamathematical predicate demarcating strict finitistic mathematics. It is then suggested how the notion of surveyability is useful in informal philosophizing about mathematics.
Employing the theory of Birkhoff polarities as a model of model theory yields an inductively defined dual structure which is a formalization of semantics and which allows for simple proofs of some new results for model theory.
It is often assumed that cognitive science is built upon folk psychology, and that challenges to folk psychology are therefore challenges to cognitive science itself. We argue that, in practice, cognitive science and folk psychology treat entirely non-overlapping domains: cognitive science considers aspects of mental life which do not depend on general knowledge, whereas folk psychology considers aspects of mental life which do depend on general knowledge. We back up our argument on theoretical grounds, and also illustrate the separation between (...) cognitive scientific and folk psychological phenomena in a number of cognitive domains. We consider the methodological and theoretical significance of our arguments for cognitive science research. (shrink)
Exact sciences are described as sciences whose theories are formalized. These are contrasted to inexact sciences, whose theories are not formalized. Formalization is described as a broader category than mathematization, involving any form/content distinction allowing forms, e.g., as represented in theoretical models, to be studied independently of the empirical content of a subject-matter domain. Exactness is a practice depending on the use of theories to control subject-matter domains and to align theoretical with empirical models and not merely a (...) state of a science. Inexact biological sciences tolerate a degree of “mismatch” between theoretical and empirical models and concepts. Three illustrations from biological sciences are discussed in which formalization is achieved by various means: Mendelism, Weismannism, and Darwinism. Frege’s idea of a “conceptual notation” is used to further characterize the notion of a form/content distinction. (shrink)
We propose a formal representation of objects , those being mathematical or empirical objects. The powerful framework inside which we represent them in a unique and coherent way is grounded, on the formal side, in a logical approach with a direct mathematical semantics in the well-established field of constructive topology, and, on the philosophical side, in a neo-Kantian perspective emphasizing the knowing subject’s role, which is constructive for the mathematical objects and constitutive for the empirical ones.
This paper argues that just as full beliefs can constitute knowledge, so can properties of your credence distribution. The resulting notion of probabilistic knowledge helps us give a natural account of knowledge ascriptions embedding language of subjective uncertainty, and a simple diagnosis of probabilistic analogs of Gettier cases. Just like propositional knowledge, probabilistic knowledge is factive, safe, and sensitive. And it helps us build knowledge-based norms of action without accepting implausible semantic assumptions or endorsing the claim that knowledge is interest-relative.
A series of representations must be semantics-driven if the members of that series are to combine into a single thought. Where semantics is not operative, there is at most a series of disjoint representations that add up to nothing true or false, and therefore do not constitute a thought at all. There is necessarily a gulf between simulating thought, on the one hand, and actually thinking, on the other. A related point is that a popular doctrine - the so-called 'computational (...) theory of mind' (CTM) - is based on a confusion. CTM is the view that thought-processes consist in 'computations', where a computation is defined as a 'form-driven' operation on symbols. The expression 'form-driven operation' is ambiguous, and may refer either to syntax-driven operations or to morphology-driven operations. Syntax-driven operations presuppose the existence of operations that are driven by semantic and extra-semantic knowledge. So CTM is false if the terms 'computation' and 'form-driven operation' are taken to refer to syntax-driven operations. Thus, if CTM is to work, those expressions must be taken to refer to morphology-driven operations; and CTM therefore fails, given that an operation must be semantics-driven if it is to qualify as a thought. CTM therefore fails on every disambiguation of the expressions 'formal operation' and 'computation,' and it is therefore false. (shrink)
Formal ontology as it is presented in Husserl`s Third Logical Investigation can be interpreted as a fundamental tool to describe objects in a formal sense. It is presented one of the main sources: chapter five of Carl Stumpf`s Ûber den psycholoogischen Ursprung der Raumovorstellung (1873), and then it is described how Husserlian Formal Ontology is applied in Fifth Logical Investigation. Finally, it is applied to dramatic structures, in the spirit of Roman Ingarden.
This paper aims to argue for two related statements: first, that formal semantics should not be conceived of as interpreting natural language expressions in a single model (a very large one representing the world as a whole, or something like that) but as interpreting them in many different models (formal counterparts, say, of little fragments of reality); second, that accepting such a conception of formal semantics yields a better comprehension of the relation between semantics and pragmatics and of the role (...) to be played by formal semantics in the general enterprise of understanding meaning. For this purpose, three kinds of arguments are given: firstly, empirical arguments showing that the many models approach is the most straightforward and natural way of giving a formal counterpart to natural language sentences. Secondly, logical arguments proving the logical impossibility of a single universal model. And thirdly, theoretical arguments to the effect that such a conception of formal semantics fits in a natural and fruitful way with pragmatic theories and facts. In passing, this conception will be shown to cast some new light on the old problems raised by liar and sorites paradoxes. (shrink)
It is often claimed that emotions are linked to formal objects. But what are formal objects? What roles do they play? According to some philosophers, formal objects are axiological properties which individuate emotions, make them intelligible and give their correctness conditions. In this paper, I evaluate these claims in order to answer the above questions. I first give reasons to doubt the thesis that formal objects individuate emotions. Second, I distinguish different ways in which emotions are intelligible and argue that (...) philosophers are wrong in claiming that emotions only make sense when they are based on prior sources of axiological information. Third, I investigate how issues of intelligibility connect with the correctness conditions of emotions. I defend a theory according to which emotions do not respond to axiological information, but to non-axiological reasons. According to this theory, we can allocate fundamental roles to the formal objects of emotions while dispensing with the problematic features of other theories. (shrink)
Much philosophy of logic is shaped, explicitly or implicitly, by the thought that logic is distinctively formal and abstracts from material content. The distinction between formal and material does not appear to coincide with the more familiar contrasts between a priori and empirical, necessary and contingent, analytic and synthetic—indeed, it is often invoked to explain these. Nor, it turns out, can it be explained by appeal to schematic inference patterns, syntactic rules, or grammar. What does it mean, then, to say (...) that logic is distinctively formal? (shrink)
Whether human thinking can be formalized and whether machines can think in a human sense are questions that have been addressed by both Peirce and Searle. Peirce came to roughly the same conclusion as Searle, that the digital computer would not be able to perform human thinking or possess human understanding. However, his rationale and Searle's differ on several important points. Searle approaches the problem from the standpoint of traditional analytic philosophy, where the strict separation of syntax and semantics (...) renders understanding impossible for a purely syntactical device. Peirce disagreed with that analysis, but argued that the computer would only be able to achieve algorithmic thinking, which he considered the simplest type. Although their approaches were radically dissimilar, their conclusions were not. I will compare and analyze the arguments of both Peirce and Searle on this issue, and outline some implications of their conclusions for the field of Artificial Intelligence. (shrink)
Nelson Goodman’s new riddle of induction forcefully illustrates a challenge that must be confronted by any adequate theory of inductive inference: provide some basis for choosing among alternative hypotheses that fit past data but make divergent predictions. One response to this challenge is to distinguish among alternatives by means of some epistemically significant characteristic beyond fit with the data. Statistical learning theory takes this approach by showing how a concept similar to Popper’s notion of degrees of testability is linked to (...) minimizing expected predictive error. In contrast, formal learning theory appeals to Ockham’s razor, which it justifies by reference to the goal of enhancing efficient convergence to the truth. In this essay, I show that, despite their differences, statistical and formal learning theory yield precisely the same result for a class of inductive problems that I call strongly VC ordered , of which Goodman’s riddle is just one example. (shrink)
In this paper we propose an approach to vagueness characterised by two features. The first one is philosophical: we move along a Kantian path emphasizing the knowing subject’s conceptual apparatus. The second one is formal: to face vagueness, and our philosophical view on it, we propose to use topology and formal topology. We show that the Kantian and the topological features joined together allow us an atypical, but promising, way of considering vagueness.
This article explores the relation between the concept of symmetry and its formalisms. The standard view among philosophers and physicists is that symmetry is completely formalized by mathematical groups. For some mathematicians however, the groupoid is a competing and more general formalism. An analysis of symmetry that justifies this extension has not been adequately spelled out. After a brief explication of how groups, equivalence, and symmetries classes are related, we show that, while it’s true in some instances that groups (...) are too restrictive, there are other instances for which the standard extension to groupoids is too un restrictive. The connection between groups and equivalence classes, when generalized to groupoids, suggests a middle ground between the two. *Received July 2007. †To contact the authors, please write to: Alexandre Guay, UFR Sciences et Techniques, Université de Bourgogne, 9 Avenue Alain Savary, 21078 Dijon Cedex, France; e‐mail: email@example.com ; or to Brian Hepburn, Department of Philosophy, University of British Columbia, 1866 Main Mall E370, Vancouver, BC, Canada V6T 1Z1; e‐mail: firstname.lastname@example.org. (shrink)
Truth is a fundamental objective of adjudicative processes; ideally, substantive as distinct from formal legal truth. But problems of evidence, for example, may frustrate finding of substantive truth; other values may lead to exclusions of probative evidence, e.g., for the sake of fairness. Jury nullification and jury equity. Limits of time, and definitiveness of decision, require allocation of burden of proof. Degree of truth-formality is variable within a system and across systems.
The issue of the relationship between formal and informal logic depends strongly on how one understands these two designations. While there is very little disagreement about the nature of formal logic, the same is not true regarding informal logic, which is understood in various (often incompatible) ways by various thinkers. After reviewing some of the more prominent conceptions of informal logic, I will present my own, defend it and then show how informal logic, so understood, is complementary to formal logic.
We consider the question: under what circumstances can the concept of adaptation be applied to groups, rather than individuals? Gardner and Grafen (2009, J. Evol. Biol.22: 659–671) develop a novel approach to this question, building on Grafen's ‘formal Darwinism’ project, which defines adaptation in terms of links between evolutionary dynamics and optimization. They conclude that only clonal groups, and to a lesser extent groups in which reproductive competition is repressed, can be considered as adaptive units. We re-examine the conditions under (...) which the selection–optimization links hold at the group level. We focus on an important distinction between two ways of understanding the links, which have different implications regarding group adaptationism. We show how the formal Darwinism approach can be reconciled with G.C. Williams’ famous analysis of group adaptation, and we consider the relationships between group adaptation, the Price equation approach to multi-level selection, and the alternative approach based on contextual analysis. (shrink)
The problem of concept representation is relevant for many sub-fields of cognitive research, including psychology and philosophy, as well as artificial intelligence. In particular, in recent years it has received a great deal of attention within the field of knowledge representation, due to its relevance for both knowledge engineering as well as ontology-based technologies. However, the notion of a concept itself turns out to be highly disputed and problematic. In our opinion, one of the causes of this state of affairs (...) is that the notion of a concept is, to some extent, heterogeneous, and encompasses different cognitive phenomena. This results in a strain between conflicting requirements, such as compositionality, on the one hand and the need to represent prototypical information on the other. In some ways artificial intelligence research shows traces of this situation. In this paper, we propose an analysis of this current state of affairs. Since it is our opinion that a mature methodology with which to approach knowledge representation and knowledge engineering should also take advantage of the empirical results of cognitive psychology concerning human abilities, we outline some proposals for concept representation in formal ontologies, which take into account suggestions from psychological research. Our basic assumption is that knowledge representation systems whose design takes into account evidence from experimental psychology (and which, therefore, are more similar to the human way of organizing and processing information) may therefore give better results in many applications (e.g. in the fields of information retrieval and semantic web). (shrink)
One of the most interesting and entertaining philosophical discussions of the last few decades is the discussion between Daniel Dennett and John Searle on the existence of intrinsic intentionality. Dennett denies the existence of phenomena with intrinsic intentionality. Searle, however, is convinced that some mental phenomena exhibit intrinsic intentionality. According to me, this discussion has been obscured by some serious misunderstandings with regard to the concept ‘intrinsic intentionality’. For instance, most philosophers fail to realize that it is possible that the (...) intentionality of a phenomenon is partly intrinsic and partly observer relative. Moreover, many philosophers are mixing up the concepts ‘original intentionality’ and ‘intrinsic intentionality’. In fact, there is, in the philosophical literature, no strict and unambiguous definition of the concept ‘intrinsic intentionality’. In this article, I will try to remedy this. I will also try to give strict and unambiguous definitions of the concepts ‘observer relative intentionality’, ‘original intentionality’, and ‘derived intentionality’. These definitions will be used for an examination of the intentionality of formal mathematical systems. In conclusion, I will make a comparison between the (intrinsic) intentionality of formal mathematical systems on the one hand, and the (intrinsic) intentionality of human beings on the other hand. (shrink)
As Vincent Hendricks remarks early on in this book, the formal and mainstream traditions of epistemic theorising have mostly evolved independently of each other. This initial impression is confirmed by a comparison of the main problems and methods practitioners in each tradition are concerned with. Mainstream epistemol- ogy engages in a dialectical game of proposing and challenging definitions of knowledge. Formal epistemologists proceed differently, as they design a wide variety of axiomatic and model-theoretic methods whose consequences they investigate independently of (...) the need of giving counterexample-free definitions of knowledge. Or at least, this is a common way to explain where both disciplines stand in the larger landscape of epistemic theorising, and why interactions between them remain scarce. The main ambition of this book is to show that the distinction between formal and mainstream approaches should not preclude a fruitful interaction, and that it only takes the right outlook on their respective practices to disclose plenty of room for interaction. (shrink)
There is a long-standing debate whether propositions, sentences, statements or utterances provide an answer to the question of what objects logical formulas stand for. Based on the traditional understanding of logic as a science of valid arguments, this question is firstly framed more exactly, making explicit that it calls not only for identifying some class of objects, but also for explaining their relationship to ordinary language utterances. It is then argued that there are strong arguments against the proposals commonly put (...) forward in the debate. The core of the problem is that an informative account of the objects formulas stand for presupposes a theory of formalization; that is, a theory that explains what formulas may adequately substitute for an inference in proofs of validity. Although such theories are still subject to research, some consequences can be drawn from an analysis of the reasons why the common accounts featuring sentences, propositions or utterances fail. Theories of formalization cannot refer to utterances qua expressions of propositions; instead they may refer to sentences and rely on additional information about linguistic structure and pragmatic context. (shrink)
Using Carnapâs concept explication, we propose a theory of concept formation in mathematics. This theory is then applied to the problem of how to understand the relation between the concepts formal proof (deduction) and informal, mathematical proof.
The advantages and disadvantages of formalization in philosophy are summarized. It is concluded that formalized philosophy is an endangered speciality that needs to be revitalized and to increase its interactions with non-formalized philosophy. The enigmatic style that is common in philosophical logic must give way to explicit discussions of the problematic relationship between formal models and the philosophical concepts and issues that motivated their development.
An aging population is often taken to require a profound reorganization of the prevailing health care system. In particular, a more cost-effective care system is warranted and ICT-based home care is often considered a promising alternative. Modern health care devices admit a transfer of patients with rather complex care needs from institutions to the home care setting. With care recipients set up with health monitoring technologies at home, spouses and children are likely to become involved in the caring process and (...) informal caregivers may have to assist kin-persons with advanced care needs by means of sophisticated technology. This paper investigates some of the ethical implications of a near-future shift from institutional care to technology-assisted home care and the subsequent impact on the care recipient and formal- and informal care providers. (shrink)
The purpose of this study was to extend the knowledge about why procedural justice (PJ) has behavioral implications within organizations. Since prior studies show that PJ leads to legitimacy, the author suggests that, when formal regulations are unfairly implemented, they lose their validity or efficacy (becoming deactivated even if they are formally still in force). This "rule deactivation," in turn, leads to two proposed destructive work behaviors, namely, workplace deviance and decreased citizenship behaviors (OCBs). The results support this mediating role (...) of PJD, thus suggesting that it forms part of the generative mechanism through which unfair procedures influence (un) ethical behavior within organizations. The author ends the article by discussing behavioral ethics and managerial implications as well as suggestions for future research. (shrink)
Replies to Kevin de Laplante’s ‘Certainty and Domain-Independence in the Sciences of Complexity’ (de Laplante, 1999), defending the thesis of J. Franklin, ‘The formal sciences discover the philosophers’ stone’, Studies in History and Philosophy of Science, 25 (1994), 513-33, that the sciences of complexity can combine certain knowledge with direct applicability to reality.
In the paper we build up the ontology of Leśniewski’s type for formalizing synthetic propositions. We claim that for these propositions an unconventional square of opposition holds, where a, i are contrary, a, o (resp. e, i) are contradictory, e, o are subcontrary, a, e (resp. i, o) are said to stand in the subalternation. Further, we construct a non-Archimedean extension of Boolean algebra and show that in this algebra just two squares of opposition are formalized: conventional and the (...) square that we invented. As a result, we can claim that there are only two basic squares of opposition. All basic constructions of the paper (the new square of opposition, the formalization of synthetic propositions within ontology of Leśniewski’s type, the non-Archimedean explanation of square of opposition) are introduced for the first time. (shrink)
Three things are presented: How Hilbert changed the original construction postulates of his geometry into existential axioms; In what sense he formalized geometry; How elementary geometry is formalized to present day's standards.
Theoretical concepts which may be applicated in formalized contexts as well as by empirical research are relatively rare. The concepts of ‘subjective information’ used in cybernetically oriented pedagogists (learning theorists) seems to be applicable in both ways. An analysis of its logical and empirical validity shows, however, that the basis of this concept is ambiguous. The concept is defined on the basis of statistical probability, but implicitly interpreted on the base of inductive probability. Thus the measurement and also the (...) strategic application of subjective information are doubtful. This concept does not seem to be useful as a paradigm for the solution of analogous problems. (shrink)
The paper introduces and formally defines a functional concept of a measuring system, on this basis characterizing the measurement as an evaluation performed by means of a calibrated measuring system. The distinction between exact and uncertain measurement is formalized in terms of the properties of the traceability chain joining the measuring system to the primary standard. The consequence is drawn that uncertain measurements lose the property of relation-preservation, on which the very concept of measurement is founded according to the (...) representational viewpoint. Finally, from the analysis of the inter-relations between calibration and measurement the fundamental reasons of the claimed objectivity and intersubjectivity of measurement are highlighted, a valuable epistemological result to characterize measurement as a particular kind of evaluation. (shrink)
The methodological nonreductionism of contemporary biology opens an interesting discussion on the level of ontology and the philosophy of nature. The theory of emergence (EM), and downward causation (DC) in particular, bring a new set of arguments challenging not only methodological, but also ontological and causal reductionism. This argumentation provides a crucial philosophical foundation for the science/theology dialogue. However, a closer examination shows that proponents of EM do not present a unified and consistent definition of DC. Moreover, they find it (...) difficult to prove that higher-order properties can be causally significant without violating the causal laws that operate at lower physical levels. They also face the problem of circularity and incoherence in their explanation. In our article we show that these problems can be overcome only if DC is understood in terms of formal rather than physical (efficient) causality. This breakdown of causal monism in science opens a way to the retrieval of the fourfold Aristotelian notion of causality. (shrink)