Changes in an upper level ontology have obvious conse-quences for the domain ontologies that use it at lower levels. It is therefore crucial to document the changes made between successive versions of ontologies of this kind. We describe and apply a method for tracking, explaining and measuring changes between successive versions of upper level ontologies such as the Basic Formal Ontology (BFO). The proposed change-tracking method extends earlier work on Realism-Based Ontology Versioning (RBOV) and Evolutionary Terminology Auditing (ETA). We (...) describe here the application of this evaluation method to changes between BFO 1.0, BFO 1.1, and BFO 2.0. We discuss the issues raised by this application and describe the extensions which we added to the original evaluation schema in order to account for changes in an ontology of this type. Our results show that BFO has undergone eight types of changes that can be systematically explained by the extended evaluation schema. Finally, we discuss problematic cases, possible pitfalls and certain limits of our study that we propose to address in future work. (shrink)
We propose a formal representation of objects , those being mathematical or empirical objects. The powerful framework inside which we represent them in a unique and coherent way is grounded, on the formal side, in a logical approach with a direct mathematical semantics in the well-established field of constructive topology, and, on the philosophical side, in a neo-Kantian perspective emphasizing the knowing subject’s role, which is constructive for the mathematical objects and constitutive for the empirical ones.
It is often claimed that emotions are linked to formal objects. But what are formal objects? What roles do they play? According to some philosophers, formal objects are axiological properties which individuate emotions, make them intelligible and give their correctness conditions. In this paper, I evaluate these claims in order to answer the above questions. I first give reasons to doubt the thesis that formal objects individuate emotions. Second, I distinguish different ways in which emotions are (...) intelligible and argue that philosophers are wrong in claiming that emotions only make sense when they are based on prior sources of axiological information. Third, I investigate how issues of intelligibility connect with the correctness conditions of emotions. I defend a theory according to which emotions do not respond to axiological information, but to non-axiological reasons. According to this theory, we can allocate fundamental roles to the formal objects of emotions while dispensing with the problematic features of other theories. (shrink)
In his argument for the possibility of knowledge of spatial objects, in the Transcendental Deduction of the B-version of the Critique of Pure Reason, Kant makes a crucial distinction between space as “form of intuition” and space as “formal intuition.” The traditional interpretation regards the distinction between the two notions as reflecting a distinction between indeterminate space and determinations of space by the understanding, respectively. By contrast, a recent influential reading has argued that the two notions can be fused (...) into one and that space as such is first generated by the understanding through an act of synthesis of the imagination. Against this reading, this article argues that a key characteristic of space as a form of intuition is its nonconceptual unity, which defines the properties of space and is as such necessarily independent of determination by the understanding through the transcendental synthesis of the imagination. The conceptual unity that the understanding prescribes to the manifold in intuition, by means of the categories, defines the formal intuition. Furthermore, this article argues that it is the sui generis, nonconceptual unity of space, when taken as a unity for the understanding by means of conceptual determination, that first enables geometric knowledge and knowledge of spatially located particulars. (shrink)
We consider the question: under what circumstances can the concept of adaptation be applied to groups, rather than individuals? Gardner and Grafen (2009, J. Evol. Biol.22: 659–671) develop a novel approach to this question, building on Grafen's ‘formal Darwinism’ project, which defines adaptation in terms of links between evolutionary dynamics and optimization. They conclude that only clonal groups, and to a lesser extent groups in which reproductive competition is repressed, can be considered as adaptive units. We re-examine the conditions (...) under which the selection–optimization links hold at the group level. We focus on an important distinction between two ways of understanding the links, which have different implications regarding group adaptationism. We show how the formal Darwinism approach can be reconciled with G.C. Williams’ famous analysis of group adaptation, and we consider the relationships between group adaptation, the Price equation approach to multi-level selection, and the alternative approach based on contextual analysis. (shrink)
The problem of concept representation is relevant for many sub-fields of cognitive research, including psychology and philosophy, as well as artificial intelligence. In particular, in recent years it has received a great deal of attention within the field of knowledge representation, due to its relevance for both knowledge engineering as well as ontology-based technologies. However, the notion of a concept itself turns out to be highly disputed and problematic. In our opinion, one of the causes of this state of affairs (...) is that the notion of a concept is, to some extent, heterogeneous, and encompasses different cognitive phenomena. This results in a strain between conflicting requirements, such as compositionality, on the one hand and the need to represent prototypical information on the other. In some ways artificial intelligence research shows traces of this situation. In this paper, we propose an analysis of this current state of affairs. Since it is our opinion that a mature methodology with which to approach knowledge representation and knowledge engineering should also take advantage of the empirical results of cognitive psychology concerning human abilities, we outline some proposals for concept representation in formal ontologies, which take into account suggestions from psychological research. Our basic assumption is that knowledge representation systems whose design takes into account evidence from experimental psychology may therefore give better results in many applications. (shrink)
Numerous research groups are now utilizing Basic Formal Ontology as an upper-level framework to assist in the organization and integration of biomedical information. This paper provides elucidation of the three existing BFO subcategories of realizable entity, namely function, role, and disposition. It proposes one further sub-category of tendency, and considers the merits of recognizing two sub-categories of function for domain ontologies, namely, artifactual and biological function. The motivation is to help advance the coherent ontological treatment of functions, roles, and (...) dispositions, to help provide the potential for more detailed classification, and to shed light on BFO’s general make-up and use. (shrink)
The broader context for the formal darwinism project established by two of the commentators, in terms of reconciling the Modern Synthesis with Darwinian arguments over design and in terms of links to other types of selection and design, is discussed and welcomed. Some overselling of the project is admitted, in particular of whether it claims to consider all organic design. One important fundamental question raised in two commentaries is flagged but not answered of whether design is rightly represented by (...) an optimisation program, and another from one commentary of whether the coreplicon dissolves in the face of multi-generational imprinting. Calls for the project to be extended to design at levels above and below the individual are considered sympathetically, but judged impractical at the high level of abstraction of the project. All claims of substantive technical error are emphatically rejected. Close technical readings are welcomed that, among other things, represent the project as 'axiomatizing fitness'. The prospects for the project are set out in the light of this highly varied set of commentaries. (shrink)
Formal epistemology is just what it sounds like: epistemology done with formal tools. Coinciding with the general rise in popularity of experimental philosophy, formal epistemologists have begun to apply experimental methods in their own work. In this entry, I survey some of the work at the intersection of formal and experimental epistemology. I show that experimental methods have unique roles to play when epistemology is done formally, and I highlight some ways in which results from (...) class='Hi'>formal epistemology have been used fruitfully to advance epistemically-relevant experimental work. The upshot of this brief, incomplete survey is that formal and experimental methods often constitute mutually informative means to epistemological ends. (shrink)
In this paper, I introduce and defend a notion of analyticity for formal languages. I first uncover a crucial flaw in Timothy Williamson’s famous argument template against analyticity, when it is applied to sentences of formal mathematical languages. Williamson’s argument targets the popular idea that a necessary condition for analyticity is that whoever understands an analytic sentence assents to it. Williamson argues that for any given candidate analytic sentence, there can be people who understand that sentence and yet (...) who fail to assent to it. I argue that, on the most natural understanding of the notion of assent when it is applied to sentences of formal mathematical languages, Williamson’s argument fails. Formal analyticity is the notion of analyticity that is based on this natural understanding of assent. I go on to develop the notion of formal analyticity and defend the claim that there are formally analytic sentences and rules of inference. I conclude by showing the potential payoffs of recognizing formal analyticity. (shrink)
Aristotle's logical and metaphysical works contain elements of three distinct types of formal theory: an ontology, a theory of consequences, and a theory of reasoning. His formal ontology (unlike that of certain later thinkers) does not require all propositions of a given logical form to be true. His formal syllogistic (unlike medieval theories of consequences) was guided primarily by a conception of logic as a theory of reasoning; and his fragmentary theory of consequences exists merely as an (...) adjunct to the syllogistic. When theories of consequences took centre stage in the Middle Ages, the original motivation for the theory of the syllogism was forgotten. (shrink)
We present a philosophical motivation for the logics of formal inconsistency, a family of paraconsistent logics whose distinctive feature is that of having resources for expressing the notion of consistency within the object language. We shall defend the view according to which logics of formal inconsistency are theories of logical consequence of normative and epistemic character. This approach not only allows us to make inferences in the presence of contradictions, but offers a philosophically acceptable account of paraconsistency.
The notion of function is indispensable to our understanding of distinctions such as that between being broken and being in working order (for artifacts) and between being diseased and being healthy (for organisms). A clear account of the ontology of functions and functioning is thus an important desideratum for any top-level ontology intended for application to domains such as engineering or medicine. The benefit of using top-level ontologies in applied ontology can only be realized when each of the categories identified (...) and defined by a top-level ontology is integrated with the others in a coherent fashion. Basic Formal Ontology (BFO) has from the beginning included function as one of its categories, exploiting a version of the etiological account of function that is framed at a level of generality sufficient to accommodate both biological and artifactual functions. This account has been subjected to a series of criticisms and refinements. We here articulate BFO’s account of function, provide some reasons for favoring it over competing views, and defend it against objections. (shrink)
This paper presents a dialogue system called Lorenzen–Hamblin Natural Dialogue (LHND), in which participants can commit formal fallacies and have a method of both identifying and withdrawing formal fallacies. It therefore provides a tool for the dialectical evaluation of force of argument when players advance reasons which are deductively incorrect. The system is inspired by Hamblin’s formal dialectic and Lorenzen’s dialogical logic. It offers uniform protocols for Hamblin’s and Lorenzen’s dialogues and adds a protocol for embedding them. (...) This unification required a reformulation of the original description of Lorenzen’s system to distinguish “between different stances that a person might take in the discussion”, as suggested by Hodges. The LHND system is compared to Walton and Krabbe’s Complex Persuasion Dialogue using an example of a dialogue. (shrink)
The methodological nonreductionism of contemporary biology opens an interesting discussion on the level of ontology and the philosophy of nature. The theory of emergence (EM), and downward causation (DC) in particular, bring a new set of arguments challenging not only methodological, but also ontological and causal reductionism. This argumentation provides a crucial philosophical foundation for the science/theology dialogue. However, a closer examination shows that proponents of EM do not present a unified and consistent definition of DC. Moreover, they find it (...) difficult to prove that higher-order properties can be causally significant without violating the causal laws that operate at lower physical levels. They also face the problem of circularity and incoherence in their explanation. In our article we show that these problems can be overcome only if DC is understood in terms of formal rather than physical (efficient) causality. This breakdown of causal monism in science opens a way to the retrieval of the fourfold Aristotelian notion of causality. (shrink)
This position paper advocates combining formal epistemology and the new paradigm psychology of reasoning in the studies of conditionals and reasoning with uncertainty. The new paradigm psychology of reasoning is characterized by the use of probability theory as a rationality framework instead of classical logic, used by more traditional approaches to the psychology of reasoning. This paper presents a new interdisciplinary research program which involves both formal and experimental work. To illustrate the program, the paper discusses recent work (...) on the paradoxes of the material conditional, nonmonotonic reasoning, and Adams’ Thesis. It also identifies the issue of updating on conditionals as an area which seems to call for a combined formal and empirical approach. (shrink)
I argue that Grafen’s formal darwinism project could profitably incorporate a gene’s-eye view, as informed by the major transitions framework. In this, instead of the individual being assumed to maximise its inclusive fitness, genes are assumed to maximise their inclusive fitness. Maximisation of fitness at the individual level is not a straightforward concept because the major transitions framework shows that there are several kinds of biological individual. In addition, individuals have a definable fitness, exhibit individual-level adaptations and arise in (...) a major transition, only to the extent that the inclusive-fitness interests of genes within them coincide. Therefore, as others have suggested, the fundamental level at which fitness is maximised is the gene level. Previous reconciliations of the concepts of gene-level fitness and individual-level fitness implicitly recognise this point. Adaptations always maximise the fitness of their causative genes, but may be simple or complex. Simple adaptations may be controlled by single genes and be maladaptive at higher levels, whereas complex adaptations are controlled by multiple genes and rely on those genes having coinciding fitness interests at a higher level, for a given trait. (shrink)
The paper first introduces a cube of opposition that associates the traditional square of opposition with the dual square obtained by Piaget’s reciprocation. It is then pointed out that Blanché’s extension of the square-of-opposition structure into an conceptual hexagonal structure always relies on an abstract tripartition. Considering quadripartitions leads to organize the 16 binary connectives into a regular tetrahedron. Lastly, the cube of opposition, once interpreted in modal terms, is shown to account for a recent generalization of formal concept (...) analysis, where noticeable hexagons are also laid bare. This generalization of formal concept analysis is motivated by a parallel with bipolar possibility theory. The latter, albeit graded, is indeed based on four graded set functions that can be organized in a similar structure. (shrink)
The purpose of this study was to extend the knowledge about why procedural justice (PJ) has behavioral implications within organizations. Since prior studies show that PJ leads to legitimacy, the author suggests that, when formal regulations are unfairly implemented, they lose their validity or efficacy (becoming deactivated even if they are formally still in force). This "rule deactivation," in turn, leads to two proposed destructive work behaviors, namely, workplace deviance and decreased citizenship behaviors (OCBs). The results support this mediating (...) role of PJD, thus suggesting that it forms part of the generative mechanism through which unfair procedures influence (un) ethical behavior within organizations. The author ends the article by discussing behavioral ethics and managerial implications as well as suggestions for future research. (shrink)
In this paper we propose an approach to vagueness characterised by two features. The first one is philosophical: we move along a Kantian path emphasizing the knowing subject’s conceptual apparatus. The second one is formal: to face vagueness, and our philosophical view on it, we propose to use topology and formal topology. We show that the Kantian and the topological features joined together allow us an atypical, but promising, way of considering vagueness.
In the formal semantics based on modern type theories, common nouns are interpreted as types, rather than as predicates of entities as in Montague’s semantics. This brings about important advantages in linguistic interpretations but also leads to a limitation of expressive power because there are fewer operations on types as compared with those on predicates. The theory of coercive subtyping adequately extends the modern type theories and, as shown in this paper, plays a very useful role in making type (...) theories more expressive for formal semantics. It not only gives a satisfactory solution to the basic problem of ‘multiple categorisation’ caused by interpreting common nouns as types, but provides a powerful formal framework to model interesting linguistic phenomena such as copredication, whose formal treatment has been found difficult in a Montagovian setting. In particular, we show how to formally introduce dot-types in a type theory with coercive subtyping and study some type-theoretic constructs that provide useful representational tools for reference transfers and multiple word meanings in formal lexical semantics. (shrink)
ABSTRACT: In its strongest unqualified form, the principle of wholistic reference is that in any given discourse, each proposition refers to the whole universe of that discourse, regardless of how limited the referents of its non-logical or content terms. According to this principle every proposition of number theory, even an equation such as "5 + 7 = 12", refers not only to the individual numbers that it happens to mention but to the whole universe of numbers. This principle, its history, (...) and its relevance to some of Oswaldo Chateaubriand's work are discussed in my 2004 paper "The Principle of Wholistic Reference" in Essays on Chateaubriand's "Logical Forms". In Chateaubriand's réplica (reply), which is printed with my paper, he raised several important additional issues including the three I focus on in this tréplica (reply to his reply): truth-values, universes of discourse, and formal ontology. This paper is self-contained: it is not necessary to have read the above-mentioned works. The principle of wholistic reference (PWR) was first put forth by George Boole in 1847 when he espoused a monistic fixed-universe viewpoint similar to the one Frege and Russell espoused throughout their careers. Later, Boole elaborated PWR in 1854 from the pluralistic multiple-universes perspective. (shrink)
Formal ontology as it is presented in Husserl`s Third Logical Investigation can be interpreted as a fundamental tool to describe objects in a formal sense. It is presented one of the main sources: chapter five of Carl Stumpf`s Ûber den psycholoogischen Ursprung der Raumovorstellung (1873), and then it is described how Husserlian Formal Ontology is applied in Fifth Logical Investigation. Finally, it is applied to dramatic structures, in the spirit of Roman Ingarden.
In this paper we present a philosophical motivation for the logics of formal inconsistency, a family of paraconsistent logics whose distinctive feature is that of having resources for expressing the notion of consistency within the object language in such a way that consistency may be logically independent of non-contradiction. We defend the view according to which logics of formal inconsistency may be interpreted as theories of logical consequence of an epistemological character. We also argue that in order to (...) philosophically justify paraconsistency there is no need to endorse dialetheism, the thesis that there are true contradictions. Furthermore, we show that mbC, a logic of formal inconsistency based on classical logic, may be enhanced in order to express the basic ideas of an intuitive interpretation of contradictions as conflicting evidence. (shrink)
Understanding good design requires addressing the question of what units undergo natural selection, thereby becoming adapted. There is, therefore, a natural connection between the formal Darwinism project (which aims to connect population genetics with the evolution of design and fitness maximization) and levels of selection issues. We argue that the formal Darwinism project offers contradictory and confusing lines of thinking concerning level(s) of selection. The project favors multicellular organisms over both the lower (cell) and higher (social group) levels (...) as the level of adaptation. Grafen offers four reasons for giving such special status to multicellular organisms: (1) they lack appreciable within-organism cell selection, (2) they have multiple features that appear contrived for the same purpose, (3) they possess a set of phenotypes, and (4) they leave offspring according to their phenotypes. We discuss why these rationales are not compelling and suggest that a more even-handed approach, in which multicellular organisms are not assumed to have special status, would be desirable for a project that aims to make progress on the foundations of evolutionary theory. (shrink)
In the era of “big data,” science is increasingly information driven, and the potential for computers to store, manage, and integrate massive amounts of data has given rise to such new disciplinary fields as biomedical informatics. Applied ontology offers a strategy for the organization of scientific information in computer-tractable form, drawing on concepts not only from computer and information science but also from linguistics, logic, and philosophy. This book provides an introduction to the field of applied ontology that is of (...) particular relevance to biomedicine, covering theoretical components of ontologies, best practices for ontology design, and examples of biomedical ontologies in use. -/- After defining an ontology as a representation of the types of entities in a given domain, the book distinguishes between different kinds of ontologies and taxonomies, and shows how applied ontology draws on more traditional ideas from metaphysics. It presents the core features of the Basic Formal Ontology (BFO), now used by over one hundred ontology projects around the world, and offers examples of domain ontologies that utilize BFO. The book also describes Web Ontology Language (OWL), a common framework for Semantic Web technologies. Throughout, the book provides concrete recommendations for the design and construction of domain ontologies. (shrink)
In this paper two systems of AGM-like Paraconsistent Belief Revision are overviewed, both defined over Logics of Formal Inconsistency (LFIs) due to the possibility of defining a formal consistency operator within these logics. The AGM° system is strongly based on this operator and internalize the notion of formal consistency in the explicit constructions and postulates. Alternatively, the AGMp system uses the AGM-compliance of LFIs and thus assumes a wider notion of paraconsistency - not necessarily related to the (...) notion of formal consistency. (shrink)
Basic Formal Ontology was created in 2002 as an upper-level ontology to support the creation of consistent lower-level ontologies, initially in the subdomains of biomedical research, now also in other areas, including defense and security. BFO is currently undergoing revisions in preparation for the release of BFO version 2.0. We summarize some of the proposed revisions in what follows, focusing on BFO’s treatment of material entities, and specifically of the category object.
Since the crisis of Fordism, capitalism has been characterised by the ever more central role of knowledge and the rise of the cognitive dimensions of labour. This is not to say that the centrality of knowledge to capitalism is new per se. Rather, the question we must ask is to what extent we can speak of a new role for knowledge and, more importantly, its relationship with transformations in the capital/labour relation. From this perspective, the paper highlights the continuing validity (...) of Marx's analysis of the knowledge/power relation in the development of the division of labour. More precisely, we are concerned with the theoretical and heuristic value of the concepts of formal subsumption, real subsumption and general intellect for any interpretation of the present change of the capital/labour relation in cognitive capitalism. In this way, we show the originality of the general intellect hypothesis as a sublation of real subsumption. Finally, the article summarises key contradictions and new forms of antagonism in cognitive capitalism. (shrink)
An aging population is often taken to require a profound reorganization of the prevailing health care system. In particular, a more cost-effective care system is warranted and ICT-based home care is often considered a promising alternative. Modern health care devices admit a transfer of patients with rather complex care needs from institutions to the home care setting. With care recipients set up with health monitoring technologies at home, spouses and children are likely to become involved in the caring process and (...) informal caregivers may have to assist kin-persons with advanced care needs by means of sophisticated technology. This paper investigates some of the ethical implications of a near-future shift from institutional care to technology-assisted home care and the subsequent impact on the care recipient and formal- and informal care providers. (shrink)
This paper investigates the effect of the countervailing forces within organizations of formal systems that direct employees toward ethical acts and informal systems that direct employees toward fraudulent behavior. We study the effect of these forces on deception, a key component of fraud. The results provide support for an interactive effect of these formal and informal systems. The effectiveness of formal systems is greater when there is a strong informal “push” to do wrong; conversely, in the absence (...) of a strong push to do wrong, the strength of formal systems has little impact on fraudulent behavior. These results help to explain why the implementation of formal systems within organizations has been met with mixed results and identifies when formal systems designed to promote ethical behavior will be most efficacious. (shrink)
The central hypothesis of the collaboration between Language and Computing (L&C) and the Institute for Formal Ontology and Medical Information Science (IFOMIS) is that the methodology and conceptual rigor of a philosophically inspired formal ontology will greatly benefit software application ontologies. To this end LinKBase®, L&C’s ontology, which is designed to integrate and reason across various external databases simultaneously, has been submitted to the conceptual demands of IFOMIS’s Basic Formal Ontology (BFO). With this, we aim to move (...) beyond the level of controlled vocabularies to yield an ontology with the ability to support reasoning applications. (shrink)
I distinguish two types of reduction within the context of quantum-classical relations, which I designate “formal” and “empirical”. Formal reduction holds or fails to hold solely by virtue of the mathematical relationship between two theories; it is therefore a two-place, a priori relation between theories. Empirical reduction requires one theory to encompass the range of physical behaviors that are well-modeled in another theory; in a certain sense, it is a three-place, a posteriori relation connecting the theories and the (...) domain of physical reality that both serve to describe. Focusing on the relationship between classical and quantum mechanics, I argue that while certain formal results concerning singular \ limits have been taken to preclude the possibility of reduction between these theories, such results at most provide support for the claim that singular limits block reduction in the formal sense; little if any reason has been given for thinking that they block reduction in the empirical sense. I then briefly outline a strategy for empirical reduction that is suggested by work on decoherence theory, arguing that this sort of account remains a fully viable route to the empirical reduction of classical to quantum mechanics and is unaffected by such singular limits. (shrink)
Formal principles governing best practices in classification and definition have for too long been neglected in the construction of biomedical ontologies, in ways which have important negative consequences for data integration and ontology alignment. We argue that the use of such principles in ontology construction can serve as a valuable tool in error-detection and also in supporting reliable manual curation. We argue also that such principles are a prerequisite for the successful application of advanced data integration techniques such as (...) ontology-based multi-database querying, automated ontology alignment and ontology-based text-mining. These theses are illustrated by means of a case study of the Gene Ontology, a project of increasing importance within the field of biomedical data integration. (shrink)
Revised version of chapter in J. N. Mohanty and W. McKenna (eds.), Husserl’s Phenomenology: A Textbook, Lanham: University Press of America, 1989, 29–67. -/- Logic for Husserl is a science of science, a science of what all sciences have in common in their modes of validation. Thus logic deals with universal laws relating to truth, to deduction, to verification and falsification, and with laws relating to theory as such, and to what makes for theoretical unity, both on the side of (...) the propositions of a theory and on the side of the domain of objects to which these propositions refer. This essay presents a systematic overview of Husserl’s views on these matters as put forward in his Logical Investigations. It shows how Husserl’s theory of linguistic meanings as species of mental acts, his formal ontology of part, whole and dependence, his theory of meaning categories, and his theory of categorial intuition combine with his theory of science to form a single whole. Finally, it explores the ways in which Husserl’s ideas on these matters can be put to use in solving problems in the philosophy of language, logic and mathematics in a way which does justice to the role of mental activity in each of these domains while at the same time avoiding the pitfalls of psychologism. (shrink)
Since the beginning of the 20th Century to the present day, it has rarely been doubted that whenever formal aesthetic methods meet their iconological counterparts, the two approaches appear to be mutually exclusive. In reality, though, an ahistorical concept is challenging a historical analysis of art. It is especially Susanne K. Langer´s long-overlooked system of analogies between perceptions of the world and of artistic creations that are dependent on feelings which today allows a rapprochement of these positions. Krois’s insistence (...) on a similar point supports this analysis. - I - Unbestritten bis heute gilt, formwissenschaftliche und ikonologische Methoden scheinen sich grundsätzlich auszuschließen, da die ersteren auf ahistorischen und die letzteren auf historischen Grundlagen aufbauen. Dem entgegen soll mit diesem Beitrag gezeigt werden, wie insbesondere die Forschungen Susanne K. Langers und ergänzend diejenigen von John M. Krois eine Annäherung beider Positionen ermöglichen. (shrink)
In this paper we present a philosophical motivation for the logics of formal inconsistency, a family of paraconsistent logics whose distinctive feature is that of having resources for expressing the notion of consistency within the object language in such a way that consistency may be logically independent of non- contradiction. We defend the view according to which logics of formal inconsistency may be interpreted as theories of logical consequence of an epistemological character. We also argue that in order (...) to philosophically justify paraconsistency there is no need to endorse dialetheism, the thesis that there are true contradictions. Furthermore, we argue that an intuitive reading of the bivalued semantics for the logic mbC, a logic of formal inconsistency based on classical logic, fits in well with the basic ideas of an intuitive interpretation of contradictions. On this interpretation, the acceptance of a pair of propositions A and ¬A does not mean that A is simultaneously true and false, but rather that there is conflicting evidence about the truth value of A. (shrink)
Formal dialectic has its roots in ancient dialectic. We can trace this influence in Charles Hamblin’s book on fallacies, in which he introduced his first formal dialectical systems. Earlier, Paul Lorenzen proposed systems of dialogical logic, which were in fact formal dialectical systems avant la lettre, with roles similar to those of the Greek Questioner and Answerer. In order to make a comparison between ancient dialectic and contemporary formal dialectic, I shall formalize part of the Aristotelian (...) procedure for Academic debates. The resulting system will be compared (1) with Van Eemeren and Grootendorst’s system of rules of Critical Discussion (the pragma-dialectical discussion procedure), which must, however, first itself be reconstructed as a formal dialectical system, and (2) with a Hamblin-type system, and (3) a Lorenzen-type system. When drawing comparisons, it will become clear that there is a line to be drawn from Aristotle to formal dialectic and pragma-dialectics, extending to contemporary computational models of argument. (shrink)
A series of representations must be semantics-driven if the members of that series are to combine into a single thought: where semantics is not operative, there is at most a series of disjoint representations that add up to nothing true or false, and therefore do not constitute a thought at all. A consequence is that there is necessarily a gulf between simulating thought, on the one hand, and actually thinking, on the other. A related point is that a popular doctrine (...) - the so-called 'computational theory of mind' (CTM) - is based on a confusion. CTM is the view that thought-processes consist in 'computations', where a computation is defined as a 'form-driven' operation on symbols. The expression 'form-driven operation' is ambiguous, as it may refer either to syntax-driven operations or to morphology-driven operations. Syntax-driven operations presuppose the existence of operations that are driven by semantic and extra-semantic knowledge. So CTM is false if the terms 'computation' and 'form-driven operation' are taken to refer to syntax-driven operations. Thus, if CTM is to work, those expressions must be taken to refer to morphology-driven operations. CTM therefore fails, given that an operation must be semantics-driven if it is to qualify as a thought. CTM therefore fails on each possible disambiguation of the expressions 'formal operation' and 'computation,' and it is therefore false. (shrink)
In his paper “Flaws of Formal Relationism”, Mahrad Almotahari argues against the sort of response to Frege's Puzzle I have defended elsewhere, which he dubs ‘Formal Relationism’. Almotahari argues that, because of its specifically formal character, this view is vulnerable to objections that cannot be raised against the otherwise similar Semantic Relationism due to Kit Fine. I argue in response that Formal Relationism has neither of the flaws Almotahari claims to identify.
In this paper, we discuss some formal properties of the model ofbidirectional Optimality Theory that was developed inBlutner (2000). We investigate the conditions under whichbidirectional optimization is a well-defined notion, and we give aconceptually simpler reformulation of Blutner's definition. In thesecond part of the paper, we show that bidirectional optimization can bemodeled by means of finite state techniques. There we rely heavily onthe related work of Frank and Satta (1998) about unidirectionaloptimization.
The issue of the relationship between formal and informal logic depends strongly on how one understands these two designations. While there is very little disagreement about the nature of formal logic, the same is not true regarding informal logic, which is understood in various (often incompatible) ways by various thinkers. After reviewing some of the more prominent conceptions of informal logic, I will present my own, defend it and then show how informal logic, so understood, is complementary to (...)formal logic. (shrink)
This paper aims to argue for two related statements: first, that formal semantics should not be conceived of as interpreting natural language expressions in a single model (a very large one representing the world as a whole, or something like that) but as interpreting them in many different models (formal counterparts, say, of little fragments of reality); second, that accepting such a conception of formal semantics yields a better comprehension of the relation between semantics and pragmatics and (...) of the role to be played by formal semantics in the general enterprise of understanding meaning. For this purpose, three kinds of arguments are given: firstly, empirical arguments showing that the many models approach is the most straightforward and natural way of giving a formal counterpart to natural language sentences. Secondly, logical arguments proving the logical impossibility of a single universal model. And thirdly, theoretical arguments to the effect that such a conception of formal semantics fits in a natural and fruitful way with pragmatic theories and facts. In passing, this conception will be shown to cast some new light on the old problems raised by liar and sorites paradoxes. (shrink)
Nelson Goodman’s new riddle of induction forcefully illustrates a challenge that must be confronted by any adequate theory of inductive inference: provide some basis for choosing among alternative hypotheses that fit past data but make divergent predictions. One response to this challenge is to distinguish among alternatives by means of some epistemically significant characteristic beyond fit with the data. Statistical learning theory takes this approach by showing how a concept similar to Popper’s notion of degrees of testability is linked to (...) minimizing expected predictive error. In contrast, formal learning theory appeals to Ockham’s razor, which it justifies by reference to the goal of enhancing efficient convergence to the truth. In this essay, I show that, despite their differences, statistical and formal learning theory yield precisely the same result for a class of inductive problems that I call strongly VC ordered , of which Goodman’s riddle is just one example. (shrink)
This paper addresses the theoretical notion of a game as it arisesacross scientific inquiries, exploring its uses as a technical andformal asset in logic and science versus an explanatory mechanism. Whilegames comprise a widely used method in a broad intellectual realm(including, but not limited to, philosophy, logic, mathematics,cognitive science, artificial intelligence, computation, linguistics,physics, economics), each discipline advocates its own methodology and aunified understanding is lacking. In the first part of this paper, anumber of game theories in formal studies are (...) critically surveyed. Inthe second part, the doctrine of games as explanations for logic isassessed, and the relevance of a conceptual analysis of games tocognition discussed. It is suggested that the notion of evolution playsa part in the game-theoretic concept of meaning. (shrink)
This is the preface of the special Issue: Formal Representations in Model-based Reasoning and Abduction, published at the Logic Jnl IGPL (2012) 20 (2): 367-369. doi: 10.1093/jigpal/jzq055 First published online: December 20, 2010.
Kathrin Koslicki argues that ordinary material objects like tables and motorcycles have formal proper parts that structure the material proper parts. Karen Bennett rejects a key premise in Koslicki's argument according to which the material ingredient out of which a complex material object is made is a proper part of that object. Koslicki defends this premise with a principle motivated by its power to explain three important phenomena of material composition. But these phenomena can be equally well explained by (...) a weaker principle that does not support the questioned premise in Koslicki's argument, Bennett argues. I show that Bennett's weaker principle, together with an appropriate strengthening of a different premise in Koslicki's original argument, still yields a sound argument for the existence of formal parts. (shrink)
I defend a conception of Logic as normative for the sort of activities in which inferences super-vene, namely, reasoning and arguing. Toulmin’s criticism of formal logic will be our framework to shape the idea that in order to make sense of Logic as normative, we should con-ceive it as a discipline devoted to the layout of arguments, understood as the representations of the semantic, truth relevant, properties of the inferences that we make in arguing and reason-ing.