We propose a formal representation of objects , those being mathematical or empirical objects. The powerful framework inside which we represent them in a unique and coherent way is grounded, on the formal side, in a logical approach with a direct mathematical semantics in the well-established field of constructive topology, and, on the philosophical side, in a neo-Kantian perspective emphasizing the knowing subject’s role, which is constructive for the mathematical objects and constitutive for the empirical ones.
A series of representations must be semantics-driven if the members of that series are to combine into a single thought. Where semantics is not operative, there is at most a series of disjoint representations that add up to nothing true or false, and therefore do not constitute a thought at all. There is necessarily a gulf between simulating thought, on the one hand, and actually thinking, on the other. A related point is that a popular doctrine - the so-called 'computational (...) theory of mind' (CTM) - is based on a confusion. CTM is the view that thought-processes consist in 'computations', where a computation is defined as a 'form-driven' operation on symbols. The expression 'form-driven operation' is ambiguous, and may refer either to syntax-driven operations or to morphology-driven operations. Syntax-driven operations presuppose the existence of operations that are driven by semantic and extra-semantic knowledge. So CTM is false if the terms 'computation' and 'form-driven operation' are taken to refer to syntax-driven operations. Thus, if CTM is to work, those expressions must be taken to refer to morphology-driven operations; and CTM therefore fails, given that an operation must be semantics-driven if it is to qualify as a thought. CTM therefore fails on every disambiguation of the expressions 'formal operation' and 'computation,' and it is therefore false. (shrink)
Formal ontology as it is presented in Husserl`s Third Logical Investigation can be interpreted as a fundamental tool to describe objects in a formal sense. It is presented one of the main sources: chapter five of Carl Stumpf`s Ûber den psycholoogischen Ursprung der Raumovorstellung (1873), and then it is described how Husserlian Formal Ontology is applied in Fifth Logical Investigation. Finally, it is applied to dramatic structures, in the spirit of Roman Ingarden.
This paper aims to argue for two related statements: first, that formal semantics should not be conceived of as interpreting natural language expressions in a single model (a very large one representing the world as a whole, or something like that) but as interpreting them in many different models (formal counterparts, say, of little fragments of reality); second, that accepting such a conception of formal semantics yields a better comprehension of the relation between semantics and pragmatics and (...) of the role to be played by formal semantics in the general enterprise of understanding meaning. For this purpose, three kinds of arguments are given: firstly, empirical arguments showing that the many models approach is the most straightforward and natural way of giving a formal counterpart to natural language sentences. Secondly, logical arguments proving the logical impossibility of a single universal model. And thirdly, theoretical arguments to the effect that such a conception of formal semantics fits in a natural and fruitful way with pragmatic theories and facts. In passing, this conception will be shown to cast some new light on the old problems raised by liar and sorites paradoxes. (shrink)
It is often claimed that emotions are linked to formal objects. But what are formal objects? What roles do they play? According to some philosophers, formal objects are axiological properties which individuate emotions, make them intelligible and give their correctness conditions. In this paper, I evaluate these claims in order to answer the above questions. I first give reasons to doubt the thesis that formal objects individuate emotions. Second, I distinguish different ways in which emotions are (...) intelligible and argue that philosophers are wrong in claiming that emotions only make sense when they are based on prior sources of axiological information. Third, I investigate how issues of intelligibility connect with the correctness conditions of emotions. I defend a theory according to which emotions do not respond to axiological information, but to non-axiological reasons. According to this theory, we can allocate fundamental roles to the formal objects of emotions while dispensing with the problematic features of other theories. (shrink)
Nelson Goodman’s new riddle of induction forcefully illustrates a challenge that must be confronted by any adequate theory of inductive inference: provide some basis for choosing among alternative hypotheses that fit past data but make divergent predictions. One response to this challenge is to distinguish among alternatives by means of some epistemically significant characteristic beyond fit with the data. Statistical learning theory takes this approach by showing how a concept similar to Popper’s notion of degrees of testability is linked to (...) minimizing expected predictive error. In contrast, formal learning theory appeals to Ockham’s razor, which it justifies by reference to the goal of enhancing efficient convergence to the truth. In this essay, I show that, despite their differences, statistical and formal learning theory yield precisely the same result for a class of inductive problems that I call strongly VC ordered , of which Goodman’s riddle is just one example. (shrink)
In this paper we propose an approach to vagueness characterised by two features. The first one is philosophical: we move along a Kantian path emphasizing the knowing subject’s conceptual apparatus. The second one is formal: to face vagueness, and our philosophical view on it, we propose to use topology and formal topology. We show that the Kantian and the topological features joined together allow us an atypical, but promising, way of considering vagueness.
I defend the thesis that Kantian analytic judgments are about objects (as opposed to concepts) against two challenges raised by recent scholars. First, can it accommodate cases like “A two-sided polygon is two-sided”, where no object really falls under the subject-concept as Kant sees it? Second, is it compatible with Kant’s view that analytic judgments make no claims about objects in the world and that we can know them to be true without going beyond the given concepts? I address these (...) challenges in two steps. First, given Kant’s distinction between an object in general = x from an object of sensible intuition, I argue that analytic judgments are about objects in the former sense, no matter whether the purported objects can be given in our intuition. Second, using Kant’s method of representing certain logical relations of concepts with such figures as circles, I construct a model to show that analytic truths are truths about objects in general = x and yet can be determined solely by the intensional relation between the given concepts plus certain Kantian-logical laws. Analytic truths are thus shown as formal in the Kantian sense that they do not presuppose the purported objects as givable in our intuition. This account of the formality of analytic truths captures Kant’s diagnosis of the Leibnizian illusion that we can make material claims about the world by analytically true judgments. (shrink)
Truth is a fundamental objective of adjudicative processes; ideally, substantive as distinct from formal legal truth. But problems of evidence, for example, may frustrate finding of substantive truth; other values may lead to exclusions of probative evidence, e.g., for the sake of fairness. Jury nullification and jury equity. Limits of time, and definitiveness of decision, require allocation of burden of proof. Degree of truth-formality is variable within a system and across systems.
The issue of the relationship between formal and informal logic depends strongly on how one understands these two designations. While there is very little disagreement about the nature of formal logic, the same is not true regarding informal logic, which is understood in various (often incompatible) ways by various thinkers. After reviewing some of the more prominent conceptions of informal logic, I will present my own, defend it and then show how informal logic, so understood, is complementary to (...)formal logic. (shrink)
We consider the question: under what circumstances can the concept of adaptation be applied to groups, rather than individuals? Gardner and Grafen (2009, J. Evol. Biol.22: 659–671) develop a novel approach to this question, building on Grafen's ‘formal Darwinism’ project, which defines adaptation in terms of links between evolutionary dynamics and optimization. They conclude that only clonal groups, and to a lesser extent groups in which reproductive competition is repressed, can be considered as adaptive units. We re-examine the conditions (...) under which the selection–optimization links hold at the group level. We focus on an important distinction between two ways of understanding the links, which have different implications regarding group adaptationism. We show how the formal Darwinism approach can be reconciled with G.C. Williams’ famous analysis of group adaptation, and we consider the relationships between group adaptation, the Price equation approach to multi-level selection, and the alternative approach based on contextual analysis. (shrink)
The problem of concept representation is relevant for many sub-fields of cognitive research, including psychology and philosophy, as well as artificial intelligence. In particular, in recent years it has received a great deal of attention within the field of knowledge representation, due to its relevance for both knowledge engineering as well as ontology-based technologies. However, the notion of a concept itself turns out to be highly disputed and problematic. In our opinion, one of the causes of this state of affairs (...) is that the notion of a concept is, to some extent, heterogeneous, and encompasses different cognitive phenomena. This results in a strain between conflicting requirements, such as compositionality, on the one hand and the need to represent prototypical information on the other. In some ways artificial intelligence research shows traces of this situation. In this paper, we propose an analysis of this current state of affairs. Since it is our opinion that a mature methodology with which to approach knowledge representation and knowledge engineering should also take advantage of the empirical results of cognitive psychology concerning human abilities, we outline some proposals for concept representation in formal ontologies, which take into account suggestions from psychological research. Our basic assumption is that knowledge representation systems whose design takes into account evidence from experimental psychology (and which, therefore, are more similar to the human way of organizing and processing information) may therefore give better results in many applications (e.g. in the fields of information retrieval and semantic web). (shrink)
One of the most interesting and entertaining philosophical discussions of the last few decades is the discussion between Daniel Dennett and John Searle on the existence of intrinsic intentionality. Dennett denies the existence of phenomena with intrinsic intentionality. Searle, however, is convinced that some mental phenomena exhibit intrinsic intentionality. According to me, this discussion has been obscured by some serious misunderstandings with regard to the concept ‘intrinsic intentionality’. For instance, most philosophers fail to realize that it is possible that the (...) intentionality of a phenomenon is partly intrinsic and partly observer relative. Moreover, many philosophers are mixing up the concepts ‘original intentionality’ and ‘intrinsic intentionality’. In fact, there is, in the philosophical literature, no strict and unambiguous definition of the concept ‘intrinsic intentionality’. In this article, I will try to remedy this. I will also try to give strict and unambiguous definitions of the concepts ‘observer relative intentionality’, ‘original intentionality’, and ‘derived intentionality’. These definitions will be used for an examination of the intentionality of formal mathematical systems. In conclusion, I will make a comparison between the (intrinsic) intentionality of formal mathematical systems on the one hand, and the (intrinsic) intentionality of human beings on the other hand. (shrink)
As Vincent Hendricks remarks early on in this book, the formal and mainstream traditions of epistemic theorising have mostly evolved independently of each other. This initial impression is confirmed by a comparison of the main problems and methods practitioners in each tradition are concerned with. Mainstream epistemol- ogy engages in a dialectical game of proposing and challenging definitions of knowledge. Formal epistemologists proceed differently, as they design a wide variety of axiomatic and model-theoretic methods whose consequences they investigate (...) independently of the need of giving counterexample-free definitions of knowledge. Or at least, this is a common way to explain where both disciplines stand in the larger landscape of epistemic theorising, and why interactions between them remain scarce. The main ambition of this book is to show that the distinction between formal and mainstream approaches should not preclude a fruitful interaction, and that it only takes the right outlook on their respective practices to disclose plenty of room for interaction. (shrink)
Using Carnapâs concept explication, we propose a theory of concept formation in mathematics. This theory is then applied to the problem of how to understand the relation between the concepts formal proof (deduction) and informal, mathematical proof.
An aging population is often taken to require a profound reorganization of the prevailing health care system. In particular, a more cost-effective care system is warranted and ICT-based home care is often considered a promising alternative. Modern health care devices admit a transfer of patients with rather complex care needs from institutions to the home care setting. With care recipients set up with health monitoring technologies at home, spouses and children are likely to become involved in the caring process and (...) informal caregivers may have to assist kin-persons with advanced care needs by means of sophisticated technology. This paper investigates some of the ethical implications of a near-future shift from institutional care to technology-assisted home care and the subsequent impact on the care recipient and formal- and informal care providers. (shrink)
The purpose of this study was to extend the knowledge about why procedural justice (PJ) has behavioral implications within organizations. Since prior studies show that PJ leads to legitimacy, the author suggests that, when formal regulations are unfairly implemented, they lose their validity or efficacy (becoming deactivated even if they are formally still in force). This "rule deactivation," in turn, leads to two proposed destructive work behaviors, namely, workplace deviance and decreased citizenship behaviors (OCBs). The results support this mediating (...) role of PJD, thus suggesting that it forms part of the generative mechanism through which unfair procedures influence (un) ethical behavior within organizations. The author ends the article by discussing behavioral ethics and managerial implications as well as suggestions for future research. (shrink)
Replies to Kevin de Laplante’s ‘Certainty and Domain-Independence in the Sciences of Complexity’ (de Laplante, 1999), defending the thesis of J. Franklin, ‘The formal sciences discover the philosophers’ stone’, Studies in History and Philosophy of Science, 25 (1994), 513-33, that the sciences of complexity can combine certain knowledge with direct applicability to reality.
The methodological nonreductionism of contemporary biology opens an interesting discussion on the level of ontology and the philosophy of nature. The theory of emergence (EM), and downward causation (DC) in particular, bring a new set of arguments challenging not only methodological, but also ontological and causal reductionism. This argumentation provides a crucial philosophical foundation for the science/theology dialogue. However, a closer examination shows that proponents of EM do not present a unified and consistent definition of DC. Moreover, they find it (...) difficult to prove that higher-order properties can be causally significant without violating the causal laws that operate at lower physical levels. They also face the problem of circularity and incoherence in their explanation. In our article we show that these problems can be overcome only if DC is understood in terms of formal rather than physical (efficient) causality. This breakdown of causal monism in science opens a way to the retrieval of the fourfold Aristotelian notion of causality. (shrink)
The last fifty years have seen the creation of a number of new "formal" or "mathematical" sciences, or "sciences of complexity". Examples are operations research, theoretical computer science, information theory, descriptive statistics, mathematical ecology and control theory. Theorists of science have almost ignored them, despite the remarkable fact that (from the way the practitioners speak) they seem to have come upon the "philosophers' stone": a way of converting knowledge about the real world into certainty, merely by thinking.
Aristotle's logical and metaphysical works contain elements of three distinct types of formal theory: an ontology, a theory of consequences, and a theory of reasoning. His formal ontology (unlike that of certain later thinkers) does not require all propositions of a given logical form to be true. His formal syllogistic (unlike medieval theories of consequences) was guided primarily by a conception of logic as a theory of reasoning; and his fragmentary theory of consequences exists merely as an (...) adjunct to the syllogistic. When theories of consequences took centre stage in the Middle Ages, the original motivation for the theory of the syllogism was forgotten. (shrink)
This article reviews a number of different areas in the foundations of formal learning theory. After outlining the general framework for formal models of learning, the Bayesian approach to learning is summarized. This leads to a discussion of Solomonoff's Universal Prior Distribution for Bayesian learning. Gold's model of identification in the limit is also outlined. We next discuss a number of aspects of learning theory raised in contributed papers, related to both computational and representational complexity. The article concludes (...) with a description of how semi-supervised learning can be applied to the study of cognitive learning models. Throughout this overview, the specific points raised by our contributing authors are connected to the models and methods under review. (shrink)
This position paper advocates combining formal epistemology and the new paradigm psychology of reasoning in the studies of conditionals and reasoning with uncertainty. The new paradigm psychology of reasoning is characterized by the use of probability theory as a rationality framework instead of classical logic, used by more traditional approaches to the psychology of reasoning. This paper presents a new interdisciplinary research program which involves both formal and experimental work. To illustrate the program, the paper discusses recent work (...) on the paradoxes of the material conditional, nonmonotonic reasoning, and Adams’ Thesis. It also identifies the issue of updating on conditionals as an area which seems to call for a combined formal and empirical approach. (shrink)
In this paper, we discuss some formal properties of the model ofbidirectional Optimality Theory that was developed inBlutner (2000). We investigate the conditions under whichbidirectional optimization is a well-defined notion, and we give aconceptually simpler reformulation of Blutner's definition. In thesecond part of the paper, we show that bidirectional optimization can bemodeled by means of finite state techniques. There we rely heavily onthe related work of Frank and Satta (1998) about unidirectionaloptimization.
This paper addresses the theoretical notion of a game as it arisesacross scientific inquiries, exploring its uses as a technical andformal asset in logic and science versus an explanatory mechanism. Whilegames comprise a widely used method in a broad intellectual realm(including, but not limited to, philosophy, logic, mathematics,cognitive science, artificial intelligence, computation, linguistics,physics, economics), each discipline advocates its own methodology and aunified understanding is lacking. In the first part of this paper, anumber of game theories in formal studies are (...) critically surveyed. Inthe second part, the doctrine of games as explanations for logic isassessed, and the relevance of a conceptual analysis of games tocognition discussed. It is suggested that the notion of evolution playsa part in the game-theoretic concept of meaning. (shrink)
Formal dialectic has its roots in ancient dialectic. We can trace this influence in Charles Hamblin’s book on fallacies, in which he introduced his first formal dialectical systems. Earlier, Paul Lorenzen proposed systems of dialogical logic, which were in fact formal dialectical systems avant la lettre, with roles similar to those of the Greek Questioner and Answerer. In order to make a comparison between ancient dialectic and contemporary formal dialectic, I shall formalize part of the Aristotelian (...) procedure for Academic debates. The resulting system will be compared (1) with Van Eemeren and Grootendorst’s system of rules of Critical Discussion (the pragma-dialectical discussion procedure), which must, however, first itself be reconstructed as a formal dialectical system, and (2) with a Hamblin-type system, and (3) a Lorenzen-type system. When drawing comparisons, it will become clear that there is a line to be drawn from Aristotle to formal dialectic and pragma-dialectics, extending to contemporary computational models of argument. (shrink)
Since its inception in the work on fallacies of Charles Hamblin, formal dialectic has been the object of an unparalleled level of optimism concerning the potential of its analytical contribution to fallacy inquiry. This optimism has taken the form of a rapid proliferation of formal dialectical studies of arguments in general and fallacious arguments in particular under the auspices of theorists such as Jim Mackenzie and John Woods and Douglas Walton, to name but a few. Notwithstanding the interest (...) in, and the hopes for, a formal dialectical analysis of the fallacies, such an analysis, I will demonstrate subsequently, leads to much unintelligibility in fallacy inquiry. The context of my argument will be the philosophical views of Hilary Putnam, particularly Putnam's claim that when we theorise in relation to rationality, the unintelligibility of the conception of rationality to emerge from this theoretical process can be traced to the circumscription of rationality within this theoretical process. I charge formal dialectic with effecting a similar circumscription of argumentative rationality, a circumscription that, I will claim, is generative of unintelligibility in formal dialectical analyses of the fallacies. In this case, the context for my claims will be the formal dialectical analyses of Walton and Batten, Rescher, Hamblin, Mackenzie and Hintikka, primarily in relation to the petitio principii fallacy. My conclusion examines a number of the reasons, both historical and conceptual, which have made it seem that it is possible to fully circumscribe the notion of argumentative rationality. (shrink)
This paper introduces current acoustic theories relating to the phenomenology of sound as a framework for interrogating concepts relating to the ecologies of acoustic and landscape phenomena in a Japanese stroll garden. By applying the technique of Formal Concept Analysis, a partially ordered lattice of garden objects and attributes is visualized as a means to investigate the relationship between elements of the taxonomy.
We provide a syntax and a derivation system fora formal language of mathematics called Weak Type Theory (WTT). We give the metatheory of WTT and a number of illustrative examples.WTT is a refinement of de Bruijn''s Mathematical Vernacular (MV) and hence:– WTT is faithful to the mathematician''s language yet isformal and avoids ambiguities.
This paper aims to clarify and resolve issues surrounding the so-called formal linking problem in interpreting the Internally Headed Relative Clause construction in Korean and Japanese, a problem that has been identified in recent E-type pronominal treatments of the construction (e.g., Hoshi, K. (1995). Structural and interpretive aspects of head-internal and head-external relative clauses. PhD dissertation, University of Rochester; Shimoyama, J. (2001). Wh-constructions in Japanese. PhD dissertation, University of Massachusetts at Amherst). In the literature, this problem refers to the (...) difficulty of capturing the delimited semantic variability of the E-type pronoun present in the embedding clause of the construction. I show that the E-type pronoun at issue is subject to a different licensing condition from a typical E-type pronoun and therefore presents a different linking problem. More specifically, it requires that the embedded clause describe a state of its antecedent and its descriptive content be supplied by a salient property represented in the logical form of the embedded clause. I propose an event-based semantic analysis that derives the effects of this novel generalization by establishing a binding relation between the event structure of the embedded clause and the denotation of the E-type pronoun. (shrink)
In the formal semantics based on modern type theories, common nouns are interpreted as types, rather than as predicates of entities as in Montague’s semantics. This brings about important advantages in linguistic interpretations but also leads to a limitation of expressive power because there are fewer operations on types as compared with those on predicates. The theory of coercive subtyping adequately extends the modern type theories and, as shown in this paper, plays a very useful role in making type (...) theories more expressive for formal semantics. It not only gives a satisfactory solution to the basic problem of ‘multiple categorisation’ caused by interpreting common nouns as types, but provides a powerful formal framework to model interesting linguistic phenomena such as copredication, whose formal treatment has been found difficult in a Montagovian setting. In particular, we show how to formally introduce dot-types in a type theory with coercive subtyping and study some type-theoretic constructs that provide useful representational tools for reference transfers and multiple word meanings in formal lexical semantics. (shrink)
This paper proposes a reformulation of the treatment of boundaries, at parts and aggregates of entities in Basic Formal Ontology. These are currently treated as mutually exclusive, which is inadequate for biological representation since some entities may simultaneously be at parts, boundaries and/or aggregates. We introduce functions which map entities to their boundaries, at parts or aggregations. We make use of time, space and spacetime projection functions which, along the way, allow us to develop a simple temporal theory.
El texto aborda una de las cuestiones metodológicas fundamentales que Heidegger se planteaba en las Frühe Freiburger Vorlesungen (1919-1923), a saber, el problema de la indicación formal. En efecto, si la vida misma (Dasein) es un acontecimiento de sentido cerrado en sí mismo es necesario establecer un punto de vista que exprese conceptualmente la vida sin objetivarla. El gran problema con el que Heidegger se enfrenta es encontrar un metalenguaje no objetivante. El concepto de indicación formal es lo (...) que le permite elaborar un discurso sobre el origen. (shrink)
I defend a conception of Logic as normative for the sort of activities in which inferences super-vene, namely, reasoning and arguing. Toulmin’s criticism of formal logic will be our framework to shape the idea that in order to make sense of Logic as normative, we should con-ceive it as a discipline devoted to the layout of arguments, understood as the representations of the semantic, truth relevant, properties of the inferences that we make in arguing and reason-ing.
The paper first introduces a cube of opposition that associates the traditional square of opposition with the dual square obtained by Piaget’s reciprocation. It is then pointed out that Blanché’s extension of the square-of-opposition structure into an conceptual hexagonal structure always relies on an abstract tripartition. Considering quadripartitions leads to organize the 16 binary connectives into a regular tetrahedron. Lastly, the cube of opposition, once interpreted in modal terms, is shown to account for a recent generalization of formal concept (...) analysis, where noticeable hexagons are also laid bare. This generalization of formal concept analysis is motivated by a parallel with bipolar possibility theory. The latter, albeit graded, is indeed based on four graded set functions that can be organized in a similar structure. (shrink)
After critiquing the arguments against using formal logic to teach critical thinking, this paper argues that for theoretical, practical, and empirical reasons, instruction in the fundamentals of formal logic is essential for critical thinking, and so should be included in every class that purports to teach critical thinking.
Systems of formal dialectics articulate methods of conflict resolution. To this end they provide norms to regulate verbal exchanges between the Proponent of a thesis and an Opponent. These regulated exchanges constitute what are known as formal discussions.One may ask what moves, if any, in formal discusions correspond to arguing for or against the thesis. It is claimed that certain moves of the Proponent's are properly designated as arguing for the thesis, and that certain moves of the (...) Opponent purport to criticize the tenability or the relevance of the reasons advanced. Thus the usefulness of formal dialectic systems as models for reasonable argument is vindicated.It is then proposed to make these systems more realistic by incorporating in them a norm of Creative Reasoning that removes the severe restrictions to which the Proponent's arguing was hitherto subject. As a consequence, a certain type of irrelevant reason is no longer automatically excluded. Therefore, it is proposed to extend the Opponent's rights to exert relevance criticism. The new dialectic systems are shown to be strategically equivalent to the original ones. Finally, it is stressed that the Opponent's criticism should not be designated as arguing against the thesis. The Opponent criticizes, but does not argue. (shrink)
Este artículo reúne los resultados de nuestra sistematización de las magnitudes físicas y de las expresiones matemáticas de los aparatos formales de la mecánica clásica, relativista y cuántica, revelando su estructura subyacente común. La sistematización se ha realizado ordenando en tablas algunas de las magnitudes físicas y expresiones matemáticas más conocidas según sus diferentes grados de derivación, abriendo paso, de este modo, a un criterio de fundamentalidad de las leyes, principios, postulados y ecuaciones de cada aparato formal.
The wider topic to which the content of this paper belongs is that of the relationship between formal logic and real argumentation. Of particular potential interest in this connection are held to be substantive arguments constructed by philosophers reputed equally as authorities in logical theory. A number of characteristics are tentatively indicated by the author as likely to be encountered in such arguments. The discussion centers afterwards, by way of specification, on a remarkable piece of argument quoted in Cicero’s (...) dialog On Divination and ascribed to Stoic thinkers. The Stoics’ formal theory of inference is summarily referred to in this context, with special emphasis on their basic deductive schemata (‘indemonstrables’), some of them recognizable as links in the overall structure of the quoted argument. The main lines of Cicero’s criticism of the Stoic argument are next commented upon, with emphasis on his implied view as to the requirements of a good argument. Towards the end of the paper, a few considerations are added on the changes in the prevailing style of argumentation conspicuous in the three famous Roman Stoics. (shrink)
The conventional approach to developing expert systems views the domain of application as being “formally defined”. This view often leads to practical problems when expert systems are built using this approach. This paper examines the implications and problems of the formal approach to expert system design and proposes an alternative approach based on the concept of semi-formal domains. This approach, which draws on the work of socio-technical information systems, provides guidelines which can be used for the design of (...) successful expert systems. (shrink)
Recently, a new problem has arisen for an Anscombean conception of intentional action. The claim is that the Anscombean’s emphasis on the formally causal character of practical knowledge precludes distinguishing between an aim and a merely foreseen side effect. I propose a solution to this problem: the difference between aim and side effect should be understood in terms of the familiar Anscombean distinction between acting intentionally and the intention with which one acts. I also argue that this solution has advantages (...) over an alternative that has already been endorsed in the literature: it is a better fit for the Anscombean theory, and it naturally accommodates intuitions about the moral significance of aiming vs. merely foreseeing. (shrink)
The term ‘formal ontology’ was first used by the philosopher Edmund Husserl in his Logical Investigations to signify the study of those formal structures and relations – above all relations of part and whole – which are exemplified in the subject-matters of the different material sciences. We follow Husserl in presenting the basic concepts of formal ontology as falling into three groups: the theory of part and whole, the theory of dependence, and the theory of boundary, continuity (...) and contact. These basic concepts are presented in relation to the problem of providing an account of the formal ontology of the mesoscopic realm of everyday experience, and specifically of providing an account of the concept of individual substance. (shrink)
Process modeling is ubiquitous in business and industry. While a great deal of effort has been devoted to the formal and philosophical investigation of processes, surprisingly little research connects this work to real world process modeling. The purpose of this paper is to begin making such a connection. To do so, we ﬁrst develop a simple mathematical model of activities and their instances based upon the model theory for the NIST Process Speciﬁcation Language (PSL), a simple language for describing (...) these entities, and a semantics for the latter in terms of the former, and a set of axioms for the semantics based upon the NIST Process Speciﬁcation Language (PSL). On the basis of this foundation, we then develop a general notion of a process model, and an account of what it is for such a model to be realized by a collection of events. (shrink)
While the classical account of the linear continuum takes it to be a totality of points, which are its ultimate parts, Aristotle conceives of it as continuous and infinitely divisible, without ultimate parts. A formal account of this conception can be given employing a theory of quantification for nonatomic domains and a theory of region-based topology.
One of the tasks of ontology in information science is to support the classification of entities according to their kinds and qualities. We hold that to realize this task as far as entities such as material objects are concerned we need to distinguish four kinds of entities: substance particulars, quality particulars, substance universals, and quality universals. These form, so to speak, an ontological square. We present a formal theory of classification based on this idea, including both a semantics for (...) the theory and a provably sound axiomatization. (shrink)
This unique book presents a comprehensive and rigorous treatment of the theory of computability which is introductory yet self-contained. It takes a novel approach by looking at the subject using computation models rather than a limitation orientation, and is the first book of its kind to include software. Accompanying software simulations of almost all computational models are available for use in conjunction with the text, and numerous examples are provided on disk in a user-friendly format. Its applications to computer science (...) itself include interesting links to programming language theory, compiler design theory, and algorithm design. The software, numerous examples, and solutions make this book ideal for self-study by computer scientists and mathematicians alike. (shrink)
During the 1990s, the Government of Peru began to aggressivelyprivatize agriculture. The government stopped loaning money to farmers' cooperatives and closed the government rice-buying company. The government even rented out most of its researchstations and many senior scientists lost their jobs. As part of this trend, the government eliminated its seed certification agency. Instead, private seed certification committees were set up with USAID funding and technical advise from a US university. The committees were supposed to become self-financing (bycertifying seed grown (...) by small seed producers) and each committee was supposed to encourage the development of a group of small seed-producing firms, clustered around the seedcertification agency. The amazing thing is that many of the seed committees actually accomplished these goals. The agronomists who staffed the committees stood by their jobs,even after US funding ended, even though the committees' income was (at best) modest, and occasionally under the threat of violence from the extreme left. Some seed certificationcommittees failed and others did not. Some of the problems with Peruvian agricultural liberalization can be seen in regard to the seed programs of maize, rice, potatoes, and beans. For example, the government abandoned most research, yet could not resist creating certain distortions in the seed market (e.g.,buying large amounts of seed and distributing them for political ends). (shrink)