Adaptive logics typically pertain to reasoning procedures for which there is no positive test. In , we presented a tableaumethod for two inconsistency-adaptive logics. In the present paper, we describe these methods and present several ways to increase their efficiency. This culminates in a dynamic marking procedure that indicates which branches have to be extended first, and thus guides one towards a decision — the conclusion follows or does not follow — in a very economical way.
We present tableau systems and sequent calculi for the intuitionistic analoguesIK, ID, IT, IKB, IKDB, IB, IK4, IKD4, IS4, IKB4, IK5, IKD5, IK45, IKD45 andIS5 of the normal classical modal logics. We provide soundness and completeness theorems with respect to the models of intuitionistic logic enriched by a modal accessibility relation, as proposed by G. Fischer Servi. We then show the disjunction property forIK, ID, IT, IKB, IKDB, IB, IK4, IKD4, IS4, IKB4, IK5, IK45 andIS5. We also investigate the (...) relationship of these logics with some other intuitionistic modal logics proposed in the literature. (shrink)
The propositional fragment L 1 of Leniewski's ontology is the smallest class (of formulas) containing besides all the instances of tautology the formulas of the forms: (a, b) (a, a), (a, b) (b,). (a, c) and (a, b) (b, c). (b, a) being closed under detachment. The purpose of this paper is to furnish another more constructive proof than that given earlier by one of us for: Theorem A is provable in L 1 iff TA is a thesis of first-order (...) predicate logic with equality, where T is a translation of the formulas of L 1 into those of first-order predicate logic with equality such that T(a, b) = FblxFax (Russeltian-type definite description), TA B = TA TB, T A = TA, etc. (shrink)
A concept language with role intersection and number restriction is defined and its modal equivalent is provided. The main reasoning tasks of satisfiability and subsumption checking are formulated in terms of modal logic and an algorithm for their solution is provided. An axiomatization for a restricted graded modal language with intersection of modalities (the modal counterpart of the concept language we examine)is given and used in the proposed algorithm.
We present an extension of the mosaic method aimed at capturing many-dimensional modal logics. As a proof-of-concept, we define the method for logics arising from the combination of linear tense operators with an “orthogonal” S5-like modality. We show that the existence of a model for a given set of formulas is equivalent to the existence of a suitable set of partial models, called mosaics, and apply the technique not only in obtaining a proof of decidability and a proof (...) of completeness for the corresponding Hilbert-style axiomatization, but also in the development of a mosaic-based tableau system. We further consider extensions for dealing with the case when interactions between the two dimensions exist, thus covering a wide class of bundled Ockhamist branching-time logics, and present for them some partial results, such as a non-analytic version of the tableau system. (shrink)
ABSTRACT: For the Stoics, a syllogism is a formally valid argument; the primary function of their syllogistic is to establish such formal validity. Stoic syllogistic is a system of formal logic that relies on two types of argumental rules: (i) 5 rules (the accounts of the indemonstrables) which determine whether any given argument is an indemonstrable argument, i.e. an elementary syllogism the validity of which is not in need of further demonstration; (ii) one unary and three binary argumental rules which (...) establish the formal validity of non-indemonstrable arguments by analysing them in one or more steps into one or more indemonstrable arguments (cut type rules and antilogism). The function of these rules is to reduce given non-indemonstrable arguments to indemonstrable syllogisms. Moreover, the Stoic method of deduction differs from standard modern ones in that the direction is reversed (similar to tableau methods). The Stoic system may hence be called an argumental reductive system of deduction. In this paper, a reconstruction of this system of logic is presented, and similarities to relevance logic are pointed out. (shrink)
ABSTRACT In this paper we define two logics, KLn and BLn, and present tableau-based decision procedures for both. KLn is a temporal logic of knowledge. Thus, in addition to the usual connectives of linear discrete temporal logic, it contains a set of unary modal connectives for representing the knowledge possessed by agents. The logic BLn is somewhat similar; it is a temporal logic that contains connectives for representing the beliefs of agents. In addition to a complete formal definition of (...) the two logics and their decision procedures, the paper includes a brief review of their applications in AI and mainstream computer science, correctness proofs for the decision procedures, a number of worked examples illustrating the decision procedures, and some pointers to further work. (shrink)
The ancient Greek method of analysis has a rational reconstruction in the form of the tableaumethod of logical proof. This reconstruction shows that the format of analysis was largely determined by the requirement that proofs could be formulated by reference to geometrical figures. In problematic analysis, it has to be assumed not only that the theorem to be proved is true, but also that it is known. This means using epistemic logic, where instantiations of variables are (...) typically allowed only with respect to known objects. This requirement explains the preoccupation of Greek geometers with questions as to which geometrical objects are ?given?, that is, known or ?data?, as in the title of Euclid's eponymous book. In problematic analysis, constructions had to rely on objects that are known only hypothetically. This seems strange unless one relies on a robust idea of ?unknown? objects in the same sense as the unknowns of algebra. The Greeks did not have such a concept, which made their grasp of the analytic method shaky. (shrink)
We give sound and complete tableau and sequent calculi for the prepositional normal modal logics S4.04, K4B and G 0(these logics are the smallest normal modal logics containing K and the schemata A A, A A and A ( A); A A and AA; A A and ((A A) A) A resp.) with the following properties: the calculi for S4.04 and G 0are cut-free and have the interpolation property, the calculus for K4B contains a restricted version of the cut-rule, (...) the so-called analytical cut-rule.In addition we show that G 0is not compact (and therefore not canonical), and we proof with the tableau-method that G 0is characterised by the class of all finite, (transitive) trees of degenerate or simple clusters of worlds; therefore G 0is decidable and also characterised by the class of all frames for G 0. (shrink)
A proof method for automation of reasoning in a paraconsistent logic, the calculus C1* of da Costa, is presented. The method is analytical, using a specially designed tableau system. Actually two tableau systems were created. A first one, with a small number of rules in order to be mathematically convenient, is used to prove the soundness and the completeness of the method. The other one, which is equivalent to the former, is a system of derived (...) rules designed to enhance computational efficiency. A prototype based on this second system was effectively implemented. (shrink)
We define a tableau calculus for the logic of only knowing and knowing at most ON, which is an extension of Levesque's logic of only knowing O. The method is based on the possible-world semantics of the logic ON, and can be considered as an extension of known tableau calculi for modal logic K45. From the technical viewpoint, the main features of such an extension are the explicit representation of "unreachable" worlds in the tableau, and an (...) additional branch closure condition implementing the property that each world must be either reachable or unreachable. The calculus allows for establishing the computational complexity of reasoning about only knowing and knowing at most. Moreover, we prove that the method matches the worst-case complexity lower bound of the satisfiability problem for both ON and O. With respect to , in which the tableau calculus was originally presented, in this paper we both provide a formal proof of soundness and completeness of the calculus, and prove the complexity results for the logic ON. (shrink)
The paper is devoted to an approach to analytic tableaux for propositional logic, but can be successfully extended to other logics. The distinguishing features of the presented approach are:(i) a precise set-theoretical description of tableaumethod; (ii) a notion of tableau consequence relation is defined without help of a notion of tableau, in our universe of discourse the basic notion is a branch;(iii) we define a tableau as a finite set of some chosen branches which (...) is enough to check; hence, in our approach a tableau is only a way of choosing a minimal set of closed branches;(iv) a choice of tableau can be arbitrary, it means that if one tableau starting with some set of premisses is closed in the defined sense, then every branch in the power set of the set of formulas, that starts with the same set, is closed. (shrink)
The modern notion of the axiomatic method developed as a part of the conceptualization of mathematics starting in the nineteenth century. The basic idea of the method is the capture of a class of structures as the models of an axiomatic system. The mathematical study of such classes of structures is not exhausted by the derivation of theorems from the axioms but includes normally the metatheory of the axiom system. This conception of axiomatization satisfies the crucial requirement that (...) the derivation of theorems from axioms does not produce new information in the usual sense of the term called depth information. It can produce new information in a different sense of information called surface information. It is argued in this paper that the derivation should be based on a model-theoretical relation of logical consequence rather than derivability by means of mechanical (recursive) rules. Likewise completeness must be understood by reference to a model-theoretical consequence relation. A correctly understood notion of axiomatization does not apply to purely logical theories. In the latter the only relevant kind of axiomatization amounts to recursive enumeration of logical truths. First-order “axiomatic” set theories are not genuine axiomatizations. The main reason is that their models are structures of particulars, not of sets. Axiomatization cannot usually be motivated epistemologically, but it is related to the idea of explanation. (shrink)
This article presents an interview method which enables us to bring a person, who may not even have been trained, to become aware of his or her subjective experience, and describe it with great precision. It is focused on the difficulties of becoming aware of one’s subjective experience and describing it, and on the processes used by this interview technique to overcome each of these difficulties. The article ends with a discussion of the criteria governing the validity of the (...) descriptions obtained, and then with a brief review of the functions of these descriptions. (shrink)
In this paper, I argue that the method of cases (namely, the method of using intuitive judgments elicited by intuition pumps as evidence for and/or against philosophical theories) is not a reliable method of generating evidence for and/or against philosophical theories. In other words, the method of cases is unlikely to generate accurate judgments more often than not. This is so because, if perception and intuition are analogous in epistemically relevant respects, then using intuition pumps to (...) elicit intuitive judgments is like using illusions to elicit perceptual judgments. In both cases, judgments are made under bad epistemic circumstances. (shrink)
Kant maintains that his Critique of Pure Reason follows a “synthetic method” which he distinguishes from the analytic method of the Prolegomena by saying that the Critique “rests on no other science” and “takes nothing as given except reason itself”. The paper presents an account of the synthetic method of the Critique, showing how it is related to Kant’s conception of the Critique as the “science of an a priori judging reason”. Moreover, the author suggests, understanding its (...) synthetic method sheds light on the structure of the Transcendental Deduction, and its function in the work as a whole. (shrink)
Wittgenstein’s interpreters are undivided that the method plays a central role in his philosophy. This would be no surprise if we have in mind the Tractarian dictum: “philosophy is not a body of doctrine but an activity” (4.112). After 1929, Wittgenstein’s method evolved further. In its final form, articulated in Philosophical Investigations, it was formulated as different kinds of therapies of specific philosophical problems that torment our life (§§ 133, 255, 593). In this paper we follow the changes (...) in Wittgenstein’s thinking in four subsequent phases and in three dimensions: (i) in logic and ontology; (ii) in method proper; (iii) in style. (shrink)
In this paper I discuss a set of problems concerning the method of cases as it is used in applied ethics and in the metaphysical debate about personal identity. These problems stem from research in social psychology concerning our access to the data with which the method operates. I argue that the issues facing ethics are more worrying than those facing metaphysics.
Progress in the last few decades in what is widely known as “Chaos Theory” has plainly advanced understanding in the several sciences it has been applied to. But the manner in which such progress has been achieved raises important questions about scientific method and, indeed, about the very objectives and character of science. In this presentation, I hope to engage my audience in a discussion of several of these important new topics.
This paper offers an evolutionary account of chronic pain. Chronic pain is a maladaptive by-product of pain mechanisms and neural plasticity, both of which are highly adaptive. This account shows how evolutionary psychology can be integrated with Flanagan's natural method, and in a way that avoids the usual charges of panglossian adaptationism and an uncritical commitment to a modular picture of the mind. Evolutionary psychology is most promising when it adopts a bottom-up research strategy that focuses on basic affective (...) and motivational systems (as opposed to higher cognitive functions) that are phylogenetically deep. (shrink)
In this paper I examine a controversy ongoingwithin current Deweyan philosophy of educationscholarship regarding the proper role and scopeof science in Dewey's concept of inquiry. Theside I take is nuanced. It is one that issensitive to the importance that Dewey attachesto science as the best method of solvingproblems, while also sensitive to thosestatements in Dewey that counter a wholesalereductivism of inquiry to scientific method. Iutilize Dewey's statements regarding the placeaccorded to inquiry in aesthetic experiences ascharacteristic of his (...) class='Hi'>method, as bestconceived. (shrink)
This paper reconsiders the relation between Kantian transcendental reflection (including transcendental idealism) and 20th century philosophy of science. As has been pointed out by Michael Friedman and others, the notion of a "relativized a priori" played a central role in Rudolf Carnap's, Hans Reichenbach's and other logical empiricists' thought. Thus, even though the logical empiricists dispensed with Kantian synthetic a priori judgments, they did maintain a crucial Kantian doctrine, viz., a distinction between the (transcendental) level of establishing norms for empirical (...) inquiry and the (empirical) level of norm-governed inquiry itself. Even though Thomas Kuhn's theory of scientific revolutions is often taken to be diametrically opposed to the received view of science inherited from logical empiricism, a version of this basically Kantian distinction is preserved in Kuhn's thought. In this respect, as Friedman has argued, Kuhn is closer to Carnap's theory of linguistic frameworks than, say, W.V. Quine's holistic naturalism. Kuhn, indeed, might be described as a "new Kant" in post-empiricist philosophy of science. This article examines, first, the relativization of the Kantian a priori in Reichenbach's work, arguing that while Reichenbach (after having given up his original Kantianism) criticized "transcendentalism", he nevertheless retained, in a reinterpreted form, a Kantian-like transcendental method, claiming that the task of philosophy (of science) is to discover and analyze the presuppositions underlying the applicability of conceptual systems. Then, some reflections on Kuhn's views on realism are offered, and it is suggested that Kuhn (as well as some other influential contributors to the realism debate, such as Hilary Putnam) can be reinterpreted as a (relativized, naturalized) Kantian transcendental idealist. Given the central importance of Kuhnian themes in contemporary philosophy of science, it is no exaggeration to claim that Kantian transcendental inquiry into the constitutive principles of empirical knowledge, and even transcendental idealism (as the framework for such inquiry), still have a crucial role to play in this field and deserve further scrutiny. (shrink)
This essay focuses on the extent to which the methods of analytic philosophy can be useful to feminist philosophers. I pose nine general questions feminist philosophers might ask to determine the suitability of a philosophical method. Examples include: Do its typical ways of formulating problems or issues encourage the inclusion of a wide variety of women's points of view? Are its central concepts gender-biased, not merely in their origin, but in very deep, continuing ways? Does it facilitate uncovering roles (...) that gender, politics, power, and social context play in philosophy as well as in other facets of life? (shrink)
The project method became a famous teaching method when William Heard Kilpatrick published his article ‘Project Method’ in 1918. The key idea in Kilpatrick's project method is to try to explain how pupils learn things when they work in projects toward different common objects. The same idea of pupils learning by work or action in an environment with objects also belongs to John Dewey's problem-solving method. Are Kilpatrick's project method and Dewey's problem-solving method (...) the same thing? The aim of this article is to analyze and prove that Kilpatrick's project method differs radically from Dewey's problem-solving method. (shrink)
The argument diagramming method developed by Monroe C. Beardsley in his (1950) book Practical Logic, which has since become the gold standard for diagramming arguments in informal logic, makes it possible to map the relation between premises and conclusions of a chain of reasoning in relatively complex ways. The method has since been adapted and developed in a number of directions by many contemporary informal logicians and argumentation theorists. It has proved useful in practical applications and especially pedagogically (...) in teaching basic logic and critical reasoning skills at all levels of scientific education. I propose in this essay to build on Beardsley diagramming techniques to refine and supplement their structural tools for visualizing logical relationships in a number of categories not originally accommodated by Beardsley diagramming, including circular reasoning, reductio ad absurdum arguments, and efforts to dispute and contradict arguments, with applications and analysis. (shrink)
Both Plato and Kant devote much attention and care to deliberating about their method of philosophizing. And, interestingly, both seek to expand and explain their view of philosophical method by one selfsame strategy: explaining the contrast between rational procedure in mathematics and in philosophy. Plato and Kant agree on a fundamental point of philosophical method that is at odds with the mathematico-demonstrative methodology of philosophy found in Spinoza and present in Christian Wolff. Both reject the axiomatic approach (...) with its insistence on fundamental truths postulated from the outset. Both alike insist that philosophizing—unlike mathematics—is an exercise in theorizing where the questions of basicness and foundations come into view only after the inquiry has gone on for a long, long time—and certainly not at its start. (shrink)
The companion piece to this article, “Situating Moral Justification,” challenges the idea that moral epistemology's mission is to establish a single, all-purpose reasoning strategy for moral justification because no reasoning practice can be expected to deliver authoritative moral conclusions in all social contexts. The present article argues that rethinking the mission of moral epistemology requires rethinking its method as well. Philosophers cannot learn which reasoning practices are suitable to use in particular contexts exclusively by exploring logical relations among concepts. (...) Instead, in order to understand which reasoning practices are capable of justifying moral claims in different types of contexts, we need to study empirically the relationships between reasoning practices and the contexts in which they are used. The article proposes that philosophers investigate case studies of real-world moral disputes in which people lack shared cultural assumptions and/or are unequal in social power. It motivates and explains the proposed case study method and illustrates the philosophical value of this method through a case study. (shrink)
In this article, I propose that illness is philosophically revealing and can be used to explore human experience. I suggest that illness is a limit case of embodied experience. By pushing embodied experience to its limit, illness sheds light on normal experience, revealing its ordinary and thus overlooked structure. Illness produces a distancing effect, which allows us to observe normal human behavior and cognition via their pathological counterpart. I suggest that these characteristics warrant illness a philosophical role that has not (...) been articulated. Illness can be used as a philosophical tool for the study of normally tacit aspects of human existence. I argue that illness itself can be integral to philosophical method, insofar as it facilitates a distancing from everyday practices. This method relies on pathological or limit cases to illuminate normally overlooked aspects of human perception and action. I offer Merleau-Ponty’s analysis of the case of Schneider as an example of this method. (shrink)
This paper contributes to the principled construction of tableau-based decision procedures for hybrid logic with global, difference, and converse modalities. We also consider reflexive and transitive relations. For converse-free formulas we present a terminating control that does not rely on the usual chain-based blocking scheme. Our tableau systems are based on a new model existence theorem.
The origins of field guides and other plant identification manuals have been poorly understood until now because little attention has been paid to 18th century botanical identification guides. Identification manuals came to have the format we continue to use today when botanical instructors in post-Revolutionary France combined identification keys (step-wise analyses focusing on distinctions between plants) with the "natural method" (clustering of similar plants, allowing for identification by gestalt) and alphabetical indexes. Botanical works featuring multiple but linked techniques to (...) enable plant identification became very popular in France by the first decade of the 19th century. British botanists, however, continued to use Linnaeus's sexual system almost exclusively for another two decades. Their reluctance to use other methods or systems of classification Can be attributed to a culture suspicious of innovation, anti-French sentiment and the association of all things Linnaean with English national pride, fostered in particular by the President of the Linnean Society of London, Sir James Edward Smith. The British aversion to using multiple plant identification technologies in one text also helps explain why it took so long for English botanists to adopt the natural method, even after several Englishmen had tried to introduce it to their country. Historians of ornithology emphasize that the popularity of ornithological guides in the 19th and 20th centuries stems from their illustrations, illustrations made possible by printing technologies that improved illustration quality and reduced costs. Though illustrations are the most obvious features of late 19th century and 20th century guides, the organizational principles that make them functional as identification devices come from techniques developed in botanical works in the 18th century. (shrink)
Because imagination constitutes an indispensable tool of phenomenology, e.g., in understanding another author’s description, in eidetic reduction, etc., the practicability of phenomenological method and its claim to objectivity ought to be reconsidered with regard to its dependence on imagination. Auditory imagery serves to illustrate problems involved in grasping and analyzing imaginative contents – loudness in this case. Similar to phonetic segmentation and classification, phenomenologists segment and classify mental acts and contents. Just as phoneticians rely on experts’ evaluations of notations (...) to reach valid results, phenomenologists may try to develop similar agreement procedures to escape the ‘subjectivism’ of their solitary first-person approach. (shrink)
To find exact traveling wave solutions to nonlinear evolution equations, we propose a method combining symmetry properties with trial polynomial solution to nonlinear ordinary differential equations. By the method, we obtain some exact traveling wave solutions to the Burgers-KdV equations and a kind of reaction-diffusion equations with high order nonlinear terms. As a result, we prove that the Burgers-KdV equation does not have the real solution in the form a 0+a 1tan ξ+a 2tan 2 ξ, which indicates that (...) some types of the solutions to the Burgers-KdV equation are very limited, that is, there exists no new solution to the Burgers-KdV equation if the degree of the corresponding polynomial increases. For the second equation, we obtain some new solutions. In particular, some interesting structures in those solutions maybe imply some physical meanings. Finally, we discuss some classifications of the reaction-diffusion equations which can be solved by trial equation method. (shrink)
More and more organisations formulate a code of conduct in order to stimulate responsible behaviour among their members. Much time and energy is usually spent fixing the content of the code but many organisations get stuck in the challenge of implementing and maintaining the code. The code then turns into nothing else than the notorious "paper in the drawer", without achieving its aims. The challenge of implementation is to utilize the dynamics which have emerged from the formulation of the code. (...) This will support a continuous process of reflection on the central values and standards contained in the code. This paper presents an assessment method, based on the EFQM model, which intends to support this implementation process. (shrink)
This paper introduces the likelihood method for decision under uncertainty. The method allows the quantitative determination of subjective beliefs or decision weights without invoking additional separability conditions, and generalizes the Savage–de Finetti betting method. It is applied to a number of popular models for decision under uncertainty. In each case, preference foundations result from the requirement that no inconsistencies are to be revealed by the version of the likelihood method appropriate for the model considered. A unified (...) treatment of subjective decision weights results for most of the decision models popular today. Savage’s derivation of subjective expected utility can now be generalized and simplified. In addition to the intuitive and empirical contributions of the likelihood method, we provide a number of technical contributions: We generalize Savage’s nonatomiticy condition (“P6”) and his assumption of (sigma) algebras of events, while fully maintaining his flexibility regarding the outcome set. Derivations of Choquet expected utility and probabilistic sophistication are generalized and simplified similarly. The likelihood method also reveals a common intuition underlying many other conditions for uncertainty, such as definitions of ambiguity aversion and pessimism. (shrink)
The Socratic method has a long history in teaching philosophy and mathematics, marked by such names as Karl Weierstra, Leonard Nelson and Gustav Heckmann. Its basic idea is to encourage the participants of a learning group (of pupils, students, or practitioners) to work on a conceptual, ethical or psychological problem by their own collective intellectual effort, without a textual basis and without substantial help from the teacher whose part it is mainly to enforce the rigid procedural rules designed to (...) ensure a fruitful, diversified, open and consensus-oriented thought process. Several features of the Socratic procedure, especially in the canonical form given to it by Heckmann, are highly attractive for the teaching of medical ethics in small groups: the strategy of starting from relevant singular individual experiences, interpreting and cautiously generalizing them in a process of inter-subjective confrontation and confirmation, the duty of non-directivity on the part of the teacher in regard to the contents of the discussion, the necessity, on the part of the participants, to make explicit both their own thinking and the way they understand the thought of others, the strict separation of content level and meta level discussion and, not least, the wise use made of the emotional and motivational resources developing in the group process. Experience shows, however, that the canonical form of the Socratic group suffers from a number of drawbacks which may be overcome by loosening the rigidity of some of the rules. These concern mainly the injunction against substantial interventions on the part of the teacher and the insistence on consensus formation rooted in Leonard Nelson's Neo-Kantian Apriorism. (shrink)
The main claim of this paper is that the method outlined and used in Aristotle’s Ethics is an appropriate and credible one to use in bioethics. Here “appropriate” means that the method is capable of establishing claims and developing concepts in bioethics and “credible” that the method has some plausibility, it is not open to obvious and immediate objection. It begins by suggesting why this claim matters and then gives a brief outline of Aristotle’s method. The (...) main argument is made in three stages. First, it is argued that Aristotelian method is credible because it compares favourably with alternatives. In this section it is shown that Aristotelian method is not vulnerable to criticisms that are made both of methods that give a primary place to moral theory (such as utilitarianism) and those that eschew moral theory (such as casuistry and social science approaches). As such, it compares favourably with these other approaches that are vulnerable to at least some of these criticisms. Second, the appropriateness of Aristotelian method is indicated through outlining how it would deal with a particular case. Finally, it is argued that the success of Aristotle’s philosophy is suggestive of both the credibility and appropriateness of his method. (shrink)
Springer link: http://www.springer.com/philosophy/logic+and+philosophy+of+language/book/978-94-007-6090-5 -/- This volume examines the limitations of mathematical logic and proposes a new approach to logic intended to overcome them. To this end, the book compares mathematical logic with earlier views of logic, both in the ancient and in the modern age, including those of Plato, Aristotle, Bacon, Descartes, Leibniz, and Kant. From the comparison it is apparent that a basic limitation of mathematical logic is that it narrows down the scope of logic confining it to the (...) study of deduction, without providing tools for discovering anything new. As a result, mathematical logic has had little impact on scientific practice. -/- Therefore, this volume proposes a view of logic according to which logic is intended, first of all, to provide rules of discovery, that is, non-deductive rules for finding hypotheses to solve problems. This is essential if logic is to play any relevant role in mathematics, science and even philosophy. To comply with this view of logic, this volume formulates several rules of discovery, such as induction, analogy, generalization, specialization, metaphor, metonymy, definition, and diagrams. A logic based on such rules is basically a logic of discovery, and involves a new view of the relation of logic to evolution, language, reason, method and knowledge, particularly mathematical knowledge. It also involves a new view of the relation of philosophy to knowledge. This book puts forward such new views, trying to open again many doors that the founding fathers of mathematical logic had closed historically. (shrink)
This article reflects on Truth and Method , the seminal work of Hans Georg Gadamer. The main argument developed here justifies why the work has become a classic in the philosophical literature. Further arguments survey the thematic aspects that make up the book and the importance that Truth and Method grants to humanism as a horizon from which the status of the humanities and humanistic knowledge is justified. The article also presents a smooth approach to the main categories (...) of Gadamer's hermeneutics and is an open invitation to re-read this classic. Resumen En este artículo se reflexiona sobre “ Verdad y Método ”, la obra fundamental de Hans Georg Gadamer. El argumento principal desarrollado justifica por qué la obra ha llegado a ser un clásico de la literatura filosófica. Los argumentos secundarios se detienen en los aspectos temáticos que componen el libro y en la importancia que “ Verdad y Método ” concede al humanismo como horizonte desde el cual se justifica el status de las humanidades y los saberes humanísticos. El artículo presenta también una fluida aproximación a las principales categorías de la hermenéutica gadameriana y es una invitación abierta a la relectura de este clásico. (shrink)
This article considers the reception of British cytogeneticist C.D. Darlington's controversial 1932 book, Recent Advances in Cytology. Darlington's cytogenetic work, and the manner in which he made it relevant to evolutionary biology, marked an abrupt shift in the status and role of cytology in the life sciences. By focusing on Darlington's scientific method -- a stark departure from anti-theoretical, empirical reasoning to a theoretical and speculative approach based on deduction from genetic first principles -- the article characterises the relationships (...) defining the "disciplinary landscape" of the life sciences of the time, namely those between cytology, genetics, and evolutionary theory. (shrink)
In this paper, I ponder the question of whether Socrates follows a method of investigation — the method of hypothesis — which he advocates in Plato's Phaedo. The evidence in the dialogue suggests that he does not follow the method, which raises additional questions: If he fails to do so, why does he articulate the method? Does his statement of method affect his actions or is it mainly forgotten? Although Socrates is a fictional character, his (...) actions in the Phaedo suggests questions about the function of espoused methods in actual situations. (shrink)
We demonstrate an automated, multi-level method to segment white matter brain lesions and apply it to lupus. The method makes use of local morphometric features based on multiple MR sequences, including T1-weighted, T2-weighted, and Fluid Attenuated Inversion Recovery. After preprocessing, including co-registration, brain extraction, bias correction, and intensity standardization, 49 features are calculated for each brain voxel based on local morphometry. At each level of segmentation a supervised classifier takes advantage of a different subset of the features to (...) conservatively segment lesion voxels, passing on more difficult voxels to the next classifier. This multi-level approach allows for a fast lesion classification method with tunable trade-offs between sensitivity and specificity producing accuracy comparable to a human rater. (shrink)
The problem of method is the problem of knowledge itself. As it is known, method (méthodos) also means way (odós). Philosophy is at the same time a form of knowledge and a technique of argumentation. In this second meaning, philosophy is mainly an art of both logical and rhetorical word. Consequently a profane voice reveals itself in the signs of philosophic knowledge, which on the one hand establishes the “logistic” (logistiké) soul, on the other the totalizing view of (...) truth. Then the philosophic voice requires purification of word and writing from any sensible compromising and identification of truth with the ultrasensible panoramic view. This process is reflected in philosophic writing, which is the origin and the foundation of today's world of science and technology. Scientific and technical knowledge is therefore concentrated on the question of method and loses any reference to knowledge as a way as well as a sensible relationship with things. Hence the crisis of meaning of knowledge nowadays. Deep understanding of such a crisis requires first of all recognition of the fact that the act of thinking is wider than the art of logistic and rhetoric word established by philosophy. In the second place, it requires that a hermeneutic writing of the event of truth is reached, as a more original writing than traditional logical transcription of voice. (shrink)
The paper applies to approval voting, under which the voter casts a ballot by casting one vote for each of k candidates, wherek=;1,2, ? , m-1 and there are m candidates. I assume (following Brams and Fishburn) that each of the voter's 2=;-2 strategies is equally likely to be chosen. Election-outcome types include: the m-way tie;(m-1) -way ties with the runner-up trailing by 1,2,?,m votes; (m-2)-way ties, and so on. The frequency distribution of outcome types varies only with m and (...) n and is necessary to the calculation of the expected utilities of successive ballots cast, in the same election, by a voter under a variant of approval voting. This variant allows the voter to cast several complete ballots provided that he pays the respective prices, which could reasonably be based on the expected utilities. The paper describes a shortcut method of calculating the distribution of outcome types when m=;4 andn rises to levels that make straightforward calculation computationally infeasible. The shortcut involves the combining of an outcome type, instead of each member of that type, with each of the 14 strategies available to the incremental voter. In going fromn-1 to n, for n=3, the number of outcome types increases by a factor of (n+3)/n whereas, the number of combinations of strategies increases by a factor of 14. (shrink)
I argue that the most significant contribution and legacy of Gordon Kaufman's work rests in his theological method. I limit my discussion to his methodological starting point, his concept of human nature, as he develops it in his book, In Face of Mystery. I show the relevance of this starting point for cultural and theological criticism by arguing two points: first, that this starting point embraces religious and cultural pluralism at its center, providing a framework for intercultural and interreligious (...) discussion and cooperation, and second, that Kaufman's interpretation of religion that emerges out of this starting point embodies pragmatic criteria for evaluating and reconstructing alternative cultural and religious worldviews, so that they may function more adequately within the changing contexts of life. (shrink)
The advance of science and human knowledge is impeded by misunderstandings of various statistics, insufficient reporting of findings, and the use of numerous standardized and non-standardized presentations of essentially identical information. Communication with journalists and the public is hindered by the failure to present statistics that are easy for non-scientists to interpret as well as by use of the word significant, which in scientific English does not carry the meaning of "important" or "large." This article promotes a new standard (...) class='Hi'>method for reporting two-group and two-variable statistics that can enhance the presentation of relevant information, increase understanding of findings, and replace the current presentations of two-group ANOVA, t-tests, correlations, chi-squares, and z-tests of proportions. A brief call to highly restrict the publication of risk ratios, odds ratios, and relative increase in risk percentages is also made, since these statistics appear to provide no useful scientific information regarding the magnitude of findings. (shrink)
Drawing on the intellectual tradition of the leading comparative political science scholar, Giovanni Sartori, the contributors examine the theoretical and methodological basis of: Concept Analysis, Comparative Political Analysis and Qualitative Methods.