La intensión del presente trabajo es la de explorar la relación medios-fines exponiendo que la misma se presenta en la obra de John Dewey como camino y horizonte coherente y consistente de su propuesta filosófica y educativa. La relación medios-fines dirige la investigación llevándonos a una comprensión del conocimiento como practica evaluativa y transformadora. Esta concepción epistemológica sienta las bases para repensar la educación y la democracia y brinda elementos suficientes para superar las dicotomías pensamiento - emoción, hechos - valores (...) y sociedad-individuo. El método utilizado es la investigación. La educación y la democracia son los espacios en los cuales la relación medios-fines, ya sea en su fase natural o adaptativa, ya sea en su fase moral o critica, ya sea en su fase social o publica, encuentra unidad y continuidad. En la educación de, por y para la democracia la inteligencia y la actividad se interrelacionan mutuamente. (shrink)
John Dewey’s philosophy has the notion of experience as backbone. The following paper regards experience as a method for doing philosophy, and a compromise grounding the entire philosophical project made by Dewey. Experience allows connecting cognition and affection in a natural way. Smart researching and arts creationappreciation are two modes to guide the progress of experience to a deeper level of meaning and from its own resources. The ideal way (the general description) of making part of the experience –which is (...) argued, smart and aesthetic– becomes stronger in the ideal of democracy. For this reason, Dewey’s philosophy of the experience works as a manner of interpreting democratic ideals. (shrink)
:A good deal of contemporary moral nonconsequentialism assumes that agents have perfect knowledge about the various features and consequences of their options. This assumption is unrealistic. More often than not, moral agents can only assess with a certain degree of probability the factual circumstances that are morally relevant for their decision making. My aim in this essay is to discuss the problem of moral decisions under risk from the point of view of nonconsequentialism. Basically, I analyze how objective moral principles (...) can be transformed into subjective, decisional prescriptions, and argue that the standard nonconsequentialist approach to moral decision making, which focuses on probability thresholds, is wrong. In accordance with the fundamental postulates of nonconsequentialism, I seek to solve the problem of risk in moral choice by proposing a theory about the marginal moral value of various options. Actions can vary along various dimensions, and each of these dimensions can offer a different moral value function. Nonconsequentialist marginalism can level the playing field with consequentialism. Whereas consequentialism can simply borrow the notion of expected utility from economics, nonconsequentialism must introduce the notion of expectational obligation to formulate a general principle of moral choice under risk. I finally suggest that further empirical work is needed to delineate the shape of various moral value functions that are critical for applying the general principle of moral decision making under risk to well-known cases. View HTML Send article to KindleTo send this article to your Kindle, first ensure no-reply@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about sending to your Kindle. Find out more about sending to your Kindle. Note you can select to send to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be sent to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply. Find out more about the Kindle Personal Document Service.DECISIONAL NONCONSEQUENTIALISM AND THE RISK SENSITIVITY OF OBLIGATIONVolume 32, Issue 2Horacio Spector DOI: https://doi.org/10.1017/S0265052516000121Your Kindle email address Please provide your Kindle email.@free.kindle.com@kindle.com Available formats PDF Please select a format to send. By using this service, you agree that you will only keep articles for personal use, and will not openly distribute them via Dropbox, Google Drive or other file sharing services. Please confirm that you accept the terms of use. Cancel Send ×Send article to Dropbox To send this article to your Dropbox account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about sending content to Dropbox. DECISIONAL NONCONSEQUENTIALISM AND THE RISK SENSITIVITY OF OBLIGATIONVolume 32, Issue 2Horacio Spector DOI: https://doi.org/10.1017/S0265052516000121Available formats PDF Please select a format to send. By using this service, you agree that you will only keep articles for personal use, and will not openly distribute them via Dropbox, Google Drive or other file sharing services. Please confirm that you accept the terms of use. Cancel Send ×Send article to Google Drive To send this article to your Google Drive account, please select one or more formats and confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about sending content to Google Drive. DECISIONAL NONCONSEQUENTIALISM AND THE RISK SENSITIVITY OF OBLIGATIONVolume 32, Issue 2Horacio Spector DOI: https://doi.org/10.1017/S0265052516000121Available formats PDF Please select a format to send. By using this service, you agree that you will only keep articles for personal use, and will not openly distribute them via Dropbox, Google Drive or other file sharing services. Please confirm that you accept the terms of use. Cancel Send ×Export citation Request permission. (shrink)
How to accept a conditional? F. P. Ramsey proposed the following test in (Ramsey 1990).(RT) If A, then B must be accepted with respect to the current epistemic state iff the minimal hypothetical change of it needed to accept A also requires accepting B.
The paper studies first order extensions of classical systems of modal logic (see (Chellas, 1980, part III)). We focus on the role of the Barcan formulas. It is shown that these formulas correspond to fundamental properties of neighborhood frames. The results have interesting applications in epistemic logic. In particular we suggest that the proposed models can be used in order to study monadic operators of probability (Kyburg, 1990) and likelihood (Halpern-Rabin, 1987).
The didactics of astronomy is a relatively young field with respect to that of other sciences. Historical issues have most often been part of the teaching of astronomy, although that often does not stem from a specific didactics. The teaching of astronomy is often subsumed under that of physics. One can easily consider that, from an educational standpoint, astronomy requires the same mathematical or physical strategies. This approach may be adequate in many cases but cannot stand as a general principle (...) for the teaching of astronomy. This chapter offers in a first part a brief overview of the status of astronomy education research and of the role of the history and philosophy of science (HPS) in astronomy education. In a second part, it attempts to illustrate possible ways to structure the teaching of astronomy around its historical development so as to pursue a quality education and contextualized learning. (shrink)
A Computable Universe is a collection of papers discussing computation in nature and the nature of computation, a compilation of the views of the pioneers in the contemporary area of intellectual inquiry focused on computational and informational theories of the world. This volume is the definitive source of informational/computational views of the world, and of cutting-edge models of the universe, both digital and quantum, discussed from a philosophical perspective as well as in the greatest technical detail. The book discusses the (...) foundations of computation in relation to nature. It focuses on two main questions: What is computation? How does nature compute? The contributors are world-renowned experts who have helped shape a cutting-edge computational understanding of the universe. They discuss computation in the world from a variety of perspectives, ranging from foundational concepts to pragmatic models to ontological conceptions and their philosophical implications. The volume provides a state-of-the-art collection of technical papers and non-technical essays representing a field that takes information and computation to be key to understanding and explaining the basic structure underpinning physical reality. It also includes a new edition of Konrad Zuse's "Calculating Space", and a panel discussion transcription on the topic, featuring worldwide experts (including a Nobel prize) in quantum mechanics, physics, cognition, computation and algorithmic complexity.A Computable Universe is a collection of papers discussing computation in nature and the nature of computation, a compilation of the views of the pioneers in the contemporary area of intellectual inquiry focused on computational and informational theories of the world. This volume is the definitive source of informational/computational views of the world, and of cutting-edge models of the universe, both digital and quantum, discussed from a philosophical perspective as well as in the greatest technical detail. The book discusses the foundations of computation in relation to nature. It focuses on two main questions: What is computation? How does nature compute? The contributors are world-renowned experts who have helped shape a cutting-edge computational understanding of the universe. They discuss computation in the world from a variety of perspectives, ranging from foundational concepts to pragmatic models to ontological conceptions and their philosophical implications. The volume provides a state-of-the-art collection of technical papers and non-technical essays representing a field that takes information and computation to be key to understanding and explaining the basic structure underpinning physical reality. It also includes a new edition of Konrad Zuse's "Calculating Space", and a panel discussion transcription on the topic, featuring worldwide experts (including a Nobel prize) in quantum mechanics, physics, cognition, computation and algorithmic complexity. (shrink)
How to accept a conditional? F. P. Ramsey proposed the following test in . 'If A, then B' must be accepted with respect to the current epistemic state iff the minimal hypothetical change of it needed to accept A also requires accepting B. In this article we propose a formulation of , which unlike some of its predecessors, is compatible with our best theory of belief revision, the so-called AGM theory , chapters 1-5 for a survey). The new test, which, (...) we claim, encodes some of the crucial insights defended by F. P. Ramsey in , is used to study the conditionals epistemically validated by the AGM postulates. Our notion of validity is compared with the notion of negative validity used by Gärdenfors in . It is observed that the notions of PV and NV will in general differ and that when these differences arise it is the notion of PV that is preferable. Finally we compare our formulation of the Ramsey test with a previous formulation offered by Gärdenfors . We show that any attempt to interpret as delivering acceptance conditions for Ramsey's conditionals is doomed to failure. (shrink)
The article focuses on representing different forms of non-adjunctive inference as sub-Kripkean systems of classical modal logic, where the inference from □A and □B to □A ∧ B fails. In particular we prove a completeness result showing that the modal system that Schotch and Jennings derive from a form of non-adjunctive inference in (Schotch and Jennings, 1980) is a classical system strictly stronger than EMN and weaker than K (following the notation for classical modalities presented in Chellas, 1980). The unified (...) semantical characterization in terms of neighborhoods permits comparisons between different forms of non-adjunctive inference. For example, we show that the non-adjunctive logic proposed in (Schotch and Jennings, 1980) is not adequate in general for representing the logic of high probability operators. An alternative interpretation of the forcing relation of Schotch and Jennings is derived from the proposed unified semantics and utilized in order to propose a more fine-grained measure of epistemic coherence than the one presented in (Schotch and Jennings, 1980). Finally we propose a syntactic translation of the purely implicative part of Jaśkowski's system D₂ into a classical system preserving all the theorems (and non-theorems) explicilty mentioned in (Jaśkowski, 1969). The translation method can be used in order to develop epistemic semantics for a larger class of non-adjunctive (discursive) logics than the ones historically investigated by Jaśkowski. (shrink)
Contemporary political philosophers discuss the idea of freedom in terms of two distinctions: Berlin's famous distinction between negative and positive liberty, and Skinner and Pettit's divide between liberal and republican liberty. In this essay I proceed to recast the debate by showing that there are two strands in liberalism, Hobbesian and Lockean, and that the latter inherited its conception of civil liberty from republican thought. I also argue that the contemporary debate on freedom lacks a perspicuous account of the various (...) conceptions of freedom, mainly because it leaves aside the classic contrast between natural liberty and civil liberty. Once we consider both the negative/positive distinction and the natural/civil one, we can classify all conceptions of freedom within four basic irreducible categories. In light of the resulting framework I show that there are two distinct conceptions of republican liberty, natural and civil, and that the former is coupled with an ideal of individual self-control. (shrink)
We have now provided an overall simple theoretical account of the structure of perceptual experience proto-philosophically examined in Part I. The next task is to find the proper logical machinery to formulatte those accounts rigorously.
One of the main applications of the logic of theory change is to the epistemic analysis of conditionals via the so-called Ramsey test. In the first part of the present note this test is studied in the “limiting case” where the theory being revised is inconsistent, and it is shown that this case manifests an intrinsic incompatibility between the Ramsey test and the AGM postulate of “success”. The paper then analyses the use of the postulate of success, and a weakening (...) of it, generating axioms of conditional logic via the test, and it is shown that for certain purposes both success and weak success are quite superfluous. This suggests the proposal of abandoning both success and weak success entirely, thus permitting retention of the postulate of “preservation” discarded by Gärdenfors. (shrink)
The anti- Humean proposal of constructing desire as belief about what would be good must be abandoned on pain of triviality. Our central result shows that if an agent's belief- desire state is represented by Jeffrey's expected value theory enriched with the Desire as Belief Thesis (DAB), then, provided that three pairwise inconsistent propositions receive non- zero probability, the agent must view with indifference any proposition whose probability is greater than zero. Unlike previous results against DAB our Opinionation or Indifference (...) Theorem is a purely synchronic one that depends in no way of the properties of Jeffrey conditionalization. (shrink)
Therapy for metaphysics -- Concepts, rules, and the spirit of recognition -- Meaning and meanings -- Reference and presence -- Truth and correspondence -- Emancipating theology.
Pain is a biological and subjective phenomenon. Clear understanding of its features is essential. Wierzbicka’s analysis accomplishes this. This comment discusses the relevance of her approach for the study of early evolution of medicine. The comment has six parts: (a) Wierzbicka’s theory and method; (b) its application to pain; (c) relevance of pain for the study of ethnomedicine, the cultural understanding of sickness and healing; (d) significance of natural semantic metalanguage (NSM) for understanding the evolution of human thought and behavior; (...) (e) relevance of NSM for studying biological and cultural evolution of early medicine; and (f) summary and conclusion. (shrink)
An important trend in contemporary epistemology centers on elaborating an old idea of pragmatist pedigree: theory selection (and in general the process of changing view and fixing beliefs) presupposes epistemic values. This article focuses on analyzing the case where epistemic values are indeterminate or when the sources of valuation are multiple (epistemic values like coherence and simplicity need not order options in compatible ways). According to the theory that thus arises epistemic alternatives need not be fully ordered by an underlying (...) notion of information-value and therefore the usual economic techniques of optimization cannot be applied in order to compute optimal contractions. But in cases of this sort it is still rational to maximize, i.e. to deem an option as choosable when it is not known to be worse that any other. We present here basic results about a notion of liberal contraction based on maximizing quasi-orderings. This requires the previous solution of some open problems in the theory of rational choice functions, namely a full characterization of choice functions rationalizable in terms of maximization of quasi-transitive relations. We conclude by discussing the problem of what is the adequate feasible set for calculating maximizing solutions for contraction problems and by considering the epistemological roots of some counterexamples against the most fundamental axioms on choice functions (like α). While the first part of the paper shows how economic insights can be used to improve our understanding of the principles of belief formation and change, this final section reverses this strategy by showing the utility of epistemological insights and techniques for providing invariance conditions capable of regulating the applicability of the pure principles of choice. (shrink)
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at http://www.jstor.org/about/terms.html. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you may use content in the JSTOR archive only for your personal, non-commercial use.
Die zweckmäßige Einheit der Dinge, nach der die Ordnung in der Welt so angesehen wird, als ob sie aus der Absicht eines vernünftigen Höchstwesens entstanden wäre, ist für Kant nur die höchste formale Einheit unseres Erkenntnisvermögens. Die Voraussetzung einer Intelligenz als der Ursache des Weltganzen ist aber nur ein heuristisches Prinzip, den besonderen Gesetzen der Natur nachzuforschen. Im Element des Subjekt-Objekt-Unterschieds ist die für Hegel implizite Unendlichkeit der Zweckmäßigkeit nicht begreifbar. Nur im logischen Raum der Vernünftigkeit als Identität der Bestimmtheit (...) und des Seins kann die wahrhafte Bestimmung der Teleologie zum Ausdruck kommen. Diese Bestimmung ist, dass die Welt als ein systematisches Ganzes gemäß der Analogie der Lebewesen nicht mehr in der Weise der unreflektierten Selbständigkeit ihrer verschiedenen Komponenten vorgestellt wird, sondern dass der rein ideelle Charakter dieser Komponenten erst im logischen Raum des Begreifens zum Vorschein kommt. (shrink)
In this paper, I propose a new nonconceptual reading of the B-Deduction. As Hanna correctly remarks :399–415, 2011: 405), the word “cognition” has in both editions of the first Critique a wide sense, meaning nonconceptual cognition, and a narrow meaning, in Kant’s own words “an objective perception”. To be sure, Kant assumes the first meaning to account for why the Deduction is unavoidable. And if we take this meaning as a premise of the B-Deduction, then there is a gap in (...) the argument since the categories are certainly not conditions for non-conceptual cognition. Still, I believe it is not this wide meaning but rather the narrow one that figures in any premise of the B-Deduction. Thus, in the reading that I am proposing, categories are not conditions for representing something, or even conditions for representing something objectively. Instead, they are conditions for the recognition that what we represent through the senses exists mind-independently. In the first step of the B-Deduction, this cognition in the narrow sense takes the form of the propositional thinking that the nonconceptually represented object of the sensible intuition exists objectively. In contrast, in the second step of the B-Deduction, this cognition in the narrow sense takes the form of the apprehension of what our human senses represent nonconceptually as existing objectively. (shrink)
This is a presentation about joint work between Hector Zenil and Jean-Paul Delahaye. Zenil presents Experimental Algorithmic Theory as Algorithmic Information Theory and NKS, put together in a mixer. Algorithmic Complexity Theory defines the algorithmic complexity k(s) as the length of the shortest program that produces s. But since finding this short program is in general an undecidable question, the only way to approach k(s) is to use compression algorithms. He shows how to use the Compress function in Mathematica to (...) give an idea about the compressibility of various sequences. However, the idea of applying a compression algorithm breaks down for very short sequences. This is true not only for the Compress function, but also for any other compression algorithm. Zenil's approach is to construct a metric of algorithmic complexity for short sequences from scratch. He defines the algorithmic probability as the probability that an arbitrary program produces a sequence. The basic idea is to run a whole class of computational devices such as Turing Machines or Cellular Automata, and compute the distributions of the sequences they generate. Zenil presents a comparison of frequency distributions of sequences generated by 2-state 3-color Turing Machines and 2-color radius 1 Cellular Automata. He also compared these distributions to distributions found in data from the real world, and found that not only there is correlation across different systems, but also that the distributions are rather stable, and the difference between the distributions in abstract systems and real-world data can be attributed to noise. In his paper Zenil elaborates on the nature of the noise he has encountered. Zenil conjectures that the correlation distances between different systems decreases with a larger number of steps, and converge in the infinite limit case. (shrink)