This category needs an editor. We encourage you to help if you are qualified.
Volunteer, or read more about what this involves.
Related categories
Siblings:
73 found
Search inside:
(import / add options)   Sort by:
1 — 50 / 73
  1. Pieter Adriaans (2010). A Critical Analysis of Floridi’s Theory of Semantic Information. Knowledge, Technology and Policy 23 (1-2):41-56.
    n various publications over the past years, Floridi has developed a theory of semantic information as well-formed, meaningful, and truthful data. This theory is more or less orthogonal to the standard entropy-based notions of information known from physics, information theory, and computer science that all define the amount of information in a certain system as a scalar value without any direct semantic implication. In this context the question rises what the exact relation between these various conceptions of information is and (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  2. Rudolf Arnheim (1959). Information Theory: An Introductory Note. Journal of Aesthetics and Art Criticism 17 (4):501-503.
    Remove from this list | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  3. Massimiliano Badino (forthcoming). Bridging Conceptual Gaps: The Kolmogorov-Sinai Entropy. Isonomia.
  4. Massimiliano Badino (2004). An Application of Information Theory to the Problem of the Scientific Experiment. Synthese 140 (3):355 - 389.
    There are two basic approaches to the problem of induction:the empirical one, which deems that the possibility of induction depends on how theworld was made (and how it works) and the logical one, which considers the formation(and function) of language. The first is closer to being useful for induction, whilethe second is more rigorous and clearer. The purpose of this paper is to create an empiricalapproach to induction that contains the same formal exactitude as the logical approach.This requires: (a) that (...)
    Remove from this list | Direct download (8 more)  
     
    My bibliography  
     
    Export citation  
  5. Yehoshua Bar-Hillel (1955). An Examination of Information Theory. Philosophy of Science 22 (2):86-105.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  6. Joseph E. Brenner (2010). A Logic of Ethical Information. Knowledge, Technology and Policy 23 (1-2):109-133.
    The work of Luciano Floridi lies at the interface of philosophy, information science and technology, and ethics, an intersection whose existence and significance he was one of the first to establish. His closely related concepts of a philosophy of information (PI), informational structural realism, information logic (IL), and information ethics (IE) provide a new ontological perspective from which moral concerns can be addressed, especially but not limited to those arising in connection with the new information and communication technologies. In this (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  7. Soren Brier (2001). Ecosemiotics and Cybersemiotics. Sign Systems Studies 29 (1):107-119.
    The article develops a suggestion of how cybersemiotics is pertinent to ecosemiotics. Cybersemiotics uses Luhmann's triadic view of autopoietic systems (biological, psychological, and socio-communicative autopoiesis) and adopts his approach to communication within a biosemiotic framework. The following levels of exosemiosis and signification can be identified under the consideration of nonintentional signs, cybernetics, and information theory: (1) the socio-communicative level of self-conscious signification and language games. (2) the instinctual and species specific level of sign stimuli signifying through innate release response mechanism (...)
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  8. Léon Brillouin (1956/2004). Science and Information Theory. Dover Publications.
    A classic source for understanding the connections between information theory and physics, this text was written by one of the giants of 20th-century physics and is appropriate for upper-level undergraduates and graduate students. Topics include the principles of coding, coding problems and solutions, the analysis of signals, a summary of thermodynamics, thermal agitation and Brownian motion, and thermal noise in an electric circuit. A discussion of the negentropy principle of information introduces the author's renowned examination of Maxwell's demon. Concluding chapters (...)
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  9. Julio A. Camargo (2008). Revisiting the Relation Between Species Diversity and Information Theory. Acta Biotheoretica 56 (4).
    The Shannon information function (H) has been extensively used in ecology as a statistic of species diversity. Yet, the use of Shannon diversity index has also been criticized, mainly because of its ambiguous ecological interpretation and because of its relatively great sensitivity to the relative abundances of species in the community. In my opinion, the major shortcoming of the traditional perspective (on the possible relation of species diversity with information theory) is that species need for an external receiver (the scientist (...)
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  10. G. J. Chaitin, How to Run Algorithmic Information Theory on a Computer.
    Hi everybody! It's a great pleasure for me to be back here at the new, improved Santa Fe Institute in this spectacular location. I guess this is my fourth visit and it's always very stimulating, so I'm always very happy to visit you guys. I'd like to tell you what I've been up to lately. First of all, let me say what algorithmic information theory is good for, before telling you about the new version of it I've got.
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  11. C. T. K. Chari (1963). Time Reversal, Information Theory, and "World-Geometry". Journal of Philosophy 60 (20):579-583.
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  12. Rob Clifton (2002). The Subtleties of Entanglement and its Role in Quantum Information Theory. Proceedings of the Philosophy of Science Association 2002 (3):S150-S167.
    My aim in this paper is a modest one. I do not have any particular thesis to advance about the nature of entanglement, nor can I claim novelty for any of the material I shall discuss. My aim is simply to raise some questions about entanglement that spring naturally from certain developments in quantum information theory and are, I believe, worthy of serious consideration by philosophers of science. The main topics I discuss are different manifestations of quantum nonlocality, entanglement-assisted communication, (...)
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  13. Timothy Colburn & Gary Shute (2010). Abstraction, Law, and Freedom in Computer Science. Metaphilosophy 41 (3):345-364.
    Abstract: Laws of computer science are prescriptive in nature but can have descriptive analogs in the physical sciences. Here, we describe a law of conservation of information in network programming, and various laws of computational motion (invariants) for programming in general, along with their pedagogical utility. Invariants specify constraints on objects in abstract computational worlds, so we describe language and data abstraction employed by software developers and compare them to Floridi's concept of levels of abstraction. We also consider Floridi's structural (...)
    Remove from this list | Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  14. John Collier, Information Theory as a General Language for Functional Systems.
    Function refers to a broad family of concepts of varying abstractness and range of application, from a many-one mathematical relation of great generality to, for example, highly specialized roles of designed elements in complex machines such as degaussing in a television set, or contributory processes to control mechanisms in complex metabolic pathways, such as the inhibitory function of the appropriate part of the lac-operon on the production of lactase through its action on the genome in the absence of lactose. We (...)
    Remove from this list |
     
    My bibliography  
     
    Export citation  
  15. Jeff Coulter (1995). The Informed Neuron: Issues in the Use of Information Theory in the Behavioral Sciences. [REVIEW] Minds and Machines 5 (4):583-96.
    The concept of “information” is virtually ubiquitous in contemporary cognitive science. It is claimed to be “processed” (in cognitivist theories of perception and comprehension), “stored” (in cognitivist theories of memory and recognition), and otherwise manipulated and transformed by the human central nervous system. Fred Dretske's extensive philosophical defense of a theory of informational content (“semantic” information) based upon the Shannon-Weaver formal theory of information is subjected to critical scrutiny. A major difficulty is identified in Dretske's equivocations in the use of (...)
    Remove from this list | Direct download (8 more)  
     
    My bibliography  
     
    Export citation  
  16. C. D'Antonl & P. Scanzano (1980). An Application of Information Theory: Longitudinal Measurability Bounds in Classical and Quantum Physics. [REVIEW] Foundations of Physics 10 (11-12):875-885.
    We examine the problem of the existence (in classical and/or quantum physics) of longitudinal limitations of measurability, defined as limitations preventing the measurement of a given quantity with arbitrarily high accuracy. We consider a measuring device as a generalized communication system, which enables us to use methods of information theory. As a direct consequence of the Shannon theorem on channel capacity, we obtain an inequality which limits the accuracy of a measurement in terms of the average power necessary to transmit (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  17. Edmund J. Dehnert (1967). The Theory of Games, Information Theory, and Value Criteria. Journal of Value Inquiry 1 (2):124-131.
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  18. Kevin C. Desouza & Tobin Hensgen (2002). On "Information" in Organizations: An Emergent Information Theory and Semiotic Framework. Emergence 4 (3):95-114.
    (2002). On 'Information' in Organizations: An Emergent Information Theory and Semiotic Framework. Emergence: Vol. 4, No. 3, pp. 95-114. doi: 10.1207/S15327000EM0403-07.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  19. Joseph T. Devlin, Matt H. Davis, Stuart A. McLelland & Richard P. Russell (2000). Efficiency, Information Theory, and Neural Representations. Behavioral and Brain Sciences 23 (4):475-476.
    We contend that if efficiency and reliability are important factors in neural information processing then distributed, not localist, representations are “evolution's best bet.” We note that distributed codes are the most efficient method for representing information, and that this efficiency minimizes metabolic costs, providing adaptive advantage to an organism.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  20. Gordana Dodig Crnkovic & Susan Stuart (eds.) (2007). Computation, Information, Cognition: The Nexus and the Liminal. Cambridge Scholars Press.
    Written by world-leading experts, this book draws together a number of important strands in contemporary approaches to the philosophical and scientific questions that emerge when dealing with the issues of computing, information, cognition and the conceptual issues that arise at their intersections. It discovers and develops the connections at the borders and in the interstices of disciplines and debates. This volume presents a range of essays that deal with the currently vigorous concerns of the philosophy of information, ontology creation and (...)
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  21. Reinhard Eckhorn (1997). Support for Grouping-by-Synchronization, the Context-Field, and its Mechanisms, but Doubt in the Use of Information Theory by the Cortex. Behavioral and Brain Sciences 20 (4):686-687.
    Our work supports synchronization for binding within Phillips & Singer's “contextual field” (CF) as well as the type of its lateral interaction they propose. Both firmly agree with our “association field” (AF) and its modulatory influences (Eckhorn et al. 1990). However, the CF connections seem to produce anticorrelation among assemblies representing unrelated structures, whereas experimental evidence indicates decoupling. Finally, it is unclear how the cortex can have access to the logistic function used in the “coherent infomax” approach.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  22. Adam Elga, Algorithmic Information Theory: The Basics.
    Turing machine An idealized computing device attached to a tape, each square of which is capable of holding a symbol. We write a program (a nite binary string) on the tape, and start the machine. If the machine halts with string o written at a designated place on the tape.
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  23. David Ellerman, A Short Note on the Logico-Conceptual Foundations of Information Theory in Partition Logic.
    A new logic of partitions has been developed that is dual to ordinary logic when the latter is interpreted as the logic of subsets of a fixed universe rather than the logic of propositions. For a finite universe, the logic of subsets gave rise to finite probability theory by assigning to each subset its relative size as a probability. The analogous construction for the dual logic of partitions gives rise to a notion of logical entropy that is precisely related to (...)
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  24. David Ellerman (2009). Counting Distinctions: On the Conceptual Foundations of Shannon's Information Theory. Synthese 168 (1):119 - 149.
    Categorical logic has shown that modern logic is essentially the logic of subsets (or “subobjects”). In “subset logic,” predicates are modeled as subsets of a universe and a predicate applies to an individual if the individual is in the subset. Partitions are dual to subsets so there is a dual logic of partitions where a “distinction” [an ordered pair of distinct elements (u, u′) from the universe U] is dual to an “element”. A predicate modeled by a partition π on (...)
    Remove from this list | Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  25. Wesley Elsberry & Jeffrey Shallit (2011). Information Theory, Evolutionary Computation, and Dembski's "Complex Specified Information". Synthese 178 (2):237 - 270.
    Intelligent design advocate William Dembski has introduced a measure of information called "complex specified information", or CSI. He claims that CSI is a reliable marker of design by intelligent agents. He puts forth a "Law of Conservation of Information" which states that chance and natural laws are incapable of generating CSI. In particular, CSI cannot be generated by evolutionary computation. Dembski asserts that CSI is present in intelligent causes and in the flagellum of Escherichia coli, and concludes that neither have (...)
    Remove from this list | Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  26. Don Fallis (2011). Floridi on Disinformation. Etica and Politica / Ethics and Politics (2):201-214.
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  27. Amiel Feinstein (1958). Foundations of Information Theory. New York, Mcgraw-Hill.
    Remove from this list |
     
    My bibliography  
     
    Export citation  
  28. Philippe Gagnon (2013). "Que reste-t-il de la théologie à l'âge électronique ? Valeur et cybernétique axiologique chez Raymond Ruyer" [What is left of Theology in the Electronic Age? Value and Axiological Cybernetics in Raymond Ruyer]. In Chromatikon IX: Annales de la philosophie en procès — Yearbook of Philosophy in Process, M. Weber & V. Berne (Eds.). 93-120.
    This is the outline: Introduction — La question de la cybernétique et de l'information — Une « pensée du milieu » — Cybernétique et homologie — Une théorie de l'apprentissage — L'information vue de l'autre côté — Champ et domaine unitaire — La thèse des « autres-je » — Le passage par l'axiologie — La rétroaction vraie — L'ontologie de Ruyer — Le bruissement de l'être même.
    Remove from this list |
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  29. Philippe Gagnon (2010). “What We Have Learnt From Systems Theory About the Things That Nature’s Understanding Achieves”. In Dirk Evers, Antje Jackelén & Taede Smedes (eds.), How do we Know? Understanding in Science and Theology. Forum Scientiarum.
    The problem of knowledge has been centred around the study of the content of our consciousness, seeing the world through internal representation, without any satisfactory account of the operations of nature that would be a pre-condition for our own performances in terms of concept efficiency in organizing action externally. If we want to better understand where and how meaning fits in nature, we have to find the proper way to decipher its organization, and account for the fact that we have (...)
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  30. Philippe Gagnon (2002). La Théologie de la Nature Et la Science à l'Ère de L'Information. Cerf.
    The history of the relationship between Christian theology and the natural sciences has been conditioned by the initial decision of the masters of the "first scientific revolution" to disregard any necessary explanatory premiss to account for the constituting organization and the framing of naturally occurring entities. Not paying any attention to hierarchical control, they ended-up disseminating a vision and understanding in which it was no longer possible for a theology of nature to send questions in the direction of the experimental (...)
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  31. Attila Grandpierre (2006). A Review Of: "Information Theory, Evolution and the Origin of Life as a Digital Message How Life Resembles a Computer". [REVIEW] World Futures 62 (5):401 – 403.
    (2006). A Review of: “Information Theory, Evolution and the Origin of Life as a Digital Message How Life Resembles a Computer”. World Futures: Vol. 62, No. 5, pp. 401-403.
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  32. Peter D. Grünwald & Paul M. B. Vitányi (2003). Kolmogorov Complexity and Information Theory. With an Interpretation in Terms of Questions and Answers. Journal of Logic, Language and Information 12 (4):497-529.
    We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and wherethey are fundamentally different. We discuss and relate the basicnotions of both theories: Shannon entropy, Kolmogorov complexity, Shannon mutual informationand Kolmogorov (``algorithmic'') mutual information. We explainhow universal coding may be viewed as a middle ground betweenthe two theories. We consider Shannon's rate distortion theory, whichquantifies useful (in a certain sense) information.We use the communication of information as our guiding motif, (...)
    Remove from this list | Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  33. Amit Hagar (2003). A Philosopher Looks at Quantum Information Theory. Philosophy of Science 70 (4):752-775.
    Recent suggestions to supply quantum mechanics (QM) with realistic foundations by reformulating it in light of quantum information theory (QIT) are examined and are found wanting by pointing to a basic conceptual problem that QIT itself ignores, namely, the measurement problem. Since one cannot ignore the measurement problem and at the same time pretend to be a realist, as they stand, the suggestions to reformulate QM in light of QIT are nothing but instrumentalism in disguise.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  34. Joseph F. Hanna (1969). Explanation, Prediction, Description, and Information Theory. Synthese 20 (3):308 - 334.
    The distinction between explanation and prediction has received much attention in recent literature, but the equally important distinction between explanation and description (or between prediction and description) remains blurred. This latter distinction is particularly important in the social sciences, where probabilistic models (or theories) often play dual roles as explanatory and descriptive devices. The distinction between explanation (or prediction) and description is explicated in the present paper in terms of information theory. The explanatory (or predictive) power of a probabilistic model (...)
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  35. William F. Harms (1998). The Use of Information Theory in Epistemology. Philosophy of Science 65 (3):472-501.
    Information theory offers a measure of "mutual information" which provides an appropriate measure of tracking efficiency for the naturalistic epistemologist. The statistical entropy on which it is based is arguably the best way of characterizing the uncertainty associated with the behavior of a system, and it is ontologically neutral. Though not appropriate for the naturalization of meaning, mutual information can serve as a measure of epistemic success independent of semantic maps and payoff structures. While not containing payoffs as terms, mutual (...)
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  36. Stevan Harnad (2011). Lunch Uncertain [Review Of: Floridi, Luciano (2011) The Philosophy of Information (Oxford)]. [REVIEW] Times Literary Supplement 5664 (22-23).
    The usual way to try to ground knowing according to contemporary theory of knowledge is: We know something if (1) it’s true, (2) we believe it, and (3) we believe it for the “right” reasons. Floridi proposes a better way. His grounding is based partly on probability theory, and partly on a question/answer network of verbal and behavioural interactions evolving in time. This is rather like modeling the data-exchange between a data-seeker who needs to know which button to press on (...)
    Remove from this list |
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  37. E. H. Hutten (1970). Symmetry Physics and Information Theory. Diogenes 18 (72):1-21.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  38. Marcus Hutter (2010). A Complete Theory of Everything (Will Be Subjective). Algorithms 3 (4):329-350.
    Increasingly encompassing models have been suggested for our world. Theories range from generally accepted to increasingly speculative to apparently bogus. The progression of theories from ego- to geo- to helio-centric models to universe and multiverse theories and beyond was accompanied by a dramatic increase in the sizes of the postulated worlds, with humans being expelled from their center to ever more remote and random locations. Rather than leading to a true theory of everything, this trend faces a turning point after (...)
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  39. Aleksandr I͡Akovlevich Khinchin (1957). Mathematical Foundations of Information Theory. New York, Dover Publications.
    Comprehensive, rigorous introduction to work of Shannon, McMillan, Feinstein and Khinchin. Translated by R. A. Silverman and M. D. Friedman.
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  40. Catherine Legg (2013). Peirce, Meaning, and the Semantic Web. Semiotica 2013 (193):119-143.
    This paper seeks an explanation for the challenges faced by Semantic Web developers in achieving their vision, compared to the staggering near-instantaneous success of the World Wide Web. To this end it contrasts two broad philosophical understandings of meaning and argues that the choice between them carries real consequences for how developers attempt to engineer the Semantic Web. The first is Rene Descartes’ ‘private’, static account of meaning (arguably dominant for the last 400 years in Western thought) which understands the (...)
    Remove from this list | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  41. Donald Mender (2009). Toward a Post-Technological Information Theory. In James Phillips (ed.), Philosophical Perspectives on Technology and Psychiatry. Oxford University Press.
    Remove from this list |
     
    My bibliography  
     
    Export citation  
  42. Leonard B. Meyer (1957). Meaning in Music and Information Theory. Journal of Aesthetics and Art Criticism 15 (4):412-424.
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  43. Abraham A. Moles (1966). Information Theory and Esthetic Perception. Urbana, University of Illinois Press.
    Remove from this list |
     
    My bibliography  
     
    Export citation  
  44. Wayne C. Myrvold (2010). From Physics to Information Theory and Back. In Alisa Bokulich & Gregg Jaeger (eds.), Philosophy of Quantum Information and Entanglement. Cambridge University Press. 181--207.
    Quantum information theory has given rise to a renewed interest in, and a new perspective on, the old issue of understanding the ways in which quantum mechanics differs from classical mechanics. The task of distinguishing between quantum and classical theory is facilitated by neutral frameworks that embrace both classical and quantum theory. In this paper, I discuss two approaches to this endeavour, the algebraic approach, and the convex set approach, with an eye to the strengths of each, and the relations (...)
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  45. David Paul Pace (1988). As Dreams Are Made On: The Probable Worlds of a New Human Mind as Presaged in Quantum Physics, Information Theory, Modal Philosophy, and Literary Myth. Libra Publishers.
  46. Derek Partridge (1981). Information Theory and Redundancy. Philosophy of Science 48 (2):308-316.
    This paper argues that Information Theoretic Redundancy (ITR) is fundamentally a composite concept that has been continually misinterpreted since the very inception of Information Theory. We view ITR as compounded of true redundancy and partial redundancy. This demarcation of true redundancy illustrates a limiting case phenomenon: the underlying metric (number of alternatives) differs only by degree but the properties of this concept differ in kind from those of partial redundancy. Several other studies are instanced which also imply the composite nature (...)
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  47. Steve Petersen, Minimum Message Length as a Truth-Conducive Simplicity Measure.
    given at the 2007 Formal Epistemology Workshop at Carnegie Mellon June 2nd. Good compression must track higher vs lower probability of inputs, and this is one way to approach how simplicity tracks truth.
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  48. Steve Petersen, Simplicity Tracks Truth Because Compression Tracks Probability.
    The simplicity of a theory seems closely related to how well the theory summarizes individual data points. Think, for example, of classic curve-fitting. It is easy to get perfect data-fit with a ‘‘theory’’ that simply lists each point of data, but such a theory is maximally unsimple (for the data-fit). The simple theory suggests instead that there is one underlying curve that summarizes this data, and we usually prefer such a theory even at some expense in data-fit. In general, it (...)
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  49. Jessica Pfeifer (2006). The Use of Information Theory in Biology: Lessons From Social Insects. Biological Theory 1 (3):317-330.
  50. Giuseppe Primiero (2013). Offline and Online Data: On Upgrading Functional Information to Knowledge. Philosophical Studies 164 (2):371-392.
    This paper addresses the problem of upgrading functional information to knowledge. Functional information is defined as syntactically well-formed, meaningful and collectively opaque data. Its use in the formal epistemology of information theories is crucial to solve the debate on the veridical nature of information, and it represents the companion notion to standard strongly semantic information, defined as well-formed, meaningful and true data. The formal framework, on which the definitions are based, uses a contextual version of the verificationist principle of truth (...)
    Remove from this list | Direct download (10 more)  
     
    My bibliography  
     
    Export citation  
1 — 50 / 73