About this topic
Summary The main focus of this category is Shannon's mathematical theory of information, and its broader philosophical uses. This includes in the first place cybernetics, signalling theories, the sender-receiver communication model, and Kolmogorov complexity.More general uses of information-theory that overlap with other domains of the philosophy of information may also belong to this category. Examples include different philosophical conceptions of information (semantic conceptions, semiotic approaches), as well as applications in specific domains of theoretical philosophy like the philosophy of science and the philosophy of mind.
  Show all references
Related categories
Siblings:
102 found
Search inside:
(import / add options)   Sort by:
1 — 50 / 102
  1. Pieter Adriaans, Information. Stanford Encyclopedia of Philosophy.
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  2. Pieter Adriaans (2010). A Critical Analysis of Floridi’s Theory of Semantic Information. Knowledge, Technology & Policy 23 (1-2):41-56.
    n various publications over the past years, Floridi has developed a theory of semantic information as well-formed, meaningful, and truthful data. This theory is more or less orthogonal to the standard entropy-based notions of information known from physics, information theory, and computer science that all define the amount of information in a certain system as a scalar value without any direct semantic implication. In this context the question rises what the exact relation between these various conceptions of information is and (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  3. Rudolf Arnheim (1959). Information Theory: An Introductory Note. Journal of Aesthetics and Art Criticism 17 (4):501-503.
    Remove from this list | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  4. Murat Aydede & Guven Guzeldere (2005). Concepts, Introspection, and Phenomenal Consciousness: An Information-Theoretical Approach. Noûs 39 (2):197-255.
    This essay is a sustained attempt to bring new light to some of the perennial problems in philosophy of mind surrounding phenomenal consciousness and introspection through developing an account of sensory and phenomenal concepts. Building on the information-theoretic framework of Dretske (1981), we present an informational psychosemantics as it applies to what we call sensory concepts, concepts that apply, roughly, to so-called secondary qualities of objects. We show that these concepts have a special informational character and semantic struc-ture that closely (...)
    Remove from this list | Direct download (11 more)  
     
    My bibliography  
     
    Export citation  
  5. Massimiliano Badino (forthcoming). Bridging Conceptual Gaps: The Kolmogorov-Sinai Entropy. Isonomía. Revista de Teoría y Filosofía Del Derecho.
    The Kolmogorov-Sinai entropy is a fairly exotic mathematical concept which has recently aroused some interest on the philosophers’ part. The most salient trait of this concept is its working as a junction between such diverse ambits as statistical mechanics, information theory and algorithm theory. In this paper I argue that, in order to understand this very special feature of the Kolmogorov-Sinai entropy, is essential to reconstruct its genealogy. Somewhat surprisingly, this story takes us as far back as the beginning of (...)
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  6. Massimiliano Badino (2004). An Application of Information Theory to the Problem of the Scientific Experiment. Synthese 140 (3):355 - 389.
    There are two basic approaches to the problem of induction:the empirical one, which deems that the possibility of induction depends on how theworld was made (and how it works) and the logical one, which considers the formation(and function) of language. The first is closer to being useful for induction, whilethe second is more rigorous and clearer. The purpose of this paper is to create an empiricalapproach to induction that contains the same formal exactitude as the logical approach.This requires: (a) that (...)
    Remove from this list | Direct download (8 more)  
     
    My bibliography  
     
    Export citation  
  7. David Balduzzi, Falsification and Future Performance.
    We information-theoretically reformulate two measures of capacity from statistical learning theory: empirical VC-entropy and empirical Rademacher complexity. We show these capacity measures count the number of hypotheses about a dataset that a learning algorithm falsifies when it finds the classifier in its repertoire minimizing empirical risk. It then follows from that the future performance of predictors on unseen data is controlled in part by how many hypotheses the learner falsifies. As a corollary we show that empirical VC-entropy quantifies the message (...)
    Remove from this list | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  8. David Balduzzi, Information, Learning and Falsification.
    There are (at least) three approaches to quantifying information. The first, algorithmic information or Kolmogorov complexity, takes events as strings and, given a universal Turing machine, quantifies the information content of a string as the length of the shortest program producing it [1]. The second, Shannon information, takes events as belonging to ensembles and quantifies the information resulting from observing the given event in terms of the number of alternate events that have been ruled out [2]. The third, statistical learning (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  9. Yehoshua Bar-Hillel (1955). An Examination of Information Theory. Philosophy of Science 22 (2):86-105.
  10. George Barmpalias (forthcoming). Algorithmic Randomness and Measures of Complexity. Association for Symbolic Logic: The Bulletin of Symbolic Logic.
    We survey recent advances on the interface between computability theory and algorithmic randomness, with special attention on measures of relative complexity. We focus on (weak) reducibilities that measure (a) the initial segment complexity of reals and (b) the power of reals to compress strings, when they are used as oracles. The results are put into context and several connections are made with various central issues in modern algorithmic randomness and computability.
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  11. Verónica Becher & Santiago Figueira (2005). Kolmogorov Complexity for Possibly Infinite Computations. Journal of Logic, Language and Information 14 (2):133-148.
    In this paper we study the Kolmogorov complexity for non-effective computations, that is, either halting or non-halting computations on Turing machines. This complexity function is defined as the length of the shortest input that produce a desired output via a possibly non-halting computation. Clearly this function gives a lower bound of the classical Kolmogorov complexity. In particular, if the machine is allowed to overwrite its output, this complexity coincides with the classical Kolmogorov complexity for halting computations relative to the first (...)
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  12. Soren Brier (2001). Ecosemiotics and Cybersemiotics. Sign Systems Studies 29 (1):107-119.
    The article develops a suggestion of how cybersemiotics is pertinent to ecosemiotics. Cybersemiotics uses Luhmann's triadic view of autopoietic systems (biological, psychological, and socio-communicative autopoiesis) and adopts his approach to communication within a biosemiotic framework. The following levels of exosemiosis and signification can be identified under the consideration of nonintentional signs, cybernetics, and information theory: (1) the socio-communicative level of self-conscious signification and language games. (2) the instinctual and species specific level of sign stimuli signifying through innate release response mechanism (...)
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  13. Léon Brillouin (1956/2004). Science and Information Theory. Dover Publications.
    A classic source for understanding the connections between information theory and physics, this text was written by one of the giants of 20th-century physics and is appropriate for upper-level undergraduates and graduate students. Topics include the principles of coding, coding problems and solutions, the analysis of signals, a summary of thermodynamics, thermal agitation and Brownian motion, and thermal noise in an electric circuit. A discussion of the negentropy principle of information introduces the author's renowned examination of Maxwell's demon. Concluding chapters (...)
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  14. Julio A. Camargo (2008). Revisiting the Relation Between Species Diversity and Information Theory. Acta Biotheoretica 56 (4):275-283.
    The Shannon information function (H) has been extensively used in ecology as a statistic of species diversity. Yet, the use of Shannon diversity index has also been criticized, mainly because of its ambiguous ecological interpretation and because of its relatively great sensitivity to the relative abundances of species in the community. In my opinion, the major shortcoming of the traditional perspective (on the possible relation of species diversity with information theory) is that species need for an external receiver (the scientist (...)
    Remove from this list | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  15. G. J. Chaitin (1996). A New Version of Algorithmic Information Theory. Complexity 1 (4):55-59.
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  16. G. J. Chaitin, How to Run Algorithmic Information Theory on a Computer.
    Hi everybody! It's a great pleasure for me to be back here at the new, improved Santa Fe Institute in this spectacular location. I guess this is my fourth visit and it's always very stimulating, so I'm always very happy to visit you guys. I'd like to tell you what I've been up to lately. First of all, let me say what algorithmic information theory is good for, before telling you about the new version of it I've got.
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  17. Gregory J. Chaitin (1996). How to Run Algorithmic Information Theory on a Computer:Studying the Limits of Mathematical Reasoning. Complexity 2 (1):15-21.
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  18. C. T. K. Chari (1963). Time Reversal, Information Theory, and "World-Geometry". Journal of Philosophy 60 (20):579-583.
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  19. Rob Clifton (2002). The Subtleties of Entanglement and its Role in Quantum Information Theory. Proceedings of the Philosophy of Science Association 2002 (3):S150-S167.
    My aim in this paper is a modest one. I do not have any particular thesis to advance about the nature of entanglement, nor can I claim novelty for any of the material I shall discuss. My aim is simply to raise some questions about entanglement that spring naturally from certain developments in quantum information theory and are, I believe, worthy of serious consideration by philosophers of science. The main topics I discuss are different manifestations of quantum nonlocality, entanglement-assisted communication, (...)
    Remove from this list | Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  20. Timothy Colburn & Gary Shute (2010). Abstraction, Law, and Freedom in Computer Science. Metaphilosophy 41 (3):345-364.
    Abstract: Laws of computer science are prescriptive in nature but can have descriptive analogs in the physical sciences. Here, we describe a law of conservation of information in network programming, and various laws of computational motion (invariants) for programming in general, along with their pedagogical utility. Invariants specify constraints on objects in abstract computational worlds, so we describe language and data abstraction employed by software developers and compare them to Floridi's concept of levels of abstraction. We also consider Floridi's structural (...)
    Remove from this list | Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  21. John Collier, Information Theory as a General Language for Functional Systems.
    Function refers to a broad family of concepts of varying abstractness and range of application, from a many-one mathematical relation of great generality to, for example, highly specialized roles of designed elements in complex machines such as degaussing in a television set, or contributory processes to control mechanisms in complex metabolic pathways, such as the inhibitory function of the appropriate part of the lac-operon on the production of lactase through its action on the genome in the absence of lactose. We (...)
    Remove from this list |
     
    My bibliography  
     
    Export citation  
  22. Jeff Coulter (1995). The Informed Neuron: Issues in the Use of Information Theory in the Behavioral Sciences. [REVIEW] Minds and Machines 5 (4):583-96.
    The concept of “information” is virtually ubiquitous in contemporary cognitive science. It is claimed to be “processed” (in cognitivist theories of perception and comprehension), “stored” (in cognitivist theories of memory and recognition), and otherwise manipulated and transformed by the human central nervous system. Fred Dretske's extensive philosophical defense of a theory of informational content (“semantic” information) based upon the Shannon-Weaver formal theory of information is subjected to critical scrutiny. A major difficulty is identified in Dretske's equivocations in the use of (...)
    Remove from this list | Direct download (8 more)  
     
    My bibliography  
     
    Export citation  
  23. Gordana Dodig Crnkovic, Empirical Modeling and Information Semantics. Mind & Society.
  24. Michael E. Cuffaro (2014). Review Of: Christopher G. Timpson, Quantum Information Theory and the Foundations of Quantum Mechanics. [REVIEW] Philosophy of Science 81 (4):681-684,.
  25. C. D'Antonl & P. Scanzano (1980). An Application of Information Theory: Longitudinal Measurability Bounds in Classical and Quantum Physics. [REVIEW] Foundations of Physics 10 (11-12):875-885.
    We examine the problem of the existence (in classical and/or quantum physics) of longitudinal limitations of measurability, defined as limitations preventing the measurement of a given quantity with arbitrarily high accuracy. We consider a measuring device as a generalized communication system, which enables us to use methods of information theory. As a direct consequence of the Shannon theorem on channel capacity, we obtain an inequality which limits the accuracy of a measurement in terms of the average power necessary to transmit (...)
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  26. Antoine Danchin (2009). Information of the Chassis and Information of the Program in Synthetic Cells. Systems and Synthetic Biology 3:125-134.
    Synthetic biology aims at reconstructing life to put to the test the limits of our understanding. It is based on premises similar to those which permitted invention of computers, where a machine, which reproduces over time, runs a program, which replicates. The underlying heuristics explored here is that an authentic category of reality, information, must be coupled with the standard categories, matter, energy, space and time to account for what life is. The use of this still elusive category permits us (...)
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  27. Edmund J. Dehnert (1967). The Theory of Games, Information Theory, and Value Criteria. Journal of Value Inquiry 1 (2):124-131.
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  28. Kevin C. Desouza & Tobin Hensgen (2002). On "Information" in Organizations: An Emergent Information Theory and Semiotic Framework. Emergence: Complexity and Organization 4 (3):95-114.
    (2002). On 'Information' in Organizations: An Emergent Information Theory and Semiotic Framework. Emergence: Vol. 4, No. 3, pp. 95-114. doi: 10.1207/S15327000EM0403-07.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  29. Sean Devine (2006). The Application of Algorithmic Information Theory to Noisy Patterned Strings. Complexity 12 (2):52-58.
    Although algorithmic information theory provides a measure of the information content of string of characters, problems of noise and noncomputability emerge. However, if pattern in a noisy string is recognized by reference to a set of similar strings, this article shows that a compressed algorithmic description of a noisy string is possible and illustrates this with some simple examples. The article also shows that algorithmic information theory can quantify the information in complex organized systems where pattern is nested within pattern.
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  30. Joseph T. Devlin, Matt H. Davis, Stuart A. McLelland & Richard P. Russell (2000). Efficiency, Information Theory, and Neural Representations. Behavioral and Brain Sciences 23 (4):475-476.
    We contend that if efficiency and reliability are important factors in neural information processing then distributed, not localist, representations are “evolution's best bet.” We note that distributed codes are the most efficient method for representing information, and that this efficiency minimizes metabolic costs, providing adaptive advantage to an organism.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  31. Gordana Dodig Crnkovic & Susan Stuart (eds.) (2007). Computation, Information, Cognition: The Nexus and the Liminal. Cambridge Scholars Press.
    Written by world-leading experts, this book draws together a number of important strands in contemporary approaches to the philosophical and scientific questions that emerge when dealing with the issues of computing, information, cognition and the conceptual issues that arise at their intersections. It discovers and develops the connections at the borders and in the interstices of disciplines and debates. This volume presents a range of essays that deal with the currently vigorous concerns of the philosophy of information, ontology creation and (...)
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  32. Alan L. Durham, Copyright and Information Theory: Toward an Alternative Model of Authorship.
    Both literary scholars and students of copyright law have challenged the romantic model of authorship, a model emphasizing individual genius and creation ex nihilo. Authorship, they argue, is actually a collaborative effort. Authors assemble their works from the fragments of their cultural environment, transforming as much as creating. Copyright law, however, still champions the rights of authors and it requires a coherent theory of what authorship is. An alternative to the romantic model of authorship can be found in information theory, (...)
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  33. Reinhard Eckhorn (1997). Support for Grouping-by-Synchronization, the Context-Field, and its Mechanisms, but Doubt in the Use of Information Theory by the Cortex. Behavioral and Brain Sciences 20 (4):686-687.
    Our work supports synchronization for binding within Phillips & Singer's “contextual field” (CF) as well as the type of its lateral interaction they propose. Both firmly agree with our “association field” (AF) and its modulatory influences (Eckhorn et al. 1990). However, the CF connections seem to produce anticorrelation among assemblies representing unrelated structures, whereas experimental evidence indicates decoupling. Finally, it is unclear how the cortex can have access to the logistic function used in the “coherent infomax” approach.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  34. Adam Elga, Algorithmic Information Theory: The Basics.
    Turing machine An idealized computing device attached to a tape, each square of which is capable of holding a symbol. We write a program (a nite binary string) on the tape, and start the machine. If the machine halts with string o written at a designated place on the tape.
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  35. David Ellerman, A Short Note on the Logico-Conceptual Foundations of Information Theory in Partition Logic.
    A new logic of partitions has been developed that is dual to ordinary logic when the latter is interpreted as the logic of subsets of a fixed universe rather than the logic of propositions. For a finite universe, the logic of subsets gave rise to finite probability theory by assigning to each subset its relative size as a probability. The analogous construction for the dual logic of partitions gives rise to a notion of logical entropy that is precisely related to (...)
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  36. David Ellerman (2013). An Introduction to Logical Entropy and its Relation to Shannon Entropy. International Journal of Semantic Computing 7 (2):121-145.
    The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the Boolean logic (...)
    Remove from this list | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  37. David Ellerman (2009). Counting Distinctions: On the Conceptual Foundations of Shannon's Information Theory. Synthese 168 (1):119 - 149.
    Categorical logic has shown that modern logic is essentially the logic of subsets (or “subobjects”). In “subset logic,” predicates are modeled as subsets of a universe and a predicate applies to an individual if the individual is in the subset. Partitions are dual to subsets so there is a dual logic of partitions where a “distinction” [an ordered pair of distinct elements (u, u′) from the universe U] is dual to an “element”. A predicate modeled by a partition π on (...)
    Remove from this list | Direct download (8 more)  
     
    My bibliography  
     
    Export citation  
  38. Wesley Elsberry & Jeffrey Shallit (2011). Information Theory, Evolutionary Computation, and Dembski's "Complex Specified Information". Synthese 178 (2):237 - 270.
    Intelligent design advocate William Dembski has introduced a measure of information called "complex specified information", or CSI. He claims that CSI is a reliable marker of design by intelligent agents. He puts forth a "Law of Conservation of Information" which states that chance and natural laws are incapable of generating CSI. In particular, CSI cannot be generated by evolutionary computation. Dembski asserts that CSI is present in intelligent causes and in the flagellum of Escherichia coli, and concludes that neither have (...)
    Remove from this list | Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  39. Amiel Feinstein (1958). Foundations of Information Theory. New York, Mcgraw-Hill.
    Remove from this list |
     
    My bibliography  
     
    Export citation  
  40. Jordi Fortuny & Bernat Corominas-Murtra (2013). On the Origin of Ambiguity in Efficient Communication. Journal of Logic, Language and Information 22 (3):249-267.
    This article studies the emergence of ambiguity in communication through the concept of logical irreversibility and within the framework of Shannon’s information theory. This leads us to a precise and general expression of the intuition behind Zipf’s vocabulary balance in terms of a symmetry equation between the complexities of the coding and the decoding processes that imposes an unavoidable amount of logical uncertainty in natural communication. Accordingly, the emergence of irreversible computations is required if the complexities of the coding and (...)
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  41. B. Roy Frieden (1999). F-Information, a Unitless Variant of Fisher Information. Foundations of Physics 29 (10):1521-1541.
    A new information matrix [F] with elements F mn = 〈 (y m - a m )(y n - a n) (∂ ln p(y | a)/∂a m ) (∂ ln p(y | a)/∂a n ) 〉 is analyzed. The PDF p(y | a) is the usual likelihood law. [F] differs from the Fisher information matrix by the presence of the first two factors in the given expectation. These factors make F mn unitless, in contrast with the Fisher information. This lack (...)
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  42. Roman Frigg, Chaos and Randomness: An Equivalence Proof of a Generalized Version of the Shannon Entropy and the Kolmogorov–Sinai Entropy for Hamiltonian Dynamical Systems.
    Chaos is often explained in terms of random behaviour; and having positive Kolmogorov–Sinai entropy (KSE) is taken to be indicative of randomness. Although seemly plausible, the association of positive KSE with random behaviour needs justification since the definition of the KSE does not make reference to any notion that is connected to randomness. A common way of justifying this use of the KSE is to draw parallels between the KSE and ShannonÕs information theoretic entropy. However, as it stands this no (...)
    Remove from this list | Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  43. Philippe Gagnon (2015). New Arguments for 'Intelligent Design'? Review Article on William A. Dembski, Being as Communion: A Metaphysics of Information. [REVIEW] ESSSAT News and Reviews 25 (1):17-24.
    Critical notice assessing the use of information theory in the attempt to build a design inference, and to re-establish some aspects of the program of natural theology, as carried out in this third major monograph devoted to the subject of intelligent design theory by mathematician and philosopher William A. Dembski, after The Design Inference (1998) and No Free Lunch (2002).
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  44. Philippe Gagnon (2013). "Que reste-t-il de la théologie à l'âge électronique ? Valeur et cybernétique axiologique chez Raymond Ruyer" [What is left of Theology in the Electronic Age? Value and Axiological Cybernetics in Raymond Ruyer]. In Chromatikon Ix: Annales de la Philosophie En Procès — Yearbook of Philosophy in Process, M. Weber & V. Berne. 93-120.
    This is the outline: Introduction — La question de la cybernétique et de l'information — Une « pensée du milieu » — Cybernétique et homologie — Une théorie de l'apprentissage — L'information vue de l'autre côté — Champ et domaine unitaire — La thèse des « autres-je » — Le passage par l'axiologie — La rétroaction vraie — L'ontologie de Ruyer — Le bruissement de l'être même.
    Remove from this list |
    Translate to English
    | Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  45. Philippe Gagnon (2010). “What We Have Learnt From Systems Theory About the Things That Nature’s Understanding Achieves”. In Dirk Evers, Antje Jackelén & Taede Smedes (eds.), How do we Know? Understanding in Science and Theology. Forum Scientiarum
    The problem of knowledge has been centred around the study of the content of our consciousness, seeing the world through internal representation, without any satisfactory account of the operations of nature that would be a pre-condition for our own performances in terms of concept efficiency in organizing action externally. If we want to better understand where and how meaning fits in nature, we have to find the proper way to decipher its organization, and account for the fact that we have (...)
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  46. Philippe Gagnon (2002). La Théologie de la Nature Et la Science À l'Ère de L'Information. Cerf.
    The history of the relationship between Christian theology and the natural sciences has been conditioned by the initial decision of the masters of the "first scientific revolution" to disregard any necessary explanatory premiss to account for the constituting organization and the framing of naturally occurring entities. Not paying any attention to hierarchical control, they ended-up disseminating a vision and understanding in which it was no longer possible for a theology of nature to send questions in the direction of the experimental (...)
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  47. Attila Grandpierre (2006). A Review Of: "Information Theory, Evolution and the Origin of Life as a Digital Message How Life Resembles a Computer". [REVIEW] World Futures 62 (5):401 – 403.
    (2006). A Review of: “Information Theory, Evolution and the Origin of Life as a Digital Message How Life Resembles a Computer”. World Futures: Vol. 62, No. 5, pp. 401-403.
    Remove from this list | Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  48. Peter D. Grünwald & Paul M. B. Vitányi (2003). Kolmogorov Complexity and Information Theory. With an Interpretation in Terms of Questions and Answers. Journal of Logic, Language and Information 12 (4):497-529.
    We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and wherethey are fundamentally different. We discuss and relate the basicnotions of both theories: Shannon entropy, Kolmogorov complexity, Shannon mutual informationand Kolmogorov (``algorithmic'') mutual information. We explainhow universal coding may be viewed as a middle ground betweenthe two theories. We consider Shannon's rate distortion theory, whichquantifies useful (in a certain sense) information.We use the communication of information as our guiding motif, (...)
    Remove from this list | Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  49. Amit Hagar (2003). A Philosopher Looks at Quantum Information Theory. Philosophy of Science 70 (4):752-775.
    Recent suggestions to supply quantum mechanics (QM) with realistic foundations by reformulating it in light of quantum information theory (QIT) are examined and are found wanting by pointing to a basic conceptual problem that QIT itself ignores, namely, the measurement problem. Since one cannot ignore the measurement problem and at the same time pretend to be a realist, as they stand, the suggestions to reformulate QM in light of QIT are nothing but instrumentalism in disguise.
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  50. Joseph F. Hanna (1969). Explanation, Prediction, Description, and Information Theory. Synthese 20 (3):308 - 334.
    The distinction between explanation and prediction has received much attention in recent literature, but the equally important distinction between explanation and description (or between prediction and description) remains blurred. This latter distinction is particularly important in the social sciences, where probabilistic models (or theories) often play dual roles as explanatory and descriptive devices. The distinction between explanation (or prediction) and description is explicated in the present paper in terms of information theory. The explanatory (or predictive) power of a probabilistic model (...)
    Remove from this list | Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
1 — 50 / 102