About this topic
Summary The main focus of this category is Shannon's mathematical theory of information, and its broader philosophical uses. This includes in the first place cybernetics, signalling theories, the sender-receiver communication model, and Kolmogorov complexity.More general uses of information-theory that overlap with other domains of the philosophy of information may also belong to this category. Examples include different philosophical conceptions of information (semantic conceptions, semiotic approaches), as well as applications in specific domains of theoretical philosophy like the philosophy of science and the philosophy of mind.
  Show all references
Related categories
Siblings:
125 found
Search inside:
(import / add options)   Order:
1 — 50 / 125
  1. Pieter Adriaans, Information. Stanford Encyclopedia of Philosophy.
  2. Pieter Adriaans (2010). A Critical Analysis of Floridi’s Theory of Semantic Information. Knowledge, Technology & Policy 23 (1-2):41-56.
    n various publications over the past years, Floridi has developed a theory of semantic information as well-formed, meaningful, and truthful data. This theory is more or less orthogonal to the standard entropy-based notions of information known from physics, information theory, and computer science that all define the amount of information in a certain system as a scalar value without any direct semantic implication. In this context the question rises what the exact relation between these various conceptions of information is and (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    My bibliography   2 citations  
  3. Rudolf Arnheim (1959). Information Theory: An Introductory Note. Journal of Aesthetics and Art Criticism 17 (4):501-503.
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  4. Robert Artigiani (1997). Interaction, Information and Meaning. World Futures 50 (1):703-714.
  5. Murat Aydede & Guven Guzeldere (2005). Concepts, Introspection, and Phenomenal Consciousness: An Information-Theoretical Approach. Noûs 39 (2):197-255.
    This essay is a sustained attempt to bring new light to some of the perennial problems in philosophy of mind surrounding phenomenal consciousness and introspection through developing an account of sensory and phenomenal concepts. Building on the information-theoretic framework of Dretske (1981), we present an informational psychosemantics as it applies to what we call sensory concepts, concepts that apply, roughly, to so-called secondary qualities of objects. We show that these concepts have a special informational character and semantic struc-ture that closely (...)
    Remove from this list   Direct download (11 more)  
     
    Export citation  
     
    My bibliography   13 citations  
  6. Massimiliano Badino (forthcoming). Bridging Conceptual Gaps: The Kolmogorov-Sinai Entropy. Isonomía. Revista de Teoría y Filosofía Del Derecho.
    The Kolmogorov-Sinai entropy is a fairly exotic mathematical concept which has recently aroused some interest on the philosophers’ part. The most salient trait of this concept is its working as a junction between such diverse ambits as statistical mechanics, information theory and algorithm theory. In this paper I argue that, in order to understand this very special feature of the Kolmogorov-Sinai entropy, is essential to reconstruct its genealogy. Somewhat surprisingly, this story takes us as far back as the beginning of (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    My bibliography  
  7. Massimiliano Badino (2004). An Application of Information Theory to the Problem of the Scientific Experiment. Synthese 140 (3):355 - 389.
    There are two basic approaches to the problem of induction:the empirical one, which deems that the possibility of induction depends on how theworld was made (and how it works) and the logical one, which considers the formation(and function) of language. The first is closer to being useful for induction, whilethe second is more rigorous and clearer. The purpose of this paper is to create an empiricalapproach to induction that contains the same formal exactitude as the logical approach.This requires: (a) that (...)
    Remove from this list   Direct download (9 more)  
     
    Export citation  
     
    My bibliography  
  8. David Balduzzi, Information, Learning and Falsification.
    There are (at least) three approaches to quantifying information. The first, algorithmic information or Kolmogorov complexity, takes events as strings and, given a universal Turing machine, quantifies the information content of a string as the length of the shortest program producing it [1]. The second, Shannon information, takes events as belonging to ensembles and quantifies the information resulting from observing the given event in terms of the number of alternate events that have been ruled out [2]. The third, statistical learning (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    My bibliography  
  9. David Balduzzi, Falsification and Future Performance.
    We information-theoretically reformulate two measures of capacity from statistical learning theory: empirical VC-entropy and empirical Rademacher complexity. We show these capacity measures count the number of hypotheses about a dataset that a learning algorithm falsifies when it finds the classifier in its repertoire minimizing empirical risk. It then follows from that the future performance of predictors on unseen data is controlled in part by how many hypotheses the learner falsifies. As a corollary we show that empirical VC-entropy quantifies the message (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  10. Yehoshua Bar-Hillel (1955). An Examination of Information Theory. Philosophy of Science 22 (2):86-105.
  11. George Barmpalias (forthcoming). Algorithmic Randomness and Measures of Complexity. Association for Symbolic Logic: The Bulletin of Symbolic Logic.
    We survey recent advances on the interface between computability theory and algorithmic randomness, with special attention on measures of relative complexity. We focus on (weak) reducibilities that measure (a) the initial segment complexity of reals and (b) the power of reals to compress strings, when they are used as oracles. The results are put into context and several connections are made with various central issues in modern algorithmic randomness and computability.
    Remove from this list   Direct download  
     
    Export citation  
     
    My bibliography   1 citation  
  12. Gérard Battail (2009). Applying Semiotics and Information Theory to Biology: A Critical Comparison. [REVIEW] Biosemiotics 2 (3):303-320.
    Since the beginning of the XX-th century, it became increasingly evident that information, besides matter and energy, is a major actor in the life processes. Moreover, communication of information has been recognized as differentiating living things from inanimate ones, hence as specific to the life processes. Therefore the sciences of matter and energy, chemistry and physics, do not suffice to deal with life processes. Biology should also rely on sciences of information. A majority of biologists, however, did not change their (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    My bibliography   4 citations  
  13. Verónica Becher & Santiago Figueira (2005). Kolmogorov Complexity for Possibly Infinite Computations. Journal of Logic, Language and Information 14 (2):133-148.
    In this paper we study the Kolmogorov complexity for non-effective computations, that is, either halting or non-halting computations on Turing machines. This complexity function is defined as the length of the shortest input that produce a desired output via a possibly non-halting computation. Clearly this function gives a lower bound of the classical Kolmogorov complexity. In particular, if the machine is allowed to overwrite its output, this complexity coincides with the classical Kolmogorov complexity for halting computations relative to the first (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    My bibliography  
  14. Y. Ben-Aryeh, A. Mann & B. C. Sanders (1999). Empirical State Determination of Entangled Two-Level Systems and Its Relation to Information Theory. Foundations of Physics 29 (12):1963-1975.
    Theoretical methods for empirical state determination of entangled two-level systems are analyzed in relation to information theory. We show that hidden variable theories would lead to a Shannon index of correlation between the entangled subsystems which is larger than that predicted by quantum mechanics. Canonical representations which have maximal correlations are treated by the use of Schmidt and Hilbert-Schmidt decomposition of the entangled states, including especially the Bohm singlet state and the GHZ entangled states. We show that quantum mechanics does (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    My bibliography  
  15. Carl T. Bergstrom & Martin Rosvall (2011). The Transmission Sense of Information. Biology and Philosophy 26 (2):159-176.
    Biologists rely heavily on the language of information, coding, and transmission that is commonplace in the field of information theory developed by Claude Shannon, but there is open debate about whether such language is anything more than facile metaphor. Philosophers of biology have argued that when biologists talk about information in genes and in evolution, they are not talking about the sort of information that Shannon’s theory addresses. First, philosophers have suggested that Shannon’s theory is only useful for developing a (...)
    Remove from this list   Direct download (8 more)  
     
    Export citation  
     
    My bibliography   18 citations  
  16. Soren Brier (2001). Ecosemiotics and Cybersemiotics. Sign Systems Studies 29 (1):107-119.
    The article develops a suggestion of how cybersemiotics is pertinent to ecosemiotics. Cybersemiotics uses Luhmann's triadic view of autopoietic systems (biological, psychological, and socio-communicative autopoiesis) and adopts his approach to communication within a biosemiotic framework. The following levels of exosemiosis and signification can be identified under the consideration of nonintentional signs, cybernetics, and information theory: (1) the socio-communicative level of self-conscious signification and language games. (2) the instinctual and species specific level of sign stimuli signifying through innate release response mechanism (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    My bibliography  
  17. Léon Brillouin (1956). Science and Information Theory. Dover Publications.
    A classic source for understanding the connections between information theory and physics, this text was written by one of the giants of 20th-century physics and is appropriate for upper-level undergraduates and graduate students. Topics include the principles of coding, coding problems and solutions, the analysis of signals, a summary of thermodynamics, thermal agitation and Brownian motion, and thermal noise in an electric circuit. A discussion of the negentropy principle of information introduces the author's renowned examination of Maxwell's demon. Concluding chapters (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    My bibliography   78 citations  
  18. G. C. (2003). On a Supposed Conceptual Inadequacy of the Shannon Information in Quantum Mechanics. Studies in History and Philosophy of Science Part B 34 (3):441-468.
    Recently, Brukner and Zeilinger (Phys. Rev. Lett. 83(17) (2001) 3354) have claimed that the Shannon information is not well defined as a measure of information in quantum mechanics, adducing arguments that seek to show that it is inextricably tied to classical notions of measurement. It is shown here that these arguments do not succeed: the Shannon information does not have problematic ties to classical concepts. In a further argument, Brukner and Zeilinger compare the Shannon information unfavourably to their preferred information (...)
    Remove from this list  
     
    Export citation  
     
    My bibliography  
  19. Julio A. Camargo (2008). Revisiting the Relation Between Species Diversity and Information Theory. Acta Biotheoretica 56 (4):275-283.
    The Shannon information function (H) has been extensively used in ecology as a statistic of species diversity. Yet, the use of Shannon diversity index has also been criticized, mainly because of its ambiguous ecological interpretation and because of its relatively great sensitivity to the relative abundances of species in the community. In my opinion, the major shortcoming of the traditional perspective (on the possible relation of species diversity with information theory) is that species need for an external receiver (the scientist (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    My bibliography  
  20. G. J. Chaitin (1996). A New Version of Algorithmic Information Theory. Complexity 1 (4):55-59.
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    My bibliography  
  21. G. J. Chaitin, How to Run Algorithmic Information Theory on a Computer.
    Hi everybody! It's a great pleasure for me to be back here at the new, improved Santa Fe Institute in this spectacular location. I guess this is my fourth visit and it's always very stimulating, so I'm always very happy to visit you guys. I'd like to tell you what I've been up to lately. First of all, let me say what algorithmic information theory is good for, before telling you about the new version of it I've got.
    Remove from this list  
    Translate
     
     
    Export citation  
     
    My bibliography  
  22. Gregory J. Chaitin (1996). How to Run Algorithmic Information Theory on a Computer:Studying the Limits of Mathematical Reasoning. Complexity 2 (1):15-21.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    My bibliography  
  23. C. T. K. Chari (1963). Time Reversal, Information Theory, and "World-Geometry". Journal of Philosophy 60 (20):579-583.
    Remove from this list   Direct download (6 more)  
     
    Export citation  
     
    My bibliography   1 citation  
  24. Min Chen & Luciano Floridi (2013). An Analysis of Information Visualisation. Synthese 190 (16):3421-3438.
    Philosophers have relied on visual metaphors to analyse ideas and explain their theories at least since Plato. Descartes is famous for his system of axes, and Wittgenstein for his first design of truth table diagrams. Today, visualisation is a form of ‘computer-aided seeing’ information in data. Hence, information is the fundamental ‘currency’ exchanged through a visualisation pipeline. In this article, we examine the types of information that may occur at different stages of a general visualization pipeline. We do so from (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    My bibliography  
  25. Rob Clifton (2002). The Subtleties of Entanglement and its Role in Quantum Information Theory. Proceedings of the Philosophy of Science Association 2002 (3):S150-S167.
    My aim in this paper is a modest one. I do not have any particular thesis to advance about the nature of entanglement, nor can I claim novelty for any of the material I shall discuss. My aim is simply to raise some questions about entanglement that spring naturally from certain developments in quantum information theory and are, I believe, worthy of serious consideration by philosophers of science. The main topics I discuss are different manifestations of quantum nonlocality, entanglement-assisted communication, (...)
    Remove from this list   Direct download (6 more)  
     
    Export citation  
     
    My bibliography   2 citations  
  26. Timothy Colburn & Gary Shute (2010). Abstraction, Law, and Freedom in Computer Science. Metaphilosophy 41 (3):345-364.
    Abstract: Laws of computer science are prescriptive in nature but can have descriptive analogs in the physical sciences. Here, we describe a law of conservation of information in network programming, and various laws of computational motion (invariants) for programming in general, along with their pedagogical utility. Invariants specify constraints on objects in abstract computational worlds, so we describe language and data abstraction employed by software developers and compare them to Floridi's concept of levels of abstraction. We also consider Floridi's structural (...)
    Remove from this list   Direct download (7 more)  
     
    Export citation  
     
    My bibliography   1 citation  
  27. John Collier, Information Theory as a General Language for Functional Systems.
    Function refers to a broad family of concepts of varying abstractness and range of application, from a many-one mathematical relation of great generality to, for example, highly specialized roles of designed elements in complex machines such as degaussing in a television set, or contributory processes to control mechanisms in complex metabolic pathways, such as the inhibitory function of the appropriate part of the lac-operon on the production of lactase through its action on the genome in the absence of lactose. We (...)
    Remove from this list  
     
    Export citation  
     
    My bibliography  
  28. Jeff Coulter (1995). The Informed Neuron: Issues in the Use of Information Theory in the Behavioral Sciences. [REVIEW] Minds and Machines 5 (4):583-96.
    The concept of “information” is virtually ubiquitous in contemporary cognitive science. It is claimed to be “processed” (in cognitivist theories of perception and comprehension), “stored” (in cognitivist theories of memory and recognition), and otherwise manipulated and transformed by the human central nervous system. Fred Dretske's extensive philosophical defense of a theory of informational content (“semantic” information) based upon the Shannon-Weaver formal theory of information is subjected to critical scrutiny. A major difficulty is identified in Dretske's equivocations in the use of (...)
    Remove from this list   Direct download (7 more)  
     
    Export citation  
     
    My bibliography   1 citation  
  29. Gordana Dodig Crnkovic, Empirical Modeling and Information Semantics. Mind & Society.
  30. Michael E. Cuffaro (2014). Review Of: Christopher G. Timpson, Quantum Information Theory and the Foundations of Quantum Mechanics. [REVIEW] Philosophy of Science 81 (4):681-684,.
  31. C. D'Antonl & P. Scanzano (1980). An Application of Information Theory: Longitudinal Measurability Bounds in Classical and Quantum Physics. [REVIEW] Foundations of Physics 10 (11-12):875-885.
    We examine the problem of the existence (in classical and/or quantum physics) of longitudinal limitations of measurability, defined as limitations preventing the measurement of a given quantity with arbitrarily high accuracy. We consider a measuring device as a generalized communication system, which enables us to use methods of information theory. As a direct consequence of the Shannon theorem on channel capacity, we obtain an inequality which limits the accuracy of a measurement in terms of the average power necessary to transmit (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    My bibliography  
  32. Antoine Danchin (2009). Information of the Chassis and Information of the Program in Synthetic Cells. Systems and Synthetic Biology 3:125-134.
    Synthetic biology aims at reconstructing life to put to the test the limits of our understanding. It is based on premises similar to those which permitted invention of computers, where a machine, which reproduces over time, runs a program, which replicates. The underlying heuristics explored here is that an authentic category of reality, information, must be coupled with the standard categories, matter, energy, space and time to account for what life is. The use of this still elusive category permits us (...)
    Remove from this list  
    Translate
      Direct download  
     
    Export citation  
     
    My bibliography  
  33. Antoine Danchin (2008). Les organismes vivants comme pièges à information. Ludus Vitalis 16 (30):211-212.
    Life can be defined as combining two entities that rest on completely different physico-chemical properties and on a particular way of handling information. The cell, first, is a « machine », that combines elements which are quite similar (although in a fairly fuzzy way) to those involved in a man-made factory. The machine combines two processes. First, it requires explicit compartmentalisation, including scaffolding structures similar to that of the châssis of engineered machines. In addition, cells define clearly an inside, the (...)
    Remove from this list  
    Translate
      Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  34. Edmund J. Dehnert (1967). The Theory of Games, Information Theory, and Value Criteria. Journal of Value Inquiry 1 (2):124-131.
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    My bibliography  
  35. Kevin C. Desouza & Tobin Hensgen (2002). On "Information" in Organizations: An Emergent Information Theory and Semiotic Framework. Emergence: Complexity and Organization 4 (3):95-114.
    (2002). On 'Information' in Organizations: An Emergent Information Theory and Semiotic Framework. Emergence: Vol. 4, No. 3, pp. 95-114. doi: 10.1207/S15327000EM0403-07.
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    My bibliography  
  36. Sean Devine (2006). The Application of Algorithmic Information Theory to Noisy Patterned Strings. Complexity 12 (2):52-58.
    Although algorithmic information theory provides a measure of the information content of string of characters, problems of noise and noncomputability emerge. However, if pattern in a noisy string is recognized by reference to a set of similar strings, this article shows that a compressed algorithmic description of a noisy string is possible and illustrates this with some simple examples. The article also shows that algorithmic information theory can quantify the information in complex organized systems where pattern is nested within pattern.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    My bibliography  
  37. Joseph T. Devlin, Matt H. Davis, Stuart A. McLelland & Richard P. Russell (2000). Efficiency, Information Theory, and Neural Representations. Behavioral and Brain Sciences 23 (4):475-476.
    We contend that if efficiency and reliability are important factors in neural information processing then distributed, not localist, representations are “evolution's best bet.” We note that distributed codes are the most efficient method for representing information, and that this efficiency minimizes metabolic costs, providing adaptive advantage to an organism.
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    My bibliography  
  38. Gordana Dodig Crnkovic & Susan Stuart (eds.) (2007). Computation, Information, Cognition: The Nexus and the Liminal. Cambridge Scholars Press.
    Written by world-leading experts, this book draws together a number of important strands in contemporary approaches to the philosophical and scientific questions that emerge when dealing with the issues of computing, information, cognition and the conceptual issues that arise at their intersections. It discovers and develops the connections at the borders and in the interstices of disciplines and debates. This volume presents a range of essays that deal with the currently vigorous concerns of the philosophy of information, ontology creation and (...)
    Remove from this list  
    Translate
      Direct download  
     
    Export citation  
     
    My bibliography  
  39. Alan L. Durham, Copyright and Information Theory: Toward an Alternative Model of Authorship.
    Both literary scholars and students of copyright law have challenged the romantic model of authorship, a model emphasizing individual genius and creation ex nihilo. Authorship, they argue, is actually a collaborative effort. Authors assemble their works from the fragments of their cultural environment, transforming as much as creating. Copyright law, however, still champions the rights of authors and it requires a coherent theory of what authorship is. An alternative to the romantic model of authorship can be found in information theory, (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    My bibliography  
  40. A. Duwell (2003). Quantum Information Does Not Exist. Studies in History and Philosophy of Science Part B 34 (3):479-499.
    Some physicists seem to believe that quantum information theory requires a new concept of information (Jozsa, 1998, Quantum information and its properties. In: Hoi-Kwong Lo, S. Popescu, T. Spiller (Eds.), Introduction to Quantum Computation and Information, World Scientific, Singapore, (pp. 49-75); Deutsch & Hayden, 1999, Information flow in entangled quantum subsystems, preprint quant-ph/9906007). I will argue that no new concept is necessary. Shannon's concept of information is sufficient for quantum information theory. Properties that are cited to contrast quantum information and (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    My bibliography   2 citations  
  41. Armond Duwell (2008). Quantum Information Does Exist. Studies in History and Philosophy of Science Part B 39 (1):195-216.
    Some physicists seem to believe that quantum information theory requires a new concept of information , Introduction to Quantum Computation and Information, World Scientific, Singapore, ; Deutsch & Hayden, 1999, Information flow in entangled quantum subsystems, preprint quant-ph/9906007). I will argue that no new concept is necessary. Shannon's concept of information is sufficient for quantum information theory. Properties that are cited to contrast quantum information and classical information actually point to differences in our ability to manipulate, access, and transfer information (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    My bibliography   3 citations  
  42. Reinhard Eckhorn (1997). Support for Grouping-by-Synchronization, the Context-Field, and its Mechanisms, but Doubt in the Use of Information Theory by the Cortex. Behavioral and Brain Sciences 20 (4):686-687.
    Our work supports synchronization for binding within Phillips & Singer's “contextual field” (CF) as well as the type of its lateral interaction they propose. Both firmly agree with our “association field” (AF) and its modulatory influences (Eckhorn et al. 1990). However, the CF connections seem to produce anticorrelation among assemblies representing unrelated structures, whereas experimental evidence indicates decoupling. Finally, it is unclear how the cortex can have access to the logistic function used in the “coherent infomax” approach.
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    My bibliography  
  43. Adam Elga, Algorithmic Information Theory: The Basics.
    Turing machine An idealized computing device attached to a tape, each square of which is capable of holding a symbol. We write a program (a nite binary string) on the tape, and start the machine. If the machine halts with string o written at a designated place on the tape.
    Remove from this list  
    Translate
      Direct download  
     
    Export citation  
     
    My bibliography  
  44. David Ellerman, On Classical and Quantum Logical Entropy.
    The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized counting measure on elements of (...)
    Remove from this list  
    Translate
      Direct download  
     
    Export citation  
     
    My bibliography  
  45. David Ellerman (2013). An Introduction to Logical Entropy and its Relation to Shannon Entropy. International Journal of Semantic Computing 7 (2):121-145.
    The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set--just as the usual logical notion of probability based on the Boolean logic (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    My bibliography  
  46. David Ellerman (2009). A Short Note on the Logico-Conceptual Foundations of Information Theory in Partition Logic. The Reasoner 3 (7):4-5.
    A new logic of partitions has been developed that is dual to ordinary logic when the latter is interpreted as the logic of subsets of a fixed universe rather than the logic of propositions. For a finite universe, the logic of subsets gave rise to finite probability theory by assigning to each subset its relative size as a probability. The analogous construction for the dual logic of partitions gives rise to a notion of logical entropy that is precisely related to (...)
    Remove from this list  
    Translate
      Direct download  
     
    Export citation  
     
    My bibliography  
  47. David Ellerman (2009). Counting Distinctions: On the Conceptual Foundations of Shannon's Information Theory. Synthese 168 (1):119 - 149.
    Categorical logic has shown that modern logic is essentially the logic of subsets (or “subobjects”). In “subset logic,” predicates are modeled as subsets of a universe and a predicate applies to an individual if the individual is in the subset. Partitions are dual to subsets so there is a dual logic of partitions where a “distinction” [an ordered pair of distinct elements (u, u′) from the universe U] is dual to an “element”. A predicate modeled by a partition π on (...)
    Remove from this list   Direct download (9 more)  
     
    Export citation  
     
    My bibliography  
  48. Wesley Elsberry & Jeffrey Shallit (2011). Information Theory, Evolutionary Computation, and Dembski's "Complex Specified Information". Synthese 178 (2):237 - 270.
    Intelligent design advocate William Dembski has introduced a measure of information called "complex specified information", or CSI. He claims that CSI is a reliable marker of design by intelligent agents. He puts forth a "Law of Conservation of Information" which states that chance and natural laws are incapable of generating CSI. In particular, CSI cannot be generated by evolutionary computation. Dembski asserts that CSI is present in intelligent causes and in the flagellum of Escherichia coli, and concludes that neither have (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    My bibliography   8 citations  
  49. Amiel Feinstein (1958). Foundations of Information Theory. New York, Mcgraw-Hill.
    Remove from this list  
     
    Export citation  
     
    My bibliography   2 citations  
  50. Jordi Fortuny & Bernat Corominas-Murtra (2013). On the Origin of Ambiguity in Efficient Communication. Journal of Logic, Language and Information 22 (3):249-267.
    This article studies the emergence of ambiguity in communication through the concept of logical irreversibility and within the framework of Shannon’s information theory. This leads us to a precise and general expression of the intuition behind Zipf’s vocabulary balance in terms of a symmetry equation between the complexities of the coding and the decoding processes that imposes an unavoidable amount of logical uncertainty in natural communication. Accordingly, the emergence of irreversible computations is required if the complexities of the coding and (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    My bibliography  
1 — 50 / 125