Search results for 'Information Theory' (try it on Scholar)

1000+ found
Sort by:
  1. Léon Brillouin (1956/2004). Science and Information Theory. Dover Publications.score: 240.0
    A classic source for understanding the connections between information theory and physics, this text was written by one of the giants of 20th-century physics and is appropriate for upper-level undergraduates and graduate students. Topics include the principles of coding, coding problems and solutions, the analysis of signals, a summary of thermodynamics, thermal agitation and Brownian motion, and thermal noise in an electric circuit. A discussion of the negentropy principle of information introduces the author's renowned examination of Maxwell's (...)
    Direct download  
     
    My bibliography  
     
    Export citation  
  2. David Ellerman (2009). Counting Distinctions: On the Conceptual Foundations of Shannon's Information Theory. Synthese 168 (1):119 - 149.score: 240.0
    Categorical logic has shown that modern logic is essentially the logic of subsets (or “subobjects”). In “subset logic,” predicates are modeled as subsets of a universe and a predicate applies to an individual if the individual is in the subset. Partitions are dual to subsets so there is a dual logic of partitions where a “distinction” [an ordered pair of distinct elements (u, u′) from the universe U] is dual to an “element”. A predicate modeled by a partition π on (...)
    Direct download (8 more)  
     
    My bibliography  
     
    Export citation  
  3. Peter D. Grünwald & Paul M. B. Vitányi (2003). Kolmogorov Complexity and Information Theory. With an Interpretation in Terms of Questions and Answers. Journal of Logic, Language and Information 12 (4):497-529.score: 240.0
    We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and wherethey are fundamentally different. We discuss and relate the basicnotions of both theories: Shannon entropy, Kolmogorov complexity, Shannon mutual informationand Kolmogorov (``algorithmic'') mutual information. We explainhow universal coding may be viewed as a middle ground betweenthe two theories. We consider Shannon's rate distortion theory, whichquantifies useful (in a certain sense) information.We use the communication of (...) as our guiding motif, and we explain howit relates to sequential question-answer sessions. (shrink)
    Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  4. Jae-Weon Lee (2011). Quantum Mechanics Emerges From Information Theory Applied to Causal Horizons. Foundations of Physics 41 (4):744-753.score: 240.0
    It is suggested that quantum mechanics is not fundamental but emerges from classical information theory applied to causal horizons. The path integral quantization and quantum randomness can be derived by considering information loss of fields or particles crossing Rindler horizons for accelerating observers. This implies that information is one of the fundamental roots of all physical phenomena. The connection between this theory and Verlinde’s entropic gravity theory is also investigated.
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  5. Gérard Battail (2013). Biology Needs Information Theory. Biosemiotics 6 (1):77-103.score: 240.0
    Communication is an important feature of the living world that mainstream biology fails to adequately deal with. Applying two main disciplines can be contemplated to fill in this gap: semiotics and information theory. Semiotics is a philosophical discipline mainly concerned with meaning; applying it to life already originated in biosemiotics. Information theory is a mathematical discipline coming from engineering which has literal communication as purpose. Biosemiotics and information theory are thus concerned with distinct and (...)
    No categories
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  6. Sean Devine (2014). An Algorithmic Information Theory Challenge to Intelligent Design. Zygon 49 (1):42-65.score: 240.0
    William Dembski claims to have established a decision process to determine when highly unlikely events observed in the natural world are due to Intelligent Design. This article argues that, as no implementable randomness test is superior to a universal Martin-Löf test, this test should be used to replace Dembski's decision process. Furthermore, Dembski's decision process is flawed, as natural explanations are eliminated before chance. Dembski also introduces a fourth law of thermodynamics, his “law of conservation of information,” to argue (...)
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  7. Gérard Battail (2009). Applying Semiotics and Information Theory to Biology: A Critical Comparison. [REVIEW] Biosemiotics 2 (3):303-320.score: 240.0
    Since the beginning of the XX-th century, it became increasingly evident that information, besides matter and energy, is a major actor in the life processes. Moreover, communication of information has been recognized as differentiating living things from inanimate ones, hence as specific to the life processes. Therefore the sciences of matter and energy, chemistry and physics, do not suffice to deal with life processes. Biology should also rely on sciences of information. A majority of biologists, however, did (...)
    No categories
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  8. Wesley Elsberry & Jeffrey Shallit (2011). Information Theory, Evolutionary Computation, and Dembski's "Complex Specified Information". Synthese 178 (2):237 - 270.score: 216.0
    Intelligent design advocate William Dembski has introduced a measure of information called "complex specified information", or CSI. He claims that CSI is a reliable marker of design by intelligent agents. He puts forth a "Law of Conservation of Information" which states that chance and natural laws are incapable of generating CSI. In particular, CSI cannot be generated by evolutionary computation. Dembski asserts that CSI is present in intelligent causes and in the flagellum of Escherichia coli, and concludes (...)
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  9. Christoph Jäger (2004). Skepticism, Information, and Closure: Dretske's Theory of Knowledge. Erkenntnis 61 (2-3):187 - 201.score: 210.0
    According to Fred Dretskes externalist theory of knowledge a subject knows that p if and only if she believes that p and this belief is caused or causally sustained by the information that p. Another famous feature of Dretskes epistemology is his denial that knowledge is closed under known logical entailment. I argue that, given Dretskes construal of information, he is in fact committed to the view that both information and knowledge are closed under known entailment. (...)
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  10. Aleksandr I͡Akovlevich Khinchin (1957). Mathematical Foundations of Information Theory. New York, Dover Publications.score: 210.0
    Comprehensive, rigorous introduction to work of Shannon, McMillan, Feinstein and Khinchin. Translated by R. A. Silverman and M. D. Friedman.
    Direct download  
     
    My bibliography  
     
    Export citation  
  11. D. E. Berlyne (1957). Conflict and Information-Theory Variables as Determinants of Human Perceptual Curiosity. Journal of Experimental Psychology 53 (6):399.score: 210.0
    No categories
    Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  12. Lawrence William Rosenfield (1971). Aristotle and Information Theory. The Hague,Mouton.score: 210.0
  13. Murray Aborn & Herbert Rubenstein (1952). Information Theory and Immediate Recall. Journal of Experimental Psychology 44 (4):260.score: 210.0
    No categories
    Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  14. Sean Devine (2006). The Application of Algorithmic Information Theory to Noisy Patterned Strings. Complexity 12 (2):52-58.score: 210.0
    No categories
    Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  15. Amiel Feinstein (1958). Foundations of Information Theory. New York, Mcgraw-Hill.score: 210.0
     
    My bibliography  
     
    Export citation  
  16. Laura Felline (2010). Structural Explanation From Special Relativity to Quantum Information Theory. In M. D'Agostino, G. Giorello & F. Laudisa (eds.), SILFS New Essays in Logic and Philosophy of Science. College Pubblications.score: 210.0
  17. Abraham A. Moles (1966). Information Theory and Esthetic Perception. Urbana, University of Illinois Press.score: 210.0
  18. Douglas L. Nelson (1969). Information Theory and Stimulus Encoding in Free and Serial Recall: Ordinal Position of Formal Similarity. Journal of Experimental Psychology 80 (3p1):537.score: 210.0
    No categories
    Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  19. Douglas L. Nelson & Frank A. Rowe (1969). Information Theory and Stimulus Encoding in Paired-Associate Acquisition: Ordinal Position of Formal Similarity. Journal of Experimental Psychology 79 (2p1):342.score: 210.0
    No categories
    Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  20. Arie M. Oostlander & Hans De Swart (1966). Search-Discrimination Time and the Applicability of Information Theory. Journal of Experimental Psychology 72 (3):423.score: 210.0
    No categories
    Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  21. Luciano Floridi (2012). Semantic Information and the Network Theory of Account. Synthese 184 (3):431-454.score: 204.0
    The article addresses the problem of how semantic information can be upgraded to knowledge. The introductory section explains the technical terminology and the relevant background. Section 2 argues that, for semantic information to be upgraded to knowledge, it is necessary and sufficient to be embedded in a network of questions and answers that correctly accounts for it. Section 3 shows that an information flow network of type A fulfils such a requirement, by warranting that the erotetic deficit, (...)
    Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  22. Luciano Floridi (2004). Outline of a Theory of Strongly Semantic Information. Minds and Machines 14 (2):197-221.score: 204.0
    This paper outlines a quantitative theory of strongly semantic information (TSSI) based on truth-values rather than probability distributions. The main hypothesis supported in the paper is that the classic quantitative theory of weakly semantic information (TWSI), based on probability distributions, assumes that truth-values supervene on factual semantic information, yet this principle is too weak and generates a well-known semantic paradox, whereas TSSI, according to which factual semantic information encapsulates truth, can avoid the paradox and (...)
    Direct download (13 more)  
     
    My bibliography  
     
    Export citation  
  23. Iii George Medley (2013). The Inspiration of God and Wolfhart Pannenberg's “Field Theory of Information”. Zygon 48 (1):93-106.score: 198.0
    This paper will examine the implications of an extended “field theory of information,” suggested by Wolfhart Pannenberg, specifically in the Christian understanding of creation. The paper argues that the Holy Spirit created the world as field, a concept from physics, and the creation is directed by the logos utilizing information. Taking into account more recent developments of information theory, the essay further suggests that present creation has a causal impact upon the information utilized in (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  24. Luís Santos-Pinto (2009). Asymmetries in Information Processing in a Decision Theory Framework. Theory and Decision 66 (4):317-343.score: 198.0
    Research in psychology suggests that some individuals are more sensitive to positive than to negative information while others are more sensitive to negative rather than positive information. I take these cognitive positive–negative asymmetries in information processing to a Bayesian decision-theory model and explore its consequences in terms of decisions and payoffs. I show that in monotone decision problems economic agents with more positive-responsive information structures are always better off, ex ante, when they face problems where (...)
    No categories
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  25. Anja Matschuck (2011). Non-Local Correlations in Therapeutic Settings? A Qualitative Study on the Basis of Weak Quantum Theory and the Model of Pragmatic Information. Axiomathes 21 (2):249-261.score: 192.0
    Weak Quantum Theory (WQT) and the Model of Pragmatic Information (MPI) are two psychophysical concepts developed on the basis of quantum physics. The present study contributes to their empirical examination. The issue of the study is whether WQT and MPI can not only explain ‘psi’-phenomena theoretically but also prove to be consistent with the empirical phenomenology of extrasensory perception (ESP). From the main statements of both models, 33 deductions for psychic readings are derived. Psychic readings are defined as (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  26. Ingetraut Dahlberg (2008). The Information Coding Classification (ICC): A Modern, Theory-Based Fully-Faceted, Universal System of Knowledge Fields. [REVIEW] Axiomathes 18 (2):161-176.score: 192.0
    Introduction into the structure, contents and specifications (especially the Systematifier) of the Information Coding Classification, developed in the seventies and used in many ways by the author and a few others following its publication in 1982. Its theoretical basis is explained consisting in (1) the Integrative Level Theory, following an evolutionary approach of ontical areas, and integrating also on each level the aspects contained in the sequence of the levels, (2) the distinction between categories of form and (...)
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  27. Rob Clifton, Jeffrey Bub & Hans Halvorson (2003). Characterizing Quantum Theory in Terms of Information-Theoretic Constraints. Foundations of Physics 33 (11):1561-1591.score: 192.0
    We show that three fundamental information-theoretic constraints -- the impossibility of superluminal information transfer between two physical systems by performing measurements on one of them, the impossibility of broadcasting the information contained in an unknown physical state, and the impossibility of unconditionally secure bit commitment -- suffice to entail that the observables and state space of a physical theory are quantum-mechanical. We demonstrate the converse derivation in part, and consider the implications of alternative answers to a (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  28. Orlin Vakarelov (2013). From Interface to Correspondence: Recovering Classical Representations in a Pragmatic Theory of Semantic Information. [REVIEW] Minds and Machines (3):1-25.score: 192.0
    One major fault line in foundational theories of cognition is between the so-called “representational” and “non-representational” theories. Is it possible to formulate an intermediate approach for a foundational theory of cognition by defining a conception of representation that may bridge the fault line? Such an account of representation, as well as an account of correspondence semantics, is offered here. The account extends previously developed agent-based pragmatic theories of semantic information, where meaning of an information state is defined (...)
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  29. Bernd Carsten Stahl, Neil F. Doherty, Mark Shaw & Helge Janicke (2014). Critical Theory as an Approach to the Ethics of Information Security. Science and Engineering Ethics 20 (3):675-699.score: 192.0
    Information security can be of high moral value. It can equally be used for immoral purposes and have undesirable consequences. In this paper we suggest that critical theory can facilitate a better understanding of possible ethical issues and can provide support when finding ways of addressing them. The paper argues that critical theory has intrinsic links to ethics and that it is possible to identify concepts frequently used in critical theory to pinpoint ethical concerns. Using the (...)
    Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  30. Jeff Coulter (1995). The Informed Neuron: Issues in the Use of Information Theory in the Behavioral Sciences. [REVIEW] Minds and Machines 5 (4):583-96.score: 186.0
    The concept of “information” is virtually ubiquitous in contemporary cognitive science. It is claimed to be “processed” (in cognitivist theories of perception and comprehension), “stored” (in cognitivist theories of memory and recognition), and otherwise manipulated and transformed by the human central nervous system. Fred Dretske's extensive philosophical defense of a theory of informational content (“semantic” information) based upon the Shannon-Weaver formal theory of information is subjected to critical scrutiny. A major difficulty is identified in Dretske's (...)
    Direct download (8 more)  
     
    My bibliography  
     
    Export citation  
  31. Wayne C. Myrvold (2010). From Physics to Information Theory and Back. In Alisa Bokulich & Gregg Jaeger (eds.), Philosophy of Quantum Information and Entanglement. Cambridge University Press. 181--207.score: 186.0
    Quantum information theory has given rise to a renewed interest in, and a new perspective on, the old issue of understanding the ways in which quantum mechanics differs from classical mechanics. The task of distinguishing between quantum and classical theory is facilitated by neutral frameworks that embrace both classical and quantum theory. In this paper, I discuss two approaches to this endeavour, the algebraic approach, and the convex set approach, with an eye to the strengths of (...)
    Direct download (4 more)  
     
    My bibliography  
     
    Export citation  
  32. Patrick Dillon (2004). Trajectories and Tensions in the Theory of Information and Communication Technology in Education. British Journal of Educational Studies 52 (2):138 - 150.score: 186.0
    For largely historical reasons, information and communication technology in education has been heavily influenced by a form of constructivism based on the transmission and transformation of information. This approach has implications for both learning and teaching in the field. The assumptions underlying the approach are explored and a critique offered. Although the transmission approach is entrenched in procedures and pedagogies, it is increasingly challenged by an action-theoretical form of constructivism. In this 'ecology of ideas', the value of the (...)
    No categories
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  33. Rob Clifton (2002). The Subtleties of Entanglement and its Role in Quantum Information Theory. Proceedings of the Philosophy of Science Association 2002 (3):S150-S167.score: 180.0
    My aim in this paper is a modest one. I do not have any particular thesis to advance about the nature of entanglement, nor can I claim novelty for any of the material I shall discuss. My aim is simply to raise some questions about entanglement that spring naturally from certain developments in quantum information theory and are, I believe, worthy of serious consideration by philosophers of science. The main topics I discuss are different manifestations of quantum nonlocality, (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  34. Joseph F. Hanna (1969). Explanation, Prediction, Description, and Information Theory. Synthese 20 (3):308 - 334.score: 180.0
    The distinction between explanation and prediction has received much attention in recent literature, but the equally important distinction between explanation and description (or between prediction and description) remains blurred. This latter distinction is particularly important in the social sciences, where probabilistic models (or theories) often play dual roles as explanatory and descriptive devices. The distinction between explanation (or prediction) and description is explicated in the present paper in terms of information theory. The explanatory (or predictive) power of a (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  35. William F. Harms (1998). The Use of Information Theory in Epistemology. Philosophy of Science 65 (3):472-501.score: 180.0
    Information theory offers a measure of "mutual information" which provides an appropriate measure of tracking efficiency for the naturalistic epistemologist. The statistical entropy on which it is based is arguably the best way of characterizing the uncertainty associated with the behavior of a system, and it is ontologically neutral. Though not appropriate for the naturalization of meaning, mutual information can serve as a measure of epistemic success independent of semantic maps and payoff structures. While not containing (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  36. Y. Ben-Aryeh, A. Mann & B. C. Sanders (1999). Empirical State Determination of Entangled Two-Level Systems and Its Relation to Information Theory. Foundations of Physics 29 (12):1963-1975.score: 180.0
    Theoretical methods for empirical state determination of entangled two-level systems are analyzed in relation to information theory. We show that hidden variable theories would lead to a Shannon index of correlation between the entangled subsystems which is larger than that predicted by quantum mechanics. Canonical representations which have maximal correlations are treated by the use of Schmidt and Hilbert-Schmidt decomposition of the entangled states, including especially the Bohm singlet state and the GHZ entangled states. We show that quantum (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  37. Amit Hagar (2003). A Philosopher Looks at Quantum Information Theory. Philosophy of Science 70 (4):752-775.score: 180.0
    Recent suggestions to supply quantum mechanics (QM) with realistic foundations by reformulating it in light of quantum information theory (QIT) are examined and are found wanting by pointing to a basic conceptual problem that QIT itself ignores, namely, the measurement problem. Since one cannot ignore the measurement problem and at the same time pretend to be a realist, as they stand, the suggestions to reformulate QM in light of QIT are nothing but instrumentalism in disguise.
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  38. Attila Grandpierre (2006). A Review Of: "Information Theory, Evolution and the Origin of Life as a Digital Message How Life Resembles a Computer". [REVIEW] World Futures 62 (5):401 – 403.score: 180.0
    (2006). A Review of: “Information Theory, Evolution and the Origin of Life as a Digital Message How Life Resembles a Computer”. World Futures: Vol. 62, No. 5, pp. 401-403.
    Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  39. Christopher Gordon Timpson (2013). Quantum Information Theory & the Foundations of Quantum Mechanics. Oxford University Press.score: 180.0
    Christopher G. Timpson provides the first full-length philosophical treatment of quantum information theory and the questions it raises for our understanding of the quantum world. He argues for an ontologically deflationary account of the nature of quantum information, which is grounded in a revisionary analysis of the concepts of information.
    Direct download (2 more)  
     
    My bibliography  
     
    Export citation  
  40. Sebastian Sequoiah-Grayson (2012). Giovanni Sommaruga (Ed): Formal Theories of Information: From Shannon to Semantic Information Theory and General Concepts of Information. [REVIEW] Minds and Machines 22 (1):35-40.score: 180.0
    Giovanni Sommaruga (ed): Formal Theories of Information: From Shannon to Semantic Information Theory and General Concepts of Information Content Type Journal Article Pages 35-40 DOI 10.1007/s11023-011-9250-2 Authors Sebastian Sequoiah-Grayson, Department of Theoretical Philosophy, Faculty of Philosophy, University of Groningen, Groningen, The Netherlands Journal Minds and Machines Online ISSN 1572-8641 Print ISSN 0924-6495 Journal Volume Volume 22 Journal Issue Volume 22, Number 1.
    Direct download (8 more)  
     
    My bibliography  
     
    Export citation  
  41. Giuseppe Primiero (2011). Giovanni Sommaruga (Ed): Formal Theories of Information: From Shannon to Semantic Information Theory and General Concepts of Information. [REVIEW] Minds and Machines 21 (1):119-122.score: 180.0
    Giovanni Sommaruga (ed): Formal Theories of Information: From Shannon to Semantic Information Theory and General Concepts of Information Content Type Journal Article Pages 119-122 DOI 10.1007/s11023-011-9228-0 Authors Giuseppe Primiero, Centre for Logic and Philosophy of Science, University of Ghent, Blandijnberg 2, Ghent, 9000 Belgium Journal Minds and Machines Online ISSN 1572-8641 Print ISSN 0924-6495 Journal Volume Volume 21 Journal Issue Volume 21, Number 1.
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  42. Julio A. Camargo (2008). Revisiting the Relation Between Species Diversity and Information Theory. Acta Biotheoretica 56 (4).score: 180.0
    The Shannon information function (H) has been extensively used in ecology as a statistic of species diversity. Yet, the use of Shannon diversity index has also been criticized, mainly because of its ambiguous ecological interpretation and because of its relatively great sensitivity to the relative abundances of species in the community. In my opinion, the major shortcoming of the traditional perspective (on the possible relation of species diversity with information theory) is that species need for an external (...)
    Direct download  
     
    My bibliography  
     
    Export citation  
  43. Panu Raatikainen (2000). Algorithmic Information Theory and Undecidability. Synthese 123 (2):217-225.score: 180.0
    Algorithmic information theory, or the theory of Kolmogorov complexity, has become an extraordinarily popular theory, and this is no doubt due, in some part, to the fame of Chaitin’s incompleteness results arising from this field. Actually, there are two rather different results by Chaitin: the earlier one concerns the finite limit of the provability of complexity (see Chaitin, 1974a, 1974b, 1975a); and the later is related to random reals and the halting probability (see Chaitin, 1986, 1987a, (...)
    Direct download (8 more)  
     
    My bibliography  
     
    Export citation  
  44. G. J. Chaitin, How to Run Algorithmic Information Theory on a Computer.score: 180.0
    Hi everybody! It's a great pleasure for me to be back here at the new, improved Santa Fe Institute in this spectacular location. I guess this is my fourth visit and it's always very stimulating, so I'm always very happy to visit you guys. I'd like to tell you what I've been up to lately. First of all, let me say what algorithmic information theory is good for, before telling you about the new version of it I've got.
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  45. Jacob T. Schwartz, A Note on Monte Carlo Primality Tests and Algorithmic Information Theory.score: 180.0
    clusions are only probably correct. On the other hand, algorithmic information theory provides a precise mathematical definition of the notion of random or patternless sequence. In this paper we shall describe conditions under which if the sequence of coin tosses in the Solovay– Strassen and Miller–Rabin algorithms is replaced by a sequence of heads and tails that is of maximal algorithmic information content, i.e., has maximal algorithmic randomness, then one obtains an error-free test for primality. These results (...)
    Direct download  
     
    My bibliography  
     
    Export citation  
  46. Michael Cerullo (2011). Integrated Information Theory A Promising but Ultimately Incomplete Theory of Consciousness. Journal of Consciousness Studies 18 (11-12):11-12.score: 180.0
    Tononi has proposed a fundamental theory of consciousness he terms Integrated Information Theory (IIT). IIT purports to explain the quantity of conscious experience by linking it with integrated information: information shared by the system as a whole and quantified by adopting a modified version of Shannon's definition of information. Since the fundamental aspect of IIT is information the theory allows for the multiple realizability of consciousness. While there are several concepts within IIT (...)
    Direct download  
     
    My bibliography  
     
    Export citation  
  47. C. D'Antonl & P. Scanzano (1980). An Application of Information Theory: Longitudinal Measurability Bounds in Classical and Quantum Physics. [REVIEW] Foundations of Physics 10 (11-12):875-885.score: 180.0
    We examine the problem of the existence (in classical and/or quantum physics) of longitudinal limitations of measurability, defined as limitations preventing the measurement of a given quantity with arbitrarily high accuracy. We consider a measuring device as a generalized communication system, which enables us to use methods of information theory. As a direct consequence of the Shannon theorem on channel capacity, we obtain an inequality which limits the accuracy of a measurement in terms of the average power necessary (...)
    Direct download (3 more)  
     
    My bibliography  
     
    Export citation  
  48. Derek Partridge (1981). Information Theory and Redundancy. Philosophy of Science 48 (2):308-316.score: 180.0
    This paper argues that Information Theoretic Redundancy (ITR) is fundamentally a composite concept that has been continually misinterpreted since the very inception of Information Theory. We view ITR as compounded of true redundancy and partial redundancy. This demarcation of true redundancy illustrates a limiting case phenomenon: the underlying metric (number of alternatives) differs only by degree but the properties of this concept differ in kind from those of partial redundancy. Several other studies are instanced which also imply (...)
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
  49. Michiel van Lambalgen (1989). Algorithmic Information Theory. Journal of Symbolic Logic 54 (4):1389-1400.score: 180.0
    We present a critical discussion of the claim (most forcefully propounded by Chaitin) that algorithmic information theory sheds new light on Godel's first incompleteness theorem.
    Direct download (6 more)  
     
    My bibliography  
     
    Export citation  
  50. Kevin C. Desouza & Tobin Hensgen (2002). On "Information" in Organizations: An Emergent Information Theory and Semiotic Framework. Emergence 4 (3):95-114.score: 180.0
    (2002). On 'Information' in Organizations: An Emergent Information Theory and Semiotic Framework. Emergence: Vol. 4, No. 3, pp. 95-114. doi: 10.1207/S15327000EM0403-07.
    Direct download (5 more)  
     
    My bibliography  
     
    Export citation  
1 — 50 / 1000