About this topic
Summary The main focus of this category is Shannon's mathematical theory of information, and its broader philosophical uses. This includes in the first place cybernetics, signalling theories, the sender-receiver communication model, and Kolmogorov complexity.More general uses of information-theory that overlap with other domains of the philosophy of information may also belong to this category. Examples include different philosophical conceptions of information (semantic conceptions, semiotic approaches), as well as applications in specific domains of theoretical philosophy like the philosophy of science and the philosophy of mind.
Related categories

205 found
Order:
1 — 50 / 205
  1. added 2020-04-27
    El breve “Discurso del método” de Claude Shannon.Juan Ramón Álvarez - 2018 - Principia: An International Journal of Epistemology 22 (3):393-410.
    The following study departs from the lecture, entitled “Creative thinking”, delivered by Claude Shannon in 1952 at the Bell Laboratories. This paper includes an interpretive and critical account of the necessary conditions, as well as the desirable procedures, which must be satisfied in the scientific and technological invention, within the frame of the so-called scientist’s spontaneous philosophy.
    Remove from this list   Direct download (3 more)  
    Translate
     
     
    Export citation  
     
    Bookmark  
  2. added 2020-04-13
    Channels’ Confirmation and Predictions’ Confirmation: From the Medical Test to the Raven Paradox.Chenguang Lu - 2020 - Entropy 22 (4):384.
    After long arguments between positivism and falsificationism, the verification of universal hypotheses was replaced with the confirmation of uncertain major premises. Unfortunately, Hemple proposed the Raven Paradox. Then, Carnap used the increment of logical probability as the confirmation measure. So far, many confirmation measures have been proposed. Measure F proposed by Kemeny and Oppenheim among them possesses symmetries and asymmetries proposed by Elles and Fitelson, monotonicity proposed by Greco et al., and normalizing property suggested by many researchers. Based on the (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  3. added 2020-04-02
    广义信息论.Chenguang Lu (ed.) - 1993 - Hefei: Science and Tech. University Press.
    本书回顾了信息和熵理论的历史, 介绍了Shannon信息论及其局限性; 提出了广义通信模型和可度量语义信息、感觉信息及测量信号信息的广义信息测度;讨论了预测和检测的信息准则和优化理论; 提出了限误差信息率函数和保质信息率函数—经典信息率失真函数的改进形式, 及相应的通信数据压缩理论; 介绍了新的信息测度在气象预报、图象通信等领域的应用;分析了信息熵和统计物理熵之间的关系; 把新的通信模型和信息测度推广到控制领域, 使得信息测度可以用来评价控制 效果, 而且通信优化方法可用于控制优化。 还讨论了有关的经济学、生物学、美学及哲学问题。 新理论贯彻和深化了K.R. Popper 的科学进化论及马克思主义的实践检验真理思想。 本书可供通信、预测、检测、模式识别和人工智能、自然辩证法、 哲学等领域的研究人员及大 专学生阅读;亦可供语言学、控制、经济学、统计物理等方面学者参考。.
    Remove from this list   Direct download (2 more)  
    Translate
     
     
    Export citation  
     
    Bookmark  
  4. added 2020-03-29
    A review on a peer review.Andrej Poleev - 2016 - Enzymes 14.
    The peer review is an opportunity to perform an unlawful censorship which ensures that no apostate notion ever get published in mainstream journals. Or such peer review censorship is an opportunity to steal any content and to claim afterward the priority of the first publication. And last but not least, the peer review is an academic tool to promote the mainstream pseudoscience.
    Remove from this list   Direct download (2 more)  
    Translate
     
     
    Export citation  
     
    Bookmark   1 citation  
  5. added 2020-03-25
    Semantic Information G Theory and Logical Bayesian Inference for Machine Learning.Chenguang Lu - 2019 - Information 10 (8):261.
    An important problem with machine learning is that when label number n>2, it is very difficult to construct and optimize a group of learning functions, and we wish that optimized learning functions are still useful when prior distribution P(x) (where x is an instance) is changed. To resolve this problem, the semantic information G theory, Logical Bayesian Inference (LBI), and a group of Channel Matching (CM) algorithms together form a systematic solution. MultilabelMultilabel A semantic channel in the G theory consists (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  6. added 2020-03-10
    Logical Information Theory: New Logical Foundations for Information Theory.David Ellerman - 2017 - Logic Journal of the IGPL 25 (5):806-835.
    There is a new theory of information based on logic. The definition of Shannon entropy as well as the notions on joint, conditional, and mutual entropy as defined by Shannon can all be derived by a uniform transformation from the corresponding formulas of logical information theory. Information is first defined in terms of sets of distinctions without using any probability measure. When a probability measure is introduced, the logical entropies are simply the values of the probability measure on the sets (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  7. added 2020-03-10
    GlosariumBITRI: Interdisciplinary Elucidation of Concepts, Metaphors, Theories and Problems Concerning INFORMATION.José María Díaz-Nafría, Mario Pérez Montoro & Francisco Salto - 2015 - La Libertad: Universidad Estatal Península de Santa Elena, 2015..
    The BITrum glossary, planned as one of the first activities in the development of the BITrum project, essentially aims at serving as a tool for the clarification of concepts, theories and problems concerning information. Intending to embrace the most relevant points of view with respect to information, it is interdisciplinarily developed by a board of experts coming from a wide variety of scientific fields. The glossarium BITri kindly invites the scientific community to make contributions of any kind aimed at clarifying (...)
    Remove from this list  
     
    Export citation  
     
    Bookmark  
  8. added 2020-03-10
    What is Really Information.Francisco Salto - 2009 - Triple-C 7 (2):125-398.
  9. added 2020-02-11
    The Mathematical Theory of Communication.Arthur W. Burks - 1951 - Philosophical Review 60 (3):398-400.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark   55 citations  
  10. added 2019-10-23
    A Contingency Interpretation of Information Theory as a Bridge Between God’s Immanence and Transcendence.Philippe Gagnon - 2020 - In Michael Fuller, Dirk Evers, Anne L. C. Runehov, Knut-Willy Sæther & Bernard Michollet (eds.), Issues in Science and Theology: Nature – and Beyond. Cham: Springer. pp. 169-185.
    This paper investigates the degree to which information theory, and the derived uses that make it work as a metaphor of our age, can be helpful in thinking about God’s immanence and transcendance. We ask when it is possible to say that a consciousness has to be behind the information we encounter. If God is to be thought about as a communicator of information, we need to ask whether a communication system has to pre-exist to the divine and impose itself (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  11. added 2019-07-22
    Real Patterns and Indispensability.Abel Suñé & Manolo Martínez - manuscript
    While scientific inquiry crucially relies on the extraction of patterns from data, we still have a very imperfect understanding of the metaphysics of patterns—and, in particular, of what it is that makes a pattern real. In this paper we derive a criterion of real-patternhood from the notion of conditional Kolmogorov complexity. The resulting account belongs in the philosophical tradition, initiated by Dennett, that links real-patternhood to data compressibility, but is simpler and formally more perspicuous than other proposals defended heretofore in (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  12. added 2019-06-06
    Algorithmic Compression of Empirical Data: Reply to Twardy, Gardner, and Dowe.James Mcallister - 2005 - Studies in History and Philosophy of Science Part A 36 (2):403-410.
    This discussion note responds to objections by Twardy, Gardner, and Dowe to my earlier claim that empirical data sets are algorithmically incompressible. Twardy, Gardner, and Dowe hold that many empirical data sets are compressible by Minimum Message Length technique and offer this as evidence that these data sets are algorithmically compressible. I reply that the compression achieved by Minimum Message Length technique is different from algorithmic compression. I conclude that Twardy, Gardner, and Dowe fail to establish that empirical data sets (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  13. added 2019-06-06
    Algorithmic Randomness in Empirical Data.James W. McAllister - 2003 - Studies in History and Philosophy of Science Part A 34 (3):633-646.
    According to a traditional view, scientific laws and theories constitute algorithmic compressions of empirical data sets collected from observations and measurements. This article defends the thesis that, to the contrary, empirical data sets are algorithmically incompressible. The reason is that individual data points are determined partly by perturbations, or causal factors that cannot be reduced to any pattern. If empirical data sets are incompressible, then they exhibit maximal algorithmic complexity, maximal entropy and zero redundancy. They are therefore maximally efficient carriers (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   10 citations  
  14. added 2019-06-05
    Information and Inaccuracy.William Roche & Tomoji Shogenji - 2018 - British Journal for the Philosophy of Science 69 (2):577-604.
    This article proposes a new interpretation of mutual information. We examine three extant interpretations of MI by reduction in doubt, by reduction in uncertainty, and by divergence. We argue that the first two are inconsistent with the epistemic value of information assumed in many applications of MI: the greater is the amount of information we acquire, the better is our epistemic position, other things being equal. The third interpretation is consistent with EVI, but it is faced with the problem of (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  15. added 2019-03-27
    Does Semantic Information Need to Be Truthful?Lundgren Björn - 2019 - Synthese 196 (7):2885-2906.
    The concept of information has well-known difficulties. Among the many issues that have been discussed is the alethic nature of a semantic conception of information. Floridi :197–222, 2004; Philos Phenomenol Res 70:351–370, 2005; EUJAP 3:31–41, 2007; The philosophy of information, Oxford University Press, Oxford, 2011) argued that semantic information must be truthful. In this article, arguments will be presented in favor of an alethically neutral conception of semantic information and it will be shown that such a conception can withstand Floridi’s (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  16. added 2019-03-27
    Defining Information Security.Lundgren Björn & Möller Niklas - 2019 - Science and Engineering Ethics 25 (2):419-441.
    This article proposes a new definition of information security, the ‘Appropriate Access’ definition. Apart from providing the basic criteria for a definition—correct demarcation and meaning concerning the state of security—it also aims at being a definition suitable for any information security perspective. As such, it bridges the conceptual divide between so-called ‘soft issues’ of information security and more technical issues. Because of this it is also suitable for various analytical purposes, such as analysing possible security breaches, or for studying conflicting (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  17. added 2019-02-19
    Angeletics and Epistemology, Angeletics as Epistemology: A Comparison Between Capurro’s Angeletics and Goldman’s Social Epistemology.Pak-Hang Wong - 2011 - In Messages and Messengers – Angeletics as an Approach to the Phenomenology of Communication / Von Boten und Botschaften – Die Angeletik als Weg zur Phänomenologie der Kommunikation.
    Nearly a decade ago, Rafael Capurro has gradually shifted his attention towards the ideas of message and of messenger. In lieu of ‘information’, he proposes and develops a new direction of research he calls Angeletics that aims to examine the nature of message and messenger, both of which are inherently social. Coincidently, at about the same time, we witnessed the rise of social epistemology in Angelo-American analytic philosophy. This coincidence is interesting, because both Capurro’s Angeletics and social epistemology indicated a (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  18. added 2019-01-20
    A Scientific Metaphysical Naturalisation of Information.Bruce Long - 2018 - Dissertation, University of Sydney
    The objective of this thesis is to present a naturalised metaphysics of information, or to naturalise information, by way of deploying a scientific metaphysics according to which contingency is privileged and a-priori conceptual analysis is excluded (or at least greatly diminished) in favour of contingent and defeasible metaphysics. The ontology of information is established according to the premises and mandate of the scientific metaphysics by inference to the best explanation, and in accordance with the idea that the primacy of physics (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  19. added 2019-01-13
    Information Before Information Theory: The Politics of Data Beyond the Perspective of Communication.Colin Koopman - forthcoming - New Media and Society.
    Scholarship on the politics of new media widely assumes that communication functions as a sufficient conceptual paradigm for critically assessing new media politics. This article argues that communication-centric analyses fail to engage the politics of information itself, limiting information only to its consequences for communication, and neglecting information as it reaches into our selves, lives, and actions beyond the confines of communication. Furthering recent new media historiography on the “information theory” of Shannon and Wiener, the article reveals both the primacy (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  20. added 2018-11-19
    Semantic Information Measure with Two Types of Probability for Falsification and Confirmation.Lu Chenguang - manuscript
    Logical Probability (LP) is strictly distinguished from Statistical Probability (SP). To measure semantic information or confirm hypotheses, we need to use sampling distribution (conditional SP function) to test or confirm fuzzy truth function (conditional LP function). The Semantic Information Measure (SIM) proposed is compatible with Shannon’s information theory and Fisher’s likelihood method. It can ensure that the less the LP of a predicate is and the larger the true value of the proposition is, the more information there is. So the (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  21. added 2018-09-23
    What an Entangled Web We Weave: An Information-Centric Approach to Time-Evolving Socio-Technical Systems.Markus Luczak-Roesch, Kieron O’Hara, Jesse David Dinneen & Ramine Tinati - 2018 - Minds and Machines 28 (4):709-733.
    A new layer of complexity, constituted of networks of information token recurrence, has been identified in socio-technical systems such as the Wikipedia online community and the Zooniverse citizen science platform. The identification of this complexity reveals that our current understanding of the actual structure of those systems, and consequently the structure of the entire World Wide Web, is incomplete, which raises novel questions for data science research but also from the perspective of social epistemology. Here we establish the principled foundations (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  22. added 2018-07-31
    Information-Not-Thing: Further Problems with and Alternatives to the Belief That Information is Physical.Jesse David Dinneen & Christian Brauner - 2017 - Proceedings of 2017 CAIS-ACSI Conference.
    In this short paper, we show that a popular view in information science, information-as-thing, fails to account for a common example of information that seems physical. We then demonstrate how the distinction between types and tokens, recently used to analyse Shannon information, can account for this same example by viewing information as abstract, and discuss existing definitions of information that are consistent with this approach. -/- Dans ce court article nous montrons qu'une vision populaire en sciences de l'information, l'information en (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  23. added 2018-07-31
    From Coincidence to Purposeful Flow? Properties of Transcendental Information Cascades.Markus Luczak-Roesch, Ramine Tinati, Max van Kleek & Nigel Shadbolt - 2015 - In International Conference on Advances in Social Networks Analysis and Mining (ASONAM) 2015.
    In this paper, we investigate a method for constructing cascades of information co-occurrence, which is suitable to trace emergent structures in information in scenarios where rich contextual features are unavailable. Our method relies only on the temporal order of content-sharing activities, and intrinsic properties of the shared content itself. We apply this method to analyse information dissemination patterns across the active online citizen science project Planet Hunters, a part of the Zooniverse platform. Our results lend insight into both structural and (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  24. added 2018-07-30
    When Resources Collide: Towards a Theory of Coincidence in Information Spaces.Markus Luczak-Roesch, Ramine Tinati & Nigel Shadbolt - 2015 - In WWW '15 Companion Proceedings of the 24th International Conference on World Wide Web. Florence, Metropolitan City of Florence, Italy: pp. 1137-1142.
    This paper is an attempt to lay out foundations for a general theory of coincidence in information spaces such as the World Wide Web, expanding on existing work on bursty structures in document streams and information cascades. We elaborate on the hypothesis that every resource that is published in an information space, enters a temporary interaction with another resource once a unique explicit or implicit reference between the two is found. This thought is motivated by Erwin Shroedingers notion of entanglement (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  25. added 2018-03-02
    An Informational Theory of Counterfactuals.Danilo Dantas - 2018 - Acta Analytica 33 (4):525-538.
    Backtracking counterfactuals are problem cases for the standard, similarity based, theories of counterfactuals e.g., Lewis. These theories usually need to employ extra-assumptions to deal with those cases. Hiddleston, 632–657, 2005) proposes a causal theory of counterfactuals that, supposedly, deals well with backtracking. The main advantage of the causal theory is that it provides a unified account for backtracking and non-backtracking counterfactuals. In this paper, I present a backtracking counterfactual that is a problem case for Hiddleston’s account. Then I propose an (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  26. added 2018-02-12
    Toward an Algorithmic Metaphysics.Steve Petersen - 2013 - In David Dowe (ed.), Algorithmic Probability and Friends: Bayesian Prediction and Artificial Intelligence. Springer. pp. 306-317.
    There are writers in both metaphysics and algorithmic information theory (AIT) who seem to think that the latter could provide a formal theory of the former. This paper is intended as a step in that direction. It demonstrates how AIT might be used to define basic metaphysical notions such as *object* and *property* for a simple, idealized world. The extent to which these definitions capture intuitions about the metaphysics of the simple world, times the extent to which we think the (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  27. added 2018-01-21
    Information-Theoretic Philosophy of Mind.Jason Winning & William Bechtel - 2016 - In Luciano Floridi (ed.), The Routledge Handbook of Philosophy of Information. London and New York: Routledge. pp. 347-360.
  28. added 2018-01-18
    The Semantics Latent in Shannon Information.M. C. Isaac Alistair - 2019 - British Journal for the Philosophy of Science 70 (1):103-125.
    The lore is that standard information theory provides an analysis of information quantity, but not of information content. I argue this lore is incorrect, and there is an adequate informational semantics latent in standard theory. The roots of this notion of content can be traced to the secret parallel development of an information theory equivalent to Shannon’s by Turing at Bletchley Park, and it has been suggested independently in recent work by Skyrms and Bullinaria and Levy. This paper explicitly articulates (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  29. added 2018-01-18
    Intervening on the Causal Exclusion Problem for Integrated Information Theory.Matthew Baxendale & Garrett Mindt - 2018 - Minds and Machines 28 (2):331-351.
    In this paper, we examine the causal framework within which integrated information theory of consciousness makes it claims. We argue that, in its current formulation, IIT is threatened by the causal exclusion problem. Some proponents of IIT have attempted to thwart the causal exclusion problem by arguing that IIT has the resources to demonstrate genuine causal emergence at macro scales. In contrast, we argue that their proposed solution to the problem is damagingly circular as a result of inter-defining information and (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  30. added 2018-01-18
    Some Evidence Concerning the Genesis of Shannon’s Information Theory.Samuel W. Thomsen - 2009 - Studies in History and Philosophy of Science Part A 40 (1):81-91.
    A typescript by Claude Shannon, ‘Theorems on statistical sequences’, is examined to shed light on the development of information theory. In particular, it appears that Shannon was still working out the mathematical details of his theory in the spring of 1948, just before he published ‘A mathematical theory of communication’. This is contrasted with evidence from a declassified cryptography report that Shannon’s theory was intuitively worked out in its essentials by the time he filed the report in 1945. Previous interviews (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  31. added 2017-12-06
    A Quantitative-Informational Approach to Logical Consequence.Marcos Antonio Alves & Ítala M. Loffredo D'Otaviano - 2015 - In Jean-Yves Beziau (ed.), The Road to Universal Logic (Studies in Universal Logic). Switzerland: Springer International Publishing. pp. 105-24.
    In this work, we propose a definition of logical consequence based on the relation between the quantity of information present in a particular set of formulae and a particular formula. As a starting point, we use Shannon‟s quantitative notion of information, founded on the concepts of logarithmic function and probability value. We first consider some of the basic elements of an axiomatic probability theory, and then construct a probabilistic semantics for languages of classical propositional logic. We define the quantity of (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  32. added 2017-10-28
    Secure Communication in the Twin Paradox.Juan Carlos Garcia-Escartin & Pedro Chamorro-Posada - 2015 - Foundations of Physics 45 (11):1433-1453.
    The amount of information that can be transmitted through a noisy channel is affected by relativistic effects. Under the presence of a fixed noise at the receiver, there appears an asymmetry between “slowly aging” and “fast aging” observers which can be used to have private information transmission. We discuss some models for users inside gravitational wells and in the twin paradox scenario.
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  33. added 2017-09-26
    Information, Cosmology and Time.C. T. K. Chari - 1963 - Dialectica 17 (4):368-380.
  34. added 2017-06-20
    Patterns, Information, and Causation.Holly Andersen - 2017 - Journal of Philosophy 114 (11):592-622.
    This paper articulates an account of causation as a collection of information-theoretic relationships between patterns instantiated in the causal nexus. I draw on Dennett’s account of real patterns to characterize potential causal relata as patterns with specific identification criteria and noise tolerance levels, and actual causal relata as those patterns instantiated at some spatiotemporal location in the rich causal nexus as originally developed by Salmon. I develop a representation framework using phase space to precisely characterize causal relata, including their degree (...)
    Remove from this list   Direct download (6 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  35. added 2017-05-06
    A Simplicity Criterion for Physical Computation.Tyler Millhouse - 2019 - British Journal for the Philosophy of Science 70 (1):153-178.
    The aim of this paper is to offer a formal criterion for physical computation that allows us to objectively distinguish between competing computational interpretations of a physical system. The criterion construes a computational interpretation as an ordered pair of functions mapping (1) states of a physical system to states of an abstract machine, and (2) inputs to this machine to interventions in this physical system. This interpretation must ensure that counterfactuals true of the abstract machine have appropriate counterparts which are (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    Bookmark   5 citations  
  36. added 2017-03-03
    Information Flow in the Brain: Ordered Sequences of Metastable States.Andrew A. Fingelkurts & Alexander A. Fingelkurts - 2017 - Information 8 (1):22.
    In this brief overview paper, we analyse information flow in the brain. Although Shannon’s information concept, in its pure algebraic form, has made a number of valuable contributions to neuroscience, information dynamics within the brain is not fully captured by its classical description. These additional dynamics consist of self-organisation, interplay of stability/instability, timing of sequential processing, coordination of multiple sequential streams, circular causality between bottom-up and top-down operations, and information creation. Importantly, all of these processes are dynamic, hierarchically nested and (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark   5 citations  
  37. added 2017-02-13
    Prediction, Complexity, and Randomness.Giuseppe Trautteur - 1973 - In Radu J. Bogdan & Ilkka Niiniluoto (eds.), Logic, Language, and Probability. Boston: D. Reidel Pub. Co.. pp. 124--128.
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  38. added 2017-02-12
    Reviewed Work(S): Some Theorems on the Algorithmic Approach to Probability Theory and Information Theory (1971 Dissertation Directed by A. N. Kolmogorov). Annals of Pure and Applied Logic, Vol. 162 by L. A. Levin. [REVIEW]Jan Reimann - forthcoming - Association for Symbolic Logic: The Bulletin of Symbolic Logic.
    Review by: Jan Reimann The Bulletin of Symbolic Logic, Volume 19, Issue 3, Page 397-399, September 2013.
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  39. added 2017-02-12
    Completeness, Compactness, Effective Dimensions.Stephen Binns - 2013 - Mathematical Logic Quarterly 59 (3):206-218.
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  40. added 2017-02-12
    Kolmogorov Complexity and Noncomputability.George Davie - 2002 - Mathematical Logic Quarterly 48 (4):574-581.
    We use a method suggested by Kolmogorov complexity to examine some relations between Kolmogorov complexity and noncomputability. In particular we show that the method consistently gives us more information than conventional ways of demonstrating noncomputability . Also, many sets which are awkward to embed into the halting problem are easily shown noncomputable. We also prove a gap-theorem for outputting consecutive integers and find, for a given length n, a statement of length n with maximal proof length.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  41. added 2017-02-12
    Recursive Events in Random Sequences.George Davie - 2001 - Archive for Mathematical Logic 40 (8):629-638.
    Let ω be a Kolmogorov–Chaitin random sequence with ω1: n denoting the first n digits of ω. Let P be a recursive predicate defined on all finite binary strings such that the Lebesgue measure of the set {ω|∃nP(ω1: n )} is a computable real α. Roughly, P holds with computable probability for a random infinite sequence. Then there is an algorithm which on input indices for any such P and α finds an n such that P holds within the first (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  42. added 2017-02-11
    Kolmogorov Complexity Estimates for Detection of Viruses in Biologically Inspired Security Systems: A Comparison with Traditional Approaches.Sanjay Goel & Stephen F. Bush - 2003 - Complexity 9 (2):54-73.
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  43. added 2017-02-11
    Complexity and Information by Joseph Traub and A. G. Werschulz.Edward W. Packel - 1999 - Complexity 4 (5):39-40.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  44. added 2017-02-10
    The Relationship Between Task Complexity and Information Search: The Role of Self-Efficacy.J. Hu, B. A. Huhmann & M. R. Hyman - 2007 - Psychology and Marketing 24 (3):253--270.
    Remove from this list  
    Translate
     
     
    Export citation  
     
    Bookmark  
  45. added 2017-02-10
    A Note on Deterministic and Algorithmic Behavior.Pavel Materna - 1974 - Theory and Decision 4 (3-4):369-371.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  46. added 2017-02-02
    Algorithmic Randomness in Empirical Data.W. J. - 2003 - Studies in History and Philosophy of Science Part A 34 (3):633-646.
    According to a traditional view, scientific laws and theories constitute algorithmic compressions of empirical data sets collected from observations and measurements. This article defends the thesis that, to the contrary, empirical data sets are algorithmically incompressible. The reason is that individual data points are determined partly by perturbations, or causal factors that cannot be reduced to any pattern. If empirical data sets are incompressible, then they exhibit maximal algorithmic complexity, maximal entropy and zero redundancy. They are therefore maximally efficient carriers (...)
    Remove from this list  
     
    Export citation  
     
    Bookmark  
  47. added 2017-02-01
    Information Functions with Applications.Krzysztof Szymanek - 1990 - Studia Logica 49 (3):387 - 400.
    In the first place, we present the definition and fundamental properties of information functions — functions which establish a correspondence between sets of formulas and the information contained in them. The intuitions for the notion of information stem from the conception of Bar-Hillel and Carnap in [3]. In § 2 we will briefly show how those notions can be applied to the logic of theory change. In § 3 we will use them for proving two theorems about the lattices of (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  48. added 2017-01-30
    Prefix and Plain Kolmogorov Complexity Characterizations of 2-Randomness: Simple Proofs.Bruno Bauwens - 2015 - Archive for Mathematical Logic 54 (5-6):615-629.
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  49. added 2017-01-30
    Algorithmic Randomness Over General Spaces.Kenshi Miyabe - 2014 - Mathematical Logic Quarterly 60 (3):184-204.
  50. added 2017-01-28
    Aristotle and Information Theory a Comparison of the Influence of Causal Assumptions on Two Theories of Communication.Lawrence William Rosenfield - 1971 - Mouton.
1 — 50 / 205