About this topic
Summary The main focus of this category is Shannon's mathematical theory of information, and its broader philosophical uses. This includes in the first place cybernetics, signalling theories, the sender-receiver communication model, and Kolmogorov complexity.More general uses of information-theory that overlap with other domains of the philosophy of information may also belong to this category. Examples include different philosophical conceptions of information (semantic conceptions, semiotic approaches), as well as applications in specific domains of theoretical philosophy like the philosophy of science and the philosophy of mind.
Related categories

195 found
Order:
1 — 50 / 195
  1. added 2019-01-20
    A Scientific Metaphysical Naturalisation of Information.Bruce Long - 2018 - Dissertation,
    The objective of this thesis is to present a naturalised metaphysics of information, or to naturalise information, by way of deploying a scientific metaphysics according to which contingency is privileged and a-priori conceptual analysis is excluded (or at least greatly diminished) in favour of contingent and defeasible metaphysics. The ontology of information is established according to the premises and mandate of the scientific metaphysics by inference to the best explanation, and in accordance with the idea that the primacy of physics (...)
    Remove from this list  
     
    Export citation  
     
    Bookmark  
  2. added 2019-01-13
    Information Before Information Theory: The Politics of Data Beyond the Perspective of Communication.Colin Koopman - forthcoming - New Media and Society.
    Scholarship on the politics of new media widely assumes that communication functions as a sufficient conceptual paradigm for critically assessing new media politics. This article argues that communication-centric analyses fail to engage the politics of information itself, limiting information only to its consequences for communication, and neglecting information as it reaches into our selves, lives, and actions beyond the confines of communication. Furthering recent new media historiography on the “information theory” of Shannon and Wiener, the article reveals both the primacy (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  3. added 2019-01-08
    Argument Williamsona przeciwko KK-tezie.Grzegorz Lisowski - 2017 - Diametros 52:81-95.
    The KK-principle can be defined as follows: “For any subject x : if x knows that p, then she is always in a position to know that she knows that p ”. This principle has been widely accepted in the history of philosophy. However, in contemporary epistemology it is considered controversial and regarded as an important part of the debate concerning the nature of knowledge. One of the arguments against the KK-principle has been presented by Timothy Williamson and it involves (...)
    Remove from this list   Direct download  
    Translate
     
     
    Export citation  
     
    Bookmark  
  4. added 2018-11-19
    Semantic Information Measure with Two Types of Probability for Falsification and Confirmation.Lu Chenguang - manuscript
    Logical Probability (LP) is strictly distinguished from Statistical Probability (SP). To measure semantic information or confirm hypotheses, we need to use sampling distribution (conditional SP function) to test or confirm fuzzy truth function (conditional LP function). The Semantic Information Measure (SIM) proposed is compatible with Shannon’s information theory and Fisher’s likelihood method. It can ensure that the less the LP of a predicate is and the larger the true value of the proposition is, the more information there is. So the (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  5. added 2018-09-23
    What an Entangled Web We Weave: An Information-Centric Approach to Time-Evolving Socio-Technical Systems.Markus Luczak-Roesch, Kieron O’Hara, Jesse David Dinneen & Ramine Tinati - 2018 - Minds and Machines 28 (4):709-733.
    A new layer of complexity, constituted of networks of information token recurrence, has been identified in socio-technical systems such as the Wikipedia online community and the Zooniverse citizen science platform. The identification of this complexity reveals that our current understanding of the actual structure of those systems, and consequently the structure of the entire World Wide Web, is incomplete, which raises novel questions for data science research but also from the perspective of social epistemology. Here we establish the principled foundations (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  6. added 2018-07-31
    Information-Not-Thing: Further Problems with and Alternatives to the Belief That Information is Physical.Jesse David Dinneen & Christian Brauner - 2017 - Proceedings of 2017 CAIS-ACSI Conference.
    In this short paper, we show that a popular view in information science, information-as-thing, fails to account for a common example of information that seems physical. We then demonstrate how the distinction between types and tokens, recently used to analyse Shannon information, can account for this same example by viewing information as abstract, and discuss existing definitions of information that are consistent with this approach. -/- Dans ce court article nous montrons qu'une vision populaire en sciences de l'information, l'information en (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  7. added 2018-07-31
    From Coincidence to Purposeful Flow? Properties of Transcendental Information Cascades.Markus Luczak-Roesch, Ramine Tinati, Max van Kleek & Nigel Shadbolt - 2015 - In International Conference on Advances in Social Networks Analysis and Mining (ASONAM) 2015.
    In this paper, we investigate a method for constructing cascades of information co-occurrence, which is suitable to trace emergent structures in information in scenarios where rich contextual features are unavailable. Our method relies only on the temporal order of content-sharing activities, and intrinsic properties of the shared content itself. We apply this method to analyse information dissemination patterns across the active online citizen science project Planet Hunters, a part of the Zooniverse platform. Our results lend insight into both structural and (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  8. added 2018-07-30
    When Resources Collide: Towards a Theory of Coincidence in Information Spaces.Markus Luczak-Roesch, Ramine Tinati & Nigel Shadbolt - 2015 - In WWW '15 Companion Proceedings of the 24th International Conference on World Wide Web. Florence, Metropolitan City of Florence, Italy: pp. 1137-1142.
    This paper is an attempt to lay out foundations for a general theory of coincidence in information spaces such as the World Wide Web, expanding on existing work on bursty structures in document streams and information cascades. We elaborate on the hypothesis that every resource that is published in an information space, enters a temporary interaction with another resource once a unique explicit or implicit reference between the two is found. This thought is motivated by Erwin Shroedingers notion of entanglement (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  9. added 2018-03-02
    An Informational Theory of Counterfactuals.Danilo Fraga Dantas - 2018 - Acta Analytica 33 (4):525-538.
    Backtracking counterfactuals are problem cases for the standard, similarity based, theories of counterfactuals e.g., Lewis. These theories usually need to employ extra-assumptions to deal with those cases. Hiddleston, 632–657, 2005) proposes a causal theory of counterfactuals that, supposedly, deals well with backtracking. The main advantage of the causal theory is that it provides a unified account for backtracking and non-backtracking counterfactuals. In this paper, I present a backtracking counterfactual that is a problem case for Hiddleston’s account. Then I propose an (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  10. added 2018-02-12
    Toward an Algorithmic Metaphysics.Steve Petersen - 2013 - In David Dowe (ed.), Algorithmic Probability and Friends: Bayesian Prediction and Artificial Intelligence. Springer. pp. 306-317.
    There are writers in both metaphysics and algorithmic information theory (AIT) who seem to think that the latter could provide a formal theory of the former. This paper is intended as a step in that direction. It demonstrates how AIT might be used to define basic metaphysical notions such as *object* and *property* for a simple, idealized world. The extent to which these definitions capture intuitions about the metaphysics of the simple world, times the extent to which we think the (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  11. added 2018-01-21
    Information-Theoretic Philosophy of Mind.Jason Winning & William Bechtel - 2016 - In Luciano Floridi (ed.), The Routledge Handbook of Philosophy of Information. London and New York: Routledge. pp. 347-360.
  12. added 2018-01-18
    The Semantics Latent in Shannon Information.M. C. Isaac Alistair - forthcoming - British Journal for the Philosophy of Science:axx029.
    The lore is that standard information theory provides an analysis of information quantity, but not of information content. I argue this lore is incorrect, and there is an adequate informational semantics latent in standard theory. The roots of this notion of content can be traced to the secret parallel development of an information theory equivalent to Shannon’s by Turing at Bletchley Park, and it has been suggested independently in recent work by Skyrms and Bullinaria and Levy. This paper explicitly articulates (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  13. added 2018-01-18
    Intervening on the Causal Exclusion Problem for Integrated Information Theory.Matthew Baxendale & Garrett Mindt - 2018 - Minds and Machines 28 (2):331-351.
    In this paper, we examine the causal framework within which integrated information theory of consciousness makes it claims. We argue that, in its current formulation, IIT is threatened by the causal exclusion problem. Some proponents of IIT have attempted to thwart the causal exclusion problem by arguing that IIT has the resources to demonstrate genuine causal emergence at macro scales. In contrast, we argue that their proposed solution to the problem is damagingly circular as a result of inter-defining information and (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  14. added 2018-01-18
    Some Evidence Concerning the Genesis of Shannon’s Information Theory.Samuel W. Thomsen - 2009 - Studies in History and Philosophy of Science Part A 40 (1):81-91.
    A typescript by Claude Shannon, ‘Theorems on statistical sequences’, is examined to shed light on the development of information theory. In particular, it appears that Shannon was still working out the mathematical details of his theory in the spring of 1948, just before he published ‘A mathematical theory of communication’. This is contrasted with evidence from a declassified cryptography report that Shannon’s theory was intuitively worked out in its essentials by the time he filed the report in 1945. Previous interviews (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  15. added 2017-12-06
    A Quantitative-Informational Approach to Logical Consequence.Marcos Antonio Alves & Ítala M. Loffredo D'Otaviano - 2015 - In Jean-Yves Beziau (ed.), The Road to Universal Logic (Studies in Universal Logic). Switzerland: Springer International Publishing. pp. 105-24.
    In this work, we propose a definition of logical consequence based on the relation between the quantity of information present in a particular set of formulae and a particular formula. As a starting point, we use Shannon‟s quantitative notion of information, founded on the concepts of logarithmic function and probability value. We first consider some of the basic elements of an axiomatic probability theory, and then construct a probabilistic semantics for languages of classical propositional logic. We define the quantity of (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  16. added 2017-10-28
    Secure Communication in the Twin Paradox.Juan Carlos Garcia-Escartin & Pedro Chamorro-Posada - 2015 - Foundations of Physics 45 (11):1433-1453.
    The amount of information that can be transmitted through a noisy channel is affected by relativistic effects. Under the presence of a fixed noise at the receiver, there appears an asymmetry between “slowly aging” and “fast aging” observers which can be used to have private information transmission. We discuss some models for users inside gravitational wells and in the twin paradox scenario.
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  17. added 2017-09-26
    Information, Cosmology and Time.C. T. K. Chari - 1963 - Dialectica 17 (4):368-380.
  18. added 2017-06-20
    Patterns, Information, and Causation.Holly Andersen - 2017 - Journal of Philosophy 114 (11):592-622.
    This paper articulates an account of causation as a collection of information-theoretic relationships between patterns instantiated in the causal nexus. I draw on Dennett’s account of real patterns to characterize potential causal relata as patterns with specific identification criteria and noise tolerance levels, and actual causal relata as those patterns instantiated at some spatiotemporal location in the rich causal nexus as originally developed by Salmon. I develop a representation framework using phase space to precisely characterize causal relata, including their degree (...)
    Remove from this list   Direct download (6 more)  
     
    Export citation  
     
    Bookmark  
  19. added 2017-05-06
    A Simplicity Criterion for Physical Computation.Tyler Millhouse - forthcoming - British Journal for the Philosophy of Science:axx046.
    The aim of this paper is to offer a formal criterion for physical computation that allows us to objectively distinguish between competing computational interpretations of a physical system. The criterion construes a computational interpretation as an ordered pair of functions mapping (1) states of a physical system to states of an abstract machine, and (2) inputs to this machine to interventions in this physical system. This interpretation must ensure that counterfactuals true of the abstract machine have appropriate counterparts which are (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  20. added 2017-03-03
    Information Flow in the Brain: Ordered Sequences of Metastable States.Andrew A. Fingelkurts & Alexander A. Fingelkurts - 2017 - Information 8 (1):22.
    In this brief overview paper, we analyse information flow in the brain. Although Shannon’s information concept, in its pure algebraic form, has made a number of valuable contributions to neuroscience, information dynamics within the brain is not fully captured by its classical description. These additional dynamics consist of self-organisation, interplay of stability/instability, timing of sequential processing, coordination of multiple sequential streams, circular causality between bottom-up and top-down operations, and information creation. Importantly, all of these processes are dynamic, hierarchically nested and (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  21. added 2017-03-02
    Information and Inaccuracy.William Roche & Tomoji Shogenji - 2018 - British Journal for the Philosophy of Science 69 (2):577-604.
    This article proposes a new interpretation of mutual information. We examine three extant interpretations of MI by reduction in doubt, by reduction in uncertainty, and by divergence. We argue that the first two are inconsistent with the epistemic value of information assumed in many applications of MI: the greater is the amount of information we acquire, the better is our epistemic position, other things being equal. The third interpretation is consistent with EVI, but it is faced with the problem of (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  22. added 2017-02-13
    Prediction, Complexity, and Randomness.Giuseppe Trautteur - 1973 - In Radu J. Bogdan & Ilkka Niiniluoto (eds.), Logic, Language, and Probability. Boston: D. Reidel Pub. Co.. pp. 124--128.
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  23. added 2017-02-12
    Reviewed Work(S): Some Theorems on the Algorithmic Approach to Probability Theory and Information Theory (1971 Dissertation Directed by A. N. Kolmogorov). Annals of Pure and Applied Logic, Vol. 162 by L. A. Levin. [REVIEW]Jan Reimann - forthcoming - Association for Symbolic Logic: The Bulletin of Symbolic Logic.
    Review by: Jan Reimann The Bulletin of Symbolic Logic, Volume 19, Issue 3, Page 397-399, September 2013.
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  24. added 2017-02-12
    Completeness, Compactness, Effective Dimensions.Stephen Binns - 2013 - Mathematical Logic Quarterly 59 (3):206-218.
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  25. added 2017-02-12
    Kolmogorov Complexity and Noncomputability.George Davie - 2002 - Mathematical Logic Quarterly 48 (4):574-581.
    We use a method suggested by Kolmogorov complexity to examine some relations between Kolmogorov complexity and noncomputability. In particular we show that the method consistently gives us more information than conventional ways of demonstrating noncomputability . Also, many sets which are awkward to embed into the halting problem are easily shown noncomputable. We also prove a gap-theorem for outputting consecutive integers and find, for a given length n, a statement of length n with maximal proof length.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  26. added 2017-02-12
    Recursive Events in Random Sequences.George Davie - 2001 - Archive for Mathematical Logic 40 (8):629-638.
    Let ω be a Kolmogorov–Chaitin random sequence with ω1: n denoting the first n digits of ω. Let P be a recursive predicate defined on all finite binary strings such that the Lebesgue measure of the set {ω|∃nP(ω1: n )} is a computable real α. Roughly, P holds with computable probability for a random infinite sequence. Then there is an algorithm which on input indices for any such P and α finds an n such that P holds within the first (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  27. added 2017-02-11
    Kolmogorov Complexity Estimates for Detection of Viruses in Biologically Inspired Security Systems: A Comparison with Traditional Approaches.Sanjay Goel & Stephen F. Bush - 2003 - Complexity 9 (2):54-73.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark   2 citations  
  28. added 2017-02-11
    Complexity and Information by Joseph Traub and A. G. Werschulz.Edward W. Packel - 1999 - Complexity 4 (5):39-40.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  29. added 2017-02-10
    The Relationship Between Task Complexity and Information Search: The Role of Self-Efficacy.J. Hu, B. A. Huhmann & M. R. Hyman - 2007 - Psychology and Marketing 24 (3):253--270.
    Remove from this list  
    Translate
     
     
    Export citation  
     
    Bookmark  
  30. added 2017-02-10
    A Note on Deterministic and Algorithmic Behavior.Pavel Materna - 1974 - Theory and Decision 4 (3-4):369-371.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  31. added 2017-02-02
    Algorithmic Randomness in Empirical Data.W. J. - 2003 - Studies in History and Philosophy of Science Part A 34 (3):633-646.
    According to a traditional view, scientific laws and theories constitute algorithmic compressions of empirical data sets collected from observations and measurements. This article defends the thesis that, to the contrary, empirical data sets are algorithmically incompressible. The reason is that individual data points are determined partly by perturbations, or causal factors that cannot be reduced to any pattern. If empirical data sets are incompressible, then they exhibit maximal algorithmic complexity, maximal entropy and zero redundancy. They are therefore maximally efficient carriers (...)
    Remove from this list  
     
    Export citation  
     
    Bookmark  
  32. added 2017-02-01
    Information Functions with Applications.Krzysztof Szymanek - 1990 - Studia Logica 49 (3):387 - 400.
    In the first place, we present the definition and fundamental properties of information functions — functions which establish a correspondence between sets of formulas and the information contained in them. The intuitions for the notion of information stem from the conception of Bar-Hillel and Carnap in [3]. In § 2 we will briefly show how those notions can be applied to the logic of theory change. In § 3 we will use them for proving two theorems about the lattices of (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    Bookmark  
  33. added 2017-01-30
    Prefix and Plain Kolmogorov Complexity Characterizations of 2-Randomness: Simple Proofs.Bruno Bauwens - 2015 - Archive for Mathematical Logic 54 (5-6):615-629.
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  34. added 2017-01-30
    Algorithmic Randomness Over General Spaces.Kenshi Miyabe - 2014 - Mathematical Logic Quarterly 60 (3):184-204.
  35. added 2017-01-28
    Aristotle and Information Theory a Comparison of the Influence of Causal Assumptions on Two Theories of Communication.Lawrence William Rosenfield - 1971 - Mouton.
  36. added 2017-01-26
    Reviewed Work(S): Some Theorems on the Algorithmic Approach to Probability Theory and Information Theory (1971 Dissertation Directed by A. N. Kolmogorov). Annals of Pure and Applied Logic, Vol. 162 by L. A. Levin. [REVIEW]Review by: Jan Reimann - 2013 - Bulletin of Symbolic Logic 19 (3):397-399,.
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  37. added 2017-01-26
    Quantum Observer, Information Theory and Kolmogorov Complexity.Alexei Grinbaum - 2013 - In Hanne Andersen, Dennis Dieks, Wenceslao González, Thomas Uebel & Gregory Wheeler (eds.), New Challenges to Philosophy of Science. Springer Verlag. pp. 59--72.
  38. added 2017-01-26
    230 Shannon Speed.Comandanta Esther - 2009 - In Mark Goodale (ed.), Human Rights: An Anthropological Reader. Wiley-Blackwell. pp. 10--229.
  39. added 2017-01-26
    Sensory Coding and Information Transmission.John Hertz & Stefano Panzeri - 2002 - In M. Arbib (ed.), The Handbook of Brain Theory and Neural Networks. MIT Press. pp. 1023--1026.
  40. added 2017-01-26
    The Program-Substitution in Algorithmic Logic and Algorithmic Logic with Non-Deterministic Programs.Andrzej Biela - 1984 - Bulletin of the Section of Logic 13 (2):69-72.
    This note presents a point of view upon the notions of programsubstitution which are the tools for proving properties of programs of algorithmic logics [5], [3] being sufficiently strong and universal to comprise almost all previously introduced theories of programming, and the so-called extended algorithmic logic [1], [2] and algorithmic logic with nondeterministic programs [4]. It appears that the mentioned substitution rule allows us to examine more deeply algorithmic properties of terms, formulas and programs. Besides the problem of Post-completeness and (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  41. added 2017-01-25
    Referee’s Report on Leonid Levin’s Dissertation “Some Theorems on the Algorithmic Approach to Probability Theory and Information Theory”. [REVIEW]N. A. Shanin - 2010 - Annals of Pure and Applied Logic 162 (3):236.
  42. added 2017-01-25
    Some Theorems on the Algorithmic Approach to Probability Theory and Information Theory:(1971 Dissertation Directed by AN Kolmogorov).Leonid A. Levin - 2010 - Annals of Pure and Applied Logic 162 (3):224-235.
  43. added 2017-01-25
    Tous Shannoniens?Claude Baltz - 2007 - Hermes 48:87.
    Depuis une vingtaine d'années, l'oeuvre de C. E. Shannon semble être l'objet d'un relatif oubli dans l'ensemble disciplinaire nommé en France « Sciences de l'information et de la communication ». Cet article essaie d'en saisir les raisons, après en avoir rappelé le succès. Il plaide pour une relecture épistémologique du fameux schéma de la mesure d'information de Shannon. C'est ainsi que le « nombre de bits », terme à peu près incompréhensible du côté des sciences humaines, peut se voir donner (...)
    Remove from this list   Direct download (2 more)  
    Translate
     
     
    Export citation  
     
    Bookmark  
  44. added 2017-01-25
    On Partial Randomness.Cristian S. Calude, Ludwig Staiger & Sebastiaan A. Terwijn - 2006 - Annals of Pure and Applied Logic 138 (1):20-30.
    If is a random sequence, then the sequence is clearly not random; however, seems to be “about half random”. L. Staiger [Kolmogorov complexity and Hausdorff dimension, Inform. and Comput. 103 159–194 and A tight upper bound on Kolmogorov complexity and uniformly optimal prediction, Theory Comput. Syst. 31 215–229] and K. Tadaki [A generalisation of Chaitin’s halting probability Ω and halting self-similar sets, Hokkaido Math. J. 31 219–253] have studied the degree of randomness of sequences or reals by measuring their “degree (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  45. added 2017-01-25
    Information Measures, Effective Complexity, and Total Information.Murray Gell-Mann & Seth Lloyd - 1996 - Complexity 2 (1):44-52.
  46. added 2017-01-24
    Algorithmic Randomness, Reverse Mathematics, and the Dominated Convergence Theorem.Jeremy Avigad, Edward T. Dean & Jason Rute - 2012 - Annals of Pure and Applied Logic 163 (12):1854-1864.
    We analyze the pointwise convergence of a sequence of computable elements of L1 in terms of algorithmic randomness. We consider two ways of expressing the dominated convergence theorem and show that, over the base theory RCA0, each is equivalent to the assertion that every Gδ subset of Cantor space with positive measure has an element. This last statement is, in turn, equivalent to weak weak Königʼs lemma relativized to the Turing jump of any set. It is also equivalent to the (...)
    Remove from this list   Direct download (7 more)  
     
    Export citation  
     
    Bookmark  
  47. added 2017-01-24
    Image Characterization and Classification by Physical Complexity.Hector Zenil, Jean‐Paul Delahaye & Cédric Gaucherel - 2012 - Complexity 17 (3):26-42.
  48. added 2017-01-24
    Kolmogorov Complexity and Characteristic Constants of Formal Theories of Arithmetic.Shingo Ibuka, Makoto Kikuchi & Hirotaka Kikyo - 2011 - Mathematical Logic Quarterly 57 (5):470-473.
    We investigate two constants cT and rT, introduced by Chaitin and Raatikainen respectively, defined for each recursively axiomatizable consistent theory T and universal Turing machine used to determine Kolmogorov complexity. Raatikainen argued that cT does not represent the complexity of T and found that for two theories S and T, one can always find a universal Turing machine such that equation image. We prove the following are equivalent: equation image for some universal Turing machine, equation image for some universal Turing (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    Bookmark  
  49. added 2017-01-24
    Kolmogorov Complexity and Set Theoretical Representations of Integers.Marie Ferbus-Zanda & Serge Grigorieff - 2006 - Mathematical Logic Quarterly 52 (4):375-403.
    We reconsider some classical natural semantics of integers in the perspective of Kolmogorov complexity. To each such semantics one can attach a simple representation of integers that we suitably effectivize in order to develop an associated Kolmogorov theory. Such effectivizations are particular instances of a general notion of “self-enumerated system” that we introduce in this paper. Our main result asserts that, with such effectivizations, Kolmogorov theory allows to quantitatively distinguish the underlying semantics. We characterize the families obtained by such effectivizations (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  50. added 2017-01-24
    Effective Fractal Dimensions.Jack H. Lutz - 2005 - Mathematical Logic Quarterly 51 (1):62-72.
    Classical fractal dimensions have recently been effectivized by characterizing them in terms of real-valued functions called gales, and imposing computability and complexity constraints on these gales. This paper surveys these developments and their applications in algorithmic information theory and computational complexity theory.
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    Bookmark   4 citations  
1 — 50 / 195