About this topic
Summary The main focus of this category is Shannon's mathematical theory of information, and its broader philosophical uses. This includes in the first place cybernetics, signalling theories, the sender-receiver communication model, and Kolmogorov complexity.More general uses of information-theory that overlap with other domains of the philosophy of information may also belong to this category. Examples include different philosophical conceptions of information (semantic conceptions, semiotic approaches), as well as applications in specific domains of theoretical philosophy like the philosophy of science and the philosophy of mind.
Related categories

191 found
Order:
1 — 50 / 191
  1. Information.Pieter Adriaans - 2012 - Stanford Encyclopedia of Philosophy.
  2. A Critical Analysis of Floridi’s Theory of Semantic Information.Pieter Adriaans - 2010 - Knowledge, Technology & Policy 23 (1-2):41-56.
    n various publications over the past years, Floridi has developed a theory of semantic information as well-formed, meaningful, and truthful data. This theory is more or less orthogonal to the standard entropy-based notions of information known from physics, information theory, and computer science that all define the amount of information in a certain system as a scalar value without any direct semantic implication. In this context the question rises what the exact relation between these various conceptions of information is and (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    My bibliography   7 citations  
  3. A Quantitative-Informational Approach to Logical Consequence.Marcos Antonio Alves & Ítala M. Loffredo D'Otaviano - 2015 - In Jean-Yves Beziau (ed.), The Road to Universal Logic (Studies in Universal Logic). Switzerland: Springer International Publishing. pp. 105-24.
    In this work, we propose a definition of logical consequence based on the relation between the quantity of information present in a particular set of formulae and a particular formula. As a starting point, we use Shannon‟s quantitative notion of information, founded on the concepts of logarithmic function and probability value. We first consider some of the basic elements of an axiomatic probability theory, and then construct a probabilistic semantics for languages of classical propositional logic. We define the quantity of (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  4. Patterns, Information, and Causation.Holly Andersen - 2017 - Journal of Philosophy 114 (11):592-622.
    This paper articulates an account of causation as a collection of information-theoretic relationships between patterns instantiated in the causal nexus. I draw on Dennett’s account of real patterns to characterize potential causal relata as patterns with specific identification criteria and noise tolerance levels, and actual causal relata as those patterns instantiated at some spatiotemporal location in the rich causal nexus as originally developed by Salmon. I develop a representation framework using phase space to precisely characterize causal relata, including their degree (...)
    Remove from this list   Direct download (6 more)  
     
    Export citation  
     
    My bibliography  
  5. Information Theory: An Introductory Note.Rudolf Arnheim - 1959 - Journal of Aesthetics and Art Criticism 17 (4):501-503.
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  6. Interaction, Information and Meaning.Robert Artigiani - 1997 - World Futures 50 (1):703-714.
  7. Algorithmic Randomness, Reverse Mathematics, and the Dominated Convergence Theorem.Jeremy Avigad, Edward T. Dean & Jason Rute - 2012 - Annals of Pure and Applied Logic 163 (12):1854-1864.
    We analyze the pointwise convergence of a sequence of computable elements of L1 in terms of algorithmic randomness. We consider two ways of expressing the dominated convergence theorem and show that, over the base theory RCA0, each is equivalent to the assertion that every Gδ subset of Cantor space with positive measure has an element. This last statement is, in turn, equivalent to weak weak Königʼs lemma relativized to the Turing jump of any set. It is also equivalent to the (...)
    Remove from this list   Direct download (6 more)  
     
    Export citation  
     
    My bibliography  
  8. Concepts, Introspection, and Phenomenal Consciousness: An Information-Theoretical Approach.Murat Aydede & Guven Guzeldere - 2005 - Noûs 39 (2):197-255.
    This essay is a sustained attempt to bring new light to some of the perennial problems in philosophy of mind surrounding phenomenal consciousness and introspection through developing an account of sensory and phenomenal concepts. Building on the information-theoretic framework of Dretske (1981), we present an informational psychosemantics as it applies to what we call sensory concepts, concepts that apply, roughly, to so-called secondary qualities of objects. We show that these concepts have a special informational character and semantic structure that closely (...)
    Remove from this list   Direct download (9 more)  
     
    Export citation  
     
    My bibliography   15 citations  
  9. Bridging Conceptual Gaps: The Kolmogorov-Sinai Entropy.Massimiliano Badino - forthcoming - Isonomía. Revista de Teoría y Filosofía Del Derecho.
    The Kolmogorov-Sinai entropy is a fairly exotic mathematical concept which has recently aroused some interest on the philosophers’ part. The most salient trait of this concept is its working as a junction between such diverse ambits as statistical mechanics, information theory and algorithm theory. In this paper I argue that, in order to understand this very special feature of the Kolmogorov-Sinai entropy, is essential to reconstruct its genealogy. Somewhat surprisingly, this story takes us as far back as the beginning of (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    My bibliography  
  10. An Application of Information Theory to the Problem of the Scientific Experiment.Massimiliano Badino - 2004 - Synthese 140 (3):355 - 389.
    There are two basic approaches to the problem of induction:the empirical one, which deems that the possibility of induction depends on how theworld was made (and how it works) and the logical one, which considers the formation(and function) of language. The first is closer to being useful for induction, whilethe second is more rigorous and clearer. The purpose of this paper is to create an empiricalapproach to induction that contains the same formal exactitude as the logical approach.This requires: (a) that (...)
    Remove from this list   Direct download (9 more)  
     
    Export citation  
     
    My bibliography  
  11. Falsification and Future Performance.David Balduzzi - manuscript
    We information-theoretically reformulate two measures of capacity from statistical learning theory: empirical VC-entropy and empirical Rademacher complexity. We show these capacity measures count the number of hypotheses about a dataset that a learning algorithm falsifies when it finds the classifier in its repertoire minimizing empirical risk. It then follows from that the future performance of predictors on unseen data is controlled in part by how many hypotheses the learner falsifies. As a corollary we show that empirical VC-entropy quantifies the message (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  12. Information, Learning and Falsification.David Balduzzi - manuscript
    There are (at least) three approaches to quantifying information. The first, algorithmic information or Kolmogorov complexity, takes events as strings and, given a universal Turing machine, quantifies the information content of a string as the length of the shortest program producing it [1]. The second, Shannon information, takes events as belonging to ensembles and quantifies the information resulting from observing the given event in terms of the number of alternate events that have been ruled out [2]. The third, statistical learning (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    My bibliography  
  13. Tous Shannoniens?Claude Baltz - 2007 - Hermes 48:87.
    Depuis une vingtaine d'années, l'oeuvre de C. E. Shannon semble être l'objet d'un relatif oubli dans l'ensemble disciplinaire nommé en France « Sciences de l'information et de la communication ». Cet article essaie d'en saisir les raisons, après en avoir rappelé le succès. Il plaide pour une relecture épistémologique du fameux schéma de la mesure d'information de Shannon. C'est ainsi que le « nombre de bits », terme à peu près incompréhensible du côté des sciences humaines, peut se voir donner (...)
    Remove from this list   Direct download (2 more)  
    Translate
     
     
    Export citation  
     
    My bibliography  
  14. An Examination of Information Theory.Yehoshua Bar-Hillel - 1955 - Philosophy of Science 22 (2):86-105.
  15. Algorithmic Randomness and Measures of Complexity.George Barmpalias - forthcoming - Association for Symbolic Logic: The Bulletin of Symbolic Logic.
    We survey recent advances on the interface between computability theory and algorithmic randomness, with special attention on measures of relative complexity. We focus on (weak) reducibilities that measure (a) the initial segment complexity of reals and (b) the power of reals to compress strings, when they are used as oracles. The results are put into context and several connections are made with various central issues in modern algorithmic randomness and computability.
    Remove from this list   Direct download  
     
    Export citation  
     
    My bibliography   1 citation  
  16. Biology Needs Information Theory.Gérard Battail - 2013 - Biosemiotics 6 (1):77-103.
    Communication is an important feature of the living world that mainstream biology fails to adequately deal with. Applying two main disciplines can be contemplated to fill in this gap: semiotics and information theory. Semiotics is a philosophical discipline mainly concerned with meaning; applying it to life already originated in biosemiotics. Information theory is a mathematical discipline coming from engineering which has literal communication as purpose. Biosemiotics and information theory are thus concerned with distinct and complementary possible meanings of the word (...)
    Remove from this list   Direct download (6 more)  
     
    Export citation  
     
    My bibliography   4 citations  
  17. Applying Semiotics and Information Theory to Biology: A Critical Comparison. [REVIEW]Gérard Battail - 2009 - Biosemiotics 2 (3):303-320.
    Since the beginning of the XX-th century, it became increasingly evident that information, besides matter and energy, is a major actor in the life processes. Moreover, communication of information has been recognized as differentiating living things from inanimate ones, hence as specific to the life processes. Therefore the sciences of matter and energy, chemistry and physics, do not suffice to deal with life processes. Biology should also rely on sciences of information. A majority of biologists, however, did not change their (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    My bibliography   4 citations  
  18. Prefix and Plain Kolmogorov Complexity Characterizations of 2-Randomness: Simple Proofs.Bruno Bauwens - 2015 - Archive for Mathematical Logic 54 (5-6):615-629.
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  19. Intervening on the Causal Exclusion Problem for Integrated Information Theory.Matthew Baxendale & Garrett Mindt - 2018 - Minds and Machines 28 (2):331-351.
    In this paper, we examine the causal framework within which integrated information theory of consciousness makes it claims. We argue that, in its current formulation, IIT is threatened by the causal exclusion problem. Some proponents of IIT have attempted to thwart the causal exclusion problem by arguing that IIT has the resources to demonstrate genuine causal emergence at macro scales. In contrast, we argue that their proposed solution to the problem is damagingly circular as a result of inter-defining information and (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    My bibliography  
  20. Kolmogorov Complexity for Possibly Infinite Computations.Verónica Becher & Santiago Figueira - 2005 - Journal of Logic, Language and Information 14 (2):133-148.
    In this paper we study the Kolmogorov complexity for non-effective computations, that is, either halting or non-halting computations on Turing machines. This complexity function is defined as the length of the shortest input that produce a desired output via a possibly non-halting computation. Clearly this function gives a lower bound of the classical Kolmogorov complexity. In particular, if the machine is allowed to overwrite its output, this complexity coincides with the classical Kolmogorov complexity for halting computations relative to the first (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    My bibliography  
  21. Empirical State Determination of Entangled Two-Level Systems and Its Relation to Information Theory.Y. Ben-Aryeh, A. Mann & B. C. Sanders - 1999 - Foundations of Physics 29 (12):1963-1975.
    Theoretical methods for empirical state determination of entangled two-level systems are analyzed in relation to information theory. We show that hidden variable theories would lead to a Shannon index of correlation between the entangled subsystems which is larger than that predicted by quantum mechanics. Canonical representations which have maximal correlations are treated by the use of Schmidt and Hilbert-Schmidt decomposition of the entangled states, including especially the Bohm singlet state and the GHZ entangled states. We show that quantum mechanics does (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    My bibliography  
  22. The Transmission Sense of Information.Carl T. Bergstrom & Martin Rosvall - 2011 - Biology and Philosophy 26 (2):159-176.
    Biologists rely heavily on the language of information, coding, and transmission that is commonplace in the field of information theory developed by Claude Shannon, but there is open debate about whether such language is anything more than facile metaphor. Philosophers of biology have argued that when biologists talk about information in genes and in evolution, they are not talking about the sort of information that Shannon’s theory addresses. First, philosophers have suggested that Shannon’s theory is only useful for developing a (...)
    Remove from this list   Direct download (7 more)  
     
    Export citation  
     
    My bibliography   25 citations  
  23. The Program-Substitution in Algorithmic Logic and Algorithmic Logic with Non-Deterministic Programs.Andrzej Biela - 1984 - Bulletin of the Section of Logic 13 (2):69-72.
    This note presents a point of view upon the notions of programsubstitution which are the tools for proving properties of programs of algorithmic logics [5], [3] being sufficiently strong and universal to comprise almost all previously introduced theories of programming, and the so-called extended algorithmic logic [1], [2] and algorithmic logic with nondeterministic programs [4]. It appears that the mentioned substitution rule allows us to examine more deeply algorithmic properties of terms, formulas and programs. Besides the problem of Post-completeness and (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    My bibliography  
  24. Completeness, Compactness, Effective Dimensions.Stephen Binns - 2013 - Mathematical Logic Quarterly 59 (3):206-218.
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    My bibliography  
  25. Ecosemiotics and Cybersemiotics.Soren Brier - 2001 - Sign Systems Studies 29 (1):107-119.
    The article develops a suggestion of how cybersemiotics is pertinent to ecosemiotics. Cybersemiotics uses Luhmann's triadic view of autopoietic systems (biological, psychological, and socio-communicative autopoiesis) and adopts his approach to communication within a biosemiotic framework. The following levels of exosemiosis and signification can be identified under the consideration of nonintentional signs, cybernetics, and information theory: (1) the socio-communicative level of self-conscious signification and language games. (2) the instinctual and species specific level of sign stimuli signifying through innate release response mechanism (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    My bibliography  
  26. Science and Information Theory.Léon Brillouin - 1956 - Dover Publications.
    A classic source for understanding the connections between information theory and physics, this text was written by one of the giants of 20th-century physics and is appropriate for upper-level undergraduates and graduate students. Topics include the principles of coding, coding problems and solutions, the analysis of signals, a summary of thermodynamics, thermal agitation and Brownian motion, and thermal noise in an electric circuit. A discussion of the negentropy principle of information introduces the author's renowned examination of Maxwell's demon. Concluding chapters (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    My bibliography   79 citations  
  27. On a Supposed Conceptual Inadequacy of the Shannon Information in Quantum Mechanics.G. C. - 2003 - Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics 34 (3):441-468.
    Recently, Brukner and Zeilinger (Phys. Rev. Lett. 83(17) (2001) 3354) have claimed that the Shannon information is not well defined as a measure of information in quantum mechanics, adducing arguments that seek to show that it is inextricably tied to classical notions of measurement. It is shown here that these arguments do not succeed: the Shannon information does not have problematic ties to classical concepts. In a further argument, Brukner and Zeilinger compare the Shannon information unfavourably to their preferred information (...)
    Remove from this list  
     
    Export citation  
     
    My bibliography  
  28. The Deluge of Spurious Correlations in Big Data.Cristian S. Calude & Giuseppe Longo - 2017 - Foundations of Science 22 (3):595-612.
    Very large databases are a major opportunity for science and data analytics is a remarkable new field of investigation in computer science. The effectiveness of these tools is used to support a “philosophy” against the scientific method as developed throughout history. According to this view, computer-discovered correlations should replace understanding and guide prediction and action. Consequently, there will be no need to give scientific meaning to phenomena, by proposing, say, causal relations, since regularities in very large databases are enough: “with (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    My bibliography   3 citations  
  29. On Partial Randomness.Cristian S. Calude, Ludwig Staiger & Sebastiaan A. Terwijn - 2006 - Annals of Pure and Applied Logic 138 (1):20-30.
    If is a random sequence, then the sequence is clearly not random; however, seems to be “about half random”. L. Staiger [Kolmogorov complexity and Hausdorff dimension, Inform. and Comput. 103 159–194 and A tight upper bound on Kolmogorov complexity and uniformly optimal prediction, Theory Comput. Syst. 31 215–229] and K. Tadaki [A generalisation of Chaitin’s halting probability Ω and halting self-similar sets, Hokkaido Math. J. 31 219–253] have studied the degree of randomness of sequences or reals by measuring their “degree (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    My bibliography   6 citations  
  30. Revisiting the Relation Between Species Diversity and Information Theory.Julio A. Camargo - 2008 - Acta Biotheoretica 56 (4):275-283.
    The Shannon information function (H) has been extensively used in ecology as a statistic of species diversity. Yet, the use of Shannon diversity index has also been criticized, mainly because of its ambiguous ecological interpretation and because of its relatively great sensitivity to the relative abundances of species in the community. In my opinion, the major shortcoming of the traditional perspective (on the possible relation of species diversity with information theory) is that species need for an external receiver (the scientist (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    My bibliography  
  31. A New Version of Algorithmic Information Theory.G. J. Chaitin - 1996 - Complexity 1 (4):55-59.
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    My bibliography  
  32. How to Run Algorithmic Information Theory on a Computer.G. J. Chaitin - unknown
    Hi everybody! It's a great pleasure for me to be back here at the new, improved Santa Fe Institute in this spectacular location. I guess this is my fourth visit and it's always very stimulating, so I'm always very happy to visit you guys. I'd like to tell you what I've been up to lately. First of all, let me say what algorithmic information theory is good for, before telling you about the new version of it I've got.
    Remove from this list  
    Translate
     
     
    Export citation  
     
    My bibliography  
  33. How to Run Algorithmic Information Theory on a Computer:Studying the Limits of Mathematical Reasoning.Gregory J. Chaitin - 1996 - Complexity 2 (1):15-21.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    My bibliography  
  34. Information, Randomness & Incompleteness: Papers on Algorithmic Information Theory.Gregory J. Chaitin - 1990 - World Scientific: Singapore.
    Remove from this list  
     
    Export citation  
     
    My bibliography   1 citation  
  35. Information, Cosmology and Time.C. T. K. Chari - 1963 - Dialectica 17 (4):368-380.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    My bibliography  
  36. Time Reversal, Information Theory, and "World-Geometry".C. T. K. Chari - 1963 - Journal of Philosophy 60 (20):579-583.
  37. An Analysis of Information Visualisation.Min Chen & Luciano Floridi - 2013 - Synthese 190 (16):3421-3438.
    Philosophers have relied on visual metaphors to analyse ideas and explain their theories at least since Plato. Descartes is famous for his system of axes, and Wittgenstein for his first design of truth table diagrams. Today, visualisation is a form of ‘computer-aided seeing’ information in data. Hence, information is the fundamental ‘currency’ exchanged through a visualisation pipeline. In this article, we examine the types of information that may occur at different stages of a general visualization pipeline. We do so from (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    My bibliography  
  38. The Subtleties of Entanglement and its Role in Quantum Information Theory.Rob Clifton - 2001 - Proceedings of the Philosophy of Science Association 2002 (3):S150-S167.
    My aim in this paper is a modest one. I do not have any particular thesis to advance about the nature of entanglement, nor can I claim novelty for any of the material I shall discuss. My aim is simply to raise some questions about entanglement that spring naturally from certain developments in quantum information theory and are, I believe, worthy of serious consideration by philosophers of science. The main topics I discuss are different manifestations of quantum nonlocality, entanglement-assisted communication, (...)
    Remove from this list   Direct download (7 more)  
     
    Export citation  
     
    My bibliography   2 citations  
  39. Abstraction, Law, and Freedom in Computer Science.Timothy Colburn & Gary Shute - 2010 - Metaphilosophy 41 (3):345-364.
    Abstract: Laws of computer science are prescriptive in nature but can have descriptive analogs in the physical sciences. Here, we describe a law of conservation of information in network programming, and various laws of computational motion (invariants) for programming in general, along with their pedagogical utility. Invariants specify constraints on objects in abstract computational worlds, so we describe language and data abstraction employed by software developers and compare them to Floridi's concept of levels of abstraction. We also consider Floridi's structural (...)
    Remove from this list   Direct download (6 more)  
     
    Export citation  
     
    My bibliography   1 citation  
  40. Information Theory as a General Language for Functional Systems.John Collier - unknown
    Function refers to a broad family of concepts of varying abstractness and range of application, from a many-one mathematical relation of great generality to, for example, highly specialized roles of designed elements in complex machines such as degaussing in a television set, or contributory processes to control mechanisms in complex metabolic pathways, such as the inhibitory function of the appropriate part of the lac-operon on the production of lactase through its action on the genome in the absence of lactose. We (...)
    Remove from this list  
     
    Export citation  
     
    My bibliography  
  41. The Informed Neuron: Issues in the Use of Information Theory in the Behavioral Sciences. [REVIEW]Jeff Coulter - 1995 - Minds and Machines 5 (4):583-96.
    The concept of “information” is virtually ubiquitous in contemporary cognitive science. It is claimed to be “processed” (in cognitivist theories of perception and comprehension), “stored” (in cognitivist theories of memory and recognition), and otherwise manipulated and transformed by the human central nervous system. Fred Dretske's extensive philosophical defense of a theory of informational content (“semantic” information) based upon the Shannon-Weaver formal theory of information is subjected to critical scrutiny. A major difficulty is identified in Dretske's equivocations in the use of (...)
    Remove from this list   Direct download (7 more)  
     
    Export citation  
     
    My bibliography   3 citations  
  42. Empirical Modeling and Information Semantics.Gordana Dodig Crnkovic - 2008 - Mind & Society.
  43. Review Of: Christopher G. Timpson, Quantum Information Theory and the Foundations of Quantum Mechanics. [REVIEW]Michael E. Cuffaro - 2014 - Philosophy of Science 81 (4):681-684,.
  44. An Application of Information Theory: Longitudinal Measurability Bounds in Classical and Quantum Physics. [REVIEW]C. D'Antonl & P. Scanzano - 1980 - Foundations of Physics 10 (11-12):875-885.
    We examine the problem of the existence (in classical and/or quantum physics) of longitudinal limitations of measurability, defined as limitations preventing the measurement of a given quantity with arbitrarily high accuracy. We consider a measuring device as a generalized communication system, which enables us to use methods of information theory. As a direct consequence of the Shannon theorem on channel capacity, we obtain an inequality which limits the accuracy of a measurement in terms of the average power necessary to transmit (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    My bibliography  
  45. Information of the Chassis and Information of the Program in Synthetic Cells.Antoine Danchin - 2009 - Systems and Synthetic Biology 3:125-134.
    Synthetic biology aims at reconstructing life to put to the test the limits of our understanding. It is based on premises similar to those which permitted invention of computers, where a machine, which reproduces over time, runs a program, which replicates. The underlying heuristics explored here is that an authentic category of reality, information, must be coupled with the standard categories, matter, energy, space and time to account for what life is. The use of this still elusive category permits us (...)
    Remove from this list   Direct download  
    Translate
     
     
    Export citation  
     
    My bibliography  
  46. Les organismes vivants comme pièges à information.Antoine Danchin - 2008 - Ludus Vitalis 16 (30):211-212.
    Life can be defined as combining two entities that rest on completely different physico-chemical properties and on a particular way of handling information. The cell, first, is a « machine », that combines elements which are quite similar (although in a fairly fuzzy way) to those involved in a man-made factory. The machine combines two processes. First, it requires explicit compartmentalisation, including scaffolding structures similar to that of the châssis of engineered machines. In addition, cells define clearly an inside, the (...)
    Remove from this list   Direct download  
    Translate
     
     
    Export citation  
     
    My bibliography   1 citation  
  47. An Informational Theory of Counterfactuals.Danilo Fraga Dantas - 2018 - Acta Analytica 33 (4):525-538.
    Backtracking counterfactuals are problem cases for the standard, similarity based, theories of counterfactuals e.g., Lewis. These theories usually need to employ extra-assumptions to deal with those cases. Hiddleston, 632–657, 2005) proposes a causal theory of counterfactuals that, supposedly, deals well with backtracking. The main advantage of the causal theory is that it provides a unified account for backtracking and non-backtracking counterfactuals. In this paper, I present a backtracking counterfactual that is a problem case for Hiddleston’s account. Then I propose an (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    My bibliography  
  48. Kolmogorov Complexity and Noncomputability.George Davie - 2002 - Mathematical Logic Quarterly 48 (4):574-581.
    We use a method suggested by Kolmogorov complexity to examine some relations between Kolmogorov complexity and noncomputability. In particular we show that the method consistently gives us more information than conventional ways of demonstrating noncomputability . Also, many sets which are awkward to embed into the halting problem are easily shown noncomputable. We also prove a gap-theorem for outputting consecutive integers and find, for a given length n, a statement of length n with maximal proof length.
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    My bibliography  
  49. Recursive Events in Random Sequences.George Davie - 2001 - Archive for Mathematical Logic 40 (8):629-638.
    Let ω be a Kolmogorov–Chaitin random sequence with ω1: n denoting the first n digits of ω. Let P be a recursive predicate defined on all finite binary strings such that the Lebesgue measure of the set {ω|∃nP(ω1: n )} is a computable real α. Roughly, P holds with computable probability for a random infinite sequence. Then there is an algorithm which on input indices for any such P and α finds an n such that P holds within the first (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    My bibliography  
  50. The Theory of Games, Information Theory, and Value Criteria.EdmundJ Dehnert - 1967 - Journal of Value Inquiry 1 (2):124-131.
1 — 50 / 191