About this topic
Summary The main focus of this category is Shannon's mathematical theory of information, and its broader philosophical uses. This includes in the first place cybernetics, signalling theories, the sender-receiver communication model, and Kolmogorov complexity.More general uses of information-theory that overlap with other domains of the philosophy of information may also belong to this category. Examples include different philosophical conceptions of information (semantic conceptions, semiotic approaches), as well as applications in specific domains of theoretical philosophy like the philosophy of science and the philosophy of mind.
Related categories

231 found
Order:
1 — 50 / 231
  1. Falsification and Future Performance.David Balduzzi - manuscript
    We information-theoretically reformulate two measures of capacity from statistical learning theory: empirical VC-entropy and empirical Rademacher complexity. We show these capacity measures count the number of hypotheses about a dataset that a learning algorithm falsifies when it finds the classifier in its repertoire minimizing empirical risk. It then follows from that the future performance of predictors on unseen data is controlled in part by how many hypotheses the learner falsifies. As a corollary we show that empirical VC-entropy quantifies the message (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  2. On Classical and Quantum Logical Entropy.David Ellerman - manuscript
    The notion of a partition on a set is mathematically dual to the notion of a subset of a set, so there is a logic of partitions dual to Boole's logic of subsets (Boolean logic is usually mis-specified as "propositional" logic). The notion of an element of a subset has as its dual the notion of a distinction of a partition (a pair of elements in different blocks). Boole developed finite logical probability as the normalized counting measure on elements of (...)
    Remove from this list   Direct download  
    Translate
     
     
    Export citation  
     
    Bookmark  
  3. A Semantic Information Formula Compatible with Shannon and Popper's Theories.Chenguang Lu - manuscript
    Semantic Information conveyed by daily language has been researched for many years; yet, we still need a practical formula to measure information of a simple sentence or prediction, such as “There will be heavy rain tomorrow”. For practical purpose, this paper introduces a new formula, Semantic Information Formula (SIF), which is based on L. A. Zadeh’s fuzzy set theory and P. Z. Wang’s random set falling shadow theory. It carries forward C. E. Shannon and K. Popper’s thought. The fuzzy set’s (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  4. Information, Learning and Falsification.David Balduzzi - 2011
    There are (at least) three approaches to quantifying information. The first, algorithmic information or Kolmogorov complexity, takes events as strings and, given a universal Turing machine, quantifies the information content of a string as the length of the shortest program producing it [1]. The second, Shannon information, takes events as belonging to ensembles and quantifies the information resulting from observing the given event in terms of the number of alternate events that have been ruled out [2]. The third, statistical learning (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  5. Minimum Message Length as a Truth-Conducive Simplicity Measure.Steve Petersen - manuscript
    given at the 2007 Formal Epistemology Workshop at Carnegie Mellon June 2nd. Good compression must track higher vs lower probability of inputs, and this is one way to approach how simplicity tracks truth.
    Remove from this list  
     
    Export citation  
     
    Bookmark  
  6. Information, Meaning and Physics: The Intellectual Transformation of the English School of Information Theory During 1946-1956.Javier Anta - forthcoming - Science in Context.
    In this comparative historical analysis, we will analyze the intellectual tendency that emerged between 1946 and 1956 to take advantage of the popularity of communication theory to develop a kind of informational epistemology of statistical mechanics. We will argue that this tendency results from a historical confluence in the early 1950s of certain theoretical claims of the so-called English School of Information Theory, championed by authors such as Gabor (1956) or MacKay (1969), and the search to extend the profound success (...)
    Remove from this list  
     
    Export citation  
     
    Bookmark  
  7. Bridging Conceptual Gaps: The Kolmogorov-Sinai Entropy.Massimiliano Badino - forthcoming - Isonomía. Revista de Teoría y Filosofía Del Derecho.
    The Kolmogorov-Sinai entropy is a fairly exotic mathematical concept which has recently aroused some interest on the philosophers’ part. The most salient trait of this concept is its working as a junction between such diverse ambits as statistical mechanics, information theory and algorithm theory. In this paper I argue that, in order to understand this very special feature of the Kolmogorov-Sinai entropy, is essential to reconstruct its genealogy. Somewhat surprisingly, this story takes us as far back as the beginning of (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  8. Algorithmic Randomness and Measures of Complexity.George Barmpalias - forthcoming - Association for Symbolic Logic: The Bulletin of Symbolic Logic.
    We survey recent advances on the interface between computability theory and algorithmic randomness, with special attention on measures of relative complexity. We focus on (weak) reducibilities that measure (a) the initial segment complexity of reals and (b) the power of reals to compress strings, when they are used as oracles. The results are put into context and several connections are made with various central issues in modern algorithmic randomness and computability.
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  9. Strengthening Weak Emergence.Nora Berenstain - forthcoming - Erkenntnis:1-18.
    Bedau's influential (1997) account analyzes weak emergence in terms of the non-derivability of a system’s macrostates from its microstates except by simulation. I offer an improved version of Bedau’s account of weak emergence in light of insights from information theory. Non-derivability alone does not guarantee that a system’s macrostates are weakly emergent. Rather, it is non-derivability plus the algorithmic compressibility of the system’s macrostates that makes them weakly emergent. I argue that the resulting information-theoretic picture provides a metaphysical account of (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  10. Algorithm, Information.A. N. Kolmogorov - forthcoming - Complexity.
    Remove from this list  
     
    Export citation  
     
    Bookmark  
  11. Information Before Information Theory: The Politics of Data Beyond the Perspective of Communication.Colin Koopman - forthcoming - New Media and Society.
    Scholarship on the politics of new media widely assumes that communication functions as a sufficient conceptual paradigm for critically assessing new media politics. This article argues that communication-centric analyses fail to engage the politics of information itself, limiting information only to its consequences for communication, and neglecting information as it reaches into our selves, lives, and actions beyond the confines of communication. Furthering recent new media historiography on the “information theory” of Shannon and Wiener, the article reveals both the primacy (...)
    Remove from this list  
     
    Export citation  
     
    Bookmark   1 citation  
  12. Information Theory and Logical Analysis in the Tractatus Logico-Philosophicus.Felipe Oliveira Araújo Lopes - forthcoming - Philosophia:1-37.
    The present article proposes an Informational-Theoretic interpretation of logical analysis applied to natural language in Tractatus Logico-Philosophicus. Natural language is characterized by descriptive definitions in order to compress information according to empirical regularities. However, notations fitted to empirical patterns do not explicitly reflect the logical structure of language that enables it to represent those very patterns. I argue that logical analysis is the process of obtaining incompressible and uniformly distributed codes, best fitted to express the possible combinations of facts instead (...)
    Remove from this list   Direct download (2 more)  
    Translate
     
     
    Export citation  
     
    Bookmark  
  13. Objects and Processes: Two Notions for Understanding Biological Information.Agustín Mercado-Reyes, Pablo Padilla Longoria & Alfonso Arroyo-Santos - forthcoming - Journal of Theoretical Biology.
    In spite of being ubiquitous in life sciences, the concept of information is harshly criticized. Uses of the concept other than those derived from Shannon's theory are denounced as pernicious metaphors. We perform a computational experiment to explore whether Shannon's information is adequate to describe the uses of said concept in commonplace scientific practice. Our results show that semantic sequences do not have unique complexity values different from the value of meaningless sequences. This result suggests that quantitative theoretical frameworks do (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  14. Cybernetics, operations research and information theory at the Ulm School of Design and its influence on Latin America.David Oswald - forthcoming - AI and Society:1-13.
    The Chilean Cybersyn project, an attempt to manage a nation’s economy by cybernetic methods, has evoked more and more interest in recent years. The project’s design lead and several team members were alumni of the Ulm School of Design—an institution that has been labelled “Bauhaus successor” and today is famous for a no-arts and method-led design approach with strong societal aspirations. The school also influenced the emerging design discipline in Latin America during the 1960s and 70s. This article reviews topics (...)
    Remove from this list   Direct download (2 more)  
    Translate
     
     
    Export citation  
     
    Bookmark  
  15. Reviewed Work(S): Some Theorems on the Algorithmic Approach to Probability Theory and Information Theory (1971 Dissertation Directed by A. N. Kolmogorov). Annals of Pure and Applied Logic, Vol. 162 by L. A. Levin. [REVIEW]Jan Reimann - forthcoming - Association for Symbolic Logic: The Bulletin of Symbolic Logic.
    Review by: Jan Reimann The Bulletin of Symbolic Logic, Volume 19, Issue 3, Page 397-399, September 2013.
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  16. Real Patterns and Indispensability.Abel Suñé & Manolo Martínez - forthcoming - Synthese 198 (5):4315-4330.
    While scientific inquiry crucially relies on the extraction of patterns from data, we still have a far from perfect understanding of the metaphysics of patterns—and, in particular, of what makes a pattern real. In this paper we derive a criterion of real-patternhood from the notion of conditional Kolmogorov complexity. The resulting account belongs to the philosophical tradition, initiated by Dennett :27–51, 1991), that links real-patternhood to data compressibility, but is simpler and formally more perspicuous than other proposals previously defended in (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark   4 citations  
  17. The Integrated Information Theory of Agency.Hugh Desmond & Philippe Huneman - 2022 - Brain and Behavioral Sciences 45:e45.
    We propose that measures of information integration can be more straightforwardly interpreted as measures of agency rather than of consciousness. This may be useful to the goals of consciousness research, given how agency and consciousness are “duals” in many (although not all) respects.
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  18. Consciousness and Complexity: Neurobiological Naturalism and Integrated Information Theory.Francesco Ellia & Robert Chis-Ciure - 2022 - Consciousness and Cognition 100:103281.
    In this paper, we take a meta-theoretical stance and aim to compare and assess two conceptual frameworks that endeavor to explain phenomenal experience. In particular, we compare Feinberg & Mallatt’s Neurobiological Naturalism (NN) and Tononi’s and colleagues' Integrated Information Theory (IIT), given that the former pointed out some similarities between the two theories (Feinberg & Mallatt 2016c-d). To probe their similarity, we first give a general introduction to both frameworks. Next, we expound a ground plan for carrying out our analysis. (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  19. Filled/Non-Filled Pairs: An Empirical Challenge to the Integrated Information Theory of Consciousness.Amber R. Hopkins & Kelvin J. McQueen - 2022 - Consciousness and Cognition 97:103245.
    Perceptual filling-in for vision is the insertion of visual properties (e.g., color, contour, luminance, or motion) into one’s visual field, when those properties have no corresponding retinal input. This paper introduces and provides preliminary empirical support for filled/non-filled pairs, pairs of images that appear identical, yet differ by amount of filling-in. It is argued that such image pairs are important to the experimental testing of theories of consciousness. We review recent experimental research and conclude that filling-in involves brain activity with (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  20. The Integrated Information Theory of Consciousness: Unmasked and Identified.Bjorn Merker, Kenneth Williford & David Rudrauf - 2022 - Behavioral and Brain Sciences 45.
    In our response to a truly diverse set of commentaries, we first summarize the principal topical themes around which they cluster, then address two “outlier” positions. Next, we address ways in which commentaries by non-integrated information theory authors engage with the specifics of our IIT critique, turning finally to the four commentaries by IIT authors.
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  21. Integrated Information Theory as Testing Ground for Causation: Why Nested Hylomorphism Overcomes Physicalism and Panpsychism.Javier Sánchez-Cañizares - 2022 - Journal of Consciousness Studies 29 (1-2):56-78.
    Integrated Information Theory (IIT) stands out as one of the most promising approaches to scientifically understand the emergence of consciousness. Even if it borrows from the phenomenology of consciousness to derive its axiomatic formulation, IIT does not initially adhere to any particular ontological position. However, its founder leans towards Panpsychism. More recently, Owen has studied the pros and cons of different ontologies as a metaphysical basis for IIT, defending a hylomorphic stance where en-grounding, en-forming relations gain the upper hand in (...)
    Remove from this list   Direct download  
    Translate
     
     
    Export citation  
     
    Bookmark  
  22. Make Information in Science Meaningful Again.Javier Anta - 2021 - Logos and Episteme: An International Journal of Epistemology (3):263-286.
    Although the everyday notion of information has clear semantic properties, the all-pervasive technical concept of Shannon information is usually considered as a non-semantic concept. In this paper I show how this concept was implicitly ‘semantized’ in the early 1950s by many authors, such as Rothstein or Brillouin, in order to explain the knowledge dynamics underlying certain scientific practices such as measurement. On the other hand, I argue that the main attempts in the literature to develop a quantitative measure of semantic (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  23. Can Informational Thermal Physics Explain the Approach to Equilibrium?Javier Anta - 2021 - Synthese 199 (1-2):4015–4038.
    In this paper I will defend the incapacity of the informational frameworks in thermal physics, mainly those that historically and conceptually derive from the work of Brillouin (1962) and Jaynes (1957a), to robustly explain the approach of certain gaseous systems to their state of thermal equilibrium from the dynamics of their molecular components. I will further argue that, since their various interpretative, conceptual and technical-formal resources (e.g. epistemic interpretations of probabilities and entropy measures, identification of thermal entropy as Shannon information, (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  24. Integrated Information Theory, Intrinsicality, and Overlapping Conscious Systems.James C. Blackmon - 2021 - Journal of Consciousness Studies 28 (11-12):31-53.
    Remove from this list   Direct download  
    Translate
     
     
    Export citation  
     
    Bookmark  
  25. New Foundations for Information Theory: Logical Entropy and Shannon Entropy.David Ellerman - 2021 - Springer Verlag.
    This monograph offers a new foundation for information theory that is based on the notion of information-as-distinctions, being directly measured by logical entropy, and on the re-quantification as Shannon entropy, which is the fundamental concept for the theory of coding and communications. Information is based on distinctions, differences, distinguishability, and diversity. Information sets are defined that express the distinctions made by a partition, e.g., the inverse-image of a random variable so they represent the pre-probability notion of information. Then logical entropy (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  26. Note on the Complexities of Simple Things Such as a Timeline. On the Notions Text, E-Text, Hypertext, and Origins of Machine Translation.Niels Ole Finnemann - 2021 - In Frode Hegland (ed.), The Future of Text, vol. 2. Wimbledon: Liquid Text. pp. pp 149-156..
    The composition of a timeline depends on purpose, perspective, and scale – and of the very understanding of the word, the phenomenon referred to, and whether the focus is the idea or concept, an instance of an idea or a phenomenon, a process, or an event and so forth. The main function of timelines is to provide an overview over a long history, it is a kind of a mnemotechnic device or a particular kind of Knowledge Organization System (KOS).b The (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  27. Critique of the Integrated Information Theory of Consciousness: Or, the Relevance of Ontological Information.A. Peuhu - 2021 - Journal of Consciousness Studies 28 (5-6):58-78.
    Remove from this list   Direct download  
    Translate
     
     
    Export citation  
     
    Bookmark  
  28. Nature's Operating System.Ilexa Yardley - 2021 - Https://Medium.Com/the-Circular-Theory.
  29. The Integrated Information Theory Facing the Hard Problem of Consciousness.Wael Basille - 2020 - Dissertation, Sorbonne Université
    The Integrated Information Theory (IIT) formulated for the first time in 2004 by the neuroscientist Giulio Tononi, is a theoretical framework aiming to scientifically explain phenomenal consciousness. The IIT is presented in the first part of this work. Broadly speaking, integrated information is an abstract quantitative measure of the causal power a system has on itself. The main claim of IIT is the identity between informational structures and experience. The nature of this identity will be the subject of the second (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  30. Two Informational Theories of Memory: A Case From Memory-Conjunction Errors.Danilo Fraga Dantas - 2020 - Disputatio 12 (59):395-431.
    The causal and simulation theories are often presented as very distinct views about declarative memory, their major difference lying on the causal condition. The causal theory states that remembering involves an accurate representation causally connected to an earlier experience. In the simulation theory, remembering involves an accurate representation generated by a reliable memory process. I investigate how to construe detailed versions of these theories that correctly classify memory errors as misremembering or confabulation. Neither causalists nor simulationists have paid attention to (...)
    Remove from this list   Direct download (4 more)  
     
    Export citation  
     
    Bookmark  
  31. A Contingency Interpretation of Information Theory as a Bridge Between God’s Immanence and Transcendence.Philippe Gagnon - 2020 - In Michael Fuller, Dirk Evers, Anne L. C. Runehov, Knut-Willy Sæther & Bernard Michollet (eds.), Issues in Science and Theology: Nature – and Beyond. Cham: Springer. pp. 169-185.
    This paper investigates the degree to which information theory, and the derived uses that make it work as a metaphor of our age, can be helpful in thinking about God’s immanence and transcendance. We ask when it is possible to say that a consciousness has to be behind the information we encounter. If God is to be thought about as a communicator of information, we need to ask whether a communication system has to pre-exist to the divine and impose itself (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark  
  32. Channels’ Confirmation and Predictions’ Confirmation: From the Medical Test to the Raven Paradox.Chenguang Lu - 2020 - Entropy 22 (4):384.
    After long arguments between positivism and falsificationism, the verification of universal hypotheses was replaced with the confirmation of uncertain major premises. Unfortunately, Hemple proposed the Raven Paradox. Then, Carnap used the increment of logical probability as the confirmation measure. So far, many confirmation measures have been proposed. Measure F proposed by Kemeny and Oppenheim among them possesses symmetries and asymmetries proposed by Elles and Fitelson, monotonicity proposed by Greco et al., and normalizing property suggested by many researchers. Based on the (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  33. The P–T Probability Framework for Semantic Communication, Falsification, Confirmation, and Bayesian Reasoning.Chenguang Lu - 2020 - Philosophies 5 (25):25-0.
    Many researchers want to unify probability and logic by defining logical probability or probabilistic logic reasonably. This paper tries to unify statistics and logic so that we can use both statistical probability and logical probability at the same time. For this purpose, this paper proposes the P–T probability framework, which is assembled with Shannon’s statistical probability framework for communication, Kolmogorov’s probability axioms for logical probability, and Zadeh’s membership functions used as truth functions. Two kinds of probabilities are connected by an (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  34. Conscious Matter and Matters of Conscience.Matthew Owen - 2020 - Philosophia Christi 22 (1):145-156.
    In recent decades consciousness science has become a prominent field of research. This essay analyzes the most recent book by a leading pioneer in the scientific study of consciousness. In the The Feeling of Life Itself Christof Koch presents the integrated information theory and applies it to multiple pressing topics in consciousness studies. This essay considers the philosophical basis of the theory and Koch’s application of it from neurobiology to animal ethics.
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark  
  35. Элементарная основа языка.Andrej Poleev - 2020 - Enzymes 18.
    Русский язык прошёл долгий путь становления, в ходе которого совершенствовался его алфавит, его понятийное и смысловое содержание, его культура речи. В 20-м веке русский язык стал и продолжает оставаться самым развитым языком современности, и в этом качестве он является своеобразным эталоном для оценки других языков.
    Remove from this list   Direct download (2 more)  
    Translate
     
     
    Export citation  
     
    Bookmark  
  36. Two Kinds of Information Processing in Cognition.Mark Sprevak - 2020 - Review of Philosophy and Psychology 11 (3):591-611.
    What is the relationship between information and representation? Dating back at least to Dretske, an influential answer has been that information is a rung on a ladder that gets one to representation. Representation is information, or representation is information plus some other ingredient. In this paper, I argue that this approach oversimplifies the relationship between information and representation. If one takes current probabilistic models of cognition seriously, information is connected to representation in a new way. It enters as a property (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  37. Does Semantic Information Need to Be Truthful?Lundgren Björn - 2019 - Synthese 196 (7):2885-2906.
    The concept of information has well-known difficulties. Among the many issues that have been discussed is the alethic nature of a semantic conception of information. Floridi :197–222, 2004; Philos Phenomenol Res 70:351–370, 2005; EUJAP 3:31–41, 2007; The philosophy of information, Oxford University Press, Oxford, 2011) argued that semantic information must be truthful. In this article, arguments will be presented in favor of an alethically neutral conception of semantic information and it will be shown that such a conception can withstand Floridi’s (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  38. Defining Information Security.Lundgren Björn & Möller Niklas - 2019 - Science and Engineering Ethics 25 (2):419-441.
    This article proposes a new definition of information security, the ‘Appropriate Access’ definition. Apart from providing the basic criteria for a definition—correct demarcation and meaning concerning the state of security—it also aims at being a definition suitable for any information security perspective. As such, it bridges the conceptual divide between so-called ‘soft issues’ of information security and more technical issues. Because of this it is also suitable for various analytical purposes, such as analysing possible security breaches, or for studying conflicting (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  39. The Semantics Latent in Shannon Information.M. C. Isaac Alistair - 2019 - British Journal for the Philosophy of Science 70 (1):103-125.
    The lore is that standard information theory provides an analysis of information quantity, but not of information content. I argue this lore is incorrect, and there is an adequate informational semantics latent in standard theory. The roots of this notion of content can be traced to the secret parallel development of an information theory equivalent to Shannon’s by Turing at Bletchley Park, and it has been suggested independently in recent work by Skyrms and Bullinaria and Levy. This paper explicitly articulates (...)
    Remove from this list   Direct download (6 more)  
     
    Export citation  
     
    Bookmark   7 citations  
  40. Semantic Information G Theory and Logical Bayesian Inference for Machine Learning.Chenguang Lu - 2019 - Information 10 (8):261.
    An important problem with machine learning is that when label number n>2, it is very difficult to construct and optimize a group of learning functions, and we wish that optimized learning functions are still useful when prior distribution P(x) (where x is an instance) is changed. To resolve this problem, the semantic information G theory, Logical Bayesian Inference (LBI), and a group of Channel Matching (CM) algorithms together form a systematic solution. MultilabelMultilabel A semantic channel in the G theory consists (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  41. A Simplicity Criterion for Physical Computation.Tyler Millhouse - 2019 - British Journal for the Philosophy of Science 70 (1):153-178.
    The aim of this paper is to offer a formal criterion for physical computation that allows us to objectively distinguish between competing computational interpretations of a physical system. The criterion construes a computational interpretation as an ordered pair of functions mapping (1) states of a physical system to states of an abstract machine, and (2) inputs to this machine to interventions in this physical system. This interpretation must ensure that counterfactuals true of the abstract machine have appropriate counterparts which are (...)
    Remove from this list   Direct download (6 more)  
     
    Export citation  
     
    Bookmark   11 citations  
  42. El breve “Discurso del método” de Claude Shannon.Juan Ramón Álvarez - 2018 - Principia: An International Journal of Epistemology 22 (3):393-410.
    The following study departs from the lecture, entitled “Creative thinking”, delivered by Claude Shannon in 1952 at the Bell Laboratories. This paper includes an interpretive and critical account of the necessary conditions, as well as the desirable procedures, which must be satisfied in the scientific and technological invention, within the frame of the so-called scientist’s spontaneous philosophy.
    Remove from this list   Direct download (3 more)  
    Translate
     
     
    Export citation  
     
    Bookmark  
  43. Intervening on the Causal Exclusion Problem for Integrated Information Theory.Matthew Baxendale & Garrett Mindt - 2018 - Minds and Machines 28 (2):331-351.
    In this paper, we examine the causal framework within which integrated information theory of consciousness makes it claims. We argue that, in its current formulation, IIT is threatened by the causal exclusion problem. Some proponents of IIT have attempted to thwart the causal exclusion problem by arguing that IIT has the resources to demonstrate genuine causal emergence at macro scales. In contrast, we argue that their proposed solution to the problem is damagingly circular as a result of inter-defining information and (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   3 citations  
  44. An Informational Theory of Counterfactuals.Danilo Dantas - 2018 - Acta Analytica 33 (4):525-538.
    Backtracking counterfactuals are problem cases for the standard, similarity based, theories of counterfactuals e.g., Lewis. These theories usually need to employ extra-assumptions to deal with those cases. Hiddleston, 632–657, 2005) proposes a causal theory of counterfactuals that, supposedly, deals well with backtracking. The main advantage of the causal theory is that it provides a unified account for backtracking and non-backtracking counterfactuals. In this paper, I present a backtracking counterfactual that is a problem case for Hiddleston’s account. Then I propose an (...)
    Remove from this list   Direct download (3 more)  
     
    Export citation  
     
    Bookmark  
  45. E-Text.Niels Finnemann - 2018 - Oxford Researech Encyclopedia - Literature.
    Electronic text can be defined on two different, though interconnected, levels. On the one hand, electronic text can be defined by taking the notion of “text” or “printed text” as the point of departure. On the other hand, electronic text can be defined by taking the digital format as the point of departure, where everything is represented in the binary alphabet. While the notion of text in most cases lends itself to being independent of medium and embodiment, it is also (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  46. A Scientific Metaphysical Naturalisation of Information.Bruce Long - 2018 - Dissertation, University of Sydney
    The objective of this thesis is to present a naturalised metaphysics of information, or to naturalise information, by way of deploying a scientific metaphysics according to which contingency is privileged and a-priori conceptual analysis is excluded (or at least greatly diminished) in favour of contingent and defeasible metaphysics. The ontology of information is established according to the premises and mandate of the scientific metaphysics by inference to the best explanation, and in accordance with the idea that the primacy of physics (...)
    Remove from this list   Direct download  
     
    Export citation  
     
    Bookmark   1 citation  
  47. Information and Inaccuracy.William Roche & Tomoji Shogenji - 2018 - British Journal for the Philosophy of Science 69 (2):577-604.
    This article proposes a new interpretation of mutual information. We examine three extant interpretations of MI by reduction in doubt, by reduction in uncertainty, and by divergence. We argue that the first two are inconsistent with the epistemic value of information assumed in many applications of MI: the greater is the amount of information we acquire, the better is our epistemic position, other things being equal. The third interpretation is consistent with EVI, but it is faced with the problem of (...)
    Remove from this list   Direct download (6 more)  
     
    Export citation  
     
    Bookmark   6 citations  
  48. Universal Prediction.Tom F. Sterkenburg - 2018 - Dissertation, University of Groningen
    In this thesis I investigate the theoretical possibility of a universal method of prediction. A prediction method is universal if it is always able to learn from data: if it is always able to extrapolate given data about past observations to maximally successful predictions about future observations. The context of this investigation is the broader philosophical question into the possibility of a formal specification of inductive or scientific reasoning, a question that also relates to modern-day speculation about a fully automatized (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   1 citation  
  49. Patterns, Information, and Causation.Holly Andersen - 2017 - Journal of Philosophy 114 (11):592-622.
    This paper articulates an account of causation as a collection of information-theoretic relationships between patterns instantiated in the causal nexus. I draw on Dennett’s account of real patterns to characterize potential causal relata as patterns with specific identification criteria and noise tolerance levels, and actual causal relata as those patterns instantiated at some spatiotemporal location in the rich causal nexus as originally developed by Salmon. I develop a representation framework using phase space to precisely characterize causal relata, including their degree (...)
    Remove from this list   Direct download (5 more)  
     
    Export citation  
     
    Bookmark   14 citations  
  50. The Deluge of Spurious Correlations in Big Data.Cristian S. Calude & Giuseppe Longo - 2017 - Foundations of Science 22 (3):595-612.
    Very large databases are a major opportunity for science and data analytics is a remarkable new field of investigation in computer science. The effectiveness of these tools is used to support a “philosophy” against the scientific method as developed throughout history. According to this view, computer-discovered correlations should replace understanding and guide prediction and action. Consequently, there will be no need to give scientific meaning to phenomena, by proposing, say, causal relations, since regularities in very large databases are enough: “with (...)
    Remove from this list   Direct download (2 more)  
     
    Export citation  
     
    Bookmark   24 citations  
1 — 50 / 231