We offer a novel theory of information that differs from traditional accounts in two respects: (i) it explains information in terms of counterfactuals rather than conditional probabilities, and (ii) it does not make essential reference to doxastic states of subjects, and consequently allows for the sort of objective, reductive explanations of various notions in epistemology and philosophy of mind that many have wanted from an account of information.
Cohen and Meskin 2006 recently offered a counterfactualtheory of information to replace the standard probabilistic theory of information. They claim that the counterfactualtheory fares better than the standard account on three grounds: first, it provides a better framework for explaining information flow properties; second, it requires a less expensive ontology; and third, because it does not refer to doxastic states of the information-receiving organism, it provides an objective basis. (...) In this paper, I show that none of these is really an advantage. Moreover, the counterfactualtheory fails to satisfy one of the basic properties of information flow, namely the Conjunction principle. Thus, I conclude, there is no reason to give up the standard probabilistic theory for the counterfactualtheory of information. (shrink)
Backtracking counterfactuals are problem cases for the standard, similarity based, theories of counterfactuals e.g., Lewis. These theories usually need to employ extra-assumptions to deal with those cases. Hiddleston, 632–657, 2005) proposes a causal theory of counterfactuals that, supposedly, deals well with backtracking. The main advantage of the causal theory is that it provides a unified account for backtracking and non-backtracking counterfactuals. In this paper, I present a backtracking counterfactual that is a problem case for Hiddleston’s account. Then (...) I propose an informational theory of counterfactuals, which deals well with this problem case while maintaining the main advantage of Hiddleston’s account. In addition, the informational theory offers a general theory of backtracking that provides clues for the semantics and epistemology of counterfactuals. I propose that backtracking is reasonable when the state of affairs expressed in the antecedent of a counterfactual transmits less information about an event in the past than the actual state of affairs. (shrink)
This book presents an attempt to develop a theory of knowledge and a philosophy of mind using ideas derived from the mathematical theory of communication developed by Claude Shannon. Information is seen as an objective commodity defined by the dependency relations between distinct events. Knowledge is then analyzed as information caused belief. Perception is the delivery of information in analog form for conceptual utilization by cognitive mechanisms. The final chapters attempt to develop a (...) class='Hi'>theory of meaning by viewing meaning as a certain kind of information-carrying role. (shrink)
We argue, in the spirit of some of Jean-Yves Jaffray's work, that explicitly incorporating the information, however imprecise, available to the decision maker is relevant, feasible, and fruitful. In particular, we show that it can lead us to know whether the decision maker has wrong beliefs and whether it matters or not, that it makes it possible to better model and analyze how the decision maker takes into account new information, even when this information is not an (...) event and finally that it is crucial when attempting to identify and measure the decision maker's attitude toward imprecise information. (shrink)
A framework for pragmatic analysis is proposed which treats discourse as a game, with context as a scoreboard organized around the questions under discussion by the interlocutors. The framework is intended to be coordinated with a dynamic compositional semantics. Accordingly, the context of utterance is modeled as a tuple of different types of information, and the questions therein — modeled, as is usual in formal semantics, as alternative sets of propositions — constrain the felicitous flow of discourse. A requirement (...) of Relevance is satisfied by an utterance (whether an assertion, a question or a suggestion) iff it addresses the question under discussion. Finally, it is argued that the prosodic focus of an utterance canonically serves to reflect the question under discussion (at least in English), placing additional constraints on felicity in context. (shrink)
We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and wherethey are fundamentally different. We discuss and relate the basicnotions of both theories: Shannon entropy, Kolmogorov complexity, Shannon mutual informationand Kolmogorov (``algorithmic'') mutual information. We explainhow universal coding may be viewed as a middle ground betweenthe two theories. We consider Shannon's rate distortion theory, whichquantifies useful (in a certain sense) information.We use the communication of (...) class='Hi'>information as our guiding motif, and we explain howit relates to sequential question-answer sessions. (shrink)
This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises’ views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises’ definition of probability in terms of limiting frequency and claims (...) that probability should be taken as a primitive or undefined term in accordance with modern axiomatic approaches. This of course raises the problem of how the abstract calculus of probability should be connected with the ‘actual world of experiments’. It is suggested that this link should be established, not by a definition of probability, but by an application of Popper’s concept of falsifiability. In addition to formulating his own interesting theory, Dr Gillies gives a detailed criticism of the generally accepted Neyman Pearson theory of testing, as well as of alternative philosophical approaches to probability theory. The reissue will be of interest both to philosophers with no previous knowledge of probability theory and to mathematicians interested in the foundations of probability theory and statistics. (shrink)
Information security can be of high moral value. It can equally be used for immoral purposes and have undesirable consequences. In this paper we suggest that critical theory can facilitate a better understanding of possible ethical issues and can provide support when finding ways of addressing them. The paper argues that critical theory has intrinsic links to ethics and that it is possible to identify concepts frequently used in critical theory to pinpoint ethical concerns. Using the (...) example of UK electronic medical records the paper demonstrates that a critical lens can highlight issues that traditional ethical theories tend to overlook. These are often linked to collective issues such as social and organisational structures, which philosophical ethics with its typical focus on the individual does not tend to emphasise. The paper suggests that this insight can help in developing ways of researching and innovating responsibly in the area of information security. (shrink)
We present an approach to combining three areas of research which we claim are all based on informationtheory: knowledge representation in Artificial Intelligence and Cognitive Science using prototypes, plans, or schemata; formal semantics in natural language, especially the semantics of the `if-then' conditional construct; and the logic of subjunctive conditionals first developed using a possible worlds semantics by Stalnaker and Lewis. The basic premise of the paper is that both schema-based inference and the semantics of conditionals are (...) based on Dretske's notion of information flow and Barwise and Perry's notion of a constraint in situation semantics. That is, the connection between antecedent and consequent of a conditional `if were the case then would be the case' is an informational relation holding with respect to a pragmatically determined utterance situation. The bridge between AI and conditional logic is that a prototype or planning schema represents a situation type, and the background assumptions underlying the application of a schema in a situation correspond to channel conditions on the flow of information. Adapting the work of Stalnaker and Lewis, the semantics of conditionals is modeled by a refinement ordering on situations: a conditional `if then ' holds with respect to a situation if all the minimal refinements of the situation that support also support . We present new logics of situations, information flow, and subjunctive conditionals based on three-valued partial logic that formalizes our approach, and conclude with a discussion of the resulting theory of conditionals, including the "paradoxes" of conditional implication, the difference between truth conditions and assertability conditions for subjunctive conditionals, and the relationship between subjunctive and indicative conditionals. (shrink)
We examine the problem of the existence (in classical and/or quantum physics) of longitudinal limitations of measurability, defined as limitations preventing the measurement of a given quantity with arbitrarily high accuracy. We consider a measuring device as a generalized communication system, which enables us to use methods of informationtheory. As a direct consequence of the Shannon theorem on channel capacity, we obtain an inequality which limits the accuracy of a measurement in terms of the average power necessary (...) to transmit the information content of the measurement itself. This inequality holds in a classical as well as in a quantum framework. (shrink)
Theories of statistical testing may be seen as attempts to provide systematic means for evaluating scientific conjectures on the basis of incomplete or inaccurate observational data. The Neyman-Pearson Theory of Testing (NPT) has purported to provide an objective means for testing statistical hypotheses corresponding to scientific claims. Despite their widespread use in science, methods of NPT have themselves been accused of failing to be objective; and the purported objectivity of scientific claims based upon NPT has been called (...) into question. The purpose of this paper is first to clarify this question by examining the conceptions of (I) the function served by NPT in science, and (II) the requirements of an objectivetheory of statistics upon which attacks on NPT's objectivity are based. Our grounds for rejecting these conceptions suggest altered conceptions of (I) and (II) that might avoid such attacks. Second, we propose a reformulation of NPT, denoted by NPT*, based on these altered conceptions, and argue that it provides an objectivetheory of statistics. The crux of our argument is that by being able to objectively control error frequencies NPT* is able to objectively evaluate what has or has not been learned from the result of a statistical test. (shrink)
A theory of objective, single-case chances is presented and defended. The theory states that the chance of an event E is its epistemic probability, given maximal knowledge of the possible causes of E. This theory is uniquely successful in entailing all the known properties of chance, but involves heavy metaphysical commitment. It requires an objective rationality that determines proper degrees of belief in some contexts.
__Theory of Objective Mind__ is the first book of the important German social philosopher Hans Freyer to appear in English. The work of the neo-Hegelian Freyer, especially the much admired __Theory of Objective Mind__, had a notable influence on German thinkers to follow and on America's two greatest social theorists, Talcott Parsons and Edward Shils._ Freyer took what remained valid in G. F. Hegel's work and drew upon the subsequent insights of the early work of Edmund Husserl in (...) an effort to understand the nature of culture by clarifying methodologically the process of Verstehen, the relation between life and objectivated form and the formation of the historical world as described by Wilhelm Dilthey and especially Georg Simmel. _Theory of Objective Mind__ remains a thought-provoking source of insight into the nature of human cognition and action, and necessarily of culture itself. Indeed, its pressing relevance for social philosophy today is clear from its analysis of nationality as a form of objective mind. No less relevant are its valuable analyses of creativity, tradition, and revolution as philosophical problems. For all those who seek to understand culture not just historically or sociologically but above all philosophically, __Theory of Objective Mind__ is indispensable. (shrink)
Punctuation has so far attracted attention within the linguistics community mostly from a syntactic perspective. In this paper, we give a preliminary account of the information-based aspects of punctuation, drawing our points from assorted, naturally occurring sentences. We present our formal models of these sentences and the semantic contributions of punctuation marks. Our formalism is a simpli ed analogue of an extension|due to Nicholas Asher|of Discourse Representation Theory.
There are two basic approaches to the problem of induction:the empirical one, which deems that the possibility of induction depends on how theworld was made (and how it works) and the logical one, which considers the formation(and function) of language. The first is closer to being useful for induction, whilethe second is more rigorous and clearer. The purpose of this paper is to create an empiricalapproach to induction that contains the same formal exactitude as the logical approach.This requires: (a) that (...) the empirical conditions for the induction are enunciatedand (b) that the most important results already obtained from inductive logic are againdemonstrated to be valid. Here we will be dealing only with induction by elimination,namely the analysis of the experimental confutation of a theory. The result will bea rule of refutation that takes into consideration all of the empirical aspect of theexperiment and has each of the asymptotic properties which inductive logic has shown tobe characteristic of induction. (shrink)
The effectiveness of complaint handling and service recovery policies in customer retention has been the focus of both scholars and service organizations. In the past decade, Justice Theory has provided the basis of the dominant theoretical framework for complaint management and service recovery. However, it does not explicitly address unfair trade practices, which constitute an ethical issue. Favorable outcomes in complaint handling may not be able to restore the reputation of a company and the potential harm perceived by consumers. (...) Using face-to-face interviews, this study applies Fairness Theory to explore the psychological responses of consumers in the post-complaint phase, particularly in ethical judgment. The findings suggest that an unfavorable outcome in the post-complaint stage leads to counterfactual thinking by the consumer about the consumer's state of well-being. The complaint must be due to the discretionary actions of the service provider whose accountability is assessed. Those harmful actions are then judged against an ethical standard. Explanations can reduce blame, and their effectiveness is moderated by outcome favorability but not ethical judgment. Favorable outcome, captured by "Would Perception," has only limited influence on Perceived Potential Harm (PPH), which is an important determinant of ethical judgment. This study makes both theoretical and practical contributions. It is the first study to validate Fairness Theory empirically and apply it to complaint handling as a complement of Justice Theory in the information and communication technology (ICT) service context. The study indicates that customers may condemn a service provider because of PPH even though the outcome is favorable. Unfair trade practices are what make customers hate ICT service providers. (shrink)
Is the Copenhagen interpretation really a subjective one? What is the special role that observations play in quantum theory? Is there really something peculiar about the projection postulate? Why does the Copenhagenist treat probabilities as properties of individual systems? Is there a measurement problem, and if so, can itin principle be solved within the framework of quantum theory? We offer aconceptual treatment of the Copenhagen interpretation of quantum mechanics in which these questions are answered and contrast it with (...) another interpretation which we call Popperism. The interpretations differ in their choice of a state description, from which it follows that they will offer a different account of the relation of measurements to states and different ways of describing quantum theory. But the two descriptions are equally consistent and physically completely equivalent. (shrink)
A particular research program on mental imagery is defended against certain sweeping methodological criticisms that have been advanced against it. The central claim is that the approach taken in the program is an appropriate response to the problem of doing empirical research in a theoretical vacuum, and that when it is viewed in this perspective, the criticisms are not merely unfounded, they are inappropriate. The argument for this claim is developed by first describing the program and then analyzing the methodological (...) rationale behind it. (shrink)
A theory of anaesthesia is presented. It consists of four hypotheses: (1) The occurrence of states of consciousness causally depends on the formation of transient higher-order, self-referential mental representations. The occurrence of such states is identical with the appearance of conscious phenomena. Loss of consciousness will occur, if and only if the brain's representational activity falls below a critical threshold. (2) Mental representations are instantiated by neural cell assemblies. (3) The formation of assemblies involves the activation of the NMDA (...) receptor channel complex. The activation state of this receptor determines the rate at which assemblies are generated. (4) General anaesthetics have a common operative mechanism: they directly or indirectly affect the function of the NMDA system. (shrink)
The article addresses the problem of how semantic information can be upgraded to knowledge. The introductory section explains the technical terminology and the relevant background. Section 2 argues that, for semantic information to be upgraded to knowledge, it is necessary and sufficient to be embedded in a network of questions and answers that correctly accounts for it. Section 3 shows that an information flow network of type A fulfils such a requirement, by warranting that the erotetic deficit, (...) characterising the target semantic information t by default, is correctly satisfied by the information flow of correct answers provided by an informational source s. Section 4 illustrates some of the major advantages of such a Network Theory of Account (NTA) and clears the ground of a few potential difficulties. Section 5 clarifies why NTA and an informational analysis of knowledge, according to which knowledge is accounted semantic information, is not subject to Gettier-type counterexamples. A concluding section briefly summarises the results obtained. (shrink)
I argue for a soft compatibilist theory of free will, i.e., such that free will is compatible with both determinism and indeterminism, directly opposite hard incompatibilism, which holds free will incompatible both with determinism and indeterminism. My intuitions in this book are primarily based on an analysis of meditation, but my arguments are highly syncretic, deriving from many fields, including behaviorism, psychology, conditioning and deconditioning theory, philosophy of language, philosophy of mind, simulation theory, etc. I offer a (...) causal/functional analysis of meta-mental control, or 'metacausality', cashed out in counterfactual terms, to solve what I call the easy problem of free will. (shrink)
This paper will examine the implications of an extended “field theory of information,” suggested by Wolfhart Pannenberg, specifically in the Christian understanding of creation. The paper argues that the Holy Spirit created the world as field, a concept from physics, and the creation is directed by the logos utilizing information. Taking into account more recent developments of informationtheory, the essay further suggests that present creation has a causal impact upon the information utilized in (...) creation. In order to adequately address Pannenberg's hypothesis that the logos utilizes information at creation the essay will also include an introductory examination of Pannenberg's Christology which shifts from a strict “from below” Christology, to a more open “third way” of doing Christology beyond “above” and “below.” The essay concludes with a brief section relating the implications of an extended “field theory of information” to creative inspiration, as well as parallels with human inspiration. (shrink)
One major fault line in foundational theories of cognition is between the so-called “representational” and “non-representational” theories. Is it possible to formulate an intermediate approach for a foundational theory of cognition by defining a conception of representation that may bridge the fault line? Such an account of representation, as well as an account of correspondence semantics, is offered here. The account extends previously developed agent-based pragmatic theories of semantic information, where meaning of an information state is defined (...) by its interface role, to a theory that accommodates a notion of representation and correspondence semantics. It is argued that the account can be used to develop an intermediate approach to cognition, by showing that the major sources of tension between “representational” and “non-representational” theories may be eased. (shrink)
Philosophers have relied on visual metaphors to analyse ideas and explain their theories at least since Plato. Descartes is famous for his system of axes, and Wittgenstein for his first design of truth table diagrams. Today, visualisation is a form of ‘computer-aided seeing’ information in data. Hence, information is the fundamental ‘currency’ exchanged through a visualisation pipeline. In this article, we examine the types of information that may occur at different stages of a general visualization pipeline. We (...) do so from a quantitative and a qualitative perspective. The quantitative analysis is developed on the basis of Shannon’s informationtheory. The qualitative analysis is developed on the basis of Floridi’s taxonomy in the philosophy of information. We then discuss in detail how the condition of the ‘data processing inequality’ can be broken in a visualisation pipeline. This theoretic finding underlines the usefulness and importance of visualisation in dealing with the increasing problem of data deluge. We show that the subject of visualisation should be studied using both qualitative and quantitative approaches, preferably in an interdisciplinary synergy between informationtheory and the philosophy of information. (shrink)
In this paper we review the general framework of operational probabilistic theories, along with the six axioms from which quantum theory can be derived. We argue that the OPT framework along with a relaxed version of five of the axioms, define a general informationtheory. We close the paper with considerations about the role of the observer in an OPT, and the interpretation of the von Neumann postulate and the Schrödinger-cat paradox.