A classic source for understanding the connections between informationtheory and physics, this text was written by one of the giants of 20th-century physics and is appropriate for upper-level undergraduates and graduate students. Topics include the principles of coding, coding problems and solutions, the analysis of signals, a summary of thermodynamics, thermal agitation and Brownian motion, and thermal noise in an electric circuit. A discussion of the negentropy principle of information introduces the author's renowned examination of Maxwell's (...) demon. Concluding chapters explore the associations between informationtheory, the uncertainty principle, and physical limits of observation, in addition to problems related to computing, organizing information, and inevitable errors. 1962 ed. 81 figures. 14 tables. (shrink)
We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and wherethey are fundamentally different. We discuss and relate the basicnotions of both theories: Shannon entropy, Kolmogorov complexity, Shannon mutual informationand Kolmogorov (``algorithmic'') mutual information. We explainhow universal coding may be viewed as a middle ground betweenthe two theories. We consider Shannon's rate distortion theory, whichquantifies useful (in a certain sense) information.We use the communication of (...) class='Hi'>information as our guiding motif, and we explain howit relates to sequential question-answer sessions. (shrink)
Communication is an important feature of the living world that mainstream biology fails to adequately deal with. Applying two main disciplines can be contemplated to fill in this gap: semiotics and informationtheory. Semiotics is a philosophical discipline mainly concerned with meaning; applying it to life already originated in biosemiotics. Informationtheory is a mathematical discipline coming from engineering which has literal communication as purpose. Biosemiotics and informationtheory are thus concerned with distinct and (...) complementary possible meanings of the word ‘communication’. Since literal communication needs to be secured so as to enable semantics being communicated, informationtheory is a necessary prerequisite to biosemiotics. Moreover, heredity is a purely literal communication process of capital importance fully relevant to literal communication, hence to informationtheory. A short introduction to discrete informationtheory is proposed, which is centred on the concept of redundancy and its use in order to make sequences resilient to errors. Informationtheory has been an extremely active and fruitful domain of researches and the motor of the tremendous progress of communication engineering in the last decades. Its possible connections with semantics and linguistics are briefly considered. Its applications to biology are suggested especially as regards error-correcting codes which are mandatory for securing the conservation of genomes. Biology needs informationtheory so biologists and communication engineers should closely collaborate. (shrink)
It is suggested that quantum mechanics is not fundamental but emerges from classical informationtheory applied to causal horizons. The path integral quantization and quantum randomness can be derived by considering information loss of fields or particles crossing Rindler horizons for accelerating observers. This implies that information is one of the fundamental roots of all physical phenomena. The connection between this theory and Verlinde’s entropic gravity theory is also investigated.
Since the beginning of the XX-th century, it became increasingly evident that information, besides matter and energy, is a major actor in the life processes. Moreover, communication of information has been recognized as differentiating living things from inanimate ones, hence as specific to the life processes. Therefore the sciences of matter and energy, chemistry and physics, do not suffice to deal with life processes. Biology should also rely on sciences of information. A majority of biologists, however, did (...) not change their mind and continued to describe life in terms of chemistry and physics. They merely borrowed some vocabulary from the information sciences. The first science of information available to biological applications, semiotics, appeared at the end of the XIX-th century. It is a qualitative and descriptive science which stemmed from efforts of linguists and philosophers to understand the human language and is thus mainly concerned with semantics. Applying semiotics to biology resulted in today’s Biosemiotics. Independently, an explosive expansion of communication engineering began in the second half of the XX-th century. Besides tremendous progresses in hardware technology, it was made possible by the onset of a science of literal communication: InformationTheory (Shannon, Bell Syst Tech J 27:379–457, 623–656, 1948). Literal communication consists of faithfully transporting a message from a place to another, or from an instant to another. Because the meaning of a message does not matter for its transportation, informationtheory ignores semantics. This restriction enables defining information as a measurable quantity on which a mathematical theory of communication is founded. Although lacking implementation means at its beginning, informationtheory became later very successful for designing communication means. Modern ones, like mobile phones, can be thought of as experimentally proving the relevance and accuracy of informationtheory since their design and operation heavily rely on it. Informationtheory is plainly relevant to biological functions which involve literal communication, especially heredity. This paper is intended to compare the two approaches. It shows that, besides obvious differences, they have some points in common: for instance, the quantitative measurement of information obeys Peirce’s triadic paradigm. They also can mutually enlighten each other. Using informationtheory, which is closer to the basic communication mechanisms, may appear as a preliminary step prior to more elaborated investigations. Criticizing genetics from outside, informationtheory furthermore reveals that the ability of the template-replication paradigm to faithfully conserve genomes is but a prejudice. Heredity actually demands error-correcting means which impose severe constraints to the living world and must be recognized as biological facts. (shrink)
Intelligent design advocate William Dembski has introduced a measure of information called "complex specified information", or CSI. He claims that CSI is a reliable marker of design by intelligent agents. He puts forth a "Law of Conservation of Information" which states that chance and natural laws are incapable of generating CSI. In particular, CSI cannot be generated by evolutionary computation. Dembski asserts that CSI is present in intelligent causes and in the flagellum of Escherichia coli, and concludes (...) that neither have natural explanations. In this paper, we examine Dembski's claims, point out significant errors in his reasoning, and conclude that there is no reason to accept his assertions. (shrink)
The article addresses the problem of how semantic information can be upgraded to knowledge. The introductory section explains the technical terminology and the relevant background. Section 2 argues that, for semantic information to be upgraded to knowledge, it is necessary and sufficient to be embedded in a network of questions and answers that correctly accounts for it. Section 3 shows that an information flow network of type A fulfils such a requirement, by warranting that the erotetic deficit, (...) characterising the target semantic information t by default, is correctly satisfied by the information flow of correct answers provided by an informational source s. Section 4 illustrates some of the major advantages of such a Network Theory of Account (NTA) and clears the ground of a few potential difficulties. Section 5 clarifies why NTA and an informational analysis of knowledge, according to which knowledge is accounted semantic information, is not subject to Gettier-type counterexamples. A concluding section briefly summarises the results obtained. (shrink)
This paper outlines a quantitative theory of strongly semantic information (TSSI) based on truth-values rather than probability distributions. The main hypothesis supported in the paper is that the classic quantitative theory of weakly semantic information (TWSI), based on probability distributions, assumes that truth-values supervene on factual semantic information, yet this principle is too weak and generates a well-known semantic paradox, whereas TSSI, according to which factual semantic information encapsulates truth, can avoid the paradox and (...) is more in line with the standard conception of what generally counts as semantic information. After a brief introduction, section two outlines the semantic paradox implied by TWSI, analysing it in terms of an initial conflict between two requisites of a quantitative theory of semantic information. In section three, three criteria of semantic information equivalence are used to provide a taxonomy of quantitative approaches to semantic information and introduce TSSI. In section four, some further desiderata that should be fulfilled by a quantitative TSSI are explained. From section five to section seven, TSSI is developed on the basis of a calculus of truth-values and semantic discrepancy with respect to a given situation. In section eight, it is shown how TSSI succeeds in solving the paradox. Section nine summarises the main results of the paper and indicates some future developments. (shrink)
This paper will examine the implications of an extended “field theory of information,” suggested by Wolfhart Pannenberg, specifically in the Christian understanding of creation. The paper argues that the Holy Spirit created the world as field, a concept from physics, and the creation is directed by the logos utilizing information. Taking into account more recent developments of informationtheory, the essay further suggests that present creation has a causal impact upon the information utilized in (...) creation. In order to adequately address Pannenberg's hypothesis that the logos utilizes information at creation the essay will also include an introductory examination of Pannenberg's Christology which shifts from a strict “from below” Christology, to a more open “third way” of doing Christology beyond “above” and “below.” The essay concludes with a brief section relating the implications of an extended “field theory of information” to creative inspiration, as well as parallels with human inspiration. (shrink)
Research in psychology suggests that some individuals are more sensitive to positive than to negative information while others are more sensitive to negative rather than positive information. I take these cognitive positive–negative asymmetries in information processing to a Bayesian decision-theory model and explore its consequences in terms of decisions and payoffs. I show that in monotone decision problems economic agents with more positive-responsive information structures are always better off, ex ante, when they face problems where (...) payoffs are relatively more sensitive to the action chosen when the state of nature is favorable. (shrink)
Weak Quantum Theory (WQT) and the Model of Pragmatic Information (MPI) are two psychophysical concepts developed on the basis of quantum physics. The present study contributes to their empirical examination. The issue of the study is whether WQT and MPI can not only explain ‘psi’-phenomena theoretically but also prove to be consistent with the empirical phenomenology of extrasensory perception (ESP). From the main statements of both models, 33 deductions for psychic readings are derived. Psychic readings are defined as (...) settings, in which psychics support or counsel clients by using information not mediated through the five senses. A qualitative approach is chosen to explore how the psychics experience extrasensory perceptions. Eight psychics are interviewed with a half-structured method. The reports are examined regarding deductive and inductive aspects, using a multi-level structured content analysis. The vast majority of deductions is clearly confirmed by the reports. Even though the study has to be seen as an explorative attempt with many aspects to be specified, WQT and MPI prove to be coherent and helpful concepts to explain ESP in psychic readings. (shrink)
Introduction into the structure, contents and specifications (especially the Systematifier) of the Information Coding Classification, developed in the seventies and used in many ways by the author and a few others following its publication in 1982. Its theoretical basis is explained consisting in (1) the Integrative Level Theory, following an evolutionary approach of ontical areas, and integrating also on each level the aspects contained in the sequence of the levels, (2) the distinction between categories of form and (...)categories of being, (3) the application of a feature of Systems Theory (namely the element position plan) and (4) the inclusion of a concept theory, distinguishing four kinds of relationships, originated by the kinds of characteristics (which are the elements of concepts to be derived from the statements on the properties of referents of concepts). Its special Subject Groups on each of its nine levels are outlined and the combinatory facilities at certain positions of the Systematifier are shown. Further elaboration and use have been suggested, be it only as a switching language between the six existing universal classification systems at present in use internationally. (shrink)
We show that three fundamental information-theoretic constraints -- the impossibility of superluminal information transfer between two physical systems by performing measurements on one of them, the impossibility of broadcasting the information contained in an unknown physical state, and the impossibility of unconditionally secure bit commitment -- suffice to entail that the observables and state space of a physical theory are quantum-mechanical. We demonstrate the converse derivation in part, and consider the implications of alternative answers to a (...) remaining open question about nonlocality and bit commitment. (shrink)
One major fault line in foundational theories of cognition is between the so-called “representational” and “non-representational” theories. Is it possible to formulate an intermediate approach for a foundational theory of cognition by defining a conception of representation that may bridge the fault line? Such an account of representation, as well as an account of correspondence semantics, is offered here. The account extends previously developed agent-based pragmatic theories of semantic information, where meaning of an information state is defined (...) by its interface role, to a theory that accommodates a notion of representation and correspondence semantics. It is argued that the account can be used to develop an intermediate approach to cognition, by showing that the major sources of tension between “representational” and “non-representational” theories may be eased. (shrink)
Information security can be of high moral value. It can equally be used for immoral purposes and have undesirable consequences. In this paper we suggest that critical theory can facilitate a better understanding of possible ethical issues and can provide support when finding ways of addressing them. The paper argues that critical theory has intrinsic links to ethics and that it is possible to identify concepts frequently used in critical theory to pinpoint ethical concerns. Using the (...) example of UK electronic medical records the paper demonstrates that a critical lens can highlight issues that traditional ethical theories tend to overlook. These are often linked to collective issues such as social and organisational structures, which philosophical ethics with its typical focus on the individual does not tend to emphasise. The paper suggests that this insight can help in developing ways of researching and innovating responsibly in the area of information security. (shrink)
For largely historical reasons, information and communication technology in education has been heavily influenced by a form of constructivism based on the transmission and transformation of information. This approach has implications for both learning and teaching in the field. The assumptions underlying the approach are explored and a critique offered. Although the transmission approach is entrenched in procedures and pedagogies, it is increasingly challenged by an action-theoretical form of constructivism. In this 'ecology of ideas', the value of the (...) two theoretical stances might be judged in terms of their practical utility and the contributions they make to understanding ICT. (shrink)
The concept of “information” is virtually ubiquitous in contemporary cognitive science. It is claimed to be “processed” (in cognitivist theories of perception and comprehension), “stored” (in cognitivist theories of memory and recognition), and otherwise manipulated and transformed by the human central nervous system. Fred Dretske's extensive philosophical defense of a theory of informational content (“semantic” information) based upon the Shannon-Weaver formal theory of information is subjected to critical scrutiny. A major difficulty is identified in Dretske's (...) equivocations in the use of the concept of a “signal” bearing informational content. Gibson's alternative conception of information (construed as analog by Dretske), while avoiding many of the problems located in the conventional use of “signal”, raises different but equally serious questions. It is proposed that, taken literally, the human CNS does not extract or process information at all; rather, whatever “information” is construed as locatable in the CNS is information only for an observer-theorist and only for certain purposes. (shrink)
Three of the major issues in information ethics – intellectual property, speech regulation, and privacy – concern the morality of restricting people’s access to certain information. Consequently, policies in these areas have a significant impact on the amount and types of knowledge that people acquire. As a result, epistemic considerations are critical to the ethics of information policy decisions (cf. Mill, 1978 ). The fact that information ethics is a part of the philosophy of information (...) highlights this important connection with epistemology. In this paper, I illustrate how a value-theoretic approach to epistemology can help to clarify these major issues in information ethics. However, I also identify several open questions about epistemic values that need to be answered before we will be able to evaluate the epistemic consequences of many information policies. (shrink)
Quantum informationtheory has given rise to a renewed interest in, and a new perspective on, the old issue of understanding the ways in which quantum mechanics differs from classical mechanics. The task of distinguishing between quantum and classical theory is facilitated by neutral frameworks that embrace both classical and quantum theory. In this paper, I discuss two approaches to this endeavour, the algebraic approach, and the convex set approach, with an eye to the strengths of (...) each, and the relations between the two. I end with a discussion of one particular model, the toy theory devised by Rob Spekkens, which, with minor modifications, fits neatly within the convex sets framework, and which displays in an elegant manner some of the similarities and differences between classical and quantum theories. The conclusion suggested by this investigation is that Schrödinger was right to find the essential difference between classical and quantum theory in their handling of composite systems, though Schrödinger's contention that it is entanglement that is the distinctive feature of quantum mechanics needs to be modified. (shrink)
The distinction between explanation and prediction has received much attention in recent literature, but the equally important distinction between explanation and description (or between prediction and description) remains blurred. This latter distinction is particularly important in the social sciences, where probabilistic models (or theories) often play dual roles as explanatory and descriptive devices. The distinction between explanation (or prediction) and description is explicated in the present paper in terms of informationtheory. The explanatory (or predictive) power of a (...) probabilistic model is identified with information taken from (or transmitted by) the environment (e.g., the independent, experimentally manipulated variables), while the descriptive power of a model reflects additional information taken from (or transmitted by) the data. Although information is usually transmitted by the data in the process of estimating parameters, it turns out that the number of free parameters is not a reliable index of transmitted information. Thus, the common practice of treating parameters as degrees-of-freedom in testing probabilistic models is questionable. Finally, this information-theoretic analysis of explanation, prediction, and description suggests ways of resolving some recent controversies surrounding the pragmatic aspects of explanation and the so-called structural identity thesis. (shrink)
Informationtheory offers a measure of "mutual information" which provides an appropriate measure of tracking efficiency for the naturalistic epistemologist. The statistical entropy on which it is based is arguably the best way of characterizing the uncertainty associated with the behavior of a system, and it is ontologically neutral. Though not appropriate for the naturalization of meaning, mutual information can serve as a measure of epistemic success independent of semantic maps and payoff structures. While not containing (...) payoffs as terms, mutual information places both upper and lower bounds on payoffs. This constitutes a non-trivial relationship to utility. (shrink)
My aim in this paper is a modest one. I do not have any particular thesis to advance about the nature of entanglement, nor can I claim novelty for any of the material I shall discuss. My aim is simply to raise some questions about entanglement that spring naturally from certain developments in quantum informationtheory and are, I believe, worthy of serious consideration by philosophers of science. The main topics I discuss are different manifestations of quantum nonlocality, (...) entanglement-assisted communication, and entanglement thermodynamics. (shrink)
This thesis is a contribution to the debate on the implications of quantum informationtheory for the foundations of quantum mechanics. In Part 1, the logical and conceptual status of various notions of information is assessed. It is emphasized that the everyday notion of information is to be firmly distinguished from the technical notions arising in informationtheory; however it is maintained that in both settings `information' functions as an abstract noun, hence does (...) not refer to a particular or substance (the worth of this point is illustrated in application to quantum teleportation). The claim that `Information is Physical' is assessed and argued to face a destructive dilemma. Accordingly, the slogan may not be understood as an ontological claim, but at best, as a methodological one. The reflections of Bruckner and Zeilinger (2001) and Deutsch and Hayden (2000) on the nature of information in quantum mechanics are critically assessed and some results presented on the characterization of entanglement in the Deutsch-Hayden formalism. Some philosophical aspects of quantum computation are discussed and general morals drawn concerning the nature of quantum informationtheory. In Part II, following some preliminary remarks, two particular information-theoretic approaches to the foundations of quantum mechanics are assessed in detail. It is argued that Zeilinger's (1999) Foundational Principle is unsuccessful as a foundational principle for quantum mechanics. The information-theoretic characterization theorem of Clifton, Bub and Halvorson (2003) is assessed more favourably, but the generality of the approach is questioned and it is argued that the implications of the theorem for the traditional foundational problems in quantum mechanics remains obscure. (shrink)
Recent suggestions to supply quantum mechanics (QM) with realistic foundations by reformulating it in light of quantum informationtheory (QIT) are examined and are found wanting by pointing to a basic conceptual problem that QIT itself ignores, namely, the measurement problem. Since one cannot ignore the measurement problem and at the same time pretend to be a realist, as they stand, the suggestions to reformulate QM in light of QIT are nothing but instrumentalism in disguise.
A remarkable theorem by Clifton, Bub and Halvorson (2003) (CBH) characterizes quantum theory in terms of information--theoretic principles. According to Bub (2004, 2005) the philosophical significance of the theorem is that quantum theory should be regarded as a ``principle'' theory about (quantum) information rather than a ``constructive'' theory about the dynamics of quantum systems. Here we criticize Bub's principle approach arguing that if the mathematical formalism of quantum mechanics remains intact then there is no (...) escape route from solving the measurement problem by constructive theories. We further propose a (Wigner--type) thought experiment that we argue demonstrates that quantum mechanics on the information--theoretic approach is incomplete. (shrink)
Categorical logic has shown that modern logic is essentially the logic of subsets (or “subobjects”). In “subset logic,” predicates are modeled as subsets of a universe and a predicate applies to an individual if the individual is in the subset. Partitions are dual to subsets so there is a dual logic of partitions where a “distinction” [an ordered pair of distinct elements (u, u′) from the universe U] is dual to an “element”. A predicate modeled by a partition π on (...) U would apply to a distinction if the pair of elements was distinguished by the partition π, i.e., if u and u′ were in different blocks of π. Subset logic leads to finite probability theory by taking the (Laplacian) probability as the normalized size of each subset-event of a finite universe. The analogous step in the logic of partitions is to assign to a partition the number of distinctions made by a partition normalized by the total number of ordered |U|2 pairs from the finite universe. That yields a notion of “logical entropy” for partitions and a “logical informationtheory.” The logical theory directly counts the (normalized) number of distinctions in a partition while Shannon’s theory gives the average number of binary partitions needed to make those same distinctions. Thus the logical theory is seen as providing a conceptual underpinning for Shannon’s theory based on the logical notion of “distinctions.”. (shrink)
The Shannon information function (H) has been extensively used in ecology as a statistic of species diversity. Yet, the use of Shannon diversity index has also been criticized, mainly because of its ambiguous ecological interpretation and because of its relatively great sensitivity to the relative abundances of species in the community. In my opinion, the major shortcoming of the traditional perspective (on the possible relation of species diversity with informationtheory) is that species need for an external (...) receiver (the scientist or ecologist) to exist and transmit information. Because organisms are self-catalized replicating structures that can transmit genotypic information to offspring, it should be evident that any single species has two possible states or alternatives: to be or not to be. In other words, species have no need for an external receiver since they are their own receivers. Therefore, the amount of biological information (at the species scale) in a community with one only species would be species, and not bits as in the traditional perspective. Moreover, species diversity appears to be a monotonic increasing function of (or S) when all species are equally probable (S being species richness), and not a function of as in the traditional perspective. To avoid the noted shortcoming, we could use 2H (instead of H) for calculating species diversity and species evenness (= 2H/S). However, owing to the relatively great sensitivity of H to the relative abundances of species in the community, the value of species dominance (= 1 − 2H/S) is unreasonably high when differences between dominant and subordinate species are considerable, thereby lowering the value of species evenness and diversity. This unsatisfactory behaviour is even more evident for Simpson index and related algorithms. I propose the use of other statistics for a better analysis of community structure, their relationship being: species evenness + species dominance = 1; species diversity × species uniformity = 1; and species diversity = species richness × species evenness. (shrink)
Giovanni Sommaruga (ed): Formal Theories of Information: From Shannon to Semantic InformationTheory and General Concepts of Information Content Type Journal Article Pages 35-40 DOI 10.1007/s11023-011-9250-2 Authors Sebastian Sequoiah-Grayson, Department of Theoretical Philosophy, Faculty of Philosophy, University of Groningen, Groningen, The Netherlands Journal Minds and Machines Online ISSN 1572-8641 Print ISSN 0924-6495 Journal Volume Volume 22 Journal Issue Volume 22, Number 1.
Giovanni Sommaruga (ed): Formal Theories of Information: From Shannon to Semantic InformationTheory and General Concepts of Information Content Type Journal Article Pages 119-122 DOI 10.1007/s11023-011-9228-0 Authors Giuseppe Primiero, Centre for Logic and Philosophy of Science, University of Ghent, Blandijnberg 2, Ghent, 9000 Belgium Journal Minds and Machines Online ISSN 1572-8641 Print ISSN 0924-6495 Journal Volume Volume 21 Journal Issue Volume 21, Number 1.
clusions are only probably correct. On the other hand, algorithmic informationtheory provides a precise mathematical definition of the notion of random or patternless sequence. In this paper we shall describe conditions under which if the sequence of coin tosses in the Solovay– Strassen and Miller–Rabin algorithms is replaced by a sequence of heads and tails that is of maximal algorithmic information content, i.e., has maximal algorithmic randomness, then one obtains an error-free test for primality. These results (...) are only of theoretical interest, since it is a manifestation of the G¨ odel incompleteness phenomenon that it is impossible to “certify” a sequence to be random by means of a proof, even though most sequences have this property. Thus by using certified random sequences one can in principle, but not in practice, convert probabilistic tests for primality into deterministic ones. (shrink)
Algorithmic informationtheory, or the theory of Kolmogorov complexity, has become an extraordinarily popular theory, and this is no doubt due, in some part, to the fame of Chaitin’s incompleteness results arising from this ﬁeld. Actually, there are two rather different results by Chaitin: the earlier one concerns the ﬁnite limit of the provability of complexity (see Chaitin, 1974a, 1974b, 1975a); and the later is related to random reals and the halting probability (see Chaitin, 1986, 1987a, (...) 1987b, 1988, 1989. (shrink)
This paper argues that Information Theoretic Redundancy (ITR) is fundamentally a composite concept that has been continually misinterpreted since the very inception of InformationTheory. We view ITR as compounded of true redundancy and partial redundancy. This demarcation of true redundancy illustrates a limiting case phenomenon: the underlying metric (number of alternatives) differs only by degree but the properties of this concept differ in kind from those of partial redundancy. Several other studies are instanced which also imply (...) the composite nature of ITR. We thus provide broadly based but particular support for earlier generalized suggestions that it is the underlying calculus of InformationTheory rather than the ill-named concepts themselves that provides something of a unitary language for the description of phenomena. (shrink)
Hi everybody! It's a great pleasure for me to be back here at the new, improved Santa Fe Institute in this spectacular location. I guess this is my fourth visit and it's always very stimulating, so I'm always very happy to visit you guys. I'd like to tell you what I've been up to lately. First of all, let me say what algorithmic informationtheory is good for, before telling you about the new version of it I've got.
We examine the problem of the existence (in classical and/or quantum physics) of longitudinal limitations of measurability, defined as limitations preventing the measurement of a given quantity with arbitrarily high accuracy. We consider a measuring device as a generalized communication system, which enables us to use methods of informationtheory. As a direct consequence of the Shannon theorem on channel capacity, we obtain an inequality which limits the accuracy of a measurement in terms of the average power necessary (...) to transmit the information content of the measurement itself. This inequality holds in a classical as well as in a quantum framework. (shrink)
Theoretical methods for empirical state determination of entangled two-level systems are analyzed in relation to informationtheory. We show that hidden variable theories would lead to a Shannon index of correlation between the entangled subsystems which is larger than that predicted by quantum mechanics. Canonical representations which have maximal correlations are treated by the use of Schmidt and Hilbert-Schmidt decomposition of the entangled states, including especially the Bohm singlet state and the GHZ entangled states. We show that quantum (...) mechanics does not violate locality, but does violate realism. (shrink)
The notion of information has figured prominently in much modern evolutionary theorizing. But while theorists usually concede the importance of distinguishing between our ordinary use of this notion and its special acceptation in informationtheory, some biological theorizing requires "information" to serve a double duty. Lorenz's ethological theorizing is a case in point, and this paper challenges its conceptual underpinnings. Special attention is paid to Lorenz's contention that adaptation to an environment is akin to representation, and (...) it is urged that many purportedly (phylogenetically) adapted behaviors might well instead be better interpreted as non-information-laden (ontogenetically) adaptive behaviors. (shrink)
Tononi has proposed a fundamental theory of consciousness he terms Integrated InformationTheory (IIT). IIT purports to explain the quantity of conscious experience by linking it with integrated information: information shared by the system as a whole and quantified by adopting a modified version of Shannon's definition of information. Since the fundamental aspect of IIT is information the theory allows for the multiple realizability of consciousness. While there are several concepts within IIT (...) that need further theoretical development, the main failings of the theory are an absence of a link between conscious experience and awareness and the use of Shannon's limited data based definition of information. These limitations prevent the theory from satisfying Chalmers' principles of structural coherence and organizational invariance which any functionalist theory should obey. It is not clear whether IIT can be repaired without simply reducing the theory to a general statement of computational functionalism. (shrink)