The Editors express their gratitude and appreciation to the indi-viduals listed below who served as referees for Informal Logic for Volumes 31 (2011) and 32 (2012).
Informed consent is a central topic in contemporary biomedical ethics. Yet attempts to set defensible and feasible standards for consenting have led to persistent difficulties. In Rethinking Informed Consent in Bioethics Neil Manson and Onora O'Neill set debates about informed consent in medicine and research in a fresh light. They show why informed consent cannot be fully specific or fully explicit, and why more specific consent is not always ethically better. They argue that consent needs distinctive communicative transactions, by which (...) other obligations, prohibitions, and rights can be waived or set aside in controlled and specific ways. Their book offers a coherent, wide-ranging and practical account of the role of consent in biomedicine which will be valuable to readers working in a range of areas in bioethics, medicine and law. (shrink)
Over the last few decades, multiple studies have examined the understanding of participants in clinical research. They show variable and often poor understanding of key elements of disclosure, such as expected risks and the experimental nature of treatments. Did the participants in these studies give valid consent? According to the standard view of informed consent they did not. The standard view holds that the recipient of consent has a duty to disclose certain information to the profferer of consent because (...) valid consent requires that information to be understood. The contents of the understanding and disclosure requirements are therefore conceptually linked. In this paper, we argue that the standard view is mistaken. The disclosure and understanding requirements have distinct grounds tied to two different ways in which a token of consent can be rendered invalid. Analysis of these grounds allows us to derive the contents of the two requirements. It also implies that it is sometimes permissible to enroll willing participants who have not understood everything that they ought to be told about their clinical trials. (shrink)
Informational theories of semantic content have been recently gaining prominence in the debate on the notion of mental representation. In this paper we examine new-wave informational theories which have a special focus on cognitive science. In particular, we argue that these theories face four important difficulties: they do not fully solve the problem of error, fall prey to the wrong distality attribution problem, have serious difficulties accounting for ambiguous and redundant representations and fail to deliver a metasemantic theory of representation. (...) Furthermore, we argue that these difficulties derive from their exclusive reliance on the notion of information, so we suggest that pure informational accounts should be complemented with functional approaches. (shrink)
This report reviews what quantum physics and information theory have to tell us about the age-old question, How come existence? No escape is evident from four conclusions: (1) The world cannot be a giant machine, ruled by any preestablished continuum physical law. (2) There is no such thing at the microscopic level as space or time or spacetime continuum. (3) The familiar probability function or functional, and wave equation or functional wave equation, of standard quantum theory provide mere continuum (...) idealizations and by reason of this circumstance conceal the information-theoretic source from which they derive. (4) No element in the description of physics shows itself as closer to primordial than the elementary quantum phenomenon, that is, the elementary device-intermediated act of posing a yes-no physical question and eliciting an answer or, in brief, the elementary act of observer-participancy. Otherwise stated, every physical quantity, every it, derives its ultimate significance from bits, binary yes-or-no indications, a conclusion which we epitomize in the phrase, it from bit. (shrink)
This paper traces the application of information theory to philosophical problems of mind and meaning from the earliest days of the creation of the mathematical theory of communication. The use of information theory to understand purposive behavior, learning, pattern recognition, and more marked the beginning of the naturalization of mind and meaning. From the inception of information theory, Wiener, Turing, and others began trying to show how to make a mind from informational and computational materials. Over the (...) last 50 years, many philosophers saw different aspects of the naturalization of the mind, though few saw at once all of the pieces of the puzzle that we now know. Starting with Norbert Wiener himself, philosophers and information theorists used concepts from information theory to understand cognition. This paper provides a window on the historical sequence of contributions made to the overall project of naturalizing the mind by philosophers from Shannon, Wiener, and MacKay, to Dennett, Sayre, Dretske, Fodor, and Perry, among others. At some time between 1928 and 1948, American engineers and mathematicians began to talk about `Theory of Information' and `Information Theory,' understanding by these terms approximately and vaguely a theory for which Hartley's `amount of information' is a basic concept. I have been unable to find out when and by whom these names were first used. Hartley himself does not use them nor does he employ the term `Theory of Transmission of Information,' from which the two other shorter terms presumably were derived. It seems that Norbert Wiener and Claude Shannon were using them in the Mid-Forties. (shrink)
Information was a frequently used concept in many fields of investigation. However, this concept is still not really understood, when it is referred for instance to consciousness and its informational structure. In this paper it is followed the concept of information from philosophical to physics perspective, showing especially how this concept could be extended to matter in general and to the living in particular, as a result of the intimate interaction between matter and information, the human body (...) appearing as a bipolar informed-matter structure. It is detailed on this way how this concept could be referred to consciousness, and an informational modeling of consciousness as an informational system of the human body is presented. Based on the anatomic architecture of the organism and on the inference of the specific information concepts, it is shown that the informational system of the human body could be described by seven informational subsystems, which are reflected in consciousness as corresponding cognitive centers. These results are able to explain the main properties of consciousness, both the cognitive and extra-cognitive properties of the mind, like that observed during the near-death experiences and other similar phenomena. Moreover, the results of such a modeling are compared with the existing empirical concepts and models on the energetic architecture of the organism, showing their relevance for the understanding of consciousness. (shrink)
Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both – although others disagree vehemently. Yet different cognitive scientists use ‘computation’ and ‘information processing’ to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and foremost, that they are (...) the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism and connectionism on the other. We defend the relevance to cognitive science of both computation, in a generic sense that we fully articulate for the first time, and information processing, in three important senses of the term. Our account advances some foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates’ empirical aspects. (shrink)
Searching for information is critical in many situations. In medicine, for instance, careful choice of a diagnostic test can help narrow down the range of plausible diseases that the patient might have. In a probabilistic framework, test selection is often modeled by assuming that people’s goal is to reduce uncertainty about possible states of the world. In cognitive science, psychology, and medical decision making, Shannon entropy is the most prominent and most widely used model to formalize probabilistic uncertainty and (...) the reduction thereof. However, a variety of alternative entropy metrics are popular in the social and the natural sciences, computer science, and philosophy of science. Particular entropy measures have been predominant in particular research areas, and it is often an open issue whether these divergences emerge from different theoretical and practical goals or are merely due to historical accident. Cutting across disciplinary boundaries, we show that several entropy and entropy reduction measures arise as special cases in a unified formalism, the Sharma-Mittal framework. Using mathematical results, computer simulations, and analyses of published behavioral data, we discuss four key questions: How do various entropy models relate to each other? What insights can be obtained by considering diverse entropy models within a unified framework? What is the psychological plausibility of different entropy models? What new questions and insights for research on human information acquisition follow? Our work provides several new pathways for theoretical and empirical research, reconciling apparently conflicting approaches and empirical findings within a comprehensive and unified information-theoretic formalism. (shrink)
Informed decision-making (IDM) is considered an important ethical and legal requirement for population-based screening. Governments offering such screening have a duty to enable invitees to make informed decisions regarding participation. Various views exist on how to define and measure IDM in different screening programmes. In this paper we first address the question which components should be part of IDM in the context of cancer screening. Departing from two diverging interpretations of the value of autonomy—as a right and as an ideal—we (...) describe how this value is operationalized in the practice of informed consent in medicine and translate this to IDM in population-based cancer screening. Next, we specify components of IDM, which is voluntariness and the requirements of disclosure and understanding. We argue that whereas disclosure should contain all information considered relevant in order to enable authentic IDM, understanding of basic information is sufficient for a valid IDM. In the second part of the paper we apply the capability approach in order to argue for the responsibility of the government to warrant equal and real opportunities for invitees for IDM. We argue that additional conditions beyond mere provision of information are needed in order to do so. (shrink)
Abstract: According to the Veridicality Thesis, information requires truth. On this view, smoke carries information about there being a fire only if there is a fire, the proposition that the earth has two moons carries information about the earth having two moons only if the earth has two moons, and so on. We reject this Veridicality Thesis. We argue that the main notions of information used in cognitive science and computer science allow A to have (...) class='Hi'>information about the obtaining of p even when p is false. (shrink)
The semantic concept of information is one of the most important, and one of the most problematical concepts in biology. I suggest a broad definition of biological information: a source becomes an informational input when an interpreting receiver can react to the form of the source (and variations in this form) in a functional manner. The definition accommodates information stemming from environmental cues as well as from evolved signals, and calls for a comparison between information‐transmission in (...) different types of inheritance systems—the genetic, the epigenetic, the behavioral, and the cultural‐symbolic. This comparative perspective highlights the different ways in which information is acquired and transmitted, and the role that such information plays in heredity and evolution. Focusing on the special properties of the transfer of information, which are very different from those associated with the transfer of materials or energy, also helps to uncover interesting evolutionary effects and suggests better explanations for some aspects of the evolution of communication. (shrink)
Information-theoretic approaches to formal logic analyse the "common intuitive" concept of propositional implication (or argumental validity) in terms of information content of propositions and sets of propositions: one given proposition implies a second if the former contains all of the information contained by the latter; an argument is valid if the conclusion contains no information beyond that of the premise-set. This paper locates information-theoretic approaches historically, philosophically and pragmatically. Advantages and disadvantages are identified by examining (...) such approaches in themselves and by contrasting them with standard transformation-theoretic approaches. Transformation-theoretic approaches analyse validity (and thus implication) in terms of transformations that map one argument onto another: a given argument is valid if no transformation carries it onto an argument with all true premises and false conclusion. Model-theoretic, set-theoretic, and substitution-theoretic approaches, which dominate current literature, can be construed as transformation-theoretic, as can the so-called possible-worlds approaches. Ontic and epistemic presuppositions of both types of approaches are considered. Attention is given to the question of whether our historically cumulative experience applying logic is better explained from a purely information-theoretic perspective or from a purely transformation-theoretic perspective or whether apparent conflicts between the two types of approaches need to be reconciled in order to forge a new type of approach that recognizes their basic complementarity. (shrink)
Semantic information is usually supposed to satisfy the veridicality thesis: p qualifies as semantic information only if p is true. However, what it means for semantic information to be true is often left implicit, with correspondentist interpretations representing the most popular, default option. The article develops an alternative approach, namely a correctness theory of truth (CTT) for semantic information. This is meant as a contribution not only to the philosophy of information but also to the (...) philosophical debate on the nature of truth. After the introduction, in Sect. 2, semantic information is shown to be translatable into propositional semantic information (i). In Sect. 3, i is polarised into a query (Q) and a result (R), qualified by a specific context, a level of abstraction and a purpose. This polarization is normalised in Sect. 4, where [Q + R] is transformed into a Boolean question and its relative yes/no answer [Q + A]. This completes the reduction of the truth of i to the correctness of A. In Sects. 5 and 6, it is argued that (1) A is the correct answer to Q if and only if (2) A correctly saturates Q by verifying and validating it (in the computer science’s sense of “verification” and “validation”); that (2) is the case if and only if (3) [Q + A] generates an adequate model (m) of the relevant system (s) identified by Q; that (3) is the case if and only if (4) m is a proxy of s (in the computer science’s sense of “proxy”) and (5) proximal access to m commutes with the distal access to s (in the category theory’s sense of “commutation”); and that (5) is the case if and only if (6) reading/writing (accessing, in the computer science’s technical sense of the term) m enables one to read/write (access) s. Sect. 7 provides some further clarifications about CTT, in the light of semantic paradoxes. Section 8 draws a general conclusion about the nature of CTT as a theory for systems designers not just systems users. In the course of the article all technical expressions from computer science are explained. (shrink)
Gaining information can be modelled as a narrowing of epistemic space . Intuitively, becoming informed that such-and-such is the case rules out certain scenarios or would-be possibilities. Chalmers’s account of epistemic space treats it as a space of a priori possibility and so has trouble in dealing with the information which we intuitively feel can be gained from logical inference. I propose a more inclusive notion of epistemic space, based on Priest’s notion of open worlds yet which contains (...) only those epistemic scenarios which are not obviously impossible. Whether something is obvious is not always a determinate matter and so the resulting picture is of an epistemic space with fuzzy boundaries. (shrink)
There are many different notions of information in logic, epistemology, psychology, biology and cognitive science, which are employed differently in each discipline, often with little overlap. Since our interest here is in biological processes and organisms, we develop a taxonomy of functional information that extends the standard cue/signal distinction. Three general, main claims are advanced here. This new taxonomy can be useful in describing learning and communication. It avoids some problems that the natural/non-natural information distinction faces. Functional (...)information is produced through exploration and stabilisation processes. (shrink)
This paper focuses on Information Warfare—the warfare characterised by the use of information and communication technologies. This is a fast growing phenomenon, which poses a number of issues ranging from the military use of such technologies to its political and ethical implications. The paper presents a conceptual analysis of this phenomenon with the goal of investigating its nature. Such an analysis is deemed to be necessary in order to lay the groundwork for future investigations into this topic, addressing (...) the ethical problems engendered by this kind of warfare. The conceptual analysis is developed in three parts. First, it delineates the relation between Information Warfare and the Information revolution. It then focuses attention on the effects that the diffusion of this phenomenon has on the concepts of war. On the basis of this analysis, a definition of Information Warfare is provided as a phenomenon not necessarily sanguinary and violent, and rather transversal concerning the environment in which it is waged, the way it is waged and the ontological and social status of its agents. The paper concludes by taking into consideration the Just War Theory and the problems arising from its application to the case of Information Warfare. (shrink)
This paper articulates an account of causation as a collection of information-theoretic relationships between patterns instantiated in the causal nexus. I draw on Dennett’s account of real patterns to characterize potential causal relata as patterns with specific identification criteria and noise tolerance levels, and actual causal relata as those patterns instantiated at some spatiotemporal location in the rich causal nexus as originally developed by Salmon. I develop a representation framework using phase space to precisely characterize causal relata, including their (...) degree of counterfactual robustness, causal profiles, causal connectivity, and privileged grain size. By doing so, I show how the philosophical notion of causation can be rendered in a format that is amenable for direct application of mathematical techniques from information theory such that the resulting informational measures are causal informational measures. This account provides a metaphysics of causation that supports interventionist semantics and causal modeling and discovery techniques. (shrink)
In this paper I will defend the incapacity of the informational frameworks in thermal physics, mainly those that historically and conceptually derive from the work of Brillouin (1962) and Jaynes (1957a), to robustly explain the approach of certain gaseous systems to their state of thermal equilibrium from the dynamics of their molecular components. I will further argue that, since their various interpretative, conceptual and technical-formal resources (e.g. epistemic interpretations of probabilities and entropy measures, identification of thermal entropy as Shannon (...) class='Hi'>information, and so on) are shown to be somehow incoherent, inconsistent or inaccurate, these informational proposals need to 'epistemically parasitize' the manifold of theoretical resources of Boltzmann's and Gibbs' statistical mechanics, respectively, in order to properly account for the equilibration process of an ideal gas from its microscopic properties. Finally, our conclusion leads us to adopt a sort of constructive skepticism regarding the explanatory value of the main informationalist trends in statistical thermophysics. (shrink)
Maynard Smith notes that he provides a natural history and not a philosophical analysis of the use of concepts of information in contemporary biology. Just a natural history, however rich, would do little to resolve the ongoing controversy about the role of these concepts in biology. None of the disputants deny that the biological use of these concepts is pervasive. The dispute is about whether these concepts—and the framework in which they are embedded—continue to be of explanatory value in (...) contemporary biology. Fortunately, Maynard Smith does much more than provide a natural history: his contribution is also a sustained attempt to justify many of the uses of information in biology. (shrink)
The essential difficulty about Computer Ethics' (CE) philosophical status is a methodological problem: standard ethical theories cannot easily be adapted to deal with CE-problems, which appear to strain their conceptual resources, and CE requires a conceptual foundation as an ethical theory. Information Ethics (IE), the philosophical foundational counterpart of CE, can be seen as a particular case of environmental ethics or ethics of the infosphere. What is good for an information entity and the infosphere in general? This is (...) the ethical question asked by IE. The answer is provided by a minimalist theory of deseerts: IE argues that there is something more elementary and fundamental than life and pain, namely being, understood as information, and entropy, and that any information entity is to be recognised as the centre of a minimal moral claim, which deserves recognition and should help to regulate the implementation of any information process involving it. IE can provide a valuable perspective from which to approach, with insight and adequate discernment, not only moral problems in CE, but also the whole range of conceptual and moral phenomena that form the ethical discourse. (shrink)
Machine generated contents note: 1. Introduction: does information matter?; Paul Davies and Niels Henrik Gregersen; Part I. History: 2. From matter to materialism ... and (almost) back Ernan McMullin; 3. Unsolved dilemmas: the concept of matter in the history of philosophy and in contemporary physics Philip Clayton; Part II. Physics: 4. Universe from bit Paul Davies; 5. The computational universe Seth Lloyd; 6. Minds and values in the quantum universe Henry Pierce Stapp; Part III. Biology: 7. The concept of (...)information in biology John Maynard Smith; 8. Levels of information: Shannon-Bolzmann-Darwin Terrence W. Deacon; 9. Information and communication in living matter Bernd-Olaf Küppers; 10. Semiotic freedom: an emerging force Jesper Hoffmeyer; 11. Care on earth: generating informed concern Holmes Rolston; Part IV. Philosophy and Theology: 12. The sciences of complexity - a new theological resource? Arthur Peacocke; 13. God as the ultimate informational principle Keith Ward; 14. Information, theology and the universe John F. Haught; 15. God, matter, and information: towards a Stoicizing Logos christology Niels Henrik Gregersen; 16. What is the 'spiritual body'? Michael Welker; Index. (shrink)
The principle of 'information causality' can be used to derive an upper bound---known as the 'Tsirelson bound'---on the strength of quantum mechanical correlations, and has been conjectured to be a foundational principle of nature. In this paper, however, I argue that the principle has not to date been sufficiently motivated to play this role; the motivations that have so far been given are either unsatisfactorily vague or else amount to little more than an appeal to intuition. I then consider (...) how one might begin to successfully motivate the principle. I argue that a compelling way of so doing is to understand it as a generalisation of Einstein's principle of the mutually independent existence---the 'being-thus'---of spatially distant things, interpreted as a special methodological principle. More specifically: I describe an argument, due to Demopoulos, to the effect that the quantum-mechanical no-signalling condition can be viewed as a generalisation, appropriate to an irreducibly statistical theory such as quantum mechanics, of the Einsteinian principle. And I then argue that a compelling way to motivate information causality is to in turn consider it as a further generalisation of the Einsteinian principle that is appropriate to a theory of communication. I nevertheless describe important obstacles that must yet be overcome if the project of establishing information causality as a foundational principle of nature is to succeed. (shrink)
Informed consent, decision-making styles and the role of patient-physician relationships are imperative aspects of clinical medicine worldwide. We present the case of a 74-year-old woman afflicted with advanced liver cancer whose attending physician, per request of the family, did not inform her of her true diagnosis. In our analysis, we explore the differences in informed-consent styles between patients who hold an "independent" and "interdependent" construal of the self and then highlight the possible implications maintained by this position in the context (...) of international clinical ethics. Finally, we discuss the need to reassess informed-consent styles suitable to the needs of each patient regardless of whether he or she resides in the United States or in Japan. (shrink)
This paper presents Integrated Information Theory (IIT) 4.0. IIT aims to account for the properties of experience in physical (operational) terms. It identifies the essential properties of experience (axioms), infers the necessary and sufficient properties that its substrate must satisfy (postulates), and expresses them in mathematical terms. In principle, the postulates can be applied to any system of units in a state to determine whether it is conscious, to what degree, and in what way. IIT offers a parsimonious explanation (...) of empirical evidence, makes testable predictions, and permits inferences and extrapolations. IIT 4.0 incorporates several developments of the past ten years, including a more accurate translation of axioms into postulates and mathematical expressions, the introduction of a unique measure of intrinsic information that is consistent with the postulates, and an explicit assessment of causal relations. By fully unfolding a system's irreducible cause-effect power, the distinctions and relations specified by a substrate can account for the quality of experience. (shrink)
We argue for the addition of trauma informed awareness, training, and skill in clinical ethics consultation by proposing a novel framework for Trauma Informed Ethics Consultation (TIEC). This approach expands on the American Society for Bioethics and Humanities (ASBH) framework for, and key insights from feminist approaches to, ethics consultation, and the literature on trauma informed care (TIC). TIEC keeps ethics consultation in line with the provision of TIC in other clinical settings. Most crucially, TIEC (like TIC) is systematically sensitive (...) to culture, history, difference, power, social exclusion, oppression, and marginalization. By engaging a neonatal intensive care ethics consult example, we define our TIEC approach and illustrate its application. Through TIEC we argue it is the role of ethics consultants to not only hold open moral spaces, but to furnish them in morally habitable ways for all stakeholders involved in the ethics consultation process, including patients, surrogates, and practitioners. (shrink)
Broadening participation in early science, technology, engineering and math learning outside of school is important for families experiencing poverty. We evaluated variations of the Teaching Together STEM pre-kindergarten program for increasing parent involvement in STEM learning. This informal STEM, family engagement program was offered in 20 schools where 92% of students received free/reduced lunch. The core treatment included a series of family education workshops, text messages, and family museum passes. The workshops were delivered at school sites by museum outreach educators. (...) We randomly assigned schools to business-as-usual control or one of three additive treatment groups. Using an additive treatment design, we provided the core program in Treatment A, we added take-home STEM materials in Treatment B, and added materials + parent monetary rewards in Treatment C. The primary outcome was parent involvement in STEM. There were no significant impacts of any treatment on parent involvement; however, the groups that added take-home materials had larger effect sizes on parent involvement at posttest and later, kindergarten follow-up. Adding parent monetary rewards only produced short-term improvements in parent involvement that faded at follow-up. We discuss implications for other community-sponsored family engagement programs focused on informal STEM learning, including considering characteristics of families who were more versus less likely to attend. These null findings suggest that alternatives to in-person family education workshops should be considered when parents are experiencing poverty and have competing demands on their time. (shrink)
Abstract: Luciano Floridi has impressively applied the concept of information to problems in semantics and epistemology, among other areas. In this essay, I briefly review two areas where I think one may usefully raise questions about some of Floridi's conclusions. One area is in the project to naturalize semantics and Floridi's use of the derived versus nonderived notion of semantic content. The other area is in the logic of information and knowledge and whether knowledge based on information (...) necessarily supports closure, in every instance. I suggest that it does not and, thereby, raise a challenge to Floridi's logic of being informed. (shrink)
Background The rise in genomic and biobanking research worldwide has led to the development of different informed consent models for use in such research. This study analyses consent documents used by investigators in the H3Africa (Human Heredity and Health in Africa) Consortium. Methods A qualitative method for text analysis was used to analyse consent documents used in the collection of samples and data in H3Africa projects. Thematic domains included type of consent model, explanations of genetics/genomics, data sharing and feedback of (...) test results. Results Informed consent documents for 13 of the 19 H3Africa projects were analysed. Seven projects used broad consent, five projects used tiered consent and one used specific consent. Genetics was mostly explained in terms of inherited characteristics, heredity and health, genes and disease causation, or disease susceptibility. Only one project made provisions for the feedback of individual genetic results. Conclusion H3Africa research makes use of three consent models—specific, tiered and broad consent. We outlined different strategies used by H3Africa investigators to explain concepts in genomics to potential research participants. To further ensure that the decision to participate in genomic research is informed and meaningful, we recommend that innovative approaches to the informed consent process be developed, preferably in consultation with research participants, research ethics committees and researchers in Africa. (shrink)
Scholars and policy makers often refer to the “information society”. And yet, it is more accurate to speak of societies, each different, some of which may qualify as information ones at different levels of maturity. Through exploration of the concepts of expectations, education and innovation, this paper explores what it means for an information society to be more or less mature than others, and the impact of this on the ongoing digital revolution.
Maynard Smith is right that one of the most striking features of contemporary biology is the ever-increasing prominence of the concept of information, along with related concepts like representation, programming, and coding. Maynard Smith is also right that this is surely a phenomenon which philosophers of science should examine closely. We should try to understand exactly what sorts of theoretical commitment are made when biological systems are described in these terms, and what connection there is between semantic descriptions in (...) biology and in other domains. (shrink)
In this conceptual paper, the traditional conceptualization of sustainable entrepreneurship is challenged because of a fundamental tension between processes involved in sustainable development and processes involved in entrepreneurship: the concept of sustainable business models contains a paradox, because sustainability involves the reduction of information asymmetries, whereas entrepreneurship involves enhanced and secured levels of information asymmetries. We therefore propose a new and integrated theory of sustainable entrepreneurship that overcomes this paradox. The basic argument is that environmental problems have to (...) be conceptualized as wicked problems or sustainability-related ecosystem failures. Because all actors involved in the entrepreneurial process are characterized by their epistemic insufficiency regarding the solving of these problems, the role of information in the sustainable entrepreneurial process changes. On the one hand, the reduction of information asymmetries primarily aims to enable actors to become critical of sustainable entrepreneurs’ actual business models. On the other hand, the epistemic insufficiency of sustainable entrepreneurs guarantees that information asymmetries remain as a source of new sustainable business opportunities. Three further characteristics of sustainable entrepreneurs are distinguished: sustainability and entrepreneurship-related risk-taking; sustainability and entrepreneurship-related self-efficacy; and the development of satisficing and open-ended solutions, together with multiple stakeholders. (shrink)
Quantum Information Theory and the Foundations of Quantum Mechanics is a conceptual analysis of one of the most prominent and exciting new areas of physics, providing the first full-length philosophical treatment of quantum information theory and the questions it raises for our understanding of the quantum world. -/- Beginning from a careful, revisionary, analysis of the concepts of information in the everyday and classical information-theory settings, Christopher G. Timpson argues for an ontologically deflationary account of the (...) nature of quantum information. Against what many have supposed, quantum information can be clearly defined (it is not a primitive or vague notion) but it is not part of the material contents of the world. Timpson's account sheds light on the nature of nonlocality and information flow in the presence of entanglement and, in particular, dissolves puzzles surrounding the remarkable process of quantum teleportation. In addition it permits a clear view of what the ontological and methodological lessons provided by quantum information theory are; lessons which bear on the gripping question of what role a concept like information has to play in fundamental physics. Topics discussed include the slogan 'Information is Physical', the prospects for an informational immaterialism (the view that information rather than matter might fundamentally constitute the world), and the status of the Church-Turing hypothesis in light of quantum computation. -/- With a clear grasp of the concept of information in hand, Timpson turns his attention to the pressing question of whether advances in quantum information theory pave the way for the resolution of the traditional conceptual problems of quantum mechanics: the deep problems which loom over measurement, nonlocality and the general nature of quantum ontology. He marks out a number of common pitfalls to be avoided before analysing in detail some concrete proposals, including the radical quantum Bayesian programme of Caves, Fuchs, and Schack. One central moral which is drawn is that, for all the interest that the quantum information-inspired approaches hold, no cheap resolutions to the traditional problems of quantum mechanics are to be had. (shrink)
Christopher G. Timpson provides the first full-length philosophical treatment of quantum information theory and the questions it raises for our understanding of the quantum world. He argues for an ontologically deflationary account of the nature of quantum information, which is grounded in a revisionary analysis of the concepts of information.
Informal logic is a method of argument analysis which is complementary to that of formal logic, providing for the pragmatic treatment of features of argumentation which cannot be reduced to logical form. The central claim of this paper is that a more nuanced understanding of mathematical proof and discovery may be achieved by paying attention to the aspects of mathematical argumentation which can be captured by informal, rather than formal, logic. Two accounts of argumentation are considered: the pioneering work of (...) Stephen Toulmin [The uses of argument, Cambridge University Press, 1958] and the more recent studies of Douglas Walton, [e.g. The new dialectic: Conversational contexts of argument, University of Toronto Press, 1998]. The focus of both of these approaches has largely been restricted to natural language argumentation. However, Walton’s method in particular provides a fruitful analysis of mathematical proof. He offers a contextual account of argumentational strategies, distinguishing a variety of different types of dialogue in which arguments may occur. This analysis represents many different fallacious or otherwise illicit arguments as the deployment of strategies which are sometimes admissible in contexts in which they are inadmissible. I argue that mathematical proofs are deployed in a greater variety of types of dialogue than has commonly been assumed. I proceed to show that many of the important philosophical and pedagogical problems of mathematical proof arise from a failure to make explicit the type of dialogue in which the proof is introduced. (shrink)
The dominant approach in privacy theory defines information privacy as some form of control over personal information. In this essay, I argue that the control approach is mistaken, but for different reasons than those offered by its other critics. I claim that information privacy involves the drawing of epistemic boundaries—boundaries between what others should and shouldn’t know about us. While controlling what information others have about us is one strategy we use to draw such boundaries, it (...) is not the only one. We conceal information about ourselves and we reveal it. And since the meaning of information is not self-evident, we also work to shape how others contextualize and interpret the information about us that they have. Information privacy is thus about more than controlling information; it involves the constant work of producing and managing public identities, what I call “social self- authorship.” In the second part of the essay, I argue that thinking about information privacy in terms of social self- authorship helps us see ways that information technology threatens privacy, which the control approach misses. Namely, information technology makes social self- authorship invisible and unnecessary, by making it difficult for us to know when others are forming impressions about us, and by providing them with tools for making assumptions about who we are which obviate the need for our involvement in the process. (shrink)
Trials with highly unfavourable risk–benefit ratios for participants, like HIV cure trials, raise questions about the quality of the consent of research participants. Why, it may be asked, would a person with HIV who is doing well on antiretroviral therapy be willing to jeopardise his health by enrolling in such a trial? We distinguish three concerns: first, how information is communicated to potential participants; second, participants’ motivations for enrolling in potentially high risk research with no prospect of direct benefit; (...) and third, participants’ understanding of the details of the trials in which they enrol. We argue that the communication concern is relevant to the validity of informed consent and the quality of decision making, that the motivation concern does not identify a genuine problem with either the validity of consent or the quality of decision making and that the understanding concern may not be relevant to the validity of consent but is relevant to the quality of decision making. In doing so, we derive guidance points for researchers recruiting and enrolling participants into their HIV cure trials, as well as the research ethics committees reviewing proposed studies. (shrink)
This article proposes a new interpretation of mutual information. We examine three extant interpretations of MI by reduction in doubt, by reduction in uncertainty, and by divergence. We argue that the first two are inconsistent with the epistemic value of information assumed in many applications of MI: the greater is the amount of information we acquire, the better is our epistemic position, other things being equal. The third interpretation is consistent with EVI, but it is faced with (...) the problem of measure sensitivity and fails to justify the use of MI in giving definitive answers to questions of information. We propose a fourth interpretation of MI by reduction in expected inaccuracy, where inaccuracy is measured by a strictly proper monotonic scoring rule. It is shown that the answers to questions of information given by MI are definitive whenever this interpretation is appropriate, and that it is appropriate in a wide range of applications with epistemic implications. _1_ Introduction _2_ Formal Analyses of the Three Interpretations _2.1_ Reduction in doubt _2.2_ Reduction in uncertainty _2.3_ Divergence _3_ Inconsistency with Epistemic Value of Information _4_ Problem of Measure Sensitivity _5_ Reduction in Expected Inaccuracy _6_ Resolution of the Problem of Measure Sensitivity _6.1_ Alternative measures of inaccuracy _6.2_ Resolution by strict propriety _6.3_ Range of applications _7_ Global Scoring Rules _8_ Conclusion. (shrink)
In this paper, I present an informational approach to the nature of personal identity. In “Plato and the problem of the chariot”, I use Plato’s famous metaphor of the chariot to introduce a specific problem regarding the nature of the self as an informational multiagent system: what keeps the self together as a whole and coherent unity? In “Egology and its two branches” and “Egology as synchronic individualisation”, I outline two branches of the theory of the self: one concerning the (...) individualisation of the self as an entity, the other concerning the identification of such entity. I argue that both presuppose an informational approach, defend the view that the individualisation of the self is logically prior to its identification , and suggest that such individualisation can be provided in informational terms. Hence, in “A reconciling hypothesis: the three membranes model”, I offer an informational individualisation of the self, based on a tripartite model, which can help to solve the problem of the chariot. Once this model of the self is outlined, in “ICTs as technologies of the self” I use it to show how ICTs may be interpreted as technologies of the self. In “The logic of realisation”, I introduce the concept of “realization” (Aristotle’s anagnorisis ) and support the rather Spinozian view according to which, from the perspective of informational structural realism, selves are the final stage in the development of informational structures. The final “Conclusion: from the egology to the ecology of the self” briefly concludes the article with a reference to the purposeful shaping of the self, in a shift from egology to ecology. (shrink)
Informed consent in medical practice is essential and a global standard that should be sought at all the times doctors interact with patients. Its intensity would vary depending on the invasiveness and risks associated with the anticipated treatment. To our knowledge there has not been any systematic review of consent practices to document best practices and identify areas that need improvement in our setting. The objective of the study was to evaluate the informed consent practices of surgeons at University teaching (...) Hospitals in a low resource setting. (shrink)
A framework for pragmatic analysis is proposed which treats discourse as a game, with context as a scoreboard organized around the questions under discussion by the interlocutors. The framework is intended to be coordinated with a dynamic compositional semantics. Accordingly, the context of utterance is modeled as a tuple of different types of information, and the questions therein — modeled, as is usual in formal semantics, as alternative sets of propositions — constrain the felicitous flow of discourse. A requirement (...) of Relevance is satisfied by an utterance (whether an assertion, a question or a suggestion) iff it addresses the question under discussion. Finally, it is argued that the prosodic focus of an utterance canonically serves to reflect the question under discussion (at least in English), placing additional constraints on felicity in context. (shrink)
Applied a theory of information integration to decision making with probabilistic events. 10 undergraduates judged the subjective worth of duplex bets that included independent gain and lose components. The worth of each component was assumed to be the product of a subjective weight that reflected the probability of winning or losing, and the subjective worth of the money to be won or lost. The total worth of the bet was the sum of the worths of the 2 components. Thus, (...) each judgment required multiplying and adding operations. The multiplying model worked quite well in 4 experimental conditions. The adding model showed more serious discrepancies, though these were small in magnitude. The theory of functional measurement was applied to scale the subjective values of the probability and money stimuli. Subjective and objective values were nonlinearly related both for probability and for money. (shrink)
Intelligent design advocate William Dembski has introduced a measure of information called "complex specified information", or CSI. He claims that CSI is a reliable marker of design by intelligent agents. He puts forth a "Law of Conservation of Information" which states that chance and natural laws are incapable of generating CSI. In particular, CSI cannot be generated by evolutionary computation. Dembski asserts that CSI is present in intelligent causes and in the flagellum of Escherichia coli, and concludes (...) that neither have natural explanations. In this paper, we examine Dembski's claims, point out significant errors in his reasoning, and conclude that there is no reason to accept his assertions. (shrink)
Within the relevant semantics and pragmatics literature the terms “presupposition” and “conventional implicature” are used in a variety of different, but frequently overlapping, ways. The overlaps are perhaps not surprising, given that the two categories of conveyed meaning share the property of remaining constant in the scope of other operators—the property usefully characterize as projectivity. One of my purposes in this paper will be to try to clarify these different usages. In addition to that we will explore two additional properties (...) which are shared by some of these projective contents—strong contextual felicity, and neutralizability. The idea is to try to explain all three properties by taking into account information packaging. (shrink)