The term 'information' has become ubiquitous. In fact the notion of information is arguably amongst the most important of our `Information Age'. But just what exactly is information? This is a question without a straightforward response, particularly as information is a polysemantic concept, applied to a range of phenomena across a range of disciplines. Some of the most common formulation developed in the 20th century include: Fisher information, Shannon information, Kolmogorov complexity, quantum information, information as a state of an agent, and semantic information. Some notions of information are associated with central concerns of philosophy and have been used in various ways. Dealings with information from within philosophy include work on conceptions and analyses of information, the application of information to philosophical topics such as epistemology and logic and information ethics. Further to this, conceptions of information within other disciplines such as biology and physics can be and have been of interest within philosophy. Two conceptions of information of particular interest to philosophers are semantic information and environmental information, which can be seen as roughly correlating to the Gricean notions of non-natural and natural meaning respectively.
The Phrenological Argument, also known as God's Parapraxis, is a new provocative philosophical argument that showcases God's Mental State of Inadequacy in his creation of the Human Mind. Lawsin* argues that if God has a perfect mind flawlessly capable of knowing things in advance, then he should have anticipated ahead of time the repercussions of creating a human mind that cannot fathom Truth and Reality. But since he created a mind incapable of knowing what is real and what is true, (...) then it shows God carries an erroneous lapse somewhere in his memory. This mental erracity offers good solid evidence that proves God is not after all all-knowing. -/- The basic form of the argument is: 1. God created the mind 2. The mind cannot detect truth and reality 3. therefore, God will never be known. -/- A modified version: 1. If God created the mind for men to know him 2. But god failed to foresee his parapraxis 3. Then, god is not all-knowing. -/- A follow-up variation: 1. If God is real 2. But the mind can't detect reality 3. Therefore, the mind can't detect god. -/- Furthermore, man created the idea of god. Like all ideas, the idea of god is also nothing but an assumption, a guesswork, a supposition. For example, Zero and One are abstract concepts. They don't exist in the natural world. They are not naturally inherent. They only exist in the mind by assumption. Mathematically, both words are called numerals by definition. When Zero is represented with the symbol 0 and One with 1, technically, the words become numbers by association. Numbers are the assumed physical representations of abstract numerals. By symbolic representation, both digits now exist outside the mind - the physical world - the world outside of ourselves - the inherent world that existed a long time independently before the mind. However, although 0 & 1 are created by definition, association, representation, and assumption, are these numbers now solid objects or physical materials? If I write 0 and 1 on a paper, are they materially or physically real now? Can we consider the written numbers as proof of their existence? Can the numbers be proven as solid evidence prescribed by the scientific method? How can we validate the paper's evidence to be true, false, valid, or real? Do they really exist or are the numbers still abstract, imaginary, or imagined? If they are still abstract, therefore all ideas man has thought of or conceptualized are all assumptions, guesswork, or nothing but merely suppositions (The Lawsin Conjecture/Codexation Dilemma). (shrink)
Semantic Information conveyed by daily language has been researched for many years; yet, we still need a practical formula to measure information of a simple sentence or prediction, such as “There will be heavy rain tomorrow”. For practical purpose, this paper introduces a new formula, Semantic Information Formula (SIF), which is based on L. A. Zadeh’s fuzzy set theory and P. Z. Wang’s random set falling shadow theory. It carries forward C. E. Shannon and K. Popper’s thought. The fuzzy set’s (...) probability defined by Zadeh is treated as the logical probability sought by Popper, and the membership grade is treated as the truth-value of a proposition and also as the posterior logical probability. The classical relative information formula (Information=log(Posterior probability / Prior probability) is revised into SIF by replacing the posterior probability with the membership grade and the prior probability with the fuzzy set’s probability. The SIF can be explained as “Information=Testing severity – Relative square deviation” and hence can be used as Popper's information criterion to test scientific theories or propositions. The information measure defined by the SIF also means the spared codeword length as the classical information measure. This paper introduces the set-Bayes’ formula which establishes the relationship between statistical probability and logical probability, derives Fuzzy Information Criterion (FIC) for the optimization of semantic channel, and discusses applications of SIF and FIC in areas such as linguistic communication, prediction, estimation, test, GPS, translation, and fuzzy reasoning. Particularly, through a detailed example of reasoning, it is proved that we can improve semantic channel with proper fuzziness to increase average semantic information to reach its upper limit: Shannon mutual information. (shrink)
Humans are cognitive entities. Our ongoing interactions with the environment are threaded with creations and usages of meaningful information. Animal life is also populated with meaningful information related to survival constraints. Information managed by artificial agents can also be considered as having meanings, as derived from the designer. Such perspective brings us to propose an evolutionary approach to cognition based on meaningful information management. We use a systemic tool, the Meaning Generator System (MGS), and apply it consecutively to animals, humans (...) and artificial agents [1, 2]. The MGS receives information from its environment and compares it with its constraint. The generated meaning is the connection existing between the received information and the constraint. It triggers an action aimed at satisfying the constraint. The action modifies the environment and the generated meaning. Meaning generation links agents to their environments. The MGS is a system: a set of elements linked by a set of relations. Any system submitted to a constraint and capable of receiving information can lead to a MGS. Animals, humans and robots are agents containing MGSs dealing with different constraints. Similar MGSs carrying different constraints will generate different meanings. Cognition is system dependent. Contrary to approaches on meaning generation based on psychology or linguistics, the MGS approach is not based on human mind. We want to avoid the circularity of taking human mind as a starting point. Free will and self-consciousness participate to the management of human meanings. They do not exist for animals or robots. Staying alive is a constraint that we share with animals. Robots ignore that constraint. We first use the MGS for animals with “stay alive” and “group life” constraints. The analysis of meaning and cognition in animals is however limited by our un-complete understanding of the nature of life (the question of final causes). Extending the analysis of meaning generation and cognition to humans is complex and has some true limitations as the nature of human mind is a mystery for today science and philosophy. The natures of our feelings, free will or self-consciousness are unknown. Approaches to identify human constraints are however possible, where the MGS can highlight some openings [3, 4]. Modeling meaning management in artificial agents is rather straightforward with the MGS. We, the designers, know the agents and the constraints. The derived nature of constraints, meaning and cognition is however to be highlighted. We define a meaningful representation of an item for an agent as being the networks of meanings relative to the item for the agent, with the action scenarios involving the item. Such meaningful representations embed the agents in their environments and are far from the GOFAI type of representations. Cognition, meanings and representations exist by and for the agents. We finish by summarizing the points presented here and highlight possible continuations . [1] “Information and Meaning” [2] “Introduction to a systemic theory of meaning” [3] “Computation on Information, Meaning and Representations. An Evolutionary Approach” [4] “Proposal for a shared evolutionary nature of language and consciousness”. (shrink)
Meanings are present everywhere in our environment and within ourselves. But these meanings do not exist by themselves. They are associated to information and have to be created, to be generated by agents. The Meaning Generator System (MGS) has been developed on a system approach to model meaning generation in agents following an evolutionary perspective. The agents can be natural or artificial. The MGS generates meaningful information (a meaning) when it receives information that has a connection with an internal constraint (...) to which the agent is submitted. The generated meaning is to be used by the agent to implement actions aimed at satisfying the constraint. We propose here to highlight some characteristics of the MGS that could be related to items of philosophy of information. (shrink)
Information and meaning exist around us and within ourselves, and the same information can correspond to different meanings. This is true for humans and animals, and is becoming true for robots. We propose here an overview of this subject by using a systemic tool related to meaning generation that has already been published (C. Menant, Entropy 2003). The Meaning Generator System (MGS) is a system submitted to a constraint that generates a meaningful information when it receives an incident information that (...) has a relation with the constraint. The content of the meaningful information is explicited, and its function is to trigger an action that will be used to satisfy the constraint of the system. The MGS has been introduced in the case of basic life submitted to a "stay alive" constraint. We propose here to see how the usage of the MGS can be extended to more complex living systems, to humans and to robots by introducing new types of constraints, and integrating the MGS into higher level systems. The application of the MGS to humans is partly based on a scenario relative to the evolution of body self-awareness toward self-consciousness that has already been presented (C. Menant, Biosemiotics 2003, and TSC 2004). The application of the MGS to robots is based on the definition of the MGS applied to robots functionality, taking into account the origins of the constraints. We conclude with a summary of this overview and with themes that can be linked to this systemic approach on meaning generation. (shrink)
Information and Meaning are present everywhere around us and within ourselves. Specific studies have been implemented in order to link information and meaning: - Semiotics - Phenomenology - Analytic Philosophy - Psychology No general coverage is available for the notion of meaning. We propose to complement this lack by a systemic approach to meaning generation.
N. Wiener's negative definition of information is well known: it states what information is not. According to this definition, it is neither matter nor energy. But what is it? It is shown how one can follow the lead of dialectical logic as expounded by G.W.F. Hegel in his main work -- "The Science of Logic" -- to answer this and some related questions.
This is a brief letter outlining speculative ideas for semantic web reasoning about information assurance. Much work has been done on the development of semantic web applications for reasoning about information assurance. A significant portion of this work is focused upon semantic web ontologies and reasoning about security policies and the underlying implementation of those policies. While numerous semantic web-based security policy ontologies and reasoners exist, both academically and commercially, I will briefly focus on ideas related to solutions to the (...) problem of managing semantic web rules using algorithmic information theory. (shrink)
Purpose: Self-documentation is an increasingly common phenomenon, but it is not yet well understood. This paper provides a philosophical framework for analyzing examples of self-documentation on the dimensions of ontology, epistemology and ethics. Design/methodology/approach: The framework addresses these three major areas of philosophic thought by operationalizing insights from philosophy, chiefly the work of Martin Heidegger. Heidegger’s concepts of authenticity and fallenness inform the poles of each dimension of the framework. Findings: Ontologically, self-documentation may manifest as document (authentic) or data (fallen); (...) epistemologically, as understanding (authentic) or idle curiosity (fallen); and ethically, as self-care (authentic) or diversion (fallen). These dimensions are presented separately but are understood to be intermingled. Originality/value: This unified framework offers a lens for examining and comparing cases of self-documentation and self-documents. No such framework has previously been articulated, but given the ubiquity and growing importance of self-documentation, it is needed. (shrink)
The new logic of partitions is dual to the usual Boolean logic of subsets (usually presented only in the special case of the logic of propositions) in the sense that partitions and subsets are category-theoretic duals. The new information measure of logical entropy is the normalized quantitative version of partitions. The new approach to interpreting quantum mechanics (QM) is showing that the mathematics (not the physics) of QM is the linearized Hilbert space version of the mathematics of partitions. Or, putting (...) it the other way around, the math of partitions is a skeletal version of the math of QM. The key concepts throughout this progression from logic, to logical information, to quantum theory are distinctions versus indistinctions, definiteness versus indefiniteness, or distinguishability versus indistinguishability. The distinctions of a partition are the ordered pairs of elements from the underlying set that are in different blocks of the partition and logical entropy is defined (initially) as the normalized number of distinctions. The cognate notions of definiteness and distinguishability run throughout the math of QM, e.g., in the key non-classical notion of superposition (= ontic indefiniteness) and in the Feynman rules for adding amplitudes (indistinguishable alternatives) versus adding probabilities (distinguishable alternatives). (shrink)
This book concerns the nature and the norms of inquiry. It tackles not only philosophical issues regarding what inquiry is, but also issues regarding how it should and should not be executed. Roughly put, inquiry is the activity of searching for the true answers to questions of our interest. But what is the difference between empirical and armchair inquiry? And what are the right and the wrong ways to inquire? Under what conditions should one start inquiring? Which questions are such (...) that one should not inquire into them? The book offers answers to these questions. It argues that competent armchair inquiry only makes explicit what was already implicit--the inquirer already had the answer to her question before inquiring into it, though this was not explicit to her. It also argues that we should avoid inquiring into questions whose answers are unknowable to us, in the instrumental sense of 'should', and that different modes of inquiry are called for, depending on which type of information is available to the subject. These answers are rigorously argued for, and they stem from a unified framework for modeling the activity of inquiry. (shrink)
A novel account of semantic information is proposed. The gist is that structural correspondence, analysed in terms of similarity, underlies an important kind of semantic information. In contrast to extant accounts of semantic information, it does not rely on correlation, covariation, causation, natural laws, or logical inference. Instead, it relies on structural similarity, defined in terms of correspondence between classifications of tokens into types. This account elucidates many existing uses of the notion of information, for example, in the context of (...) scientific models and structural representations in cognitive science. It is poised to open a new research program concerned with various kinds of semantic information, its functions, and its measurement. (shrink)
The explanation for everything in Nature, everything in human history, future, and-or, past, is the conservation of a circle, proven by, the circular-linear relationship between, the solstice and the equinox.
Conservation of the Circle is the only dynamic in Nature, and, therefore, the simple solution to the complex problem called ‘reality.’ (Also known as 'identity.') Financial, technological, and, therefore, psychological.
The research observed, in parallel and comparatively, a surveillance state’s use of communication & cyber networks with satellite applications for power political & realpolitik purposes, in contrast to the outer space security & legit scientific purpose driven cybernetics. The research adopted a psychoanalytic & psychosocial method of observation in the organizational behaviors of the surveillance state, and a theoretical physics, astrochemical, & cosmological feedback method in the contrast group of cybernetics. Military sociology and multilateral movements were adopted in the diagnostic (...) studies & research on cybersecurity, and cross-channeling in communications were detected during the research. The paper addresses several key points of technicalities in security & privacy breach, from personal devices to ontological networks and satellite applications - notably telecommunication service providers & carriers with differentiated spectrum. The paper discusses key moral ethical risks posed in the mal-adaptations in commercial devices that can corrupt democracy in subtle ways but in a mass scale. The research adopted an analytical linguistics approach with linguistic history in unjailing from the artificial intelligence empowered pancomputationalism approach of the heterogenous dictatorial semantic network, and the astronomical & cosmological research in information theory implies that noncomputable processes are the only defense strategy for the new technology-driven pancomputationalism developments. (shrink)
The article explains on the two-year experiment after the author’s finalization of dissertation. The thesis of the dissertation was hidden in the last chapter with analytical linguistics. It was done so with the fascist development of the Chinese Communist regime with neo- Nazi characteristics. Since numerous prior warnings on the political downshifts & coup d’état in China was willfully ignored by the university, the linguistic innovations in dissertation found a balance between multilateralism and outer space (security). The experiments were conducted (...) with a combination of physical unit analysis & thermonuclear dynamics analysis. It describes the experiment process in terms of gravitation as gravitational singularity and Bekenstein-Penrose singularity. Detailed research process is elaborated in the article concerning the sociopolitical interactions the research involved. (shrink)
Zero and one is one and two is two and three. Explaining mind, math, and music. The genesis of 'information.' Tokenization in general. Tokenization as the basis for mind, math, and music (information in general).
ArgumentIn this comparative historical analysis, we will analyze the intellectual tendency that emerged between 1946 and 1956 to take advantage of the popularity of communication theory to develop a kind of informational epistemology of statistical mechanics. We will argue that this tendency results from a historical confluence in the early 1950s of certain theoretical claims of the so-called English School of Information Theory, championed by authors such as Gabor (1956) or MacKay (1969), and from the attempt to extend the profound (...) success of Shannon’s ([1948] 1993) technical theory of sign transmission to the field of statistical thermal physics. As a paradigmatic example of this tendency, we will evaluate the intellectual work of Léon Brillouin (1956), who, in the mid-fifties, developed an information theoretical approach to statistical mechanical physics based on a concept of information linked to the knowledge of the observer. (shrink)
Although the everyday notion of information has clear semantic properties, the all-pervasive technical concept of Shannon information is usually considered as a non-semantic concept. In this paper I show how this concept was implicitly ‘semantized’ in the early 1950s by many authors, such as Rothstein or Brillouin, in order to explain the knowledge dynamics underlying certain scientific practices such as measurement. On the other hand, I argue that the main attempts in the literature to develop a quantitative measure of semantic (...) information to clarify science and scientific measurements, such as Carnap-Bar-Hillel, or Dretske, will not successfully achieve this philosophical aim for several reasons. Finally, I defend the use of a qualitative notion of semantic information within the information-theoretical framework MacKay to assess the informational dynamics underlying scientific practices, particularly measurements in statistical mechanics. (shrink)
Biological and cognitive sciences rely heavily on the idea of information transmitted between natural events or processes. This paper critically assesses some current philosophical views of natural information and defends a view of natural information as Nomic and Factive. Dretske offered a Factive view of information, and recent work on the topic has tended to reject this aspect of his view in favor of a non-Factive, probabilistic approach. This paper argues that the reasoning behind this move to non-Factivity is flawed (...) and mixes up different problems with Dretske’s original view. I show that one of these problems—strictness—has to do with Nomicity rather than Factivity. The other problem—reference class ambiguity—is not solvable just in terms of a theory of natural information. On the Dretske-inspired view I defend, natural information is Factive and Nomic but is insufficient to determine the cognitive or biological content of a natural process. In sum I present an examination what natural information is and what role it can play in our understanding of living and thinking things. (shrink)
In this paper it is analyzed from the informational perspective the relation between mind and body, an ancient philosophic issue defined as a problem, which still did not receive up to date an adequate solution. By introducing/using the concept of information, it is shown that this concept includes two facets, one of them referring to the common communications and another one referring to a hidden/structuring matter-related information, effectively acting in the human body and in the living systems, which determines the (...) dynamic inter-change of information between specific structures of the organism by electric/electronic/chemical agents and genetic/epigenetic processes. It is shown that the maintenance of body, permanently and obligatory depending on the external matter (foods, air, water) resources, needed to provide both the structuring/restructuring basic material and energy, determines the necessary existence of an info-managing system, administrating the internal mechano-chemical/physical processes. As a natural consequence, such a system should organize and assure own survival by an effective informational operability to detect the external food resources, to select the appropriate interest information and to decide as a function of circumstances. One important component in such an informational system is memory, allowing to dispose of the reference informational data for analysis/comparison and the selection between good and bad binary possible decisions. The memory receives and stores therefore signals from external reality and from the body itself, referring to the emotional reaction, digestion status, creation, and inherited predilections, within specific info-neural communication circuits between the brain and body execution/sensitive organs, the human body appearing as an integrated info-matter self-managed dynamic system. The specific body components memorize information with different degrees of info-integration: short/long-term integration, emotive/action reaction, info-abilities, culminating with the integration in the chromosomal structures by epigenetic processes. The new acquired information is transgenerationally transmissible, and is manifested as new traits, showing the adaptation capability of the human and close relation between mind and body. Analyzing the results of such a mind-body informational model in comparison with the earlier assumed/proposed/asserted archaic, Greek and Occidental philosophies, which represent only partial aspects of this relation, it is shown that this informational model, elaborated in terms of information on the basis of scientific reasons and arguments, constitutes a general, realist, and coherent model of the mind-body relation, able to integrate and/or explain most of the others. (shrink)
This article provides a comprehensive conceptual analysis of information. It begins with a folk notion that information is a tripartite phenomenon: information is something carried by signals about something for some use. This suggests that information has three main aspects: structural, referential, and normative. I analyze the individually necessary and jointly sufficient conditions for defining these aspects of information and consider formal theories relating to each aspect as well. The analysis reveals that structural, referential, and normative aspects of information are (...) hierarchically nested and that the normative depends on the referential, which in turn depends on the structural. (shrink)
In this commentary, I propose a framework for thinking about data quality in the context of scientific research. I start by analyzing conceptualizations of quality as a property of information, evidence and data and reviewing research in the philosophy of information, the philosophy of science and the philosophy of biomedicine. I identify a push for purpose dependency as one of the main results of this review. On this basis, I present a contextual approach to data quality in scientific research, whereby (...) the quality of a dataset is dependent on the context of use of the dataset as much as the dataset itself. I exemplify the approach by discussing current critiques and debates of scientific quality, thus showcasing how data quality can be approached contextually. (shrink)
Information is a central notion for cognitive sciences and neurosciences, but there is no agreement on what it means for a cognitive system to acquire information about its surroundings. In this paper, we approximate three influential views on information: the one at play in ecological psychology, which is sometimes called information for action; the notion of information as covariance as developed by some enactivists, and the idea of information as minimization of uncertainty as presented by Shannon. Our main thesis is (...) that information for action can be construed as covariant information, and that learning to perceive covariant information is a matter of minimizing uncertainty through skilled performance. We argue that the agent’s cognitive system conveys information for acting in an environment by minimizing uncertainty about how to achieve her intended goals in that environment. We conclude by reviewing empirical findings that support our view and by showing how direct learning, seen as instance of ecological rationality at work, is how mere possibilities for action are turned into embodied know-how. Finally, we indicate the affinity between direct learning and sense-making activity. (shrink)
In his recent book, Daniel Dennett defends a novel account of semantic information in terms of design worth getting (Dennett, 2017). While this is an interesting proposal in itself, my purpose in this commentary is to challenge several of Dennett’s claims. First, he argues that semantic information can be transferred without encoding and storing it. Second, this lack of encoding is what makes semantic information unmeasurable. However, the argument for both these claims, presented by Dennett as an intuition pump, is (...) invalid. (shrink)
A principle, according to which any scientific theory can be mathematized, is investigated. That theory is presupposed to be a consistent text, which can be exhaustedly represented by a certain mathematical structure constructively. In thus used, the term “theory” includes all hypotheses as yet unconfirmed as already rejected. The investigation of the sketch of a possible proof of the principle demonstrates that it should be accepted rather a metamathematical axiom about the relation of mathematics and reality. Its investigation needs philosophical (...) means. Husserl’s phenomenology is what is used, and then the conception of “bracketing reality” is modelled to generalize Peano arithmetic in its relation to set theory in the foundation of mathematics. The obtained model is equivalent to the generalization of Peano arithmetic by means of replacing the axiom of induction with that of transfinite induction. A comparison to Mach’s doctrine is used to be revealed the fundamental and philosophical reductionism of Husserl’s phenomenology leading to a kind of Pythagoreanism in the final analysis. Accepting or rejecting the principle, two kinds of mathematics appear differing from each other by its relation to reality. Accepting the principle, mathematics has to include reality within itself in a kind of Pythagoreanism. These two kinds are called in paper correspondingly Hilbert mathematics and Gödel mathematics. The sketch of the proof of the principle demonstrates that the generalization of Peano arithmetic as above can be interpreted as a model of Hilbert mathematics into Gödel mathematics therefore showing that the former is not less consistent than the latter, and the principle is an independent axiom. An information interpretation of Hilbert mathematics is involved. It is a kind of ontology of information. Thus the problem which of the two mathematics is more relevant to our being is discussed. An information interpretation of the Schrödinger equation is involved to illustrate the above problem. (shrink)
What is the relationship between information and representation? Dating back at least to Dretske (1981), an influential answer has been that information is a rung on a ladder that gets one to representation. Representation is information, or representation is information plus some other ingredient. In this paper, I argue that this approach oversimplifies the relationship between information and representation. If one takes current probabilistic models of cognition seriously, information is connected to representation in a new way. It enters as a (...) property of the represented content as well as a property of the vehicles that carry that content. This offers a new, conceptually and logically distinct way in which information and representation are intertwined in cognition. (shrink)
The central question in the biological sciences for the past 100 years has concerned an understanding of how living systems differ from other general physical phenomena and what makes these systems unique. With new developments in the fields of nonequilibrium thermodynamics, systems theory, chaos, and information theory over the past few decades, there has been growing interest in finally answering the question first posed by Erwin Schrödinger in the 1940s concerning the true scientific nature of living systems. Similarly, there is (...) also increasing interest within the biologic community for a more holistic and non-reductionist methodology. The approach followed in this book builds on a foundation of information theory and semiotics while integrating basic thermodynamic considerations and systems theory to form a singular unifying concept that is proposed to be the essential process of living systems. However, the premise presented is much more than simply the exposition of a new hypothesis. This book describes the logical progression of thought incorporating a diverse array of established scientific ideas that were used in the conceptualization of a dynamic mathematical framework that can be employed as a novel analytic means for the study of living systems and their fundamental processes. (shrink)
The concept of information has well-known difficulties. Among the many issues that have been discussed is the alethic nature of a semantic conception of information. Floridi :197–222, 2004; Philos Phenomenol Res 70:351–370, 2005; EUJAP 3:31–41, 2007; The philosophy of information, Oxford University Press, Oxford, 2011) argued that semantic information must be truthful. In this article, arguments will be presented in favor of an alethically neutral conception of semantic information and it will be shown that such a conception can withstand Floridi’s (...) criticism. In particular, it is argued that an alethically neutral conception of semantic information can manage the so-called Bar-Hillel Carnap paradox, according to which contradictions have maximum informational content. This issue, as well as some of Floridi’s other arguments, is resolved by disentangling the property of being information from the property of being informative. The essay’s final conclusion is that although semantic information is alethically neutral, a veridical conception of semantic information can, and should, be retained as a subconcept of semantic information, as it is essential for the analysis of informativity, which, unlike the property of being information, depends on truth. (shrink)
We drive our lives permanently by decisions YES/NO, and even we no longer distinguish the elementary intermediary steps of such decisions most often, they form stereotyped chains that once triggered, they run unconsciously, daily facilitating our activities. We lead our lives actually by conscious decisions, each of such decisions establishing our future trajectory. The YES/NO dipole is actually the elemental evaluation and decisional unit in the informational transmission/reception equipment and lines and in computers, respectively. Based on a binary probabilistic system, (...) this is defined as a unit of information (Bit). We operate therefore as an informational system and we actually live in a bipolar universe, which is fundamentally informational. Indeed, the laws of nature and its equilibrium or steady state conditions are based on bipolar units with opposite characteristics, such as action/reaction, attraction/rejection, gravity/anti-gravity, matter/antimatter, entropy/anti-entropy, to enumerate just a few examples. As part of this bipolar universe, we are also bipolar entities connected to information and matter. Starting from the informational features of the human being, seven informational components are identified, forming the informational system of the human body, distinguished by their different functions, reflected at the conscious level through the center Iknow (the memory, including whole life experience), Iwant (decisions center), Iove (emotions), Iam (body status), Icreate (informational genetic transmitter), Icreated (genetic generator inherited from parents) and Ibelieve, which is the gateway to the antientropic component, favorable to maintain the life structure and functioning. Taking into account the characteristics of these centers, it is discussed the life cycle and are deduced suitable conclusions concerning an optimal, active lifestyle, that would contribute to a successful life, aging and destiny. (shrink)
The "Digital Era Framework" is a reference framework for the digital information age. Targeted at science and practice alike, the concept offers a comprehensive approach for the analysis and assessment of phenomena of digital change, digitalization and digital transformation. The "Digital Era Framework" is based on an integrated approach to examine digital change, in so far as the original state, the occuring change and the final state can be represented in a uniform scheme. The framework emphasises information as the central (...) point of consideration. Therefore it is based on two factors as main moments of order that define the structure of the scheme: the form of manifestation of information on the one hand and the type of application of information on the other. This creates a two-dimensional frame of reference that allows for a differentiated classifcation of phenomena, objects and events. The model is particularly suitable for studying entire systems and processes. By positioning their individual elements within the two-dimensional frame, it can be shown how the structure and interaction pattern of theses systemes and processes are shaped and how this configuration may change as a result of digitalization. Offering a theoretically well-founded systematization, the framework provides a solid basis for analyses, explanations and forecasts. At the same time, it serves as a basis for the development and evaluation of strategies. The presentation of the "Digital Era Framework" is also aimed at contributing to the development of an adequate technical terminology by proposing a well-founded nomenclature based on the reference framework. (shrink)
Der »Digital Era Framework« ist ein Bezugsrahmen für das digitale Informationszeitalter. Gerichtet an Wissenschaft und Praxis gleichermaßen, bietet das Konzept einen umfassenden Ansatz zur Einordnung und Analyse von Phänomenen des Digitalen Wandels, der Digitalisierung und der Digitalen Transformation. Dem »Digital Era Framework« liegt dabei ein integrierter Ansatz zur Untersuchung des Wandels zugrunde, insofern als Ursprungszustand, Veränderung und Endzustand in einem einheitlichen Schema dargestellt werden können. Der Bezugsrahmen stellt die Information in den Mittelpunkt der Betrachtung und beruht auf zwei Ordnungsmomenten: der (...) Form der Manifestation von Information einerseits und der Art der Applikation von Information andererseits. Dadurch wird ein zweidimensionaler Bezugsrahmen aufgespannt, der eine differenzierte Einordnung von Phänomen, Objekten und Geschehen gestattet. Anhand dieses Ordnungsmodells lassen sich insbesondere Systeme und Prozesse untersuchen und aufzeigen, wie sich deren Struktur, Interaktionsgefüge und Konfiguration durch die Digitalisierung verändern. Durch die begründete Systematisierung schafft der Bezugsrahmen eine Grundlage für Analysen, Erklärungen und Prognosen. Zugleich dient er als Basis für die Entwicklung und Bewertung von Strategien. Mit der Vorstellung des »Digital Era Framework« ist darüber hinaus das Ziel verbunden, zur Ausprägung einer adäquaten Fachterminologie beizutragen, indem eine begründete Nomenklatur vorgeschlagen wird, welche auf dem Bezugsrahmen aufbaut. (shrink)
An important problem with machine learning is that when label number n>2, it is very difficult to construct and optimize a group of learning functions, and we wish that optimized learning functions are still useful when prior distribution P(x) (where x is an instance) is changed. To resolve this problem, the semantic information G theory, Logical Bayesian Inference (LBI), and a group of Channel Matching (CM) algorithms together form a systematic solution. MultilabelMultilabel A semantic channel in the G theory consists (...) of a group of truth functions or membership functions. In comparison with likelihood functions, Bayesian posteriors, and Logistic functions used by popular methods, membership functions can be more conveniently used as learning functions without the above problem. In Logical Bayesian Inference (LBI), every label’s learning is independent. For Multilabel learning, we can directly obtain a group of optimized membership functions from a big enough sample with labels, without preparing different samples for different labels. A group of Channel Matching (CM) algorithms are developed for machine learning. For the Maximum Mutual Information (MMI) classification of three classes with Gaussian distributions on a two-dimensional feature space, 2-3 iterations can make mutual information between three classes and three labels surpass 99% of the MMI for most initial partitions. For mixture models, the Expectation-Maxmization (EM) algorithm is improved and becomes the CM-EM algorithm, which can outperform the EM algorithm when mixture ratios are imbalanced, or local convergence exists. The CM iteration algorithm needs to combine neural networks for MMI classifications on high-dimensional feature spaces. LBI needs further studies for the unification of statistics and logic. (shrink)
Information is widely perceived as essential to the study of communication and representation; still, theorists working on these topics often take themselves not to be centrally concerned with "Shannon information", as it is often put, but with some other, sometimes called "semantic" or "nonnatural",kind of information. This perception is wrong. Shannon's theory of information is the only one we need. -/- I intend to make good on this last assertion by canvassing a fully (Shannon) informational answer to the metasemantic question (...) of what makes something a representation, for a certain important family of cases. This answer and the accompanying theory, which represents a significant departure from the broadly Dretskean philosophical mainstream, will show how a number of threads in the literature on naturalistic metasemantics, aimed at describing the purportedly non-informational ingredients in representation, actually belong in the same coherent, purely information-theoretic picture. (shrink)
The evaluation of universities from different perspectives is important for their scientific development. Analyzing the scientific papers of a university under the bibliometric approach is one main evaluative approach. The aim of this study was to conduct a bibliometric analysis and visualization of papers published by Hamadan University of Medical Science (HUMS), Iran, during 1992-2018. This study used bibliometric and visualization techniques. Scopus database was used for data collection. 3753 papers were retrieved by applying Affiliation Search in Scopus advanced search (...) section. Excel and VOSviewer software packages were used for data analysis and bibliometric indicator extraction. An increasing trend was seen in the numbers of HUMS's published papers and received citations. The highest rate of collaboration in national level was with Tehran University of Medical Sciences. Internationally, HUMS's researchers had the highest collaboration with the authors from the United States, the United Kingdom and Switzerland, respectively. All highly-cited papers were published in high level Q1 journals. Term clustering demonstrated four main clusters: epidemiological studies, laboratory studies, pharmacological studies, and microbiological studies. The results of this study can be beneficial to the policy-makers of this university. In addition, researchers and bibliometricians can use this study as a pattern for studying and visualizing the bibliometric indicators of other universities and research institutions. (shrink)
Electronic text can be defined on two different, though interconnected, levels. On the one hand, electronic text can be defined by taking the notion of “text” or “printed text” as the point of departure. On the other hand, electronic text can be defined by taking the digital format as the point of departure, where everything is represented in the binary alphabet. While the notion of text in most cases lends itself to being independent of medium and embodiment, it is also (...) often tacitly assumed that it is in fact modeled on the print medium, instead of, for instance, on hand-written text or speech. In late 20th century, the notion of “text” was subjected to increasing criticism, as can be seen in the question that has been raised in literary text theory about whether “there is a text in this class.” At the same time, the notion was expanded by including extralinguistic sign modalities (images, videos). A basic question, therefore, is whether electronic text should be included in the enlarged notion that text is a new digital sign modality added to the repertoire of modalities or whether it should be included as a sign modality that is both an independent modality and a container that can hold other modalities. In the first case, the notion of electronic text would be paradigmatically formed around the e-book, which was conceived as a digital copy of a printed book but is now a deliberately closed work. Even closed works in digital form will need some sort of interface and hypertextual navigation that together constitute a particular kind of paratext needed for accessing any sort of digital material. In the second case, the electronic text is defined by the representation of content and (some parts of the) processing rules as binary sequences manifested in the binary alphabet. This wider notion would include, for instance, all sorts of scanning results, whether of the outer cosmos or the interior of our bodies and of digital traces of other processes in-between (machine readings included). Since other alphabets, such as the genetic alphabet and all sorts of images may also be represented in the binary alphabet, such materials will also belong to the textual universe within this definition. A more intriguing implication is that born-digital materials may also include scripts and interactive features as intrinsic parts of the text. The two notions define the text on different levels: one is centered on the Latin, the other on the binary alphabet, and both definitions include hypertext, interactivity, and multimodality as constituent parameters. In the first case, hypertext is included as a navigational, paratextual device; whereas in the second case, hypertext is also incorporated in the narrative within an otherwise closed work or as a constituent element on the textual universe of the web, where it serves the ongoing production of (possibly scripted) connections and disconnections between blocks of textual content. Since the early decades of early 21st century still represent only the very early stages of the globally distributed universe of web texts, this is also a history of the gradual unfolding of the dimensions of these three constituencies—hypertext, interactivity, and multimodality. The result is a still-expanding repertoire of genres, including some that are emerging via path dependency; some via remediation; and some as new genres that are unique for networked digital media, including “social media texts” and a growing variety of narrative and discursive multiple-source systems. (shrink)
Description courte (Électre, 2019) : Une étude d'un des principaux axes de réflexion du philosophe des sciences et de la nature Raymond Ruyer (1902-1987). À la lumière des découvertes de l'embryogenèse et en s'appuyant par ailleurs sur la théorie de l'information, il proposa une interprétation des concepts unificateurs de la cybernétique mécaniste. -/- Short Descriptor (Electre 2019): A study of one of the main axes of reflection of the French philosopher of science and of nature Raymond Ruyer (1902-1987). Relying on (...) the discoveries about embryogenesis, and also with the use of information theory, Ruyer proposed an interpretation of the main unifying concepts of mechanistic cybernetics. -/- Cet ouvrage propose une étude fouillée d'un des principaux axes de réflexion du philosophe des sciences et de la nature français Raymond Ruyer (1902–1987) : la cybernétique. Après avoir proposé une philosophie structuraliste, Ruyer la modifia à la lumière des découvertes de l'embryogenèse, puis il proposa une interprétation des concepts unificateurs de la cybernétique mécaniste. Réfléchissant sur cette dernière et sur la théorie de l'information, en particulier sur l'origine de l'information, il défendit que cette cybernétique n'était qu'une lecture inversée de la vraie cybernétique, qui nous donnerait de lire dans l'expérience même les traces du pouvoir morphogénétique, appréhendé comme un champ axiologique. Dans un texte résumant son propre parcours, Ruyer affirma finalement que la critique de la théorie de l'information « peut donner […] l'espoir d'aboutir à quelque chose comme une nouvelle théologie. » Les idées directrices de Ruyer sont tout particulièrement contextualisées ici à partir de la question du développement des formes en biologie, et de celles de la génétique, de la genèse stochastique de l'ordre, et de l'identification mentale ou physique de l'information. Il se termine en départageant ce qui est théologique et axiologique dans ce projet de métaphysique qui, bien que resté inachevé, n'en représente pas moins le plus impressionnant conçu en France au siècle dernier. – This book offers an in-depth study of one of the main axes in the reflection of French philosopher of science and nature Raymond Ruyer. In a text summarising his own development, Ruyer stated about the philosophical critique of information theory that it "is what can give the most long-lasting hope of getting to something like a new theology." After propounding a structuralist philosophy, and distinguishing between form and structure, to then modify it in the light of discoveries in embryogenesis, Ruyer offered a re-evaluation of the unifying concepts of mechanistic cybernetics. Thinking about it and about information theory, he defended the idea that this cybernetics was in reality an inverted reading of the real one, which would allow us to read in experience itself traces of the morphogenetic power, apprehended as the axiological field. On some transversal points, the development of forms in biology and genetics, the stochastic genesis of order, the identification of information with either psychological and mental, or physical reality, behaviour, and the access to meaning, this work exposes the main ideas of Ruyer while situating them in the context of the breadth of others' contributions. It ends by determining what is theological and axiological in this project for a metaphysics which, although unfinished, is nevertheless the most impressive effort done in France in the last century. – Available on i6doc dot com (ISBN 978-2-930517-56-8 ; pdf 978-2-930517-57-5). (shrink)
This paper offers a history of the concept of social engineering in cybersecurity and argues that while the term began its life in the study of politics, and only later gained usage within the domain of cybersecurity, these are applications of the same fundamental ideas: epistemic asymmetry, technocratic dominance, and teleological replacement. The paper further argues that the term's usages in both areas remain conceptually and semantically interrelated. Moreover, ignorance of this interrelation continues to handicap our ability to identify and (...) rebuff social engineering attacks in cyberspace. The paper's conceptual history begins in the nineteenth-century in the writings of the economists John Gray and Thorstein Veblen. An analysis of scholarly articles shows the concept's proliferation throughout the early to mid-twentieth century within the social sciences and beyond. The paper then traces the concept's migration into cybersecurity through the 1960s–1980s utilizing both scholarly publications and memoir accounts – including interviews with then-active participants in the hacker community. Finally, it reveals a conceptual array of contemporary connotations through an analysis of 134 definitions of the term found in academic articles written about cybersecurity from 1990 to 2017. (shrink)
The objective of this thesis is to present a naturalised metaphysics of information, or to naturalise information, by way of deploying a scientific metaphysics according to which contingency is privileged and a-priori conceptual analysis is excluded (or at least greatly diminished) in favour of contingent and defeasible metaphysics. The ontology of information is established according to the premises and mandate of the scientific metaphysics by inference to the best explanation, and in accordance with the idea that the primacy of physics (...) constraint accommodates defeasibility of theorising in physics. This metametaphysical approach is used to establish a field ontology as a basis for an informational structural realism. This is in turn, in combination with information theory and specifically mathematical and algorithmic theories of information, becomes the foundation of what will be called a source ontology, according to which the world is the totality of information sources. Information sources are to be understood as causally induced configurations of structure that are, or else reduce to and/or supervene upon, bounded (including distributed and non-contiguous) regions of the heterogeneous quantum field (all quantum fields combined) and fluctuating vacuum, all in accordance with the above-mentioned quantum field-ontic informational structural realism (FOSIR.) Arguments are presented for realism, physicalism, and reductionism about information on the basis of the stated contingent scientific metaphysics. In terms of philosophical argumentation, realism about information is argued for primarily by way of an indispensability argument that defers to the practice of scientists and regards concepts of information as just as indispensable in their theories as contingent representations of structure. Physicalism and reductionism about information are adduced by way of the identity thesis that identifies the substance of the structure of ontic structural realism as identical to selections of structure existing in re to combined heterogeneous quantum fields, and to the total heterogeneous quantum field comprised of all such fields. Adjunctly, an informational statement of physicalism is arrived at, and a theory of semantic information is proposed, according to which information is intrinsically semantic and alethically neutral. (shrink)
I will analyse Floridi’s rejection of digital ontologies and his positive proposal for an informational structural realism. I intend to show that ISR is still fundamentally a digital ontology, albeit with some different metaphysical commitments to those that Floridi rejects. I will argue that even though Floridi deploys the method of levels of abstraction adapted from computer science, and has established a Kantian transcendentalist conception of both information and structure, ISR still reduces to a discretised binary, and therefore digital, ontology. (...) The digital ontologies that Floridi rejects are John Wheeler’s “It from Bit” conception and computational metaphysics. They’re rejected predominantly on the basis that they rely upon a false dichotomy between digital discrete and continuous metaphysics. ISR involves a Kantian transcendentalist conception of de re relations that is intended to avoid this false dichotomy. However, I’ll argue that the binary, discrete, digital component of digital ontology is retained in ISR, and therefore ISR is still a digital ontology since its conception of information reduces to binary discrete de re relations. As such, ISR comes down on one side of the rejected ontic dichotomy of digital metaphysics, and so an informational metaphysics that is not a digital ontology is still a promissory note in the philosophy of information. (shrink)
This doctoral thesis consists of five research papers that address four tangential topics, all of which are relevant for the challenges we are facing in our socio-technical society: information, security, privacy, and anonymity. All topics are approached by similar methods, i.e. with a concern about conceptual and definitional issues. In Paper I—concerning the concept of information and a semantic conception thereof—it is argued that the veridicality thesis is false. In Paper II—concerning information security—it is argued that the current leading definitions (...) suffer from counter-examples, and lack an appropriate conceptual sense. Based on this criticism a new kind of definition is proposed and defended. In Paper III—concerning control definitions of privacy—it is argued that any sensible control-definition of privacy must properly recognize the context as part of the defining criteria. In Paper IV—concerning the concept of privacy—it is argued that privacy is a normative concept and that it is constituted by our social relations. Final, in Paper V—concerning anonymity—it is argued that the threat from deanonymization technology goes beyond harm to anonymity. It is argued that a person who never is deanonymized can still be harmed and what is at stake is an ability to be anonymous. (shrink)
This paper develops and refines the suggestion that logical systems are conceptual artefacts that are the outcome of a design-process by exploring how a constructionist epistemology and meta-philosophy can be integrated within the philosophy of logic.
We are in the information age. Nowadays, the information is a high power commodity. Your domain and handling have high economic, political, social value. However, we still know little about it. What is the information? How do we store it, retrieve it and manipulate it? Everyone have or should have equal access to information? What is the relationship between information and knowledge? How can both influence and be influenced by the action? Can they are modeled? The models can contribute to (...) your understanding? Information and knowledge can consolidate or destroy friendships? What are the most efficient means to produce, store, organize and retrieve knowledge and information? Such questions permeate the content of this book, bringing together Brazilian and foreign authors, seeking to contribute to the improvement of the understanding of each of the concepts involved and the relationships between them. This work is the result of EIICA IX, International Meeting information, knowledge and action, held between December, 2nd and 4th 2015, at the Faculty of Philosophy and Science of UNESP/ Marília. (shrink)