Since the cognitive revolution, it’s become commonplace that cognition involves both computation and information processing. Is this one claim or two? Is computation the same as information processing? The two terms are often used interchangeably, but this usage masks important differences. In this paper, we distinguish information processing from computation and examine some of their mutual relations, shedding light on the role each can play in a theory of cognition. We recommend that theoristError: Illegal (...) entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMapError: Illegal entry in bfrange block in ToUnicode CMaps of cognition be explicit and careful in choosing 1 notions of computation and information and connecting them together. Much confusion can be avoided by doing so. Keywords: computation, information processing, computationalism, computational theory of mind, cognitivism. (shrink)
Bynum (Putting information first: Luciano Floridi and the philosophy of information. NY: Wiley-Blackwell, 2010) identifies Floridi’s focus in the philosophy of information (PI) on entities both as data structures and as information objects. One suggestion for examining the association between the former and the latter stems from Floridi’s Herbert A. Simon Lecture in Computing and Philosophy given at Carnegie Mellon University in 2001, open problems in the PI: the transduction or transception, and how we gain knowledge (...) about the world as a complex, living, information environment. This paper addresses PI across a model of interoperating levels: perception (P)—intuition (N)—computation (C)—information (I), as factored by cognitive continuity (1), temporality (2), and constitution (3). How might we begin to characterize our experience of an abstract information object across such a matrix? Chudnoff’s rationalist distinctions between perception and intuition serve as a first rung of the ladder. Turing’s brief references to the utility of intuition, in an allied, rationalist-Cartesian sense, provide the next step up to computation. Floridi provides the final link from computation to information. (shrink)
Computation and information processing are among the most fundamental notions in cognitive science. They are also among the most imprecisely discussed. Many cognitive scientists take it for granted that cognition involves computation, information processing, or both – although others disagree vehemently. Yet different cognitive scientists use ‘computation’ and ‘information processing’ to mean different things, sometimes without realizing that they do. In addition, computation and information processing are surrounded by several myths; first and (...) foremost, that they are the same thing. In this paper, we address this unsatisfactory state of affairs by presenting a general and theory-neutral account of computation and information processing. We also apply our framework by analyzing the relations between computation and information processing on one hand and classicism and connectionism on the other. We defend the relevance to cognitive science of both computation, in a generic sense that we fully articulate for the first time, and information processing, in three important senses of the term. Our account advances some foundational debates in cognitive science by untangling some of their conceptual knots in a theory-neutral way. By leveling the playing field, we pave the way for the future resolution of the debates’ empirical aspects. (shrink)
It is common in cognitive science to equate computation (and in particular digital computation) with information processing. Yet, it is hard to find a comprehensive explicit account of concrete digital computation in information processing terms. An information processing account seems like a natural candidate to explain digital computation. But when ‘information’ comes under scrutiny, this account becomes a less obvious candidate. Four interpretations of information are examined here as the basis for (...) an information processing account of digital computation, namely Shannon information, algorithmic information, factual information and instructional information. I argue that any plausible account of concrete computation has to be capable of explaining at least the three key algorithmic notions of input, output and procedures. Whist algorithmic information fares better than Shannon information, the most plausible candidate for an information processing account is instructional information. (shrink)
The book presents investigations into the world of info-computational nature, in which information constitutes the structure, while computational process amounts to its change. Information and computation are inextricably bound: There is no computation without informational structure, and there is no information without computational process. Those two complementary ideas are used to build a conceptual net, which according to Novalis is a theoretical way of capturing reality. We apprehend the reality within a framework known as natural (...) computationalism, the view that the whole universe can be understood as a computational system at many different levels - from quantum mechanical world, to biological organisms including intelligent minds and their societies. Questions about nature of information and computation and their unified view are addressed along with application of info- computational approach to knowledge generation. (shrink)
Intelligent design advocate William Dembski has introduced a measure of information called "complex specified information", or CSI. He claims that CSI is a reliable marker of design by intelligent agents. He puts forth a "Law of Conservation of Information" which states that chance and natural laws are incapable of generating CSI. In particular, CSI cannot be generated by evolutionary computation. Dembski asserts that CSI is present in intelligent causes and in the flagellum of Escherichia coli, and (...) concludes that neither have natural explanations. In this paper, we examine Dembski's claims, point out significant errors in his reasoning, and conclude that there is no reason to accept his assertions. (shrink)
The book focuses on relations between information and computation. Information is a basic structure of the world, while computation is a process of the dynamic change of information. In order for anything to exist for an individual, the individual must get information on it, either by means of perception or by re-organization of the existing information into new patterns and networks in the brain. With the advent of World Wide Web and a prospect (...) of semantic web, the ways of information supply for individuals, networks of humans and machines and for humanity as a whole are becoming strategically important in a number of ways. Information becomes pivotal for communication, research, education systems, government, businesses and basic functioning of everyday life. At the same time, information may be understood only if we understand its dynamics - time changes of informational structure, that is, we should understand information processing and its primary form - computation. As there is no information without (physical) representation, the dynamics of information is implemented on different levels of granularity by different physical processes, including the level of computation performed by computing machines. There are a lot of open problems of the nature of information and computation, as well as their relationships. How exactly is information dynamics implemented in computational systems, machines as well as living organisms? Are computers processing only data or information and knowledge as well? How does information processing relate to knowledge management and sciences, especially to science of information itself? What do we know of computational processes in machines and living organisms and how these processes are related? What can we learn from natural computational processes that can be useful for information systems and knowledge management? These and similar problems related to information and computation are treated in the book. (shrink)
Written by world-leading experts, this book draws together a number of important strands in contemporary approaches to the philosophical and scientific questions that emerge when dealing with the issues of computing, information, cognition and the conceptual issues that arise at their intersections. It discovers and develops the connections at the borders and in the interstices of disciplines and debates. This volume presents a range of essays that deal with the currently vigorous concerns of the philosophy of information, ontology (...) creation and control, bioinformation and biosemiotics, computational and post-computation approaches to the philosophy of cognitive science, computational linguistics, ethics, and education. http://www.amazon.ca/Computation-Information-Cognition-Gordana-Dodig-Crnkovic/dp/1847180906. (shrink)
What is nontrivial digital computation? It is the processing of discrete data through discrete state transitions in accordance with finite instructional information. The motivation for our account is that many previous attempts to answer this question are inadequate, and also that this account accords with the common intuition that digital computation is a type of information processing. We use the notion of reachability in a graph to defend this characterization in memory-based systems and underscore the importance (...) of instructional information for digital computation. We argue that our account evaluates positively against adequacy criteria for accounts of computation. (shrink)
Computers today are not only the calculation tools - they are directly (inter)acting in the physical world which itself may be conceived of as the universal computer (Zuse, Fredkin, Wolfram, Chaitin, Lloyd). In expanding its domains from abstract logical symbol manipulation to physical embedded and networked devices, computing goes beyond Church-Turing limit (Copeland, Siegelman, Burgin, Schachter). Computational processes are distributed, reactive, interactive, agent-based and concurrent. The main criterion of success of computation is not its termination, but the adequacy of (...) its response, its speed, generality and flexibility; adaptability, and tolerance to noise, error,faults, and damage. Interactive computing is a generalization of Turing computing, and it calls for new conceptualizations (Goldin, Wegner). In the info-computationalist framework, with computation seen as information processing, natural computation appears as the most suitable paradigm of computation and information semantics requires logical pluralism. (shrink)
It has been argued, partly from the lack of any widely accepted solution to the measurement problem, and partly from recent results from quantum information theory, that measurement in quantum theory is best treated as a black box. However, there is a crucial difference between ‘having no account of measurement' and ‘having no solution to the measurement problem'. We know a lot about measurements. Taking into account this knowledge sheds light on quantum theory as a theory of information (...) and computation. In particular, the scheme of ‘one-way quantnum computation' takes on a new character in light of the role that reference frames play in actually carrying out any one-way quantum comptuation. ‡Thanks to audiences at the PSA and the Centre for Time, University of Sydney, for helpful comments and questions. †To contact the author, please write to: Department of Philosophy, University of South Carolina, Columbia, SC 29208; e-mail: firstname.lastname@example.org. (shrink)
Causation can be understood as a computational process once we understand causation in informational terms. I argue that if we see processes as information channels, then causal processes are most readily interpreted as the transfer of information from one state to another. This directly implies that the later state is a computation from the earlier state, given causal laws, which can also be interpreted computationally. This approach unifies the ideas of causation and computation.
This paper presents several observations on the connections between information, physics, and computation. In particular, the computing power of quantum computers is examined. Quantum theory is characterized by superimposed states and nonlocal interactions. It is argued that recently studied quantum computers, which are based on local interactions, cannot simulate quantum physics.
Information processing theories in psychology give rise to executive theories of consciousness. Roughly speaking, these theories maintain that consciousness is a centralized processor that we use when processing novel or complex stimuli. The computational assumptions driving the executive theories are closely tied to the computer metaphor. However, those who take the metaphor serious — as I believe psychologists who advocate the executive theories do — end up accepting too particular a notion of a computing device. In this essay, I (...) examine the arguments from theoretical computational considerations that cognitive psychologists use to support their general approach in order to show that they make unwarranted assumptions about the processing attributes of consciousness. I then go on to examine the assumptions behind executive theories which grow out of the computer metaphor of cognitive psychology and conclude that we may not be the sort of computational machine cognitive psychology assumes and that cognitive psychology''s approach in itself does not buy us anything in developing theories of consciousness. Hence, the state space in which we may locate consciousness is vast, even within an information processing framework. (shrink)
The increased interactivity and connectivity of computational devices along with the spreading of computational tools and computational thinking across the fields, has changed our understanding of the nature of computing. In the course of this development computing models have been extended from the initial abstract symbol manipulating mechanisms of stand-alone, discrete sequential machines, to the models of natural computing in the physical world, generally concurrent asynchronous processes capable of modelling living systems, their informational structures and dynamics on both symbolic and (...) sub-symbolic information processing levels. Present account of models of computation highlights several topics of importance for the development of new understanding of computing and its role: natural computation and the relationship between the model and physical implementation, interactivity as fundamental for computational modelling of concurrent information processing systems such as living organisms and their networks, and the new developments in logic needed to support this generalized framework. Computing understood as information processing is closely related to natural sciences; it helps us recognize connections between sciences, and provides a unified approach for modeling and simulating of both living and non-living systems. (shrink)
This commentary on Fresco's article "Information processing as an account of concrete digital computation" illuminates the two intertwined roles that the definition of the term "information" plays in Fresco's analysis. It provides analysis of the notion of actualizing control in information processing. The key point made is that not all control information in common computational devices cannot be processed.
We argue that the dynamical and computational hypotheses are compatible and in fact need each other: they are about different aspects of cognition. However, only computationalism is about the information-processing aspect. We then argue that any form of information processing relying on matching and comparing, as cognition does, must use discrete representations and computations defined over them.
Weak measurement devices resemble band pass filters: they strengthen average values in the state space or equivalently filter out some ‘frequencies’ from the conjugate Fourier transformed vector space. We thereby adjust a principle of classical communication theory for the use in quantum computation. We discuss some of the computational benefits and limitations of such an approach, including complexity analysis, some simple examples and a realistic not-so-weak approach.
Computing is changing the traditional field of Philosophy of Science in a very profound way. First as a methodological tool, computing makes possible ``experimental Philosophy'' which is able to provide practical tests for different philosophical ideas. At the same time the ideal object of investigation of the Philosophy of Science is changing. For a long period of time the ideal science was Physics (e.g., Popper, Carnap, Kuhn, and Chalmers). Now the focus is shifting to the field of Computing/Informatics. There are (...) many good reasons for this paradigm shift, one of those being a long standing need of a new meeting between the sciences and humanities, for which the new discipline of Computing/Informatics gives innumerable possibilities. Contrary to Physics, Computing/Informatics is very much human-centered. It brings a potential for a new Renaissance, where Science and Humanities, Arts and Engineering can reach a new synthesis, so very much needed in our intellectually split culture. This paper investigates contemporary trends and the relation between the Philosophy of Science and the Philosophy of Computing and Information, which is equivalent to the present relation between Philosophy of Science and Philosophy of Physics. (shrink)
It has been argued that neural networks and other forms of analog computation may transcend the limits of Turing-machine computation; proofs have been offered on both sides, subject to differing assumptions. In this article I argue that the important comparisons between the two models of computation are not so much mathematical as epistemological. The Turing-machine model makes assumptions about information representation and processing that are badly matched to the realities of natural computation (information representation (...) and processing in or inspired by natural systems). This points to the need for new models of computation addressing issues orthogonal to those that have occupied the traditional theory of computation. (shrink)
Cognitive science uses the notion of computational information processing to explain cognitive information processing. Some philosophers have argued that anything can be described as doing computational information processing; if so, it is a vacuous notion for explanatory purposes.An attempt is made to explicate the notions of cognitive information processing and computational information processing and to specify the relationship between them. It is demonstrated that the resulting notion of computational information processing can only be realized (...) in a restrictive class of dynamical systems called physical notational systems (after Goodman's theory of notationality), and that the systems generally appealed to by cognitive science-physical symbol systems-are indeed such systems. Furthermore, it turns out that other alternative conceptions of computational information processing, Fodor's (1975) Language of Thought and Cummins' (1989) Interpretational Semantics appeal to substantially the same restrictive class of systems. (shrink)
Simple hypotheses are intrinsically attractive, and, for this reason, need to be formulated with utmost precision if they are to be testable. Unfortunately, it is hard to see how Phillips & Singer's hypothesis might be unambiguously refuted. Despite this, the authors have provided much evidence consistent with the hypothesis, and have proposed a natural and powerful extension for information theoretic approaches to learning.
While situation theory and situation semantics (Barwise and Perry 1983) provide an appropriate framework for a realistic model-theoretic treatment of natural language, serious thinking on their `computational' aspects has only recently started (Black 1993, Nakashima et al. 1988). Existing proposals mainly o er a Prolog- or Lisp-like programming environment with varying degrees of divergence from the ontology of situation theory. In this paper, we introduce a computational medium (called BABY-SIT) based on situations (T n and Akman 1994a, T n and (...) Akman 1994b). The primary motivation underlying BABY-SIT is to facilitate the development and testing of programs in domains ranging from linguistics to arti cial intelligence in a uni ed framework built upon situation-theoretic constructs. (shrink)
While situation theory and situation semantics provide an appropriate framework for a realistic model-theoretic treatment of natural language, serious thinking on their `computational' aspects has only recently started. Existing proposals mainly offer a Prolog- or Lisp-like programming environment with varying degrees of divergence from the ontology of situation theory. In this paper, we introduce a computational medium (called BABY-SIT) based on situations. The primary motivation underlying BABY-SIT is to facilitate the development and testing of programs in domains ranging from linguistics (...) to artificial intelligence in a unified framework built upon situation-theoretic constructs. (shrink)
This book serves as the main reference for an undergraduate course on Philosophy of Information. The book is written to be accessible to the typical undergraduate student of Philosophy and does not require propaedeutic courses in Logic, Epistemology or Ethics. Each chapter includes a rich collection of references for the student interested in furthering her understanding of the topics reviewed in the book. -/- The book covers all the main topics of the Philosophy of Information and it should (...) be considered an overview and not a comprehensive, in-depth analysis of a philosophical area. As a consequence, 'The Philosophy of Information: a Simple Introduction' does not contain research material as it is not aimed at graduate students or researchers. (shrink)
We establish a connection between measurement-based quantum computation and the field of mathematical logic. We show that the computational power of an important class of quantum states called graph states, representing resources for measurement-based quantum computation, is reflected in the expressive power of (classical) formal logic languages defined on the underlying mathematical graphs. In particular, we show that for all graph state resources which can yield a computational speed-up with respect to classical computation, the underlying graphs—describing the (...) quantum correlations of the states—are associated with undecidable logic theories. Here undecidability is to be interpreted in a sense similar to Gödel’s incompleteness results, meaning that there exist propositions, expressible in the above classical formal logic, which cannot be proven or disproven. (shrink)
A Computable Universe is a collection of papers discussing computation in nature and the nature of computation, a compilation of the views of the pioneers in the contemporary area of intellectual inquiry focused on computational and informational theories of the world. This volume is the definitive source of informational/computational views of the world, and of cutting-edge models of the universe, both digital and quantum, discussed from a philosophical perspective as well as in the greatest technical detail. The book (...) discusses the foundations of computation in relation to nature. It focuses on two main questions: What is computation? How does nature compute? The contributors are world-renowned experts who have helped shape a cutting-edge computational understanding of the universe. They discuss computation in the world from a variety of perspectives, ranging from foundational concepts to pragmatic models to ontological conceptions and their philosophical implications. The volume provides a state-of-the-art collection of technical papers and non-technical essays representing a field that takes information and computation to be key to understanding and explaining the basic structure underpinning physical reality. It also includes a new edition of Konrad Zuse's "Calculating Space", and a panel discussion transcription on the topic, featuring worldwide experts (including a Nobel prize) in quantum mechanics, physics, cognition, computation and algorithmic complexity.A Computable Universe is a collection of papers discussing computation in nature and the nature of computation, a compilation of the views of the pioneers in the contemporary area of intellectual inquiry focused on computational and informational theories of the world. This volume is the definitive source of informational/computational views of the world, and of cutting-edge models of the universe, both digital and quantum, discussed from a philosophical perspective as well as in the greatest technical detail. The book discusses the foundations of computation in relation to nature. It focuses on two main questions: What is computation? How does nature compute? The contributors are world-renowned experts who have helped shape a cutting-edge computational understanding of the universe. They discuss computation in the world from a variety of perspectives, ranging from foundational concepts to pragmatic models to ontological conceptions and their philosophical implications. The volume provides a state-of-the-art collection of technical papers and non-technical essays representing a field that takes information and computation to be key to understanding and explaining the basic structure underpinning physical reality. It also includes a new edition of Konrad Zuse's "Calculating Space", and a panel discussion transcription on the topic, featuring worldwide experts (including a Nobel prize) in quantum mechanics, physics, cognition, computation and algorithmic complexity. (shrink)
This paper presents the first bibliometric mapping analysis of the field of computer and information ethics (C&IE). It provides a map of the relations between 400 key terms in the field. This term map can be used to get an overview of concepts and topics in the field and to identify relations between information and communication technology concepts on the one hand and ethical concepts on the other hand. To produce the term map, a data set of over (...) thousand articles published in leading journals and conference proceedings in the C&IE field was constructed. With the help of various computer algorithms, key terms were identified in the titles and abstracts of the articles and co-occurrence frequencies of these key terms were calculated. Based on the co-occurrence frequencies, the term map was constructed. This was done using a computer program called VOSviewer. The term map provides a visual representation of the C&IE field and, more specifically, of the organization of the field around three main concepts, namely privacy, ethics, and the Internet. (shrink)
The recent development of the research field of Computing and Philosophy has triggered investigations into the theoretical foundations of computing and information. This thesis consists of two parts which are the result of studies in two areas of Philosophy of Computing (PC) and Philosophy of Information (PI) regarding the production of meaning (semantics) and the value system with applications (ethics). The first part develops a unified dual-aspect theory of information and computation, in which information is (...) characterized as structure, and computation is the information dynamics. This enables naturalization of epistemology, based on interactive information representation and communication. In the study of systems modeling, meaning, truth and agency are discussed within the framework of the PI/PC unification. The second part of the thesis addresses the necessity of ethical judgment in rational agency illustrated by the problem of information privacy and surveillance in the networked society. The value grounds and socio-technological solutions for securing trustworthiness of computing are analyzed. Privacy issues clearly show the need for computing professionals to contribute to understanding of the technological mechanisms of Information and Communication Technology. The main original contribution of this thesis is the unified dual-aspect theory of computation/information. Semantics of information is seen as a part of the data-information-knowledge structuring, in which complex structures are self-organized by the computational processing of information. Within the unified model, complexity is a result of computational processes on informational structures. The thesis argues for the necessity of computing beyond the Turing-Church limit, motivated by natural computation, and wider by pancomputationalism and paninformationalism, seen as two complementary views of the same physical reality. Moreover, it follows that pancomputationalism does not depend on the assumption that the physical world on some basic level is digital. Contrary to many believes it is entirely compatible with dual (analogue/digital) quantum-mechanical computing. (shrink)
In Darwin’s Dangerous Idea, Daniel Dennett claims that evolution is algorithmic. On Dennett’s analysis, evolutionary processes are trivially algorithmic because he assumes that all natural processes are algorithmic. I will argue that there are more robust ways to understand algorithmic processes that make the claim that evolution is algorithmic empirical and not conceptual. While laws of nature can be seen as compression algorithms of information about the world, it does not follow logically that they are implemented as algorithms by (...) physical processes. For that to be true, the processes have to be part of computational systems. The basic difference between mere simulation and real computing is having proper causal structure. I will show what kind of requirements this poses for natural evolutionary processes if they are to be computational. (shrink)
As a step towards comprehensive computer models of communication, and effective human machine dialogue, some of the relationships between communication and affect are explored. An outline theory is presented of the architecture that makes various kinds of affective states possible, or even inevitable, in intelligent agents, along with some of the implications of this theory for various communicative processes. The model implies that human beings typically have many different, hierarchically organized, dispositions capable of interacting with new information to produce (...) affective states, distract attention, interrupt ongoing actions, and so on. High "insistence" of motives is defined in relation to a tendency to penetrate an attention filter mechanism, which seems to account for the partial loss of control involved in emotions. One conclusion is that emulating human communicative abilities will not be achieved easily. Another is that it will be even more difficult to design and build computing systems that reliably achieve interesting communicative goals. (shrink)
The primary resource for quantum computation is Hilbert-space dimension. Whereas Hilbert space itself is an abstract construction, the number of dimensions available to a system is a physical quantity that requires physical resources. Avoiding a demand for an exponential amount of these resources places a fundamental constraint on the systems that are suitable for scalable quantum computation. To be scalable, the effective number of degrees of freedom in the computer must grow nearly linearly with the number of qubits (...) in an equivalent qubit-based quantum computer. (shrink)
There is no consensus as to whether a Liar sentence is meaningful or not. Still, a widespread conviction with respect to Liar sentences (and other ungrounded sentences) is that, whether or not they are meaningful, they are useless . The philosophical contribution of this paper is to put this conviction into question. Using the framework of assertoric semantics , which is a semantic valuation method for languages of self-referential truth that has been developed by the author, we show that certain (...) computational problems, called query structures , can be solved more efficiently by an agent who has self-referential resources (amongst which are Liar sentences) than by an agent who has only classical resources; we establish the computational power of self-referential truth . The paper concludes with some thoughts on the implications of the established result for deflationary accounts of truth. (shrink)
The interpretation of quantum mechanics is an area of increasing interest to many working physicists. In particular, interest has come from those involved in quantum computing and information theory, as there has always been a strong foundational element in this field. This paper introduces one interpretation of quantum mechanics, a modern ‘many-worlds’ theory, from the perspective of quantum computation. Reasons for seeking to interpret quantum mechanics are discussed, then the specific ‘neo-Everettian’ theory is introduced and its claim as (...) the best available interpretation defended. The main objections to the interpretation, including the so-called “problem of probability” are shown to fail. The local nature of the interpretation is demonstrated, and the implications of this both for the interpretation and for quantum mechanics more generally are discussed. Finally, the consequences of the theory for quantum computation are investigated, and common objections to using many worlds to describe quantum computing are answered. We find that using this particular many-worlds theory as a physical foundation for quantum computation gives several distinct advantages over other interpretations, and over not interpreting quantum theory at all. (shrink)
The paper offers an analysis of the problem of integrating ethical principles into the practice of software design. The approach is grounded on a review of the relevant literature from Computer Ethics and Professional Ethics. The paper is divided into four sections. The first section reviews some key questions that arise when the ethical impact of computational artefacts is analysed. The inner informational nature of such questions is used to argue in favour of the need for a specific branch of (...) ethics called Information Ethics. Such ethics deal with a specific class of ethical problems and Informational Privacy is introduced as a paradigmatic example. The second section analyses the ethical nature of computational artefacts. This section highlights the fact that this nature is impossible to comprehend without first considering designers, users, and patients alongside the artefacts they create, use and are affected by. Some of key ethical concepts are discussed, such as freedom, agency, control, autonomy and accountability. The third section illustrates how autonomous computational artefacts are rapidly changing the way in which computation is used and perceived. The description of the ethical challenges posed to software engineers by this shift in perspective closes the section. The fourth and last section of the paper is dedicated to a discussion of Professional Ethics for software engineers. After establishing the limits of the professional codes of practice, it is argued that ethical considerations are best embedded directly into software design practise. In this context, the Value Sensitive Design approach is considered and insight into how this is being integrated into current research in ethical design methodologies is given. (shrink)
In an effort to uncover fundamental differences between computers and brains, this paper identifies computation with a particular kind of physical process, in contrast to interpreting the behaviors of physical systems as one or more abstract computations. That is, whether or not a system is computing depends on how those aspects of the system we consider to be informational physically cause change rather than on our capacity to describe its behaviors in computational terms. A physical framework based on the (...) notion of causal mechanism is used to distinguish different kinds of information processing in a physically-principled way; each information processing type is associated with a particular causal mechanism. The causal mechanism associated with computation is pattern matching, which isphysically defined as the fitting of physical structures such that they cause a simple change. It is argued that information processing in the brain is based on a causal mechanism different than pattern matching so defined, implying that brains do not compute, at least not in the physical sense that digital computers do. This causal difference may also mean that computers cannot have mental states. (shrink)