About this topic
Summary The theory of computation is a mathematical theory about the properties of abstract computational objects, such as algorithms and Turing machines. They are abstract in the sense that they ignore or leave out considerations about by features of physical implementations, such as finite memory.  In contrast, computations are done by physical systems: concrete machines made of silicon and metal, or brains made of biological materials, can run algorithms or implement Turing machines. This area is concerned with questions about how the abstract objects that are in the purview of the theory of computation relate to physical systems.
Key works The relationship between abstract computation and physical systems such as brains is a central issue in philosophy of mind, particularly given the rise of computational functionalism as a foundation for the study of the mind.  Here the work of Chalmers 1996 provides a good starting point for bridging the theory of computation with theories of physical systems by means of animplementation relation. 
Introductions A good introduction is Piccinini 2010
Related categories

79 found
1 — 50 / 79
  1. Client-Server Based Remote Access Through the Internet: Internet Based Remote Process Control.Mohammed Abdullah Hussein - 2011 - Germany: LAP Lambert Academic Publishing.
    Internet based process control usage has grown in the past years. Industry field demands were behind this, and it ranges from factory, office and home automation to tasks simplifications and cost reduction. In this book a hardware interface circuit and a software system used to control the temperature and level of a liquid tank is described. The advantage of the designed interface circuit is its simplicity and low cost. The same can be true for the software system in which we (...)
  2. A Note on the Physical Possibility of Transfinite Computation.W. Aitken & J. A. Barrett - 2010 - British Journal for the Philosophy of Science 61 (4):867-874.
    In this note, we consider constraints on the physical possibility of transfinite Turing machines that arise from how one models the continuous structure of space and time in one's best physical theories. We conclude by suggesting a version of Church's thesis appropriate as an upper bound for physical computation given how space and time are modeled on our current physical theories.
  3. The Newell Test for a Theory of Cognition.John R. Anderson & Christian Lebiere - 2003 - Behavioral and Brain Sciences 26 (5):587-601.
    Newell proposed that cognitive theories be developed in an effort to satisfy multiple criteria and to avoid theoretical myopia. He provided two overlapping lists of 13 criteria that the human cognitive architecture would have to satisfy in order to be functional. We have distilled these into 12 criteria: flexible behavior, real-time performance, adaptive behavior, vast knowledge base, dynamic behavior, knowledge integration, natural language, learning, development, evolution, and brain realization. There would be greater theoretical progress if we evaluated theories by a (...)
  4. Review of Computability: Turing, Gödel, Church, and Beyond. [REVIEW]Andrew Arana - 2015 - Notre Dame Philosophical Reviews 3 (20).
  5. Hans Moravec, Robot. Mere Machine to Transcendent Mind, New York, NY: Oxford University Press, Inc., 1999, IX + 227 Pp., $25.00 (Cloth), ISBN 0-19-511630-. [REVIEW]Peter M. Asaro - 2001 - Minds and Machines 11 (1):143-147.
  6. Dynamic Mechanistic Explanation: Computational Modeling of Circadian Rhythms as an Exemplar for Cognitive Science.William Bechtel & Adele Abrahamsen - 2010 - Studies in History and Philosophy of Science Part A 41 (3):321-333.
    Two widely accepted assumptions within cognitive science are that (1) the goal is to understand the mechanisms responsible for cognitive performances and (2) computational modeling is a major tool for understanding these mechanisms. The particular approaches to computational modeling adopted in cognitive science, moreover, have significantly affected the way in which cognitive mechanisms are understood. Unable to employ some of the more common methods for conducting research on mechanisms, cognitive scientists’ guiding ideas about mechanism have developed in conjunction with their (...)
  7. The World is Either Digital or Analogue.Francesco Berto & Jacopo Tagliabue - 2014 - Synthese 191 (3):481-497.
    We address an argument by Floridi (Synthese 168(1):151–178, 2009; 2011a), to the effect that digital and analogue are not features of reality, only of modes of presentation of reality. One can therefore have an informational ontology, like Floridi’s Informational Structural Realism, without commitment to a supposedly digital or analogue world. After introducing the topic in Sect. 1, in Sect. 2 we explain what the proposition expressed by the title of our paper means. In Sect. 3, we describe Floridi’s argument. In (...)
  8. Cellular Automata.Francesco Berto & Jacopo Tagliabue - 2012 - Stanford Encyclopedia of Philosophy.
    Cellular automata (henceforth: CA) are discrete, abstract computational systems that have proved useful both as general models of complexity and as more specific representations of non-linear dynamics in a variety of scientific fields. Firstly, CA are (typically) spatially and temporally discrete: they are composed of a finite or denumerable set of homogeneous, simple units, the atoms or cells. At each time unit, the cells instantiate one of a finite set of states. They evolve in parallel at discrete time steps, following (...)
  9. There’s Plenty of Boole at the Bottom: A Reversible CA Against Information Entropy.Francesco Berto, Jacopo Tagliabue & Gabriele Rossi - 2016 - Minds and Machines 26 (4):341-357.
    “There’s Plenty of Room at the Bottom”, said the title of Richard Feynman’s 1959 seminal conference at the California Institute of Technology. Fifty years on, nanotechnologies have led computer scientists to pay close attention to the links between physical reality and information processing. Not all the physical requirements of optimal computation are captured by traditional models—one still largely missing is reversibility. The dynamic laws of physics are reversible at microphysical level, distinct initial states of a system leading to distinct final (...)
  10. Why Computer Simulations Are Not Inferences, and in What Sense They Are Experiments.Florian J. Boge - 2019 - European Journal for Philosophy of Science 9 (1):13.
    The question of where, between theory and experiment, computer simulations locate on the methodological map is one of the central questions in the epistemology of simulation. The two extremes on the map have them either be a kind of experiment in their own right, 317–329, 2005; Morrison Philosophical Studies, 143, 33–57, 2009; Morrison 2015; Massimi and Bhimji Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics, 51, 71–81, 2015; Parker Synthese, 169, 483–496, (...)
  11. Agent-Based Modeling: The Right Mathematics for the Social Sciences?Paul L. Borrill & Leigh Tesfatsion - 2011 - In J. B. Davis & D. W. Hands (eds.), Elgar Companion to Recent Economic Methodology. Edward Elgar Publishers. pp. 228.
    This study provides a basic introduction to agent-based modeling (ABM) as a powerful blend of classical and constructive mathematics, with a primary focus on its applicability for social science research. The typical goals of ABM social science researchers are discussed along with the culture-dish nature of their computer experiments. The applicability of ABM for science more generally is also considered, with special attention to physics. Finally, two distinct types of ABM applications are summarized in order to illustrate concretely the duality (...)
  12. Parallel Machines.Andrew Boucher - 1997 - Minds and Machines 7 (4):543-551.
    Because it is time-dependent, parallel computation is fundamentally different from sequential computation. Parallel programs are non-deterministic and are not effective procedures. Given the brain operates in parallel, this casts doubt on AI's attempt to make sequential computers intelligent.
  13. On Communication and Computation.Paul Bohan Broderick - 2004 - Minds and Machines 14 (1):1-19.
    Comparing technical notions of communication and computation leads to a surprising result, these notions are often not conceptually distinguishable. This paper will show how the two notions may fail to be clearly distinguished from each other. The most famous models of computation and communication, Turing Machines and (Shannon-style) information sources, are considered. The most significant difference lies in the types of state-transitions allowed in each sort of model. This difference does not correspond to the difference that would be expected after (...)
  14. Randomness & Complexity, From Leibniz to Chaitin.Christian Calude (ed.) - 2007 - World Scientific Pub Co.
    This book is a collection of papers written by a selection of eminent authors from around the world in honour of Gregory Chaitin's 60th birthday.
  15. Proving Darwin: Making Biology Mathematical.G. J. Chaitin - 2012 - Pantheon.
    Groundbreaking mathematician Gregory Chaitin gives us the first book to posit that we can prove how Darwin’s theory of evolution works on a mathematical level.
  16. On Effective Procedures.Carol E. Cleland - 2002 - Minds and Machines 12 (2):159-179.
    Since the mid-twentieth century, the concept of the Turing machine has dominated thought about effective procedures. This paper presents an alternative to Turing's analysis; it unifies, refines, and extends my earlier work on this topic. I show that Turing machines cannot live up to their billing as paragons of effective procedure; at best, they may be said to provide us with mere procedure schemas. I argue that the concept of an effective procedure crucially depends upon distinguishing procedures as definite courses (...)
  17. Computational Processes: A Reply to Chalmers and Copeland.Cristian Cocos - 2002 - SATS: Northern European Journal of Philosophy 3 (2):25-49.
  18. Why Build a Virtual Brain? Large-Scale Neural Simulations as Jump Start for Cognitive Computing.Matteo Colombo - 2016 - Journal of Experimental and Theoretical Artificial Intelligence.
    Despite the impressive amount of financial resources recently invested in carrying out large-scale brain simulations, it is controversial what the pay-offs are of pursuing this project. One idea is that from designing, building, and running a large-scale neural simulation, scientists acquire knowledge about the computational performance of the simulating system, rather than about the neurobiological system represented in the simulation. It has been claimed that this knowledge may usher in a new era of neuromorphic, cognitive computing systems. This study elucidates (...)
  19. Accelerating Turing Machines.B. Jack Copeland - 2002 - Minds and Machines 12 (2):281-300.
    Accelerating Turing machines are Turing machines of a sort able to perform tasks that are commonly regarded as impossible for Turing machines. For example, they can determine whether or not the decimal representation of contains n consecutive 7s, for any n; solve the Turing-machine halting problem; and decide the predicate calculus. Are accelerating Turing machines, then, logically impossible devices? I argue that they are not. There are implications concerning the nature of effective procedures and the theoretical limits of computability. Contrary (...)
  20. Hypercomputation.B. Jack Copeland - 2002 - Minds and Machines 12 (4):461-502.
  21. What is Computation?B. Jack Copeland - 1996 - Synthese 108 (3):335-59.
    To compute is to execute an algorithm. More precisely, to say that a device or organ computes is to say that there exists a modelling relationship of a certain kind between it and a formal specification of an algorithm and supporting architecture. The key issue is to delimit the phrase of a certain kind. I call this the problem of distinguishing between standard and nonstandard models of computation. The successful drawing of this distinction guards Turing's 1936 analysis of computation against (...)
  22. Physical Computation: How General Are Gandy's Principles for Mechanisms?B. Jack Copeland & Oron Shagrir - 2007 - Minds and Machines 17 (2):217-231.
    What are the limits of physical computation? In his ‘Church’s Thesis and Principles for Mechanisms’, Turing’s student Robin Gandy proved that any machine satisfying four idealised physical ‘principles’ is equivalent to some Turing machine. Gandy’s four principles in effect define a class of computing machines (‘Gandy machines’). Our question is: What is the relationship of this class to the class of all (ideal) physical computing machines? Gandy himself suggests that the relationship is identity. We do not share this view. We (...)
  23. Even Turing Machines Can Compute Uncomputable Functions.Jack Copeland - unknown
    Accelerated Turing machines are Turing machines that perform tasks commonly regarded as impossible, such as computing the halting function. The existence of these notional machines has obvious implications concerning the theoretical limits of computability.
  24. The Broad Conception of Computation.Jack Copeland - 1997 - American Behavioral Scientist 40 (6):690-716.
    A myth has arisen concerning Turing's paper of 1936, namely that Turing set forth a fundamental principle concerning the limits of what can be computed by machine - a myth that has passed into cognitive science and the philosophy of mind, to wide and pernicious effect. This supposed principle, sometimes incorrectly termed the 'Church-Turing thesis', is the claim that the class of functions that can be computed by machines is identical to the class of functions that can be computed by (...)
  25. Computing Mechanisms and Autopoietic Systems.Joe Dewhurst - 2016 - In Vincent Müller (ed.), Computing and Philosophy. Springer Verlag. pp. 17-26.
    This chapter draws an analogy between computing mechanisms and autopoietic systems, focusing on the non-representational status of both kinds of system (computational and autopoietic). It will be argued that the role played by input and output components in a computing mechanism closely resembles the relationship between an autopoietic system and its environment, and in this sense differs from the classical understanding of inputs and outputs. The analogy helps to make sense of why we should think of computing mechanisms as non-representational, (...)
  26. Individuation Without Representation.Joe Dewhurst - 2016 - British Journal for the Philosophy of Science:axw018.
    Shagrir (2001) and Sprevak (2010) explore the apparent necessity of representation for the individuation of digits (and processors) in computational systems. I will first offer a response to Sprevak’s argument that does not mention Shagrir’s original formulation, which was more complex. I then extend my initial response to cover Shagrir’s argument, thus demonstrating that it is possible to individuate digits in non-representational computing mechanisms. I also consider the implications that the non-representational individuation of digits would have for the broader theory (...)
  27. Physical Computation and Cognitive Science.Gordana Dodig-Crnkovic - 2016 - Australasian Journal of Philosophy 94 (2): 396-399.
    This is a book review of Nir Fresco's book, published in Australasian Journal of Philosophy (0004-8402). Vol. 94 (2016), 2, p. 396-399.
  28. Unfolding Cognitive Capacities.Jacques Dubucs - 2006 - In D. Andler, M. Okada & I. Watanabe (eds.), Reasoning and Cognition. pp. 95--101.
    As regards cognitive capacities, the point of view of classical Artificial Intelligence has been much challenged by the so-called emergentist point of view. This paper attempts to outline,on the basis of logical considerations dealing with practical feasibility, a general theory of incompressible unfolding that is consonant with an old Leibnizian stance rather with the contemporary theory of complexity. I defend a variant of emergentism according to which any process that leads to endow a system with cognitive capacities is such an (...)
  29. Some Philosophical Issues in Computer Science.Amnon Eden - 2011 - Minds and Machines 21 (2):123-133.
    The essays included in the special issue dedicated to the philosophy of computer science examine new philosophical questions that arise from reflection upon conceptual issues in computer science and the insights such an enquiry provides into ongoing philosophical debates.
  30. Three Paradigms of Computer Science.Amnon H. Eden - 2007 - Minds and Machines 17 (2):135-167.
    We examine the philosophical disputes among computer scientists concerning methodological, ontological, and epistemological questions: Is computer science a branch of mathematics, an engineering discipline, or a natural science? Should knowledge about the behaviour of programs proceed deductively or empirically? Are computer programs on a par with mathematical objects, with mere data, or with mental processes? We conclude that distinct positions taken in regard to these questions emanate from distinct sets of received beliefs or paradigms within the discipline: – The rationalist (...)
  31. Searle, Syntax, and Observer-Relativity.Ronald P. Endicott - 1996 - Canadian Journal of Philosophy 26 (1):101-22.
    I critically examine some provocative arguments that John Searle presents in his book The Rediscovery of Mind to support the claim that the syntactic states of a classical computational system are "observer relative" or "mind dependent" or otherwise less than fully and objectively real. I begin by explaining how this claim differs from Searle's earlier and more well-known claim that the physical states of a machine, including the syntactic states, are insufficient to determine its semantics. In contrast, his more recent (...)
  32. Explaining Experience In Nature: The Foundations Of Logic And Apprehension.Steven Ericsson-Zenith - forthcoming - Institute for Advanced Science & Engineering.
    At its core this book is concerned with logic and computation with respect to the mathematical characterization of sentient biophysical structure and its behavior. -/- Three related theories are presented: The first of these provides an explanation of how sentient individuals come to be in the world. The second describes how these individuals operate. And the third proposes a method for reasoning about the behavior of individuals in groups. -/- These theories are based upon a new explanation of experience in (...)
  33. Simulating Physics with Computers.R. P. Feynman - 1982 - International Journal of Theoretical Physics 21 (6):467-488.
  34. Concrete Digital Computation: Competing Accounts and its Role in Cognitive Science.Nir Fresco - 2013 - Dissertation, University of New South Wales
    There are currently considerable confusion and disarray about just how we should view computationalism, connectionism and dynamicism as explanatory frameworks in cognitive science. A key source of this ongoing conflict among the central paradigms in cognitive science is an equivocation on the notion of computation simpliciter. ‘Computation’ is construed differently by computationalism, connectionism, dynamicism and computational neuroscience. I claim that these central paradigms, properly understood, can contribute to an integrated cognitive science. Yet, before this claim can be defended, a better (...)
  35. Erratum To: A Revised Attack on Computational Ontology. [REVIEW]Nir Fresco & Phillip J. Staines - 2014 - Minds and Machines 24 (1):141-141.
    Erratum to: Minds & Machines DOI 10.1007/s11023-013-9327-1Acknowledgment was omitted from the original publication of this article, and appears below.
  36. The Instructional Information Processing Account of Digital Computation.Nir Fresco & Marty J. Wolf - 2014 - Synthese 191 (7):1469-1492.
    What is nontrivial digital computation? It is the processing of discrete data through discrete state transitions in accordance with finite instructional information. The motivation for our account is that many previous attempts to answer this question are inadequate, and also that this account accords with the common intuition that digital computation is a type of information processing. We use the notion of reachability in a graph to defend this characterization in memory-based systems and underscore the importance of instructional information for (...)
  37. Utopias and Dystopias as Cybernetic Information Systems: Envisioning the Posthuman Neuropolity.Matthew E. Gladden - 2015 - Creatio Fantastica (3 (50)).
    While it is possible to understand utopias and dystopias as particular kinds of sociopolitical systems, in this text we argue that utopias and dystopias can also be understood as particular kinds of information systems in which data is received, stored, generated, processed, and transmitted by the minds of human beings that constitute the system’s ‘nodes’ and which are connected according to specific network topologies. We begin by formulating a model of cybernetic information-processing properties that characterize utopias and dystopias. It is (...)
  38. Eric Winsberg: Science in the Age of Computer Simulation. [REVIEW]Stefan Gruner - 2013 - Minds and Machines 23 (2):251-254.
  39. To Balance a Pencil on its Tip: On the Passive Approach to Quantum Error Correction.Amit Hagar - manuscript
    Quantum computers are hypothetical quantum information processing (QIP) devices that allow one to store, manipulate, and extract information while harnessing quantum physics to solve various computational problems and do so putatively more efficiently than any known classical counterpart. Despite many ‘proofs of concept’ (Aharonov and Ben–Or 1996; Knill and Laflamme 1996; Knill et al. 1996; Knill et al. 1998) the key obstacle in realizing these powerful machines remains their scalability and susceptibility to noise: almost three decades after their conceptions, experimentalists (...)
  40. Ed Fredkin and the Physics of Information - An Inside Story of an Outsider Scientist.Amit Hagar - 2016 - Information and Culture 51 (3):419-443.
    This article tells the story of Ed Fredkin, a pilot, programmer, engineer, hardware designer and entrepreneur, whose work inside and outside academia has influenced major developments in computer science and in the foundations of theoretical physics for the past fifty years.
  41. Effective Procedures Versus Elementary Units of Behavior.John M. Hollerbach - 1981 - Behavioral and Brain Sciences 4 (4):625.
  42. Open Systems and Consciousness: A Philosophical Discussion.Roman Stanisław Ingarden - 2002 - Open Systems and Information Dynamics 9:125-151.
  43. Complex Organisation and Fundamental Physics.Brian D. Josephson - 2018 - Cambridge: Cambridge University Streaming Media Service.
    The file on this site provides the slides for a lecture given in Hangzhou in May 2018, and the lecture itself is available at the URL beginning 'sms' in the set of links provided in connection with this item. -/- It is commonly assumed that regular physics underpins biology. Here it is proposed, in a synthesis of ideas by various authors, that in reality structures and mechanisms of a biological character underpin the world studied by physicists, in principle supplying detail (...)
  44. Indistinguishable From Magic: Computation is Cognitive Technology. [REVIEW]John Kadvany - 2010 - Minds and Machines 20 (1):119-143.
    This paper explains how mathematical computation can be constructed from weaker recursive patterns typical of natural languages. A thought experiment is used to describe the formalization of computational rules, or arithmetical axioms, using only orally-based natural language capabilities, and motivated by two accomplishments of ancient Indian mathematics and linguistics. One accomplishment is the expression of positional value using versified Sanskrit number words in addition to orthodox inscribed numerals. The second is Pāṇini’s invention, around the fifth century BCE, of a formal (...)
  45. A Mechanistic Account of Wide Computationalism.Luke Kersten - 2017 - Review of Philosophy and Psychology 8 (3):501-517.
    The assumption that psychological states and processes are computational in character pervades much of cognitive science, what many call the computational theory of mind. In addition to occupying a central place in cognitive science, the computational theory of mind has also had a second life supporting “individualism”, the view that psychological states should be taxonomized so as to supervene only on the intrinsic, physical properties of individuals. One response to individualism has been to raise the prospect of “wide computational systems”, (...)
  46. Architectures of Intelligent Systems.David Kirsh - 1992 - Exploring Brain Functions:293-321.
    Theories of intelligence can be of use to neuroscientists if they: 1. Provide illuminating suggestions about the functional architecture of neural systems; 2. Suggest specific models of processing that neural circuits might implement. The objective of our session was to stand back and consider the prospects for this interdisciplinary exchange.
  47. Effective Physical Processes and Active Information in Quantum Computing.Ignazio Licata - 2007 - Quantum Biosystems 1 (1):51-65.
    The recent debate on hypercomputation has raised new questions both on the computational abilities of quantum systems and the Church-Turing Thesis role in Physics.We propose here the idea of “effective physical process” as the essentially physical notion of computation. By using the Bohm and Hiley active information concept we analyze the differences between the standard form (quantum gates) and the non-standard one (adiabatic and morphogenetic) of Quantum Computing, and we point out how its Super-Turing potentialities derive from an incomputable information (...)
  48. Physics of Emergence and Organization.Ignazio Licata & Ammar Sakaji (eds.) - 2008 - World Scientific.
    This book is a state-of-the-art review on the Physics of Emergence. Foreword v Gregory J. Chaitin Preface vii Ignazio Licata Emergence and Computation at the Edge of Classical and Quantum Systems 1 Ignazio Licata Gauge Generalized Principle for Complex Systems 27 Germano Resconi Undoing Quantum Measurement: Novel Twists to the Physical Account of Time 61 Avshalom C. Elitzur and Shahar Dolev Process Physics: Quantum Theories as Models of Complexity 77 Kirsty Kitto A Cross-disciplinary Framework for the Description of Contextually Mediated (...)
  49. The Role of 'Complex' Empiricism in the Debates About Satellite Data and Climate Models.Elisabeth A. Lloyd - 2012 - Studies in History and Philosophy of Science Part A 43 (2):390-401.
    climate scientists have been engaged in a decades-long debate over the standing of satellite measurements of the temperature trends of the atmosphere above the surface of the earth. This is especially significant because skeptics of global warming and the greenhouse effect have utilized this debate to spread doubt about global climate models used to predict future states of climate. I use this case from an under-studied science to illustrate two distinct philosophical approaches to the relation among data, scientists, measurement, models, (...)
  50. Relativity in a Planck-Level Black-Hole Universe Simulation, a Simulation Hypothesis.Malcolm Macleod - manuscript
    The Simulation Hypothesis proposes that all of reality is in fact an artificial simulation, analogous to a computer simulation, and as such our reality is an illusion. It is predicated upon the assumption that enormous amounts of computing power are available. In this article I outline a method with low computational cost for reproducing relativistic mass, space and time at the Planck level. Virtual particles that oscillate between an electric wave-state and a mass point-state are mapped within an expanding black-hole (...)
1 — 50 / 79