About this topic
Summary Pancomputationalism is a term encompassing all paradigms of a computational world, which proceed from the realization that nature can successfully be explained by computable scientific models. It takes the concepts of functionalism and computationalism to its ultimate consequences, envisaging a world where all physical processes are carried out by a computer. In other words, it encompasses all paradigms that see the universe as a computer program. The strongest form of pancomputationalism is the paradigm of a digital Turing computable world, but there are opposing paradigms having their own computational models. 
Key works While for some authors the world and its natural processes are deterministic and digital, based on classical mechanics (e.g. Zuse 1969Fredkin 1990), for others it may be obvious that the world cannot be the result of  classical computation (Feynman 1982, Deutsch 1997Lloyd 2010) because that would leave quantum phenomena unaccounted for. The main question, however, is which processes are most fundamental. Some authors believe that quantum phenomena are an emergent property of information and computation (Wheeler 1990, Wolfram 2002). The main opposing pancomputational views claim that no current scientific theory can fully account for natural phenomena such as brain consciousness (e.g. Penrose 1999), and for a world where indeterministic randomness actually occurs and free will is possible (e.g. Scheidl et al 2010). They do so, for example, by strictly assuming the Copenhagen interpretation of quantum mechanics. A weaker form of pancomputationalism entails an algorithmic view of the world and of nature (Chaitin 2012, Zenil ms) independent of computational model.
Introductions Lloyd 2010, Lloyd 2007, Seife 2007, Zenil ms
  Show all references
Related categories
Siblings:
12 found
Search inside:
(import / add options)   Sort by:
  1. Francesco Berto & Jacopo Tagliabue (2012). Cellular Automata. Stanford Encyclopedia of Philosophy.
    Cellular automata (henceforth: CA) are discrete, abstract computational systems that have proved useful both as general models of complexity and as more specific representations of non-linear dynamics in a variety of scientific fields. Firstly, CA are (typically) spatially and temporally discrete: they are composed of a finite or denumerable set of homogeneous, simple units, the atoms or cells. At each time unit, the cells instantiate one of a finite set of states. They evolve in parallel at discrete time steps, following (...)
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  2. John Mark Bishop (2003). Dancing with Pixies: Strong Artificial Intelligence and Panpsychism. In John M. Preston & Michael A. Bishop (eds.), Views Into the Chinese Room: New Essays on Searle and Artificial Intelligence. Oxford University Press.
  3. David J. Chalmers (1996). Does a Rock Implement Every Finite-State Automaton? Synthese 108 (3):309-33.
    Hilary Putnam has argued that computational functionalism cannot serve as a foundation for the study of the mind, as every ordinary open physical system implements every finite-state automaton. I argue that Putnam's argument fails, but that it points out the need for a better understanding of the bridge between the theory of computation and the theory of physical systems: the relation of implementation. It also raises questions about the class of automata that can serve as a basis for understanding the (...)
    Remove from this list | Direct download (14 more)  
     
    My bibliography  
     
    Export citation  
  4. Ronald L. Chrisley (1994). Why Everything Doesn't Realize Every Computation. Minds and Machines 4 (4):403-20.
    Some have suggested that there is no fact to the matter as to whether or not a particular physical system relaizes a particular computational description. This suggestion has been taken to imply that computational states are not real, and cannot, for example, provide a foundation for the cognitive sciences. In particular, Putnam has argued that every ordinary open physical system realizes every abstract finite automaton, implying that the fact that a particular computational characterization applies to a physical system does not (...)
    Remove from this list | Direct download (9 more)  
     
    My bibliography  
     
    Export citation  
  5. David Deutsch (1997). The Fabric of Reality. Allan Lane.
    An extraordinary and challenging synthesis of ideas uniting Quantum Theory, and the theories of Computation, Knowledge and Evolution, Deutsch's extraordinary book explores the deep connections between these strands which reveal the fabric ...
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  6. Gordana Dodig-Crnkovic (2008). Knowledge Generation as Natural Computation. Journal of Systemics, Cybernetics and Informatics 6 (2).
    Knowledge generation can be naturalized by adopting computational model of cognition and evolutionary approach. In this framework knowledge is seen as a result of the structuring of input data (data → information → knowledge) by an interactive computational process going on in the agent during the adaptive interplay with the environment, which clearly presents developmental advantage by increasing agent’s ability to cope with the situation dynamics. This paper addresses the mechanism of knowledge generation, a process that may be modeled as (...)
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  7. Gordana Dodig-Crnkovic, Semantics of Information as Interactive Computation. Proceedings of the Fifth International Workshop on Philosophy and Informatics 2008.
    Computers today are not only the calculation tools - they are directly (inter)acting in the physical world which itself may be conceived of as the universal computer (Zuse, Fredkin, Wolfram, Chaitin, Lloyd). In expanding its domains from abstract logical symbol manipulation to physical embedded and networked devices, computing goes beyond Church-Turing limit (Copeland, Siegelman, Burgin, Schachter). Computational processes are distributed, reactive, interactive, agent-based and concurrent. The main criterion of success of computation is not its termination, but the adequacy of its (...)
    Remove from this list |
    Translate to English
    | Direct download  
     
    My bibliography  
     
    Export citation  
  8. Gordana Dodig-Crnkovic (2008). Empirical Modeling and Information Semantics. Mind & Society 7 (2):157.
    This paper investigates the relationship between reality and model, information and truth. It will argue that meaningful data need not be true in order to constitute information. Information to which truth-value cannot be ascribed, partially true information or even false information can lead to an interesting outcome such as technological innovation or scientific breakthrough. In the research process, during the transition between two theoretical frameworks, there is a dynamic mixture of old and new concepts in which truth is not well (...)
    Remove from this list | Direct download (7 more)  
     
    My bibliography  
     
    Export citation  
  9. Gordana Dodig-Crnkovic (2003). Shifting the Paradigm of Philosophy of Science: Philosophy of Information and a New Renaissance. [REVIEW] Minds and Machines 13 (4):521-536.
    Computing is changing the traditional field of Philosophy of Science in a very profound way. First as a methodological tool, computing makes possible ``experimental Philosophy'' which is able to provide practical tests for different philosophical ideas. At the same time the ideal object of investigation of the Philosophy of Science is changing. For a long period of time the ideal science was Physics (e.g., Popper, Carnap, Kuhn, and Chalmers). Now the focus is shifting to the field of Computing/Informatics. There are (...)
    Remove from this list | Direct download (13 more)  
     
    My bibliography  
     
    Export citation  
  10. Amit Hagar & Giuseppe Sergioli, Counting Steps: A Finitist Interpretation of Objective Probability in Physics.
    We propose a new interpretation of objective deterministic chances in statistical physics based on physical computational complexity. This notion applies to a single physical system (be it an experimental set--up in the lab, or a subsystem of the universe), and quantifies (1) the difficulty to realize a physical state given another, (2) the 'distance' (in terms of physical resources) from a physical state to another, and (3) the size of the set of time--complexity functions that are compatible with the physical (...)
    Remove from this list | Direct download  
     
    My bibliography  
     
    Export citation  
  11. Gualtiero Piccinini, Computation in Physical Systems. Stanford Encyclopedia of Philosophy.
  12. Gualtiero Piccinini (2007). Computational Modeling Vs. Computational Explanation: Is Everything a Turing Machine, and Does It Matter to the Philosophy of Mind? Australasian Journal of Philosophy 85 (1):93 – 115.
    According to pancomputationalism, everything is a computing system. In this paper, I distinguish between different varieties of pancomputationalism. I find that although some varieties are more plausible than others, only the strongest variety is relevant to the philosophy of mind, but only the most trivial varieties are true. As a side effect of this exercise, I offer a clarified distinction between computational modelling and computational explanation.<br><br>.
    Remove from this list | Direct download (9 more)  
     
    My bibliography  
     
    Export citation