Abstract
``Neural computing'' is a research field based on perceiving the human brain as an information system. This system reads its input continuously via the different senses, encodes data into various biophysical variables such as membrane potentials or neural firing rates, stores information using different kinds of memories (e.g., short-term memory, long-term memory, associative memory), performs some operations called ``computation'', and outputs onto various channels, including motor control commands, decisions, thoughts, and feelings. We show a natural model of neural computing that gives rise to hyper-computation. Rigorous mathematical analysis is applied, explicating our model's exact computational power and how it changes with the change of parameters. Our analog neural network allows for supra-Turing power while keeping track of computational constraints, and thus embeds a possible answer to the superiority of the biological intelligence within the framework of classical computer science. We further propose it as standard in the field of analog computation, functioning in a role similar to that of the universal Turing machine in digital computation. In particular an analog of the Church-Turing thesis of digital computation is stated where the neural network takes place of the Turing machine.
- Balcázar, J.L., Gavaldà, R. and Siegelmann, H.T. (1997), 'Computational Power of Neural Networks: A Characterization in Terms of Kolmogorov Complexity', IEEE Transactions on Information Theory 43(4), pp. 1175-1183. Google ScholarDigital Library
- Blum, L., Shub, M. and Smale, S. (1989), 'On a Theory of Computation and Complexity Over the Real Numbers: NP Completeness, Recursive Functions, and Universal Machines,' Bull. A.M.S. 21, pp. 1-46.Google Scholar
- Copeland, B.J. (2000), 'Narrow Versus Wide Mechanism', Journal of Philosophy 97, pp. 5-32.Google Scholar
- Copeland, B.J. and Proudfoot, D. (1999), 'Alan Turing's Forgotten Ideas in Computer Science', Scientific American 280, pp. 99-103.Google ScholarCross Ref
- Hopfield, J.J. and Tank, D.W. (1985), 'Neural Computation of Decisions in Optimization Problems', Biological Cybernetics 52, pp. 141-152.Google ScholarCross Ref
- Karp, R.M. and Lipton, R. (1982), 'Turing Machines That Take Advice', Enseignment Mathematique 28, pp. 191-209.Google Scholar
- Kay, L.M., Lancaster, L.R. and Freeman, W.J. (1996), 'Reafference and Attractors in the Olfactory System During Odor Recognition', International Journal of Neural Systems 7(4), pp. 489-495.Google ScholarCross Ref
- Koch, C. and Crick, F.C. (2000), in M.S. Gazzaniga, ed., Some Thoughts on Consciousness and Neuroscience The Cognitive Neurosciences, 2nd edition, MIT Press, Cambridge, MA, pp. 1285- 1294.Google Scholar
- Maass, W. (1996), 'Networks of Spiking Neurons: The Third Generation of Neural Network Models', Electronic Colloquium on Computational Complexity (ECCC) 3 (031).Google Scholar
- Nyce, J. (1992), 'Analogy or Identity: Brain and Machine' at the Macy Conferences on Cybernetics SIGBIO Newsletter: Published by the Association for Computing Machinery, Special Interest Group on Biomedial Computing 12, pp. 32-37. Google ScholarDigital Library
- Orponen, P. (1997), 'A Survey of Continuous-Time Computation Theory,' in D.-Z. Du and K.-I Ko, eds, Advances in Algorithms, Languages, and Complexity, Dordrecht: Kluwer Academic Publishers, pp. 209-224.Google Scholar
- Penrose, R. (1989), The Emperor's New Mind, Oxford: Oxford University Press.Google Scholar
- Pour-El, M.B. (1974), 'Abstract Computability and its Relation to the General Purpose Analog Computer (Some Connections Between Logic, Differential Equations and Analog Computers)', Transactions of the American Mathematical Society 199, pp. 1-29.Google ScholarCross Ref
- Rumelhart, D.E., Hinton, G.E. and Williams, R.J. (1986), 'Learning Representations by Back-Propagating Errors', Nature 323, pp. 533-536.Google ScholarCross Ref
- Shannon, C.E. (1941), 'Mathematical Theory of the Differential Analyzer', Journal of Mathematics and Physics of the Massachusetts Institute of Technology 20, pp. 337-354.Google ScholarCross Ref
- Siegelmann, H.T. (1995), 'Computation Beyond the Turing Limit', Science 238(28), pp. 632-637.Google Scholar
- Siegelmann, H.T. (1998), Neural Networks and Analog Computation: Beyond the Turing Limit, Boston MA: Birkhauser. Google ScholarDigital Library
- Siegelmann, H.T. (1999), 'Stochastic Analog Networks and Computational Complexity', Journal of Complexity 15(4), pp. 451-475. Google ScholarDigital Library
- Siegelmann, H.T. (2002), 'Neural Automata and Analog Computational Complexity', in M.A. Arbib, ed. 2nd edition, The Handbook of Brain Theory and Neural Networks, Cambridge, MA: MIT Press, in press.Google Scholar
- Siegelmann, H.T., Ben-Hur, A. and Fishman, S. (1999), 'Computational Complexity for continuous Time Dynamics,' Physical Review Letters 83(7), pp. 1463-1466 (Full version to appear in Journal of Complexity).Google ScholarCross Ref
- Siegelmann H.T. and Fishman S. (1998), 'Computation by Dynamical Systems,' Physica D 120, pp. 214-235 Google ScholarDigital Library
- Siegelmann, H.T. and Sontag, E.D. (1994), 'Analog Computation via Neural Networks,' Theoretical Computer Science 131, pp. 311-360. Google ScholarDigital Library
- Siegelmann, H.T. and Sontag, E.D. (1995), 'Computational Power of Neural Networks,' Journal of Computer System Sciences 50(1), pp. 132-150. Google ScholarDigital Library
- von Neumann, J. (1958), The Computer and the Brain, New Haven: Yale University Press. Google ScholarDigital Library
Index Terms
- Neural and Super-Turing Computing
Recommendations
Neural bistability and amplification mediated by NMDA receptors: Analysis of stationary equations
The macroscopic current/voltage relationship of NMDA receptor ion channels is nonmonotonic under physiological conditions, which can give rise to bistable and amplifying/facilitatory behavior in neurons and neural structures, supporting significant ...
Modeling stress-induced adaptations in Ca2+ dynamics
A hippocampal stress response is mediated by the glucocorticoid and mineralocorticoid receptors and involves primarily delayed changes in hippocampal neuronal properties. In this study, we concentrate on stress-induced effects in CA1 neurons which ...
Modulation of excitability in CA1 pyramidal neurons via the interplay of entorhinal cortex and CA3 inputs
Hippocampal CA1 pyramidal neurons receive extrahippocampal and intrahippocampal inputs. The Schaffer collateral (SC) pathway, projecting to the stratum radiatum of the CA1 field, serves as the primary excitatory input to CA1 cells. The temporoammonic ...
Comments