David Bourget (Western Ontario)
David Chalmers (ANU, NYU)
Rafael De Clercq
Ezio Di Nucci
Jonathan Jenkins Ichikawa
Jack Alan Reynolds
Learn more about PhilPapers
Philosophical Psychology 10 (4):437-49 (1997)
The dominant position in the field of artificial intelligence (AI) is computationalism where the operative principle is that cognition in general and consciousness in particular can be captured by identification of the proper set of computations. This position has been attacked from several angles, most effectively, in my opinion, by John Searle in his now famous Chinese Room thought experiment. I critique this Searlean perspective on the grounds that, while it is probably correct in its essentials, it does not go far enough. Quite simply, it runs afoul of the problem of emergentism. The proffered solution to this problem is that consciousness (or very rudimentary forms of it) needs to be viewed as an inherent property of organic form. While this recasting of the problem solves the emergentist dilemma it opens up a number of other issues. However, the new problems, unlike the old, appear in principle to be amenable to scientific analysis.
|Keywords||Artificial Intelligence Computation Consciousness Science Searle, J|
|Categories||categorize this paper)|
Setup an account with your affiliations in order to access resources via your University's proxy server
Configure custom proxy (use this if your affiliation does not provide a proxy)
|Through your library|
References found in this work BETA
Roger Penrose (1989). The Emperor's New Mind. Oxford University Press.
David J. Chalmers (1996). The Conscious Mind: In Search of a Fundamental Theory. Oxford University Press.
Thomas Nagel (1974). What is It Like to Be a Bat? Philosophical Review 83 (October):435-50.
John R. Searle (1980). Minds, Brains and Programs. Behavioral and Brain Sciences 3 (3):417-57.
Citations of this work BETA
No citations found.
Similar books and articles
John Mark Bishop (2003). Dancing with Pixies: Strong Artificial Intelligence and Panpsychism. In John M. Preston & Michael A. Bishop (eds.), Views Into the Chinese Room: New Essays on Searle and Artificial Intelligence. Oxford University Press
Larry Hauser (1997). Searle's Chinese Box: Debunking the Chinese Room Argument. [REVIEW] Minds and Machines 7 (2):199-226.
David J. Chalmers (1994). On Implementing a Computation. Minds and Machines 4 (4):391-402.
Dale Jacquette (1990). Fear and Loathing (and Other Intentional States) in Searle's Chinese Room. Philosophical Psychology 3 (2 & 3):287-304.
Mahesh Ananth (2010). The Scientific Study of Consciousness: Searle’s Radical Request. Psyche 16 (2):59-89.
Murat Aydede & Guven Guzeldere (2000). Consciousness, Intentionality, and Intelligence: Some Foundational Issues for Artificial Intelligence. Journal of Experimental and Theoretical Artificial Intelligence 12 (3):263-277.
Larry Hauser (2003). Nixin' Goes to China. In John M. Preston & John Mark Bishop (eds.), Views Into the Chinese Room: New Essays on Searle and Artificial Intelligence. Oxford University Press 123--143.
John M. Preston & John Mark Bishop (eds.) (2002). Views Into the Chinese Room: New Essays on Searle and Artificial Intelligence. Oxford University Press.
Kevin Warwick (2002). Alien Encounters. In John M. Preston & John Mark Bishop (eds.), Views Into the Chinese Room: New Essays on Searle and Artificial Intelligence. Oxford: Clarendon Press 308.
Roger Penrose (2003). Consciousness, Computation, and the Chinese Room. In John M. Preston & Michael A. Bishop (eds.), Views Into the Chinese Room: New Essays on Searle and Artificial Intelligence. Oxford University Press
Added to index2009-01-28
Total downloads30 ( #143,249 of 1,939,032 )
Recent downloads (6 months)6 ( #99,917 of 1,939,032 )
How can I increase my downloads?