Skip to main content
Log in

Computational Theories of Conscious Experience: Between a Rock and a Hard Place

  • Original Article
  • Published:
Erkenntnis Aims and scope Submit manuscript

Abstract

Very plausibly, nothing can be a genuine computing system unless it meets an input-sensitivity requirement. Otherwise all sorts of objects, such as rocks or pails of water, can count as performing computations, even such as might suffice for mentality—thus threatening computationalism about the mind with panpsychism. Maudlin in J Philos 86:407–432, (1989) and Bishop (2002a, b) have argued, however, that such a requirement creates difficulties for computationalism about conscious experience, putting it in conflict with the very intuitive thesis that conscious experience supervenes on physical activity. Klein in Synthese 165:141–153, (2008) proposes a way for computationalists about experience to avoid panpsychism while still respecting the supervenience of experience on activity. I argue that his attempt to save computational theories of experience from Maudlin’s and Bishop’s critique fails.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. This assumes that the machinery that is called upon for the run on (I) is physically distinct from the machinery that would be called upon otherwise, so that the latter can be removed without removing the former. But while such a scenario might be a bit unusual in actual computing machines, it would be very odd and ad hoc for computationalists to say that their theory depends upon the absence of such a scenario.

  2. Thus Pylyshyn (1984: 55): “Very few of the physically discriminable properties of [a] machine are relevant to its computational function…. In fact, any variations in physical properties of distinct components, other than those few properties to which other designated components are meant to react in particular specified ways, can be said to be irrelevant to the machine’s operation as a computer.”

  3. Bishop (2002a, b) adds that since we can turn R 1 into R 2 by gradually deleting unused execution traces, qualia must fade over the series of systems between R 1 and R 2. Yet this, he implies, is the very result that Chalmers deplores in his own ‘fading qualia’ argument (Chalmers 1995). However, as Chalmers has countered (on his website), what he denies is that a system’s qualia can fade while its functional organization is held fixed. R 1 and R 2 differ in their functional organization, so Chalmers is free to hold that the removal of the unactivated states does matter.

  4. Computationalists might deny this on the grounds that Klara has the wrong kind of functional architecture. They may say that experience requires not just that certain computations be performed, but that they be performed in a certain way on a certain sort of architecture. The brain’s architecture is surely nothing like a Turing machine!

    However, ploys of episodic instantiation are not restricted to Turing machines. (Maudlin used them in the interests of generality, but overlooked the limitations of their architecture.) All that needs to be shown, as I am about to explain, is that the counterfactuals supporting a wide range of a system’s responses to a certain class of input can be falsified, while leaving untouched the machinery necessary for execution on one member of that input class. There is no reason to think that this sort of manipulation is only possible in non-sentient computing systems.

  5. Barnes (1991) holds that Olympia lacks φ even when the Klara copies are present, because the input τ is not an ‘active cause’ of her activity, and hence she is not computing π(τ) at all. I have two remarks about this. Firstly, Barnes bases his argument on a dubious analogy between cognition and computation. While it is plausible that cognizing an object requires active causation by the object, that does not entail that computing a function on an input requires active causation by the input. And secondly, in any case, there are ploys of episodic instantiation in which the operations of the second system are actively caused by the input, as with Bishop’s R 2.

  6. Hardcastle (1993) argues that while Olympia (with the Klara copies) may have experience φ, being a conscious system “requires more than simply exhibiting a subset of the possible phenomenal experiences” (¶25). Hardcastle’s position is puzzling, however, for a computational theory will first and foremost be a theory of conscious states. A conscious system plausibly just is a system with conscious states, and Hardcastle effectively admits that Maudlin has identified a problem for such a theory. That is the kind of theory with which I am concerned.

  7. Similarly, both Chalmers (1996) and Copeland (1996) observe that since a wild instantiation is a singular event, picked out post hoc, its computation-mimicking activity is underwritten only by material rather than counterfactual conditionals—thus we may say that it is not disposed to undergo the same activity again under similar conditions.

  8. At one point he hints that episodic implementations that produce experience will require some dispositions to manifest repeatedly. This suggests the idea that a wild instantiation of such an implementation would be impossible because wild instantiations are too unstable to manifest the same disposition repeatedly. However, I do not think that Klein actually adopts this view. It is not a plausible view. Even if we accept that experiences entail computations in which some dispositions manifest repeatedly, there is no reason to think that such computations cannot be mimicked by wild instantiations—only that such wild instantiations will be hugely unlikely.

  9. Another source of uncertainty is whether Klein thinks that wild instantiations possess the appropriate dispositions very fleetingly, or that they do not possess those dispositions at all. In the paper his emphasis on the importance of standing or stable dispositions implies the former, but in personal communication he asserts the latter.

  10. Indeed, in personal communication Klein says he prefers to say that they are not part of Olympia, and therefore that Olympia is not computing because the central unit on its own lacks a disposition that supports the relevant counterfactuals. (I will remark, in passing, that this seems not to fit with the episodic account of implementation. Surely Olympia, construed just as the central unit, episodically implements π(τ)? After all, the point of the account is that it jettisons the need for all the counterfactuals associated with π to be supported).

  11. Later in the footnote about Olympia, Klein seems to acknowledge the very distinction—between dispositions and activity—that I am claiming he wishes to elide. He says that if the Klara copies are part of Olympia, then since interfering with them interferes with Olympia’s dispositions, there can be no change in her computational status “without a change in either dispositional structure or actual activity” (p. 150n, my emphasis). However, this cannot be right, for it contradicts his assertion in the main text that computational activity supervenes on actual activity. In personal communication Klein confirms the error: the footnote text should read ‘both dispositional structure and actual activity’.

  12. While my own notion of physical activity entails that it is multiply realizable, it is not functionalist or computationalist. Rather, I have in mind an intrinsic characterization of activity (cf. the intrinsic structural properties suggested by Pereboom 2002). For example, the physical activity of ‘flexing’ would be realizable by rubber hoses, metal paper clips, tree branches, and many other things—but it would nevertheless be characterized intrinsically, in reference to certain structural properties that all of these items possess at the time. Of course, I do not claim to know how certain kinds of physical activity would be able to produce conscious experience. But that can hardly be held against the idea, since no one knows how any particular non-mental property produces experience.

  13. Antony’s argument is similar to Maudlin’s and Bishop’s, but targets functionalism rather than computationalism. As such, it faces more difficulty, for activity plays a less prominent role in functionalism than in computationalism. I believe that Antony’s argument begs the question against the functionalist. In another paper ([unpublished MS, in progress), I attempt to revise his argument in a way that eliminates its dependence on the activity thesis.

  14. Someone might argue that all the neurons in the brain are actively involved in producing one’s experience all the time because even activity that does not rise to the level of an action potential still contributes to experience in some way. Perhaps this is true. But even if it is, it doesn’t help CTE. CTE claims that no activity at all is needed in order for a neuron to play a part in supporting experience—for according to CTE, experience is (at least partly) a relational phenomenon. All it takes is for a neuron to stand in a certain abstractly-defined relation to some neurons that are active. So CTE can’t look for help from the ‘pan-activist’ view just adumbrated.

  15. Thanks to Colin Klein for drawing my attention to the Wada test.

References

  • Antony, M. V. (1994). Against functionalist theories of consciousness. Mind and Language, 9, 105–123.

    Article  Google Scholar 

  • Barnes, E. (1991). The causal history of computational activity: Maudlin and Olympia. The Journal of Philosophy, 88, 304–316.

    Article  Google Scholar 

  • Bartlett, G. (in progress). A neglected argument against functionalist theories of experience.

  • Bishop, M. (2002a). Dancing with pixies: Strong artificial intelligence and panpsychism. In J. Preston & M. Bishop (Eds.), Views into the Chinese room: New essays on Searle and artificial intelligence (pp. 360–378). Oxford: Clarendon Press.

    Google Scholar 

  • Bishop, M. (2002b). Counterfactuals cannot count: A rejoinder to David Chalmers. Consciousness and Cognition, 11, 642–652.

    Article  Google Scholar 

  • Bishop, M. (2009). A cognitive computation fallacy? Cognition, computations and panpsychism. Cognitive Computation, 1, 221–233.

    Article  Google Scholar 

  • Block, N. (1978). Troubles with functionalism. In C. W. Savage (Ed.), Minnesota studies in the philosophy of science, vol. 9: Perception and cognition (pp. 261–325). Minneapolis: University of Minnesota Press.

    Google Scholar 

  • Chalmers, D. J. (1995). Absent qualia, fading qualia, dancing qualia. In T. Metzinger (Ed.), Conscious experience (pp. 309–330). Schöningh: Imprint Academic.

    Google Scholar 

  • Chalmers, D. J. (1996). Does a rock implement every finite-state automaton? Synthese, 108, 309–333.

    Article  Google Scholar 

  • Chalmers, D. J. (n.d.). Responses to articles on my work: Mark Bishop. Retrieved June 11, 2008, from http://consc.net/responses.html#bishop.

  • Chrisley, R. L. (1994). Why everything doesn’t realise every computation. Minds and Machines, 4, 403–420.

    Article  Google Scholar 

  • Chrisley, R. L. (2006). Counterfactual computational vehicles of consciousness. Paper presented at ‘Toward a Science of Consciousness 2006’, Tucson, April 7th. Powerpoint retrieved June 11, 2009, from http://e-asterisk.blogspot.com/2006/05/counterfactual-computational-vehicles.html.

  • Copeland, B. J. (1996). What is computation? Synthese, 108, 335–359.

    Article  Google Scholar 

  • Dretske, F. I. (1981). Knowledge and the flow of information. Cambridge, MA: M.I.T. Press.

    Google Scholar 

  • Hardcastle, V. G. (1993). Conscious computations. The Electronic Journal of Analytic Philosophy, 1. Retrieved July 23, 2008, from http://ejap.louisiana.edu/EJAP/1993.august/hardcastle.html.

  • Klein, C. (2008). Dispositional implementation solves the superfluous structure problem. Synthese, 165, 141–153.

    Article  Google Scholar 

  • Lycan, W. L. (1987). Consciousness. Cambridge, MA: The MIT Press.

    Google Scholar 

  • Maudlin, T. (1989). Computation and consciousness. The Journal of Philosophy, 86, 407–432.

    Article  Google Scholar 

  • Pereboom, D. (2002). Robust nonreductive materialism. The Journal of Philosophy, 99, 499–531.

    Article  Google Scholar 

  • Piccinini, G. (2008). Computers. Pacific Philosophical Quarterly, 89, 32–73.

    Article  Google Scholar 

  • Putnam, H. (1988). Representation and reality. Cambridge, MA: The MIT Press.

    Google Scholar 

  • Pylyshyn, Z. W. (1984). Computation and cognition: toward a foundation for cognitive science. Cambridge, MA: MIT Press.

    Google Scholar 

  • Searle, J. (1980). Minds, brains, and programs. Behavioral and Brain Sciences, 3, 417–424.

    Article  Google Scholar 

  • Searle, J. (1992). The rediscovery of the mind. Cambridge, MA: The MIT Press.

    Google Scholar 

Download references

Acknowledgments

Thanks to Tim Maudlin, Colin Klein, and an anonymous referee for very helpful comments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gary Bartlett.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Bartlett, G. Computational Theories of Conscious Experience: Between a Rock and a Hard Place. Erkenn 76, 195–209 (2012). https://doi.org/10.1007/s10670-011-9325-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10670-011-9325-8

Keywords

Navigation