This article was downloaded by: [New York University], [lily Frank] On: 11 September 2013, At: 09:49 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK AJOB Neuroscience Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/uabn20 Consciousness Is More Complicated Than That: Theoretical Limitations of Interactive Capacity Michał Klincewicz a & Lily Frank b a Berlin School of Mind and Brain , Humboldt Universität Zu Berlin b Icahn School of Medicine at Mount Sinai and The Graduate Center , City University of New York Published online: 11 Sep 2013. To cite this article: Michał Klincewicz & Lily Frank (2013) Consciousness Is More Complicated Than That: Theoretical Limitations of Interactive Capacity, AJOB Neuroscience, 4:4, 38-39 To link to this article: http://dx.doi.org/10.1080/21507740.2013.827758 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the "Content") contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms & Conditions of access and use can be found at http:// www.tandfonline.com/page/terms-and-conditions AJOB Neuroscience Consciousness Is More Complicated Than That: Theoretical Limitations of Interactive Capacity Michał Klincewicz, Berlin School of Mind and Brain, Humboldt Universität Zu Berlin Lily Frank, Icahn School of Medicine at Mount Sinai and The Graduate Center, City University of New York Currently, the minimally conscious state (MCS) is diagnosed based on a set of behavioral criteria, which often fail to distinguish the MCS from the vegetative state (VS). As a remedy for the high rate of misdiagnosis, David Fischer and Robert Truog (2013) recommend using interactive capacity (IC), which is "the ability to receive communicated information . . . and the intentional generation of a coherent response" (30), as diagnostic of MCS. CONSCIOUSNESS OF ENVIRONMENT AND CONSCIOUSNESS OF SELF The difference between VS and MCS is consciousness, which the authors define as "the awareness of one's self and environment" (26). But it is controversial whether these two senses of consciousness-awareness of one's self and awareness of one's environment-can be collapsed into one (Block 1995; Rosenthal 2004). The problem is compounded by the ambiguity of the concept of self-consciousness itself. Self-consciousness can refer, for example, to embarrassment, awareness of one's identity, or the ability to introspect. The sense of self-consciousness relevant to the present discussion is typically characterized as conscious experience. Conscious experiences are those of which we are in some way aware, while unconscious experiences are those of which we are not aware in any way. One may resist this distinction and insist that there is no such thing as an experience of the environment without awareness of that experience. However, the evidence of experience without awareness is compelling (for review see Dehaene and Changeux 2011). In light of the distinction between these two types of consciousness, the authors' proposal should be understood to be about conscious experience, that is, consciousness of oneself in the relevant sense. Consciousness of the environment absent conscious experience may not carry the same kind of moral significance relevant to management of care and end-of-life decisions. Given this, for the remainder of this commentary, we assume that what the authors propose is that IC is a sufficient condition for conscious experience. Thus clarified, the positive proposal of the article becomes suspect. Address correspondence to Michał Klincewicz, Humboldt Universität Zu Berlin, Berlin School of Mind and Brain, Luisenstrasse 56, Haus 1, Room 317, 10117, Berlin, Germany. E-mail: michal.klincewicz@gmail.com NEUROSCIENTIFIC ADVANCES AND THE NEED FOR INTERACTIVE CAPACITY The evidence for IC as a sufficient condition for conscious experience comes from a number of recent neuroimaging studies of patients diagnosed as in VS (for review see Owen 2013). For example, one patient diagnosed to be in VS by behavioral criteria was shown to nonetheless intentionally generate objectively measurable neural responses that indicated the patient understood the experimenter's instructions to imagine oneself playing tennis (Owen et al. 2006). This kind of evidence is very compelling and suggests that advances in brain imagining technology can help better distinguish patients in VS from patients in MCS. But it is not obvious how it can help establish IC as a diagnostic threshold of conscious experience. According to the authors, patients who are in MCS1, MCS2, and MCS3 are all conscious in the relevant sense. In MSC1, consciousness is undetectable using either behavioral measures or current technology, which means that these patients are "diagnostically indistinguishable" from patients in VS. According to the authors, in order to pick these patients out and delineate them from VS patients, clinicians may be tempted to use more sensitive technological methods to detect consciousness. As the authors note, this approach is doubly problematic. On the one hand, there is a risk of mistaking neural activity indicative of consciousness of the environment, which can occur without conscious experience, for signs of conscious experience, thus overdiagnosing patients in VS as minimally conscious. On the other hand, this approach falls short of definitively indicating conscious experience, since even more advanced and sensitive technology may miss the relevant neural activity. The authors' proposal aims to avoid these problems. LIMITATIONS OF INTERACTIVE CAPACITY Unfortunately, the two given reasons for adopting IC as a sufficient condition for conscious experience are not convincing. The first reason given is that IC is consistent with the traditionally used behavioral criteria and would thereby be easily integrated with current practice thus facilitating the transition for clinicians and the public. The high 38 ajob Neuroscience October–December, Volume 4, Number 4, 2013 D ow nl oa de d by [ N ew Y or k U ni ve rs ity ], [ lil y Fr an k] a t 0 9: 49 1 1 Se pt em be r 20 13 Conscientious of the Conscious rate of misdiagnosis under the behavioral criteria makes it puzzling why being consistent with it would be a good thing. The authors acknowledge the high rate of misdiagnosis and seem to think that their proposal can help reduce this percentage. In light of all this, the first offered reason for accepting IC as an indirect marker of consciousness of oneself is somewhat self-defeating and not very convincing. The second reason that the authors give is that there is a "general agreement that interactive capacity . . . is a reliable indicator of consciousness" (30). This claim is doubly misleading. First, reliable indicators are not the same as sufficient conditions. Second, there is a good deal of disagreement about what reliably indicates conscious experience. Something is a reliable indicator of something else only if it is highly correlated with it. Smoke, for example, is a reliable indicator of fire, but it is neither a sufficient or necessary condition for it. There are fires that produce no smoke and ways to make smoke without fire. Perhaps this is why smoke is not used as a diagnostic of fire. What the authors seem to have in mind is the stronger claim that IC is indeed sufficient for conscious experience, not merely a reliable indicator of it. Treating IC as a diagnostic threshold of conscious experience would require as much. But let us assume that what the authors have in mind is the weaker claim, that IC is a reliable indicator and not a sufficient condition. Can they now better support their proposal? Arguably, no, and this is because contrary to what the authors suggest, there seems to be little agreement in the empirical or philosophical literature on what reliably correlates with conscious experience. The only generally agreedupon reliable indicators of conscious experience seem to be subjective verbal reports (Seth et al. 2008). Given the state of the field, even if IC can serve as a reliable indicator of conscious experience, other measures could do just as well. Indeed, different neurobiological accounts of consciousness would recommend different neural correlates. Among these we could count activity in the prefrontal cortex (Lau and Passingham 2006), recurrent activity (Lamme and Roelfsema 2000), global accessibility (Baars 2005), or specific kinds of neuronal oscillations (Crick and Koch 1990). What the authors need to support their recommendation of IC as a diagnostic marker of conscious experience is a theory-independent reason to believe that "the ability to receive communicated information . . . and the intentional generation of a coherent response" is as good as or better than any of the other indicators of consciousness. Without such a reason, IC should not be adopted as a diagnostic threshold of self-consciousness over any of these other options. In addition, adopting IC as the marker of consciousness does not address the problem of how to treat patients who are in the MCS1 category, that is, those who "retain an interactive capacity that is undetectable by current technologies or entirely lack the capacity to interact" (30). This puts in doubt the entire enterprise of the article. If IC is undetectable under certain conditions, what reason do we have to adopt it as a diagnostic threshold that would weigh in on end-of-life decisions? To play this role, it seems, IC would have to be a necessary, and not merely a sufficient, condition for conscious experience. And it would certainly have to be more than a reliable indicator. INTERACTIVE CAPACITY AS ONE OF THE RELIABLE INDICATORS OF MCS? The arguments presented by Fischer and Truog for adopting IC as a diagnostic of MCS are too weak to be convincing. However, the empirical results that they cite in support of their view are compelling and suggest what might be a better way of distinguishing patients in VS and MCS. The neural correlates of IC and related capacities would certainly be useful as a part of a pluralistic approach that takes into account a variety of signatures of conscious experience discussed in the neuroscience literature. The techniques used in neuroscience can provide a suite of reliable indicators of conscious experience that, in sum, can serve as a diagnostic threshold of MCS. REFERENCES Baars, B. J. 2005. Global workspace theory of consciousness: Toward a cognitive neuroscience of human experience. Progress in Brain Research 150: 45–53. Block, N. 1995. On a confusion about a function of consciousness. Behavioral and Brain Sciences 18(2): 227–287. Crick, F., and C. Koch. 1990. Towards a neurobiological theory of consciousness. Seminars in the Neurosciences 2: 263–275. Dehaene, S., and J.-P. Changeux. 2011. Experimental and theoretical approaches to conscious processing. Neuron 70(2): 200–227. Fischer, D. B., and R. D. Truog. 2013. Conscientious of the conscious: Interactive capacity as a threshold marker for consciousness. AJOB Neuroscience 4(4): 26–33. Lamme, V. A. F., and P. R. Roelfsema. 2000. The distinct modes of vision offered by feedforward and recurrent processing. Trends in Neurosciences 23(11): 571–579. Lau, H. C., and R. E. Passingham. 2006. Relative blindsight in normal observers and the neural correlate of visual consciousness. Proceedings of the National Academy of Sciences USA 103(49): 18763–18768. Owen, A. M. 2013. Detecting consciousness: A unique role for neuroimaging. Annual Review of Psychology 64: 109– 133. Owen, A. M., M. R. Coleman, M. Boly, M. H. Davis, S. Laureys, and J. D. Pickard. 2006. Detecting awareness in the vegetative state. Science 313(5792): 1402–1402. Rosenthal, D. M. 2004. Varieties of higher-order theory. Advances in Consciousness Research 56: 17–44. Seth, A. K., Z. Dienes, A. Cleeremans, M. Overgaard, and L. Pessoa. 2008. Measuring consciousness: Relating behavioural and neurophysiological approaches. Trends in Cognitive Sciences 12(8): 314–321. October–December, Volume 4, Number 4, 2013 ajob Neuroscience 39 D ow nl oa de d by [ N ew Y or k U ni ve rs ity ], [ lil y Fr an k] a t 0 9: 49 1 1 Se pt em be r 20