From PhilPapers forum Philosophy of Mind:

2009-06-14
The 'Explanatory Gap'
Reply to Arnold Trehub

Hi Arnold.  Your theory on consciousness, as far as I can tell, follows today's scientific paradigm of mind.  Your views seem to follow Chalmers' catagory #1 above (post 2):
(1) There's no explanatory gap, or one that's fairly easily closable. 

Your theory assumes that the phenomenon of consciousness is an emergent phenomenon as you state here:

AT: ... separate brain cells (things) do not individually realize consciousness, rather it is the collective activity of such cells when they are organized into the right kind of brain mechanism (e.g., the retinoid mechanism) that realizes the property of consciousness.

Further, you may be supporting strong AI here:

AT: For example, transistors, capacitors, resistors, and inductors are just "things". But when they are organized into the right kind of mechanism they constitute a radio receiver, something that realizes a property that the separate components cannot realize.

By supporting mental causation, I believe one is also forced to support downward causation.  I’d be curious to see how you might maneuver around this.  Theories of how neurons and transistors function are built on the premise that such things change state because of local causal forces acting on them.  In the case of a transistor, there is a charge applied which forces it to change state.  In the case of a neuron, the interactions at the synapse cause it to fire.  I’m sure that description can be elaborated, but let’s not argue details.  The point is that there can’t be a single neuron, nor a single transistor that is actually affected by anything but the local causal forces acting on it, and those local causal forces are sufficient to explain the behavior of the system.  And if we agree that this is true, then there can’t be any set of neurons or transistors that are affected by anything except the local causal forces acting on each of the neurons or transistors. 

To suggest that the emergent phenomenon of consciousness somehow influences some portion of the brain, or makes the entire brain change state requires that one theorize that these local causal forces are insufficient to describe what that system does.  In other words, the emergent phenomenon of consciousness must intervene and cause a neuron or transistor to do something which is not determined by these local causal forces. 

Certainly, we know there are no such nonlocal causal forces acting on a transistor for example.  None of the transistors in a computational system, regardless of its complexity, changes state except because of these local causal forces.  If nonlocal causal forces can not act on any of the transistors, if downward causation is not allowed, then there is no room for this emergent phenomenon of consciousness to enter the causal chain, either for transistors or neurons.  The cause of neurons firing is fully explained by examining the physical interactions, and these physical interactions are not influenced by emergent, phenomenal properties.

What is more interesting however, is the claim that mental causation doesn’t enter the causal chain, but it is nevertheless reliably reported.  I would argue that this is also incorrect for exactly the same reason.  If the emergent phenomenon of consciousness can’t change any single transistor, and can’t force a neuron to fire, then those transistors and neurons are not able to report any phenomenal experience either!  They don’t fire because they are reporting an emergent phenomenon such as the experience of pain.  They fire because there are local causal forces acting on them.  There is no room in the physical world (at a classical level) for transistors or neurons to change state because the system they reside within has produced any allegedly emergent phenomenon.  Please note, this only applies to classical interactions and not quantum ones.  Quantum mechanical systems such as molecules, are holistic and can not be broken down using boundary conditions as we do classical mechanical systems.  The paradigm I’m using here only works at a classical level.

How can one claim mental causation doesn’t exist, yet the emergent phenomenon is reporting something about its existance?  The only rebuttal I see is that these two things are purely coincidental.  The report of pain accidentally occurs at the same time the emergent phenomenon is occurring.  The report of pain isn’t caused by the emergent phenomenon, it is reported because of local causal actions acting on neurons.  The experience could be anything.  It could be an orgasm or it could be nothing (p-zombie), but the report of pain and simultaneous behavior could not change.  Imagine for one moment, that instead of pain, we felt an orgasm when sticking our hand in a pot of boiling water.  The neurons change state not because we are in pain or having an orgasm, they change state because of local interactions, and these local interactions are utterly oblivious to, and independent from, any phenomenal experience that might emerge.  It is the local, causal influences of neurons that report the pain and produce the aversion to it, not the allegedly emergent phenomenal experience which is reporting anything about itself.  The overall physical state can not intervene at the local level. 

The claim that an experience is being reliably reported must address the same problem that mental causation must address – how is the emergent phenomenon able to intervene on the local, physical level.  The emergent property can’t influence anything physical as long as there are local, causal actions on which all physical interactions can be pinned.  That said, I see no way for emergent mental causation to intervene in any classical mechanical system, so computationalism does not allow experience to be reliably reported.  What I would be interested in, is how your theory (or any computationalist theory for that matter) can allow for mental causation, and also how any such theory can allow for experience to be reliably reported (ie: the report of an experience is a true report that is dependant on the overall emergent phenomenon).