Dear Jacques,
No, I cannot be confusing functionalism with
behaviourism. (I am a bit surprised how unfamiliar a professional philosophy
forum is on functionalism and its problems!) My definition of functionalism was
a readily available one written by someone sympathetic to functionalism.
Behaviourism says that if you know
enough neurophysiology you can give a complete description of the workings of
Jim's mind without needing to know anything about what Jim feels: 'mental content' is irrelevant or even 'non-existent'. Behaviousrism is probably valid, but impotent because without the heuristic clues we get from assuming that Jim
feels something like the way we do, the job is too hard. The problem is that if
we misinterpret these heuristic clues, as I think functionalism does, we end up
with a worse stalemate.
Your account of Bob at 2.00 to 2.01 is fairly close to
functionalism, although functionalism states that Bob's mental content is
determined by the role in the world of the entire causal sequence from Q to A.
It also allows for several mental states between 2.00 and 2.01 the function in
the world of some of which might just be to send certain ideas to memory. However, the
absurdity that you instantly recognized comes out the same if you spell out
exactly what functionalism has to entail. As long as you do not spell it out it
sounds reasonable, but if you do, down to the last physical interaction
(literally, the output) it crashes. You seem to agree that functionalism is impossible!
The alternative position you suggest that we call
functionalism must be the right one, but it cannot be functionalism. 'Function'
can have three meanings: internal action, external effect or purpose. Forget
purpose and Dennett's intentional stance. If we disallow output we disallow
effect, so we are left with internal action. However, functionalism, as I
understand it, was deliberately designed to contrast with a 'reductionism' that relates
experience to specific physical action. It holds that function is multiply
realizable: the internal states can be anything. All that matters is that they
have the right role in the world. (I think this dates back to Putnam's ideas of
meaning being external; others may correct me.) As indicated above, even if we
think of 'role in a microcosm such as a brain' we have something defined by
effects external to the bit with the role to be assigned a mental state. Spell
out the details and you get an infinite regress.
Information associated with a physical event is local to
the event. It does not 'carry over' in a chain in such a way that you could add
all the little actions together to be the content of consciousness. (Whatever you got would be causally/computationally meaningless.) 'Functions'
of chains of events have to be considered not as actions but in terms of causal significance to
distant events - which is effect and implies output. You cannot eat your cake
and have it on this one as far as I can see. The various meanings of 'function'
are constantly conflated in all branches of science and I suspect functionalism
is the prime example.