From PhilPapers forum Philosophy of Mind:

2016-11-13
RoboMary in free fall
Hi Jo, 
Could you help me out then and point out the ambiguity you are finding with the relational categories that I was mentioning, perhaps starting with the first (sorry I thought it was clear, but at least we are making progress) :

Category 1 (Contextual Relation): The conscious experience relates to what the underlying (which could be a dynamic interaction of forces) represents given the context. So using the example in the original post the robot would consciously experience red in the first room, and blue in the second, and a switch between red and blue in the third, because that is what processing of the 255.0.0 signal represented in each of those contexts. Same signal in all three cases, and same processing, but a different representation based on context.
Regarding the robot passing the Turing Test as I had written just above it (http://philpapers.org/post/23426):

You wrote earlier in http://philpapers.org/post/23194:

"The problem with dynamic units within a robot experiencing like those in us is not really one of analogue and digital. That is a distinction that only really applies to our devices. The more relevant issue is the number of degrees of freedom. Someone with red-green colour blindness has one less degree of freedom to their experience than the rest of us. That is something we can show to fit the empirical evidence best. Whether red-green colour blind people see red as red or brown is I think an unknowable."
Would the passing the Turing Test not indicate that the robot had the number of degrees of freedom? I thought it would, and that is why I had written:

So imagine the processing in the Mark 19 used logic gates, and could pass the Turing Test, and make distinctions at least as good as a human (which would presumably require it to have at least the same number of degrees of freedom). Would you be considering it possible that it could have conscious experiences of trees and cars etc., in a similar way to you?
But now you have stated  http://philpapers.org/post/23426 that:

... questions about whether some system will host somewhere inside it an experience with certain similarities to ours has nothing to do with its 'intelligence' or behavioural characteristics.   
So what has it to do with (I had thought that behavioural characteristics equated to the dynamic relation of the forces)?

Yours sincerely, 

Glenn