Our perception of where touch occurs on our skin shapes our interactions with the world. Most accounts of cutaneous localisation emphasise spatial transformations from a skin-based reference frame into body-centred and external egocentric coordinates. We investigated another possible method of tactile localisation based on an intrinsic perception of ‘skin space’. The arrangement of cutaneous receptive fields (RFs) could allow one to track a stimulus as it moves across the skin, similarly to the way animals navigate using path integration. We applied (...) curved tactile motions to the hands of human volunteers. Participants identified the location midway between the start and end points of each motion path. Their bisection judgements were systematically biased towards the integrated motion path, consistent with the characteristic inward error that occurs in navigation by path integration. We thus showed that integration of continuous sensory inputs across several tactile RFs provides an intrinsic mechanism for spatial perception. (shrink)
It remains controversial whether touch is a truly spatial sense or not. Many philosophers suggest that, if touch is indeed spatial, it is only through its alliances with exploratory movement, and with proprioception. Here we develop the notion that a minimal yet important form of spatial perception may occur in purely passive touch. We do this by showing that the array of tactile receptive fields in the skin, and appropriately relayed to the cortex, may contain the same basic informational building (...) blocks that a creature navigating around its environment uses to build up a perception of space. We illustrate this point with preliminary evidence that perception of spatiotemporal patterns on the human skin shows some of the same features as spatial navigation in animals. We argue (a) that the receptor array defines a ‘tactile field’, (b) that this field exists in a minimal form in ‘skin space’, logically prior to any transformation into bodily or external spatial coordinates, and (c) that this field supports tactile perception without integration of concurrent proprioceptive or motor information. The basic cognitive elements of space perception may begin at lower levels of neural and perceptual organisation than previously thought. (shrink)
The Bayesian brain hypothesis, as formalized by the free-energy principle, is ascendant in cognitive science. But, how does the Bayesian brain obtain prior beliefs? Veissière and colleagues argue that sociocultural interaction is one important source. We offer a complementary model in which “interoceptive self-inference” guides the estimation of expected uncertainty both in ourselves and in our social conspecifics.
In a recent article in Philosophy of Science, De Vignemont and Jacob defend the view that empathy involves interpersonal similarity between an empathizer and a target person with respect to internal affective states. Focusing on empathy for pain, they propose a theory of the neural substrate of pain empathy. We point out several flaws in their interpretation of the data and argue that currently available data do not differentiate between De Vignemont and Jacob’s model and alternative models. Finally, we offer (...) some suggestions about how this might be achieved in future research. (shrink)