Auditory and tactile signals combine to influence vision during binocular rivalry,J Neurosci, 3 (34), 784-792.

Resolution of perceptual ambiguity is one function of cross-modal interactions. Here we investigate whether auditory and tactile stimuli can influence binocular rivalry generated by interocular temporal conflict in human subjects. Using dichoptic visual stimuli modulating at different temporal frequencies, we added modulating sounds or vibrations congruent with one or the other visual temporal frequency. Auditory and tactile stimulation both interacted with binocular rivalry by promoting dominance of the congruent visual stimulus. This effect depended on the cross-modal modulation strength and was absent when modulation depth declined to 33%. However, when auditory and tactile stimuli that were too weak on their own to bias binocular rivalry were combined, their influence over vision was very strong, suggesting the auditory and tactile temporal signals combined to influence vision. Similarly, interleaving discrete pulses of auditory and tactile stimuli also promoted dominance of the visual stimulus congruent with the supramodal frequency. When auditory and tactile stimuli were presented at maximum strength, but in antiphase, they had no influence over vision for low temporal frequencies, a null effect again suggesting audio-tactile combination. We also found that the cross-modal interaction was frequency-sensitive at low temporal frequencies, when information about temporal phase alignment can be perceptually tracked. These results show that auditory and tactile temporal processing is functionally linked, suggesting a common neural substrate for the two sensory modalities and that at low temporal frequencies visual activity can be synchronized by a congruent cross-modal signal in a frequency-selective way, suggesting the existence of a supramodal temporal binding mechanism.

Interaction between Eye Movements and Vision: Perception during Saccades. In J. S. W. L. M. Chalupa (Ed.), The New Visual Neuroscience (2nd ed., pp. 947 -962): MIT Press.

Motor commands induce time compression for tactile stimuli,J Neurosci, 27 (34), 9164-9172.

Saccades cause compression of visual space around the saccadic target, and also a compression of time, both phenomena thought to be related to the problem of maintaining saccadic stability (Morrone et al., 2005; Burr and Morrone, 2011). Interestingly, similar phenomena occur at the time of hand movements, when tactile stimuli are systematically mislocalized in the direction of the movement (Dassonville, 1995; Watanabe et al., 2009). In this study, we measured whether hand movements also cause an alteration of the perceived timing of tactile signals. Human participants compared the temporal separation between two pairs of tactile taps while moving their right hand in response to an auditory cue. The first pair of tactile taps was presented at variable times with respect to movement with a fixed onset asynchrony of 150 ms. Two seconds after test presentation, when the hand was stationary, the second pair of taps was delivered with a variable temporal separation. Tactile stimuli could be delivered to either the right moving or left stationary hand. When the tactile stimuli were presented to the motor effector just before and during movement, their perceived temporal separation was reduced. The time compression was effector-specific, as perceived time was veridical for the left stationary hand. The results indicate that time intervals are compressed around the time of hand movements. As for vision, the mislocalizations of time and space for touch stimuli may be consequences of a mechanism attempting to achieve perceptual stability during tactile exploration of objects, suggesting common strategies within different sensorimotor systems.

Buildup of spatial information over time and across eye-movements,Behavioural brain research.

To interact rapidly and effectively with our environment, our brain needs access to a neural represen-tation of the spatial layout of the external world. However, the construction of such a map poses majorchallenges, as the images on our retinae depend on where the eyes are looking, and shift each time wemove our eyes, head and body to explore the world. Research from many laboratories including ourown suggests that the visual system does compute spatial maps that are anchored to real-world coordi-nates. However, the construction of these maps takes time (up to 500 ms) and also attentional resources.We discuss research investigating how retinotopic reference frames are transformed into spatiotopicreference-frames, and how this transformation takes time to complete. These results have implicationsfor theories about visual space coordinates and particularly for the current debate about the existence ofspatiotopic representations.

The visual component to saccadic compression,J Vis, 12 (14).

Visual objects presented around the time of saccadic eye movements are strongly mislocalized towards the saccadic target, a phenomenon known as “saccadic compression.” Here we show that perisaccadic compression is modulated by the presence of a visual saccadic target. When subjects saccaded to the center of the screen with no visible target, perisaccadic localization was more veridical than when tested with a target. Presenting a saccadic target sometime before saccade initiation was sufficient to induce mislocalization. When we systematically varied the onset of the saccade target, we found that it had to be presented around 100 ms before saccade execution to cause strong mislocalization: saccadic targets presented after this time caused progressively less mislocalization. When subjects made a saccade to screen center with a reference object placed at various positions, mislocalization was focused towards the position of the reference object. The results suggest that saccadic compression is a signature of a mechanism attempting to match objects seen before the saccade with those seen after.