Transient spatiotopic integration across saccadic eye movements mediates visual stability,J Neurophysiol, 4 (109), 1117-1125. 

Eye movements pose major problems to the visual system, because each new saccade changes the mapping of external objects on the retina. It is known that stimuli briefly presented around the time of saccades are systematically mislocalized, whereas continuously visible objects are perceived as spatially stable even when they undergo large transsaccadic displacements. In this study we investigated the relationship between these two phenomena and measured how human subjects perceive the position of pairs of bars briefly displayed around the time of large horizontal saccades. We show that they interact strongly, with the perisaccadic bar being drawn toward the other, dramatically altering the pattern of perisaccadic mislocalization. The interaction field extends over a wide range (200 ms and 20 degrees ) and is oriented along the retinotopic trajectory of the saccade-induced motion, suggesting a mechanism that integrates pre- and postsaccadic stimuli at different retinal locations but similar external positions. We show how transient changes in spatial integration mechanisms, which are consistent with the present psychophysical results and with the properties of “remapping cells” reported in the literature, can create transient craniotopy by merging the distinct retinal images of the pre- and postsaccadic fixations to signal a single stable object.

Attention to Bright Surfaces Enhances the Pupillary Light Reflex,Journal of Neuroscience, 5 (33), 2199-2204.

One longstanding question is how early in the visual system attention exerts its influence. Here we show that an effect of attention can be measured at the earliest possible stage of visual information processing, as a change in the optics of the eye. We tested human subjects and found that covertly attending to bright surfaces results in an enhanced pupillary light reflex (PLR)-the pupillary constriction that occurs in response to light increments. The PLR optimizes the optical quality of the retinal image across illumination conditions, increasing sensitivity by modulating retinal illumination, and improving acuity by reducing spherical aberrations. The attentional modulation of the PLR that we describe constitutes a new mechanism through which vision is affected by attention; we discuss three alternatives for the neural substrates of this effect, including the possibility that attention might act indirectly, via its well established effects in early visual cortex.

Attention to bright surfaces enhances the pupillary light reflex,J Neurosci, 5 (33), 2199-2204.

One longstanding question is how early in the visual system attention exerts its influence. Here we show that an effect of attention can be measured at the earliest possible stage of visual information processing, as a change in the optics of the eye. We tested human subjects and found that covertly attending to bright surfaces results in an enhanced pupillary light reflex (PLR)-the pupillary constriction that occurs in response to light increments. The PLR optimizes the optical quality of the retinal image across illumination conditions, increasing sensitivity by modulating retinal illumination, and improving acuity by reducing spherical aberrations. The attentional modulation of the PLR that we describe constitutes a new mechanism through which vision is affected by attention; we discuss three alternatives for the neural substrates of this effect, including the possibility that attention might act indirectly, via its well established effects in early visual cortex.

Pupil constrictions to photographs of the sun,Journal of Vision, 6 (13).

The pupil constricts in response to light increments and dilates with light decrements. Here we show that a picture of the sun, introducing a small overall decrease in light level across the field of view, results in a pupillary constriction. Thus, the pictorial representation of a high-luminance object (the sun) can override the normal pupillary dilation elicited by a light decrement. In a series of experiments that control for a variety of factors known to modulate pupil size, we show that the effect (a) does not depend on the retinal position of the images and (b) is modulated by attention. It has long been known that cognitive factors can affect pupil diameter by producing pupillary dilations. Our results indicate that high-level visual analysis (beyond the simple subcortical system mediating the pupillary response to light) can also induce pupillary constriction, with an effect size of about 0.1 mm.

Touch Interacts with Vision during Binocular Rivalry with a Tight Orientation Tuning,PLoS One, 3 (8), e58754.

Multisensory integration is a common feature of the mammalian brain that allows it to deal more efficiently with the ambiguity of sensory input by combining complementary signals from several sensory sources. Growing evidence suggests that multisensory interactions can occur as early as primary sensory cortices. Here we present incompatible visual signals (orthogonal gratings) to each eye to create visual competition between monocular inputs in primary visual cortex where binocular combination would normally take place. The incompatibility prevents binocular fusion and triggers an ambiguous perceptual response in which the two images are perceived one at a time in an irregular alternation. One key function of multisensory integration is to minimize perceptual ambiguity by exploiting cross-sensory congruence. We show that a haptic signal matching one of the visual alternatives helps disambiguate visual perception during binocular rivalry by both prolonging the dominance period of the congruent visual stimulus and by shortening its suppression period. Importantly, this interaction is strictly tuned for orientation, with a mismatch as small as 7.5 degrees between visual and haptic orientations sufficient to annul the interaction. These results indicate important conclusions: first, that vision and touch interact at early levels of visual processing where interocular conflicts are first detected and orientation tunings are narrow, and second, that haptic input can influence visual signals outside of visual awareness, bringing a stimulus made invisible by binocular rivalry suppression back to awareness sooner than would occur without congruent haptic input.

Long-term effects of monocular deprivation revealed with binocular rivalry gratings modulated in luminance and in color,J Vis, 6 (13), 

During development, within a specific temporal window called the critical period, the mammalian visual cortex is highly plastic and literally shaped by visual experience; to what extent this extraordinary plasticity is retained in the adult brain is still a debated issue. We tested the residual plastic potential of the adult visual cortex for both achromatic and chromatic vision by measuring binocular rivalry in adult humans following 150 minutes of monocular patching. Paradoxically, monocular deprivation resulted in lengthening of the mean phase duration of both luminance-modulated and equiluminant stimuli for the deprived eye and complementary shortening of nondeprived phase durations, suggesting an initial homeostatic compensation for the lack of information following monocular deprivation. When equiluminant gratings were tested, the effect was measurable for at least 180 minutes after reexposure to binocular vision, compared with 90 minutes for achromatic gratings. Our results suggest that chromatic vision shows a high degree of plasticity, retaining the effect for a duration (180 minutes) longer than that of the deprivation period (150 minutes) and twice as long as that found with achromatic gratings. The results are in line with evidence showing a higher vulnerability of the P pathway to the effects of visual deprivation during development and a slower development of chromatic vision in humans.

Early interaction between vision and touch during binocular rivalry,Multisens Res, 3 (26), 291-306.

Multisensory integration is known to occur at high neural levels, but there is also growing evidence that cross-modal signals can be integrated at the first stages of sensory processing. We investigated whether touch specifically affected vision during binocular rivalry, a particular type of visual bistability that engages neural competition in early visual cortices. We found that tactile signals interact with visual signals outside of awareness, when the visual stimulus congruent with the tactile one is perceptually suppressed during binocular rivalry and when the interaction is strictly tuned for matched visuo-tactile spatial frequencies. We also found that voluntary action does not play a leading role in mediating the effect, since the interaction was observed also when tactile stimulation was passively delivered to the finger. However, simultaneous presentation of visual and tactile stimuli is necessary to elicit the interaction, and an asynchronous priming touch stimulus is not affecting the onset of rivalry. These results point to a very early cross-modal interaction site, probably V1. By showing that spatial proximity between visual and tactile stimuli is a necessary condition for the interaction, we also suggest that the two sensory spatial maps are aligned according to retinotopic coordinates, corroborating the hypothesis of a very early interaction between visual and tactile signals during binocular rivalry.

Touch influences visual perception with a tight orientation-tuning,PLoS One, 11 (8), e79558.

Stimuli from different sensory modalities are thought to be processed initially in distinct unisensory brain areas prior to convergence in multisensory areas. However, signals in one modality can influence the processing of signals from other modalities and recent studies suggest this cross-modal influence may occur early on, even in ‘unisensory’ areas. Some recent psychophysical studies have shown specific cross-modal effects between touch and vision during binocular rivalry, but these cannot completely rule out a response bias. To test for genuine cross-modal integration of haptic and visual signals, we investigated whether congruent haptic input could influence visual contrast sensitivity compared to incongruent haptic input in three psychophysical experiments using a two-interval, two-alternative forced-choice method to eliminate response bias. The initial experiment demonstrated that contrast thresholds for a visual grating were lower when exploring a haptic grating that shared the same orientation compared to an orthogonal orientation. Two subsequent experiments mapped the orientation and spatial frequency tunings for the congruent haptic facilitation of vision, finding a clear orientation tuning effect but not a spatial frequency tuning. In addition to an increased contrast sensitivity for iso-oriented visual-haptic gratings, we found a significant loss of sensitivity for orthogonally oriented visual-haptic gratings. We conclude that the tactile influence on vision is a result of a tactile input to orientation-tuned visual areas.

What’s “up”? Working memory contents can bias orientation processing, Vision Res, (76), 46-55.

We explored the interaction between the processing of a low-level visual feature such as orientation and the contents of working memory (WM). In a first experiment, participants memorized the orientation of a Gabor patch and performed two subsequent orientation discriminations during the retention interval. The WM stimulus exerted a consistent repulsive effect on the discrimination judgments: participants were more likely to report that the discrimination stimulus was rotated clockwise compared to the oblique after being presented with a stimulus that was tilted anti-clockwise from the oblique. A control condition where participants attended to the Gabor patch but did not memorize it, showed a much reduced effect. The repulsive effect was stable across the two discriminations in the memory condition, but not in the control condition, where it decayed at the second discrimination. In a second experiment, we showed that the greater interference observed in the WM condition cannot be explained by a difference in cognitive demands between the WM and the control condition. We conclude that WM contents can bias perception: the effect of WM interference is of a visual nature, can last over delays of several seconds and is not disrupted by the processing of intervening visual stimuli during the retention period.