Adaptation Affects Both High and Low (Subitized) Numbers Under Conditions of High Attentional Load,Seeing and Perceiving, (24), 141-150.

It has recently been reported that, like most sensory systems, numerosity is subject to adaptation. However, the effect seemed to be limited to numerosity estimation outside the subitizing range. In this study we show that low numbers, clearly in the subitizing range, are adaptable under conditions of high attentional load. These results support the idea that numerosity is detected by a perceptual mechanism that operates over the entire range of numbers, supplemented by an attention-based system for small numbers (subitizing).

Cross-Sensory Facilitation Reveals Neural Interactions between Visual and Tactile Motion in Humans,Front Psychol, (2), 55.

Many recent studies show that the human brain integrates information across the different senses and that stimuli of one sensory modality can enhance the perception of other modalities. Here we study the processes that mediate cross-modal facilitation and summation between visual and tactile motion. We find that while summation produced a generic, non-specific improvement of thresholds, probably reflecting higher-order interaction of decision signals, facilitation reveals a strong, direction-specific interaction, which we believe reflects sensory interactions. We measured visual and tactile velocity discrimination thresholds over a wide range of base velocities and conditions. Thresholds for both visual and tactile stimuli showed the characteristic “dipper function,” with the minimum thresholds occurring at a given “pedestal speed.” When visual and tactile coherent stimuli were combined (summation condition) the thresholds for these multisensory stimuli also showed a “dipper function” with the minimum thresholds occurring in a similar range to that for unisensory signals. However, the improvement of multisensory thresholds was weak and not directionally specific, well predicted by the maximum-likelihood estimation model (agreeing with previous research). A different technique (facilitation) did, however, reveal direction-specific enhancement. Adding a non-informative “pedestal” motion stimulus in one sensory modality (vision or touch) selectively lowered thresholds in the other, by the same amount as pedestals in the same modality. Facilitation did not occur for neutral stimuli like sounds (that would also have reduced temporal uncertainty), nor for motion in opposite direction, even in blocked trials where the subjects knew that the motion was in the opposite direction showing that the facilitation was not under subject control. Cross-sensory facilitation is strong evidence for functionally relevant cross-sensory integration at early levels of sensory processing.

Motion psychophysics: 1985-2010,Vision Res.

This review traces progress made in the field of visual motion research from 1985 through to 2010. While it is certainly not exhaustive, it attempts to cover most of the major achievements during that period, and speculate on where the field is heading.

Spatiotopic coding and remapping in humans,Philos Trans R Soc Lond B Biol Sci, 1564 (366), 504-515.

How our perceptual experience of the world remains stable and continuous in the face of continuous rapid eye movements still remains a mystery. This review discusses some recent progress towards understanding the neural and psychophysical processes that accompany these eye movements. We firstly report recent evidence from imaging studies in humans showing that many brain regions are tuned in spatiotopic coordinates, but only for items that are actively attended. We then describe a series of experiments measuring the spatial and temporal phenomena that occur around the time of saccades, and discuss how these could be related to visual stability. Finally, we introduce the concept of the spatio-temporal receptive field to describe the local spatiotopicity exhibited by many neurons when the eyes move.

Spatiotopic Coding of BOLD Signal in Human Visual Cortex Depends on Spatial Attention,PLoS One, 7 (6), e21661.

The neural substrate of the phenomenological experience of a stable visual world remains obscure. One possible mechanism would be to construct spatiotopic neural maps where the response is selective to the position of the stimulus in external space, rather than to retinal eccentricities, but evidence for these maps has been inconsistent. Here we show, with fMRI, that when human subjects perform concomitantly a demanding attentive task on stimuli displayed at the fovea, BOLD responses evoked by moving stimuli irrelevant to the task were mostly tuned in retinotopic coordinates. However, under more unconstrained conditions, where subjects could attend easily to the motion stimuli, BOLD responses were tuned not in retinal but in external coordinates (spatiotopic selectivity) in many visual areas, including MT, MST, LO and V6, agreeing with our previous fMRI study. These results indicate that spatial attention may play an important role in mediating spatiotopic selectivity.

Vision and audition do not share attentional resources in sustained tasks,Front Psychol, (2), 56.

Our perceptual capacities are limited by attentional resources. One important question is whether these resources are allocated separately to each sense or shared between them. We addressed this issue by asking subjects to perform a double task, either in the same modality or in different modalities (vision and audition). The primary task was a multiple object-tracking task (Pylyshyn and Storm, 1988), in which observers were required to track between 2 and 5 dots for 4 s. Concurrently, they were required to identify either which out of three gratings spaced over the interval differed in contrast or, in the auditory version of the same task, which tone differed in frequency relative to the two reference tones. The results show that while the concurrent visual contrast discrimination reduced tracking ability by about 0.7 d’, the concurrent auditory task had virtually no effect. This confirms previous reports that vision and audition use separate attentional resources, consistent with fMRI findings of attentional effects as early as V1 and A1. The results have clear implications for effective design of instrumentation and forms of audio-visual communication devices.

Visual perception: more than meets the eye,Curr Biol, 4 (21), R159-161.

A recent study shows that objects changing in colour, luminance, size or shape appear to stop changing when they move. These and other compelling illusions provide tantalizing clues about the mechanisms and limitations of object analysis.

Perceived duration of visual and tactile stimuli depends on perceived speed. Front. Integr. Neurosci. 5:51

It is known that the perceived duration of visual stimuli is strongly influenced by speed: faster moving stimuli appear to last longer. To test whether this is a general property of sensory systems we asked participants to reproduce the duration of visual and tactile gratings, and visuo-tactile gratings moving at a variable speed (3.5–15 cm/s) for three different durations (400, 600, and 800 ms). For both modalities, the apparent duration of the stimulus increased strongly with stimulus speed, more so for tactile than for visual stimuli. In addition, visual stimuli were perceived to last approximately 200 ms longer than tactile stimuli. The apparent duration of visuo-tactile stimuli lay between the unimodal estimates, as the Bayesian account predicts, but the bimodal precision of the reproduction did not show the theoretical improvement. A cross-modal speed-matching task revealed that visual stimuli were perceived to move faster than tactile stimuli. To test whether the large difference in the perceived duration of visual and tactile stimuli resulted from the difference in their perceived speed, we repeated the time reproduction task with visual and tactile stimuli matched in apparent speed. This reduced, but did not completely eliminate the difference in apparent duration. These results show that for both vision and touch, perceived duration depends on speed, pointing to common strategies of time perception.

Spatiotopic Visual Maps Revealed by Saccadic Adaptation in Humans, Curr Biol. 2011 Aug 23;21(16):1380-4 

Saccadic adaptation is a powerful experimental paradigm to probe the mechanisms of eye movement control and spatial vision, in which saccadic amplitudes change in response to false visual feedback. The adaptation occurs primarily in the motor system, but there is also evidence for visual adaptation, depending on the size and the permanence of the postsaccadic error. Here we confirm that adaptation has a strong visual component and show that the visual component of the adaptation is spatially selective in external, not retinal coordinates. Subjects performed a memory-guided, double-saccade, outward-adaptation task designed to maximize visual adaptation and to dissociate the visual and motor corrections. When the memorized saccadic target was in the same position (in external space) as that used in the adaptation training, saccade targeting was strongly influenced by adaptation (even if not matched in retinal or cranial position), but when in the same retinal or cranial but different external spatial position, targeting was unaffected by adaptation, demonstrating unequivocal spatiotopic selectivity. These results point to the existence of a spatiotopic neural representation for eye movement control that adapts in response to saccade error signals.

Reduced perceptual sensitivity for biological motion in paraplegia patients,Curr Biol, 22 (21), R910-911. 

Physiological and psychophysical studies suggest that the perception and execution of movement may be linked. Here we ask whether severe impairment of locomotion could impact on the capacity to perceive human locomotion. We measured sensitivity for the perception of point-light walkers – animation sequences of human biological motion portrayed by only the joints – in patients with severe spinal injury. These patients showed a huge (nearly three-fold) reduction of sensitivity for detecting and for discriminating the direction of biological motion compared with healthy controls, and also a smaller (~40%) reduction in sensitivity to simple translational motion. However, there was no statistically significant reduction in contrast sensitivity for discriminating the orientation of static gratings. The results point to an interaction between perceiving and producing motion, implicating shared algorithms and neural mechanisms.