- Visual perception and processing mechanisms
- Glaucoma and retinal disorders
- Ophthalmology and Visual Impairment Studies
- Neural dynamics and brain function
- Vestibular and auditory disorders
- Retinal Development and Disorders
- Multisensory perception and integration
- Tactile and Sensory Interactions
- Hearing Loss and Rehabilitation
- Retinal Imaging and Analysis
- Motor Control and Adaptation
- Speech and Audio Processing
- Advanced Optical Imaging Technologies
- Neuroscience and Neuropharmacology Research
- Ocular and Laser Science Research
- EEG and Brain-Computer Interfaces
- Color perception and design
- Gaze Tracking and Assistive Technology
- Anxiety, Depression, Psychometrics, Treatment, Cognitive Processes
- Advanced Vision and Imaging
- Color Science and Applications
- Neural and Behavioral Psychology Studies
- Memory and Neural Mechanisms
- Noise Effects and Management
- Action Observation and Synchronization
Cardiff University
2014-2025
MRC Institute of Hearing Research
2016
Chief Scientist Office
2016
Medical Research Council
2016
Glasgow Royal Infirmary
2016
University of Nottingham
2016
Max Planck Society
2010
Max Planck Institute for Biological Cybernetics
2010
University of California, Berkeley
1997-2000
Aston University
1994-1997
Neuronal orientation selectivity has been shown in animal models to require corticocortical network cooperation and be dependent on the presence of GABAergic inhibition. However, it is not known whether variability these fundamental neurophysiological parameters leads behavioral performance. Here, using a combination magnetic resonance spectroscopy, magnetoencephalography, visual psychophysics, we show that individual performance discrimination task correlated with both resting concentration...
The aim of this study was to examine the effect eye-movements on subjective and psychophysiological measures arousal distress associated with positive negative autobiographical memories. These memories were 'brought-to-mind' whilst engaging in eye-movement or eyes-stationary conditions a counterbalanced within subjects design, pre post eye-condition ratings emotional valence image vividness. Participants also rated current symptomatology using Impact Events Scale. Engagement compared...
During smooth pursuit eye movement, observers often misperceive velocity. Pursued stimuli appear slower (Aubert-Fleishl phenomenon [1Fleischl E.V. Physiologisch-optische Notizen, 2. Mitteilung.Sitzung Wiener Bereich der Akademie Wissenschaften. 1882; 3: 7-25Google Scholar, 2Aubert H. Die Bewegungsempfindung.Pflugers Arch. 1886; 39: 347-370Crossref Scopus (109) Google Scholar]), stationary objects to move (Filehne illusion [3Filehne W. Uber das optische Wahrnehmen von Bewegungen.Zeitschrift...
Smooth pursuit eye movements add motion to the retinal image. To compensate, visual system can combine estimates of velocity and recover with respect head. Little attention has been paid temporal characteristics this compensation process. Here, we describe how latency difference between movement signal be measured for perception during sinusoidal pursuit. In two experiments, observers compared peak a stimulus presented in fixation intervals. Both target moved profile. The phase amplitude...
The dysconnection hypothesis of schizophrenia (SZ) proposes that psychosis is best understood in terms aberrant connectivity. Specifically, it suggests dysconnectivity arises through synaptic modulation associated with deficits GABAergic inhibition, excitation-inhibition balance and disturbances high-frequency oscillations. Using a computational model combined graded-difficulty visual orientation discrimination paradigm, we demonstrate that, SZ, perceptual performance determined by the...
Theoretical models implicating the orienting reflex as an explanatory mechanism in eye-movement desensitization and reprocessing (EMDR) treatment protocol are contrasted tested empirically. We also test whether EMDR effects due to a distraction effect.A repeated measure design is used two experiments. The first experiment employed independent variables, eye condition (moving vs. stationary) tone (a pseudo-randomized series of low high intensity tones). In Expt 2, was replaced by attentional...
To make vision possible, the visual nervous system must represent most informative features in light pattern captured by eye. Here we use Gaussian scale–space theory to derive a multiscale model for edge analysis and test it perceptual experiments. At all scales there are two stages of spatial filtering. An odd-symmetric, first derivative filter provides input second filter. Crucially, output at each stage is half-wave rectified before feeding forward next. This creates nonlinear channels...
Despite good evidence for optimal audio-visual integration in stationary observers, few studies have considered the impact of self-movement on this process. When head and/or eyes move, vision and hearing is complicated, as sensory measurements begin different coordinate frames. To successfully integrate these signals, they must first be transformed into same frame. We propose that audio visual motion cues are separately using before being integrated body-centered to motion. tested hypothesis...
According to Bayesian models, perception and cognition depend on the optimal combination of noisy incoming evidence with prior knowledge world. Individual differences in should therefore be jointly determined by a person’s sensitivity his or her expectations. It has been proposed that individuals autism have flatter distributions than do nonautistic individuals, which suggests variance is linked degree autistic traits general population. We tested this idea studying how perceived speed...
Abstract Previous research has shown that vection can be enhanced by adding horizontal simulated viewpoint oscillation to radial flow. Adding a horizontally oscillating fixation target purely flow induces superficially similar illusion of self-motion, where the observer's perceived heading oscillates left and right as their eyes pursue moving target. This study directly compared induced these two conditions for first time. point were both found improve (relative no control displays). Neither...
Hearing is confronted by a similar problem to vision when the observer moves.The image motion that created remains ambiguous until knows velocity of eye and/or head.One way visual system solves this use motor commands, proprioception, and vestibular information.These "extraretinal signals" compensate for self-movement, converting into head-centered coordinates, although not always perfectly.We investigated whether auditory also transforms coordinates examining degree compensation head...
We present a psychophysical technique for measuring the precision of signals encoding active self-movements. Using head movements, we show that 1) is greater when rotation performed using visual comparison stimuli versus auditory; 2) decreases with speed (Weber’s law); 3) perceived lower during rotation. The findings may reflect steps needed to convert different cues into common units, and challenge standard Bayesian models motion perception.
Evidence that the auditory system contains specialised motion detectors is mixed. Many psychophysical studies confound speed cues with distance and duration present sound sources do not appear to move in external space. Here we use 'discrimination contours' technique probe probabilistic combination of speed, for stimuli moving a horizontal arc around listener virtual The produces set discrimination thresholds define contour distance-duration plane different three cues, based on 3-interval...
One way the visual system estimates object motion during pursuit is to combine of eye velocity and retinal motion. This questions whether observers need direct access pursuit. We tested this idea by varying correlation between objective in a two-interval speed discrimination task. Responses were classified according three cues: (based on measured movements), speed, relative target stimulus. In first experiment, feedback was based cue fit response curves best. second simultaneous removed but...