A sound can influence when (or at what location) an observer sees a moving dot reverse its direction. Real-world events involve both sound and vision, but how do we integrate the information from our eyes and ears into a single unified percept? We are interested in the effects of sound on visual perception, and vice-versa.
We also study somatosensory (tactile) stimuli as a third modality, with particular interest in the relative perceived timing of events across the modalities. It emerges that perceived time is far more flexible than previously thought, being particularly susceptible to recent experience.
An additional area of interest is the way in which our motor actions relate to our sensory perception. In a particularly fascinating illusion, we are able to trick the brain into perceiving a sensory event prior to the motor action producing the event!