Welcome

Why do some people see yellow when they hear a sound? How do we adjust our body-image to fit the boundaries of objects we use like our cars? How does knowing the identity of a speaker and what their voice sounds like help us hear them better at a party?

The Multisensory Perception Lab studies how information from one sensory system influences processing in other sensory systems, as well as how this information is integrated in the brain. Specifically, we investigate the mechanisms underlying basic auditory, visual, and tactile interactions, synesthesia, multisensory body image perception, and visual facilitation of speech perception. Our current research examines multisensory processes using a variety of techniques including psychophysical testing and illusions, fMRI and DTI, electrophysiological measures of neural activity (both EEG and iEEG), and lesion mapping in patients with brain tumors.

Our intracranial electroencephalography (iEEG/ECoG/sEEG) recordings are a unique resource that allow us to record neural activity directly from the human brain from clinically implanted electrodes in patients. These recordings are collected while patients perform the same auditory, visual, and tactile tasks that we use in our other behavioral and neuroimaging studies, but iEEG measures have millisecond temporal resolution as well as millimeter spatial precision, providing unparalleled information about the flow of neural activity in the brain.