Research Interests

Our senses do not reflect the external world with perfect fidelity; on the contrary, sensory information is inherently noisy and often ambiguous. How do humans and animals make adaptive perceptual decisions in the face of such uncertainty?

My previous and ongoing research addresses this question on two fronts: (a) how the brain combines information from multiple sensory modalities, and (b) how it establishes a level of confidence in a decision. Confidence—the degree of belief that a pending decision will turn out to be correct—is crucial for guiding behavior in complex environments, yet only recently has it become amenable to neuroscientific investigation. We use behavioral assays to ask monkeys how confident they are in decisions about visual motion while recording and manipulating neural activity in visual cortical areas (MT, MST). To complement traditional causal methods such as electrical microstimulation, we have developed and refined optogenetic approaches for suppressing activity with greater spatial and temporal specificity than was heretofore possible. The results so far support the idea that confidence arises from the same neural mechanism that explains the choice itself (i.e., accuracy) and the time needed to decide (reaction time).

Our recent work seeks to bring the experimental and theoretical toolkit of decision making to bear on the more natural case of multiple, time-varying sensory inputs. For example, as an animal moves through its environment, it can use both visual and non-visual (e.g., vestibular) cues to judge its speed and direction of self-motion (and confidence therein). Exploring how neural populations and circuits perform this feat, despite the inherent uncertainty of the incoming signals, may shed light on general principles of brain function.