Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Brian Odegaard is active.

Publication


Featured researches published by Brian Odegaard.


PLOS Computational Biology | 2015

Biases in Visual, Auditory, and Audiovisual Perception of Space.

Brian Odegaard; David R. Wozny; Ladan Shams

Localization of objects and events in the environment is critical for survival, as many perceptual and motor tasks rely on estimation of spatial location. Therefore, it seems reasonable to assume that spatial localizations should generally be accurate. Curiously, some previous studies have reported biases in visual and auditory localizations, but these studies have used small sample sizes and the results have been mixed. Therefore, it is not clear (1) if the reported biases in localization responses are real (or due to outliers, sampling bias, or other factors), and (2) whether these putative biases reflect a bias in sensory representations of space or a priori expectations (which may be due to the experimental setup, instructions, or distribution of stimuli). Here, to address these questions, a dataset of unprecedented size (obtained from 384 observers) was analyzed to examine presence, direction, and magnitude of sensory biases, and quantitative computational modeling was used to probe the underlying mechanism(s) driving these effects. Data revealed that, on average, observers were biased towards the center when localizing visual stimuli, and biased towards the periphery when localizing auditory stimuli. Moreover, quantitative analysis using a Bayesian Causal Inference framework suggests that while pre-existing spatial biases for central locations exert some influence, biases in the sensory representations of both visual and auditory space are necessary to fully explain the behavioral data. How are these opposing visual and auditory biases reconciled in conditions in which both auditory and visual stimuli are produced by a single event? Potentially, the bias in one modality could dominate, or the biases could interact/cancel out. The data revealed that when integration occurred in these conditions, the visual bias dominated, but the magnitude of this bias was reduced compared to unisensory conditions. Therefore, multisensory integration not only improves the precision of perceptual estimates, but also the accuracy.


The Journal of Neuroscience | 2017

Should a Few Null Findings Falsify Prefrontal Theories of Conscious Perception

Brian Odegaard; Robert T. Knight; Hakwan Lau

Is activity in prefrontal cortex (PFC) critical for conscious perception? Major theories of consciousness make distinct predictions about the role of PFC, providing an opportunity to arbitrate between these views empirically. Here we address three common misconceptions: (1) PFC lesions do not affect subjective perception; (2) PFC activity does not reflect specific perceptual content; and (3) PFC involvement in studies of perceptual awareness is solely driven by the need to make reports required by the experimental tasks rather than subjective experience per se. These claims are incompatible with empirical findings, unless one focuses only on studies using methods with limited sensitivity. The literature highlights PFCs essential role in enabling the subjective experience in perception, contra the objective capacity to perform visual tasks; conflating the two can also be a source of confusion. Dual Perspectives Companion Paper: Are the Neural Correlates of Consciousness in the Front or in the Back of the Cerebral Cortex? Clinical and Neuroimaging Evidence, by Melanie Boly, Marcello Massimini, Naotsugu Tsuchiya, Bradley R. Postle, Christof Koch, and Giulio Tononi


Psychological Science | 2016

The Brain’s Tendency to Bind Audiovisual Signals Is Stable but Not General

Brian Odegaard; Ladan Shams

Previous studies have shown a surprising amount of between-subjects variability in the strength of interactions between sensory modalities. For the same set of stimuli, some subjects exhibit strong interactions, whereas others exhibit weak interactions. To date, little is known about what underlies this variability. Sensory integration in the brain could be governed by a global mechanism or by task-specific mechanisms that could be either stable or variable across time. We used a rigorous quantitative tool (Bayesian causal inference) to investigate whether integration (i.e., binding) tendencies generalize across tasks and are stable across time. We report for the first time that individuals’ binding tendencies are stable across time but are task-specific. These results provide evidence against the hypothesis that sensory integration is governed by a single, global parameter in the brain.


Neuroscience Letters | 2016

The effects of selective and divided attention on sensory precision and integration.

Brian Odegaard; David R. Wozny; Ladan Shams

In our daily lives, our capacity to selectively attend to stimuli within or across sensory modalities enables enhanced perception of the surrounding world. While previous research on selective attention has studied this phenomenon extensively, two important questions still remain unanswered: (1) how selective attention to a single modality impacts sensory integration processes, and (2) the mechanism by which selective attention improves perception. We explored how selective attention impacts performance in both a spatial task and a temporal numerosity judgment task, and employed a Bayesian Causal Inference model to investigate the computational mechanism(s) impacted by selective attention. We report three findings: (1) in the spatial domain, selective attention improves precision of the visual sensory representations (which were relatively precise), but not the auditory sensory representations (which were fairly noisy); (2) in the temporal domain, selective attention improves the sensory precision in both modalities (both of which were fairly reliable to begin with); (3) in both tasks, selective attention did not exert a significant influence over the tendency to integrate sensory stimuli. Therefore, it may be postulated that a sensory modality must possess a certain inherent degree of encoding precision in order to benefit from selective attention. It also appears that in certain basic perceptual tasks, the tendency to integrate crossmodal signals does not depend significantly on selective attention. We conclude with a discussion of how these results relate to recent theoretical considerations of selective attention.


Trends in Cognitive Sciences | 2016

What Type of Awareness Does Binocular Rivalry Assess

Nathan Giles; Hakwan Lau; Brian Odegaard

Recent experiments demonstrate that invisible stimulus features can induce binocular rivalry, indicating the phenomenon may be caused by differences in perceptual signal strength rather than conscious selection processes. Here, we clarify binocular rivalrys role in consciousness research by highlighting a critical difference between two distinct types of visual awareness.


Proceedings of the National Academy of Sciences of the United States of America | 2018

Superior colliculus neuronal ensemble activity signals optimal rather than subjective confidence

Brian Odegaard; Piercesare Grimaldi; Seong Hah Cho; Megan A.K. Peters; Hakwan Lau; Michele A. Basso

Significance Previously, the neuronal correlates of perceptual confidence have been identified in neural circuits responsible for deciding what an animal sees. However, behaviorally, confidence and perceptual decision accuracy are confounded; we are usually more confident about perceptual decisions when they are accurate. To tease them apart, we introduced a task with stimulus conditions that produced similar decision accuracy but different reports of subjective confidence. We decoded decision performance from neuronal signals in nonhuman primates in a subcortical region involved in decision-making, the superior colliculus (SC), and found that SC ensemble activity tracks decision accuracy, but not subjective confidence. These results challenge current ideas about how to measure subjective confidence in experiments and inspire ways to study its neuronal mechanisms. Recent studies suggest that neurons in sensorimotor circuits involved in perceptual decision-making also play a role in decision confidence. In these studies, confidence is often considered to be an optimal readout of the probability that a decision is correct. However, the information leading to decision accuracy and the report of confidence often covaried, leaving open the possibility that there are actually two dissociable signal types in the brain: signals that correlate with decision accuracy (optimal confidence) and signals that correlate with subjects’ behavioral reports of confidence (subjective confidence). We recorded neuronal activity from a sensorimotor decision area, the superior colliculus (SC) of monkeys, while they performed two different tasks. In our first task, decision accuracy and confidence covaried, as in previous studies. In our second task, we implemented a motion discrimination task with stimuli that were matched for decision accuracy but produced different levels of confidence, as reflected by behavioral reports. We used a multivariate decoder to predict monkeys’ choices from neuronal population activity. As in previous studies on perceptual decision-making mechanisms, we found that neuronal decoding performance increased as decision accuracy increased. However, when decision accuracy was matched, performance of the decoder was similar between high and low subjective confidence conditions. These results show that the SC likely signals optimal decision confidence similar to previously reported cortical mechanisms, but is unlikely to play a critical role in subjective confidence. The results also motivate future investigations to determine where in the brain signals related to subjective confidence reside.


The Journal of Neuroscience | 2017

Limited Cognitive Resources Explain a Trade-Off between Perceptual and Metacognitive Vigilance

Brian Maniscalco; Li Yan McCurdy; Brian Odegaard; Hakwan Lau

Why do experimenters give subjects short breaks in long behavioral experiments? Whereas previous studies suggest it is difficult to maintain attention and vigilance over long periods of time, it is unclear precisely what mechanisms benefit from rest after short experimental blocks. Here, we evaluate decline in both perceptual performance and metacognitive sensitivity (i.e., how well confidence ratings track perceptual decision accuracy) over time and investigate whether characteristics of prefrontal cortical areas correlate with these measures. Whereas a single-process signal detection model predicts that these two forms of fatigue should be strongly positively correlated, a dual-process model predicts that rates of decline may dissociate. Here, we show that these measures consistently exhibited negative or near-zero correlations, as if engaged in a trade-off relationship, suggesting that different mechanisms contribute to perceptual and metacognitive decisions. Despite this dissociation, the two mechanisms likely depend on common resources, which could explain their trade-off relationship. Based on structural MRI brain images of individual human subjects, we assessed gray matter volume in the frontal polar area, a region that has been linked to visual metacognition. Variability of frontal polar volume correlated with individual differences in behavior, indicating the region may play a role in supplying common resources for both perceptual and metacognitive vigilance. Additional experiments revealed that reduced metacognitive demand led to superior perceptual vigilance, providing further support for this hypothesis. Overall, results indicate that during breaks between short blocks, it is the higher-level perceptual decision mechanisms, rather than lower-level sensory machinery, that benefit most from rest. SIGNIFICANCE STATEMENT Perceptual task performance declines over time (the so-called vigilance decrement), but the relationship between vigilance in perception and metacognition has not yet been explored in depth. Here, we show that patterns in perceptual and metacognitive vigilance do not follow the pattern predicted by a previously suggested single-process model of perceptual and metacognitive decision making. We account for these findings by showing that regions of anterior prefrontal cortex (aPFC) previously associated with visual metacognition are also associated with perceptual vigilance. We also show that relieving metacognitive task demand improves perceptual vigilance, suggesting that aPFC may house a limited cognitive resource that contributes to both metacognition and perceptual vigilance. These findings advance our understanding of the mechanisms and dynamics of perceptual metacognition.


Clinical psychological science | 2017

The Relationship Between Audiovisual Binding Tendencies and Prodromal Features of Schizophrenia in the General Population

Brian Odegaard; Ladan Shams

Current theoretical accounts of schizophrenia have considered the disorder within the framework of hierarchical Bayesian inference, positing that symptoms arise from a deficit in the brain’s capacity to combine incoming sensory information with preexisting priors. Here, we present the first investigation to examine the relationship between priors governing multisensory perception and subclinical, prodromal features of schizophrenia in the general population. We tested participants in two complementary tasks (one spatial, one temporal) and employed a Bayesian model to estimate both the precision of unisensory encoding and the prior tendency to integrate audiovisual signals (i.e., the “binding tendency”). Results revealed that lower binding tendency scores in the spatial task were associated with higher numbers of self-reported prodromal features. These results indicate decreased binding of audiovisual spatial information may be moderately related to the frequency of prodromal characteristics in the general population.


PeerJ | 2017

A simple and efficient method to enhance audiovisual binding tendencies

Brian Odegaard; David R. Wozny; Ladan Shams

Individuals vary in their tendency to bind signals from multiple senses. For the same set of sights and sounds, one individual may frequently integrate multisensory signals and experience a unified percept, whereas another individual may rarely bind them and often experience two distinct sensations. Thus, while this binding/integration tendency is specific to each individual, it is not clear how plastic this tendency is in adulthood, and how sensory experiences may cause it to change. Here, we conducted an exploratory investigation which provides evidence that (1) the brain’s tendency to bind in spatial perception is plastic, (2) that it can change following brief exposure to simple audiovisual stimuli, and (3) that exposure to temporally synchronous, spatially discrepant stimuli provides the most effective method to modify it. These results can inform current theories about how the brain updates its internal model of the surrounding sensory world, as well as future investigations seeking to increase integration tendencies.


Trends in Cognitive Sciences | 2016

Methodological Considerations to Strengthen Studies of Peripheral Vision

Brian Odegaard; Hakwan Lau

In a recent issue of Trends in Cognitive Sciences, Cohen et al.[1] argue that the study of visual summary statistics represents an elegant method to account for the richness of visual experience in the periphery. We resoundingly agree that employing ensemble statistics is a strong step towards resolving questions of how conscious we are of our visual surroundings. However, we think the explanatory power of this approach can be augmented by focusing on two specific areas: (i) psychophysical quantification of metacognitive capacities and decision biases associated with peripheral vision; (ii) distinction between perceptual decisions that involve different levels of detail. Consideration of these issues will facilitate the development of precise hypotheses about peripheral phenomenology and yield useful data from experiments investigating summary statistics; we explain how below.

Collaboration


Dive into the Brian Odegaard's collaboration.

Top Co-Authors

Avatar

Hakwan Lau

University of California

View shared research outputs
Top Co-Authors

Avatar

Ladan Shams

University of California

View shared research outputs
Top Co-Authors

Avatar

David R. Wozny

University of California

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Brian Maniscalco

National Institutes of Health

View shared research outputs
Top Co-Authors

Avatar

J.D. Knotts

University of California

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge