Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Christopher M. Conway is active.

Publication


Featured researches published by Christopher M. Conway.


Journal of Experimental Psychology: Learning, Memory and Cognition | 2005

Modality-constrained statistical learning of tactile, visual, and auditory sequences.

Christopher M. Conway; Morten H. Christiansen

The authors investigated the extent to which touch, vision, and audition mediate the processing of statistical regularities within sequential input. Few researchers have conducted rigorous comparisons across sensory modalities; in particular, the sense of touch has been virtually ignored. The current data reveal not only commonalities but also modality constraints affecting statistical learning across the senses. To be specific, the authors found that the auditory modality displayed a quantitative learning advantage compared with vision and touch. In addition, they discovered qualitative learning biases among the senses: Primarily, audition afforded better learning for the final part of input sequences. These findings are discussed in terms of whether statistical learning is likely to consist of a single, unitary mechanism or multiple, modality-constrained ones.


Cognition | 2010

Implicit statistical learning in language processing: Word predictability is the key☆

Christopher M. Conway; Althea Bauernschmidt; Sean S. Huang; David B. Pisoni

Fundamental learning abilities related to the implicit encoding of sequential structure have been postulated to underlie language acquisition and processing. However, there is very little direct evidence to date supporting such a link between implicit statistical learning and language. In three experiments using novel methods of assessing implicit learning and language abilities, we show that sensitivity to sequential structure - as measured by improvements to immediate memory span for structurally-consistent input sequences - is significantly correlated with the ability to use knowledge of word predictability to aid speech perception under degraded listening conditions. Importantly, the association remained even after controlling for participant performance on other cognitive tasks, including short-term and working memory, intelligence, attention and inhibition, and vocabulary knowledge. Thus, the evidence suggests that implicit learning abilities are essential for acquiring long-term knowledge of the sequential structure of language - i.e., knowledge of word predictability - and that individual differences on such abilities impact speech perception in everyday situations. These findings provide a new theoretical rationale linking basic learning phenomena to specific aspects of spoken language processing in adults, and may furthermore indicate new fruitful directions for investigating both typical and atypical language development.


conference cognitive science | 2006

Statistical Learning Within and Between Modalities Pitting Abstract Against Stimulus-Specific Representations

Christopher M. Conway; Morten H. Christiansen

When learners encode sequential patterns and generalize their knowledge to novel instances, are they relying on abstract or stimulus-specific representations? Research on artificial grammar learning (AGL) has shown transfer of learning from one stimulus set to another, and such findings have encouraged the view that statistical learning is mediated by abstract representations that are independent of the sense modality or perceptual features of the stimuli. Using a novel modification of the standard AGL paradigm, we obtained data to the contrary. These experiments pitted abstract processing against stimulus-specific learning. The findings show that statistical learning results in knowledge that is stimulus-specific rather than abstract. They show furthermore that learning can proceed in parallel for multiple input streams along separate perceptual dimensions or sense modalities. We conclude that learning sequential structure and generalizing to novel stimuli inherently involve learning mechanisms that are closely tied to the perceptual characteristics of the input.


Trends in Cognitive Sciences | 2001

Sequential learning in non-human primates.

Christopher M. Conway; Morten H. Christiansen

Sequential learning plays a role in a variety of common tasks, such as human language processing, animal communication, and the learning of action sequences. In this article, we investigate sequential learning in non-human primates from a comparative perspective, focusing on three areas: the learning of arbitrary, fixed sequences; statistical learning; and the learning of hierarchical structure. Although primates exhibit many similarities to humans in their performance on sequence learning tasks, there are also important differences. Crucially, non-human primates appear to be limited in their ability to learn and represent the hierarchical structure of sequences. We consider the evolutionary implications of these differences and suggest that limitations in sequential learning may help explain why non-human primates lack human-like language.


Current Directions in Psychological Science | 2009

The Importance of Sound for Cognitive Sequencing Abilities The Auditory Scaffolding Hypothesis

Christopher M. Conway; David B. Pisoni; William G. Kronenberger

Sound is inherently a temporal and sequential signal. Experience with sound therefore may help bootstrap—that is, provide a kind of “scaffolding” for—the development of general cognitive abilities related to representing temporal or sequential patterns. Accordingly, the absence of sound early in development may result in disturbances to these sequencing skills. In support of this hypothesis, we present two types of findings. First, normal-hearing adults do best on sequencing tasks when the sense of hearing, rather than sight, can be used. Second, recent findings suggest that deaf children have disturbances on exactly these same kinds of tasks that involve learning and manipulation of serial-order information. We suggest that sound provides an “auditory scaffolding” for time and serial-order behavior, possibly mediated through neural connections between the temporal and frontal lobes of the brain. Under conditions of auditory deprivation, auditory scaffolding is absent, resulting in neural reorganization and a disturbance to cognitive sequencing abilities.


Developmental Science | 2011

Implicit Sequence Learning in Deaf Children with Cochlear Implants

Christopher M. Conway; David B. Pisoni; Esperanza M. Anaya; Jennifer Karpicke; Shirley C. Henning

Deaf children with cochlear implants (CIs) represent an intriguing opportunity to study neurocognitive plasticity and reorganization when sound is introduced following a period of auditory deprivation early in development. Although it is common to consider deafness as affecting hearing alone, it may be the case that auditory deprivation leads to more global changes in neurocognitive function. In this paper, we investigate implicit sequence learning abilities in deaf children with CIs using a novel task that measured learning through improvement to immediate serial recall for statistically consistent visual sequences. The results demonstrated two key findings. First, the deaf children with CIs showed disturbances in their visual sequence learning abilities relative to the typically developing normal-hearing children. Second, sequence learning was significantly correlated with a standardized measure of language outcome in the CI children. These findings suggest that a period of auditory deprivation has secondary effects related to general sequencing deficits, and that disturbances in sequence learning may at least partially explain why some deaf children still struggle with language following cochlear implantation.


Annals of the New York Academy of Sciences | 2008

Neurocognitive basis of implicit learning of sequential structure and its relation to language processing.

Christopher M. Conway; David B. Pisoni

The ability to learn and exploit environmental regularities is important for many aspects of skill learning, of which language may be a prime example. Much of such learning proceeds in an implicit fashion, that is, it occurs unintentionally and automatically and results in knowledge that is difficult to verbalize explicitly. An important research goal is to ascertain the underlying neurocognitive mechanisms of implicit learning abilities and understand its contribution to perception, language, and cognition more generally. In this article, we review recent work that investigates the extent to which implicit learning of sequential structure is mediated by stimulus‐specific versus domain‐general learning mechanisms. Although much of previous implicit learning research has emphasized its domain‐general aspect, here we highlight behavioral work suggesting a modality‐specific locus. Even so, our data also reveal that individual variability in implicit sequence learning skill correlates with performance on a task requiring sensitivity to the sequential context of spoken language, suggesting that implicit sequence learning to some extent is domain‐general. Taking into consideration this behavioral work, in conjunction with recent imaging studies, we argue that implicit sequence learning and language processing are both complex, dynamic processes that partially share the same underlying neurocognitive mechanisms, specifically those that rely on the encoding and representation of phonological sequences.


Language and Cognitive Processes | 2012

Similar Neural Correlates for Language and Sequential Learning: Evidence from Event-Related Brain Potentials

Morten H. Christiansen; Christopher M. Conway; Luca Onnis

We used event-related potentials (ERPs) to investigate the time course and distribution of brain activity while adults performed (1) a sequential learning task involving complex structured sequences and (2) a language processing task. The same positive ERP deflection, the P600 effect, typically linked to difficult or ungrammatical syntactic processing, was found for structural incongruencies in both sequential learning as well as natural language and with similar topographical distributions. Additionally, a left anterior negativity (LAN) was observed for language but not for sequential learning. These results are interpreted as an indication that the P600 provides an index of violations and the cost of integration of expectations for upcoming material when processing complex sequential structure. We conclude that the same neural mechanisms may be recruited for both syntactic processing of linguistic stimuli and sequential learning of structured sequence patterns more generally.


European Journal of Cognitive Psychology | 2009

Seeing and hearing in space and time: Effects of modality and presentation rate on implicit statistical learning

Christopher M. Conway; Morten H. Christiansen

Across a wide range of tasks, vision appears to process input best when it is spatially rather than temporally distributed, whereas audition is the opposite. Here we explored whether such modality constraints also affect implicit statistical learning in an artificial grammar learning task. Participants were exposed to statistically governed input sequences and then tested on their ability to classify novel items. We explored three types of presentation formats—visual input distributed spatially, visual input distributed temporally, auditory input distributed temporally—and two rates of presentation: moderate (4 elements/second) and fast (8 elements/second). Overall, learning abilities were best for visual-spatial and auditory input. Additionally, at the faster presentation rate, performance declined only for the visual-temporal condition. Finally, auditory learning was mediated by increased sensitivity to the endings of input sequences, whereas vision was most sensitive to the beginnings of sequences. These results suggest that statistical learning for sequential and spatial patterns proceeds differently across the visual and auditory modalities.


Quarterly Journal of Experimental Psychology | 2011

Timing is everything: Changes in presentation rate have opposite effects on auditory and visual implicit statistical learning

Lauren L. Emberson; Christopher M. Conway; Morten H. Christiansen

Implicit statistical learning (ISL) is exclusive to neither a particular sensory modality nor a single domain of processing. Even so, differences in perceptual processing may substantially affect learning across modalities. In three experiments, statistically equivalent auditory and visual familiarizations were presented under different timing conditions that either facilitated or disrupted temporal processing (fast or slow presentation rates). We find an interaction of rate and modality of presentation: At fast rates, auditory ISL was superior to visual. However, at slow presentation rates, the opposite pattern of results was found: Visual ISL was superior to auditory. Thus, we find that changes to presentation rate differentially affect ISL across sensory modalities. Additional experiments confirmed that this modality-specific effect was not due to cross-modal interference or attentional manipulations. These findings suggest that ISL is rooted in modality-specific, perceptually based processes.

Collaboration


Dive into the Christopher M. Conway's collaboration.

Top Co-Authors

Avatar

David B. Pisoni

Indiana University Bloomington

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Sonia Singh

Georgia State University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Anne McClure Walk

Eastern Illinois University

View shared research outputs
Researchain Logo
Decentralizing Knowledge