Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Colin Humphries is active.

Publication


Featured researches published by Colin Humphries.


Psychophysiology | 2000

Removing electroencephalographic artifacts by blind source separation

Tzyy-Ping Jung; Scott Makeig; Colin Humphries; Te-Won Lee; Martin J. McKeown; Vicente J. Iragui; Terrence J. Sejnowski

Eye movements, eye blinks, cardiac signals, muscle noise, and line noise present serious problems for electroencephalographic (EEG) interpretation and analysis when rejecting contaminated EEG segments results in an unacceptable data loss. Many methods have been proposed to remove artifacts from EEG recordings, especially those arising from eye movements and blinks. Often regression in the time or frequency domain is performed on parallel EEG and electrooculographic (EOG) recordings to derive parameters characterizing the appearance and spread of EOG artifacts in the EEG channels. Because EEG and ocular activity mix bidirectionally, regressing out eye artifacts inevitably involves subtracting relevant EEG signals from each record as well. Regression methods become even more problematic when a good regressing channel is not available for each artifact source, as in the case of muscle artifacts. Use of principal component analysis (PCA) has been proposed to remove eye artifacts from multichannel EEG. However, PCA cannot completely separate eye artifacts from brain signals, especially when they have comparable amplitudes. Here, we propose a new and generally applicable method for removing a wide variety of artifacts from EEG records based on blind source separation by independent component analysis (ICA). Our results on EEG data collected from normal and autistic subjects show that ICA can effectively detect, separate, and remove contamination from a wide variety of artifactual sources in EEG records with results comparing favorably with those obtained using regression and PCA methods. ICA can also be used to analyze blink-related brain activity.


Journal of Cognitive Neuroscience | 2006

Syntactic and Semantic Modulation of Neural Activity during Auditory Sentence Comprehension

Colin Humphries; Jeffrey R. Binder; David A. Medler; Einat Liebenthal

In previous functional neuroimaging studies, left anterior temporal and temporal-parietal areas responded more strongly to sentences than to randomly ordered lists of words. The smaller response for word lists could be explained by either (1) less activation of syntactic processes due to the absence of syntactic structure in the random word lists or (2) less activation of semantic processes resulting from failure to combine the content words into a global meaning. To test these two explanations, we conducted a functional magnetic resonance imaging study in which word order and combinatorial word meaning were independently manipulated during auditory comprehension. Subjects heard six different stimuli: normal sentences, semantically incongruent sentences in which content words were randomly replaced with other content words, pseudoword sentences, and versions of these three sentence types in which word order was randomized to remove syntactic structure. Effects of syntactic structure (greater activation to sentences than to word lists) were observed in the left anterior superior temporal sulcus and left angular gyrus. Semantic effects (greater activation to semantically congruent stimuli than either incongruent or pseudoword stimuli) were seen in widespread, bilateral temporal lobe areas and the angular gyrus. Of the two regions that responded to syntactic structure, the angular gyrus showed a greater response to semantic structure, suggesting that reduced activation for word lists in this area is related to a disruption in semantic processing. The anterior temporal lobe, on the other hand, was relatively insensitive to manipulations of semantic structure, suggesting that syntactic information plays a greater role in driving activation in this area.


Human Brain Mapping | 2005

Response of anterior temporal cortex to syntactic and prosodic manipulations during sentence processing.

Colin Humphries; Tracy Love; David Swinney; Gregory Hickok

Previous research has implicated a portion of the anterior temporal cortex in sentence‐level processing. This region activates more to sentences than to word‐lists, sentences in an unfamiliar language, and environmental sound sequences. The current study sought to identify the relative contributions of syntactic and prosodic processing to anterior temporal activation. We presented auditory stimuli where the presence of prosodic and syntactic structure was independently manipulated during functional magnetic resonance imaging (fMRI). Three “structural” conditions included normal sentences, sentences with scrambled word order, and lists of content words. These three classes of stimuli were presented either with sentence prosody or with flat supra‐lexical (list‐like) prosody. Sentence stimuli activated a portion of the left anterior temporal cortex in the superior temporal sulcus (STS) and extending into the middle temporal gyrus, independent of prosody, and to a greater extent than any of the other conditions. An interaction between the structural conditions and prosodic conditions was seen in a more dorsal region of the anterior temporal lobe bilaterally along the superior temporal gyrus (STG). A post‐hoc analysis revealed that this region responded either to syntactically structured stimuli or to nonstructured stimuli with sentence‐like prosody. The results suggest a parcellation of anterior temporal cortex into 1) an STG region that is sensitive both to the presence of syntactic information and is modulated by prosodic manipulations (in nonsyntactic stimuli); and 2) a more inferior left STS/MTG region that is more selective for syntactic structure. Hum Brain Mapp, 2005.


Neuroreport | 2001

Role of anterior temporal cortex in auditory sentence comprehension: an fMRI study.

Colin Humphries; Kimberley Willard; Bradley R. Buchsbaum; Gregory Hickok

Recent neuropsychological and functional imaging evidence has suggested a role for anterior temporal cortex in sentence-level comprehension. We explored this hypothesis using event-related fMRI. Subjects were scanned while they listened to either a sequence of environmental sounds describing an event or a corresponding sentence matched as closely as possible in meaning. Both types of stimuli required subjects to integrate auditory information over time to derive a similar meaning, but differ in the processing mechanisms leading to the integration of that information, with speech input requiring syntactic mechanisms and environmental sounds utilizing non-linguistic mechanisms. Consistent with recent claims, sentences produced greater activation than environmental sounds in anterior superior temporal lobe bilaterally. A similar speech > sound activation pattern was noted also in posterior superior temporal regions in the left. Envirornmental sounds produced greater activation than sentences in right inferior frontal gyrus. The results provide support for the view that anterior temporal cortex plays an important role in sentence-level comprehension.


Cerebral Cortex | 2010

Neural Systems for Reading Aloud: A Multiparametric Approach

William W. Graves; Rutvik H. Desai; Colin Humphries; Mark S. Seidenberg; Jeffrey R. Binder

Reading aloud involves computing the sound of a word from its visual form. This may be accomplished 1) by direct associations between spellings and phonology and 2) by computation from orthography to meaning to phonology. These components have been studied in behavioral experiments examining lexical properties such as word frequency; length in letters or phonemes; spelling–sound consistency; semantic factors such as imageability, measures of orthographic, or phonological complexity; and others. Effects of these lexical properties on specific neural systems, however, are poorly understood, partially because high intercorrelations among lexical factors make it difficult to determine if they have independent effects. We addressed this problem by decorrelating several important lexical properties through careful stimulus selection. Functional magnetic resonance imaging data revealed distributed neural systems for mapping orthography directly to phonology, involving left supramarginal, posterior middle temporal, and fusiform gyri. Distinct from these were areas reflecting semantic processing, including left middle temporal gyrus/inferior-temporal sulcus, bilateral angular gyrus, and precuneus/posterior cingulate. Left inferior frontal regions generally showed increased activation with greater task load, suggesting a more general role in attention, working memory, and executive processes. These data offer the first clear evidence, in a single study, for the separate neural correlates of orthography–phonology mapping and semantic access during reading aloud.


NeuroImage | 2010

Tonotopic organization of human auditory cortex

Colin Humphries; Einat Liebenthal; Jeffrey R. Binder

The organization of tonotopic fields in human auditory cortex was investigated using functional magnetic resonance imaging. Subjects were presented with stochastically alternating multi-tone sequences in six different frequency bands, centered at 200, 400, 800, 1600, 3200, and 6400 Hz. Two mirror-symmetric frequency gradients were found extending along an anterior-posterior axis from a zone on the lateral aspect of Heschls gyrus (HG), which responds preferentially to lower frequencies, toward zones posterior and anterior to HG that are sensitive to higher frequencies. The orientation of these two principal gradients is thus roughly perpendicular to HG, rather than parallel as previously assumed. A third, smaller gradient was observed in the lateral posterior aspect of the superior temporal gyrus. The results suggest close homologies between the tonotopic organization of human and nonhuman primate auditory cortex.


NeuroImage | 2007

Time course of semantic processes during sentence comprehension: an fMRI study.

Colin Humphries; Jeffrey R. Binder; David A. Medler; Einat Liebenthal

The ability to create new meanings from combinations of words is one important function of the language system. We investigated the neural correlates of combinatorial semantic processing using fMRI. During scanning, participants performed a rating task on auditory word or pseudoword strings that differed in the presence of combinatorial and word-level semantic information. Stimuli included normal sentences comprised of thematically related words that could be readily combined to produce a more complex meaning, semantically incongruent sentences in which content words were randomly replaced with other content words, pseudoword sentences, and versions of these three sentence types in which syntactic structure was removed by randomly re-ordering the words. Several regions showed greater BOLD signal for stimuli with words than for those with pseudowords, including the left angular gyrus, left superior temporal sulcus, and left inferior frontal gyrus, suggesting that these areas are involved in semantic access at the single word level. In the angular and inferior frontal gyri these differences emerged early in the course of the hemodynamic response. An effect of combinatorial semantic structure was observed in the left angular gyrus and left lateral temporal lobe, which showed greater activation for normal compared to semantically incongruent sentences. These effects appeared later in the time course of the hemodynamic response, beginning after the entire stimulus had been presented. The data indicate a complex spatiotemporal pattern of activity associated with computation of word and sentence-level semantic information, and suggest a particular role for the left angular gyrus in processing overall sentence meaning.


Neural Networks for Signal Processing VIII. Proceedings of the 1998 IEEE Signal Processing Society Workshop (Cat. No.98TH8378) | 1998

Removing electroencephalographic artifacts: comparison between ICA and PCA

Tzyy-Ping Jung; Colin Humphries; Te-Won Lee; Scott Makeig; Martin J. McKeown; Vicente J. Iragui; Terrence J. Sejnowski

Pervasive electroencephalographic (EEG) artifacts associated with blinks, eye-movements, muscle noise, cardiac signals, and line noise poses a major challenge for EEG interpretation and analysis. Here, we propose a generally applicable method for removing a wide variety of artifacts from EEG records based on an extended version of the independent component analysis (ICA) algorithm for performing blind source separation on linear mixtures of independent source signals. Our results show that ICA can effectively separate and remove contamination from a wide variety of artifact sources in EEG records with results comparing favourably to those obtained using principal component analysis (PCA).


Journal of Sleep Research | 1998

A new method for detecting state changes in the EEG: exploratory application to sleep data

Martin J. McKeown; Colin Humphries; Peter Achermann; Alexander A. Borbély; Terrence Sejnowsk

A new statistical method is described for detecting state changes in the electroencephalogram (EEG), based on the ongoing relationships between electrode voltages at different scalp locations. An EEG sleep recording from one NREM‐REM sleep cycle from a healthy subject was used for exploratory analysis. A dimensionless function defined at discrete times ti, u(ti), was calculated by determining the log‐likelihood of observing all scalp electrode voltages under the assumption that the data can be modeled by linear combinations of stationary relationships between derivations. The u(ti), calculated by using independent component analysis, provided a sensitive, but non‐specific measure of changes in the global pattern of the EEG. In stage 2, abrupt increases in u(ti) corresponded to sleep spindles. In stages 3 and 4, low frequency (≈ 0.6 Hz) oscillations occurred in u(ti) which may correspond to slow oscillations described in cellular recordings and the EEG of sleeping cats. In stage 4 sleep, additional irregular very low frequency (≈ 0.05–0.2 Hz) oscillations were observed in u(ti) consistent with possible cyclic changes in cerebral blood flow or changes of vigilance and muscle tone. These preliminary results suggest that the new method can detect subtle changes in the overall pattern of the EEG without the necessity of making tenuous assumptions about stationarity.


Cerebral Cortex | 2013

The Role of Left Occipitotemporal Cortex in Reading: Reconciling Stimulus, Task, and Lexicality Effects

Quintino R. Mano; Colin Humphries; Rutvik H. Desai; Mark S. Seidenberg; David C. Osmon; Ben C. Stengel; Jeffrey R. Binder

Although the left posterior occipitotemporal sulcus (pOTS) has been called a visual word form area, debate persists over the selectivity of this region for reading relative to general nonorthographic visual object processing. We used high-resolution functional magnetic resonance imaging to study left pOTS responses to combinatorial orthographic and object shape information. Participants performed naming and visual discrimination tasks designed to encourage or suppress phonological encoding. During the naming task, all participants showed subregions within left pOTS that were more sensitive to combinatorial orthographic information than to object information. This difference disappeared, however, when phonological processing demands were removed. Responses were stronger to pseudowords than to words, but this effect also disappeared when phonological processing demands were removed. Subregions within the left pOTS are preferentially activated when visual input must be mapped to a phonological representation (i.e., a name) and particularly when component parts of the visual input must be mapped to corresponding phonological elements (consonant or vowel phonemes). Results indicate a specialized role for subregions within the left pOTS in the isomorphic mapping of familiar combinatorial visual patterns to phonological forms. This process distinguishes reading from picture naming and accounts for a wide range of previously reported stimulus and task effects in left pOTS.

Collaboration


Dive into the Colin Humphries's collaboration.

Top Co-Authors

Avatar

Jeffrey R. Binder

Medical College of Wisconsin

View shared research outputs
Top Co-Authors

Avatar

Einat Liebenthal

Medical College of Wisconsin

View shared research outputs
Top Co-Authors

Avatar

Mark S. Seidenberg

University of Wisconsin-Madison

View shared research outputs
Top Co-Authors

Avatar

Rutvik H. Desai

University of South Carolina

View shared research outputs
Top Co-Authors

Avatar

Gregory Hickok

University of California

View shared research outputs
Top Co-Authors

Avatar

Leonardo Fernandino

Medical College of Wisconsin

View shared research outputs
Top Co-Authors

Avatar

Lisa L. Conant

Medical College of Wisconsin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Merav Sabri

Medical College of Wisconsin

View shared research outputs
Top Co-Authors

Avatar

William L. Gross

Medical College of Wisconsin

View shared research outputs
Researchain Logo
Decentralizing Knowledge