Klas Ihme
Leipzig University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Klas Ihme.
Frontiers in Neuroscience | 2011
Thorsten O. Zander; Moritz Lehne; Klas Ihme; Sabine Jatzev; João Mendonça Correia; Christian Kothe; Bernd Picht; Femke Nijboer
Although it ranks among the oldest tools in neuroscientific research, electroencephalography (EEG) still forms the method of choice in a wide variety of clinical and research applications. In the context of brain–computer interfacing (BCI), EEG recently has become a tool to enhance human–machine interaction. EEG could be employed in a wider range of environments, especially for the use of BCI systems in a clinical context or at the homes of patients. However, the application of EEG in these contexts is impeded by the cumbersome preparation of the electrodes with conductive gel that is necessary to lower the impedance between electrodes and scalp. Dry electrodes could provide a solution to this barrier and allow for EEG applications outside the laboratory. In addition, dry electrodes may reduce the time needed for neurological exams in clinical practice. This study evaluates a prototype of a three-channel dry electrode EEG system, comparing it to state-of-the-art conventional EEG electrodes. Two experimental paradigms were used: first, event-related potentials (ERP) were investigated with a variant of the oddball paradigm. Second, features of the frequency domain were compared by a paradigm inducing occipital alpha. Furthermore, both paradigms were used to evaluate BCI classification accuracies of both EEG systems. Amplitude and temporal structure of ERPs as well as features in the frequency domain did not differ significantly between the EEG systems. BCI classification accuracies were equally high in both systems when the frequency domain was considered. With respect to the oddball classification accuracy, there were slight differences between the wet and dry electrode systems. We conclude that the tested dry electrodes were capable to detect EEG signals with good quality and that these signals can be used for research or BCI applications. Easy to handle electrodes may help to foster the use of EEG among a wider range of potential users.
Brain Research | 2013
Klas Ihme; Udo Dannlowski; Vladimir Lichev; Anja Stuhrmann; Dominik Grotegerd; Nicole Rosenberg; Harald Kugel; Walter Heindel; Volker Arolt; Anette Kersting; Thomas Suslow
OBJECTIVEnAlexithymia has been characterized as the inability to identify and describe feelings. Functional imaging studies have revealed that alexithymia is linked to reactivity changes in emotion- and face-processing-relevant brain areas. In this respect, anterior cingulate cortex (ACC), amygdala, anterior insula and fusiform gyrus (FFG) have been consistently reported. However, it remains to be clarified whether alexithymia is also associated with structural differences.nnnMETHODSnVoxel-based morphometry on T1-weighted magnetic resonance images was used to investigate gray matter volume in 17 high alexithymics (HA) and 17 gender-matched low alexithymics (LA), which were selected from a sample of 161 healthy volunteers on basis of the 20-item Toronto Alexithymia Scale. Data were analyzed as statistic parametric maps for the comparisons LA>HA and HA>LA in a priori determined regions of interests (ROIs), i.e., ACC, amygdala, anterior insula and FFG. Moreover, an exploratory whole brain analysis was accomplished.nnnRESULTSnFor the contrast LA>HA, significant clusters were detected in the ACC, left amygdala and left anterior insula. Additionally, the whole brain analysis revealed volume differences in the left middle temporal gyrus. No significant differences were found for the comparison HA>LA.nnnCONCLUSIONnOur findings suggest that high compared to low alexithymics show less gray matter volume in several emotion-relevant brain areas. These structural differences might contribute to the functional alterations found in previous imaging studies in alexithymia.
Neuroscience | 2012
Uta-Susan Donges; Harald Kugel; Anja Stuhrmann; Dominik Grotegerd; Ronny Redlich; Vladimir Lichev; Nicole Rosenberg; Klas Ihme; Thomas Suslow; Udo Dannlowski
According to social psychology models of adult attachment, a fundamental dimension of attachment is anxiety. Individuals who are high in attachment anxiety are motivated to achieve intimacy in relationships, but are mistrustful of others and their availability. Behavioral research has shown that anxiously attached persons are vigilant for emotional facial expression, but the neural substrates underlying this perceptual sensitivity remain largely unknown. In the present study functional magnetic resonance imaging was used to examine automatic brain reactivity to approach-related facial emotions as a function of attachment anxiety in a sample of 109 healthy adults. Pictures of sad and happy faces were presented masked by neutral faces. The Relationship Scales Questionnaire (RSQ) was used to assess attachment style. Attachment anxiety was correlated with depressivity, trait anxiety, and attachment avoidance. Controlling for these variables, attachment-related anxiety was positively related to responses in left inferior, middle, and medial prefrontal areas, globus pallidus, claustrum, and right cerebellum to masked happy facial expression. Attachment anxiety was not found to be associated with brain activation due to masked sad faces. Our findings suggest that anxiously attached adults are automatically more responsive to positive approach-related facial expression in brain areas that are involved in the perception of facial emotion, facial mimicry, or the assessment of affective value and social distance.
Neuropsychologia | 2014
Klas Ihme; Julia Sacher; Vladimir Lichev; Nicole Rosenberg; Harald Kugel; Michael Rufer; Hans Jörgen Grabe; André Pampel; Jöran Lepsien; Anette Kersting; Arno Villringer; Richard D. Lane; Thomas Suslow
The ability to recognize subtle facial expressions can be valuable in social interaction to infer emotions and intentions of others. Research has shown that the personality trait of alexithymia is linked to difficulties labeling facial expressions especially when these are presented with temporal constraints. The present study investigates the neural mechanisms underlying this deficit. 50 young healthy volunteers had to label briefly presented (≤100ms) emotional (happy, angry, fearful) facial expressions masked by a neutral expression while undergoing functional magnetic resonance imaging (fMRI). A multi-method approach (20-Item Toronto Alexithymia Scale and Toronto Structured Interview for Alexithymia) was administered to assess alexithymic tendencies. Behavioral results point to a global deficit of alexithymic individuals in labeling brief facial expressions. Alexithymia was related to decreased response of the ventral striatum to negative facial expressions. Moreover, alexithymia was associated with lowered activation in frontal, temporal and occipital cortices. Our data suggest that alexithymic individuals have difficulties in creating appropriate representations of the emotional state of other persons under temporal constraints. These deficiencies could lead to problems in labeling other people׳s facial emotions.
international conference on universal access in human computer interaction | 2013
Janna Protzak; Klas Ihme; Thorsten O. Zander
Tracking eye movements to control technical systems is becoming increasingly popular; the use of eye movements to direct a cursor in human-computer interaction (HCI) is particularly convenient and caters for both healthy and disabled users alike. However, it is often difficult to find an appropriate substitute for the click operation, especially within the context of hands-free interaction. The most common approach is the use of dwell-times, but this can lead to the so-called Midas-Touch problem. This problem is defined by the fact that the system incorrectly interprets fixations due to long processing times or spontaneous dwellings as a user command. The current study explores the event-related potentials (ERPs) that might indicate a users intention to select. Therefore, Electroencephalography (EEG) data was recorded from 10 participants during an interaction with a dwell-time system within a selection process. The aim was to identify EEG potentials related to the intention to interact (i.e. the selection of targets on a screen) and to classify these against EEG potentials unrelated to interaction during random fixations on the screen. As a result, we found a clear negativity over parietal electrodes for the intention of item selection. This negativity did not occur when participant fixated an object without intention to select (no specific intention). We robustly could classify the underlying brain activity in most of our participants with an average accuracy of 81%. The presented study provides evidence that the intention to interact evokes EEG activity that can clearly be detected by passive BCI technology. This leads to a new type of implicit interaction that holds the potential to improve human-machine interaction by increasing efficiency and making it more intuitive.
Journal of Neural Engineering | 2011
Thorsten O. Zander; Klas Ihme; Matti Gärtner; Matthias Rötting
Methods of statistical machine learning have recently proven to be very useful in contemporary brain-computer interface (BCI) research based on the discrimination of electroencephalogram (EEG) patterns. Because of this, many research groups develop new algorithms for both feature extraction and classification. However, until now, no large-scale comparison of these algorithms has been accomplished due to the fact that little EEG data is publicly available. Therefore, we at Team PhyPA recorded 32-channel EEGs, electromyograms and electrooculograms of 36 participants during a simple finger movement task. The data are published on our website www.phypa.org and are freely available for downloading. We encourage BCI researchers to test their algorithms on these data and share their results. This work also presents exemplary benchmarking procedures of common feature extraction methods for slow cortical potentials and event-related desynchronization as well as for classification algorithms based on these features.
Social Cognitive and Affective Neuroscience | 2015
Vladimir Lichev; Julia Sacher; Klas Ihme; Nicole Rosenberg; Markus Quirin; Jöran Lepsien; André Pampel; Michael Rufer; Hans Jörgen Grabe; Harald Kugel; Anette Kersting; Arno Villringer; Richard D. Lane; Thomas Suslow
It is unclear whether reflective awareness of emotions is related to extent and intensity of implicit affective reactions. This study is the first to investigate automatic brain reactivity to emotional stimuli as a function of trait emotional awareness. To assess emotional awareness the Levels of Emotional Awareness Scale (LEAS) was administered. During scanning, masked happy, angry, fearful and neutral facial expressions were presented to 46 healthy subjects, who had to rate the fit between artificial and emotional words. The rating procedure allowed assessment of shifts in implicit affectivity due to emotion faces. Trait emotional awareness was associated with increased activation in the primary somatosensory cortex, inferior parietal lobule, anterior cingulate gyrus, middle frontal and cerebellar areas, thalamus, putamen and amygdala in response to masked happy faces. LEAS correlated positively with shifts in implicit affect caused by masked happy faces. According to our findings, people with high emotional awareness show stronger affective reactivity and more activation in brain areas involved in emotion processing and simulation during the perception of masked happy facial expression than people with low emotional awareness. High emotional awareness appears to be characterized by an enhanced positive affective resonance to others at an automatic processing level.
BMC Neuroscience | 2014
Klas Ihme; Julia Sacher; Vladimir Lichev; Nicole Rosenberg; Harald Kugel; Michael Rufer; Hans-Jörgen Grabe; André Pampel; Jöran Lepsien; Anette Kersting; Arno Villringer; Thomas Suslow
BackgroundAlexithymia is a personality trait that is characterized by difficulties in identifying and describing feelings. Previous studies have shown that alexithymia is related to problems in recognizing others’ emotional facial expressions when these are presented with temporal constraints. These problems can be less severe when the expressions are visible for a relatively long time. Because the neural correlates of these recognition deficits are still relatively unexplored, we investigated the labeling of facial emotions and brain responses to facial emotions as a function of alexithymia.ResultsForty-eight healthy participants had to label the emotional expression (angry, fearful, happy, or neutral) of faces presented for 1 or 3xa0seconds in a forced-choice format while undergoing functional magnetic resonance imaging. The participants’ level of alexithymia was assessed using self-report and interview. In light of the previous findings, we focused our analysis on the alexithymia component of difficulties in describing feelings. Difficulties describing feelings, as assessed by the interview, were associated with increased reaction times for negative (i.e., angry and fearful) faces, but not with labeling accuracy. Moreover, individuals with higher alexithymia showed increased brain activation in the somatosensory cortex and supplementary motor area (SMA) in response to angry and fearful faces. These cortical areas are known to be involved in the simulation of the bodily (motor and somatosensory) components of facial emotions.ConclusionThe present data indicate that alexithymic individuals may use information related to bodily actions rather than affective states to understand the facial expressions of other persons.
Comprehensive Psychiatry | 2014
Vladimir Lichev; Michael Rufer; Nicole Rosenberg; Klas Ihme; Hans-Jörgen Grabe; Harald Kugel; Uta-Susan Donges; Anette Kersting; Thomas Suslow
The aim of this study was to evaluate psychometric properties and relations between two different methods of measuring alexithymia and one measure of emotional awareness in a German non-clinical sample. The 20-Item Toronto Alexithymia Scale (TAS-20), the Toronto Structured Interview for Alexithymia (TSIA), and the Levels of Emotional Awareness Scale (LEAS), which is a performance-based measure of emotional awareness, were administered to 84 university students. Both internal reliability and inter-rater reliability for the TSIA were acceptable. Results from exploratory factor analysis (EFA) based on all measures supported a three factorial solution previously obtained in an American sample using multiple methods of alexithymia and emotional ability measurement. In our three factor model direct self (TAS-20), direct other (TSIA), and indirect self (LEAS) measures were differentiated. The convergent validity of the TSIA was supported by a significant correlation with the LEAS. Our findings suggest that future research on alexithymia and emotional awareness can benefit from the use of a multi-method approach and should include objective measures.
Scandinavian Journal of Psychology | 2015
Thomas Suslow; Klas Ihme; Markus Quirin; Vladimir Lichev; Nicole Rosenberg; Jochen Bauer; Luise Bomberg; Anette Kersting; Karl-Titus Hoffmann; Donald Lobsien
Previous research has revealed affect-congruity effects for the recognition of affects from faces. Little is known about the impact of affect on the perception of body language. The aim of the present study was to investigate the relationship of implicit (versus explicit) affectivity with the recognition of briefly presented affective body expressions. Implicit affectivity, which can be measured using indirect assessment methods, has been found to be more predictive of spontaneous physiological reactions than explicit (self-reported) affect. Thirty-four healthy women had to label the expression of body postures (angry, fearful, happy, or neutral) presented for 66 ms and masked by a neutral body posture in a forced-choice format while undergoing functional magnetic resonance imaging (fMRI). Participants implicit affectivity was assessed using the Implicit Positive and Negative Affect Test. Measures of explicit state and trait affectivity were also administered. Analysis of the fMRI data was focused on a subcortical network involved in the rapid perception of affective body expressions. Only implicit negative affect (but not explicit affect) was correlated with correct labeling performance for angry body posture. As expected, implicit negative affect was positively associated with activation of the subcortical network in response to fearful and angry expression (compared to neutral expression). Responses of the caudate nucleus to affective body expression were especially associated with its recognition. It appears that processes of rapid recognition of affects from body postures could be facilitated by an individuals implicit negative affect.