Veikko Jousmäki
Helsinki University of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Veikko Jousmäki.
NeuroImage | 1997
Alfons Schnitzler; Stephan Salenius; Riitta Salmelin; Veikko Jousmäki; Riitta Hari
Functional brain imaging studies have indicated that several cortical and subcortical areas active during actual motor performance are also active during imagination or mental rehearsal of movements. Recent evidence shows that the primary motor cortex may also be involved in motor imagery. Using whole-scalp magnetoencephalography, we monitored spontaneous and evoked activity of the somatomotor cortex after right median nerve stimuli in seven healthy right-handed subjects while they kinesthetically imagined or actually executed continuous finger movements. Manipulatory finger movements abolished the poststimulus 20-Hz activity of the motor cortex and markedly affected the somatosensory evoked response. Imagination of manipulatory finger movements attenuated the 20-Hz activity by 27% with respect to the rest level but had no effect on the somatosensory response. Slight constant stretching of the fingers suppressed the 20-Hz activity less than motor imagery. The smallest possible, kinesthetically just perceivable finger movements resulted in slightly stronger attenuation of 20-Hz activity than motor imagery did. The effects were observed in both hemispheres but predominantly contralateral to the performing hand. The attempt to execute manipulatory finger movements under experimentally induced ischemia causing paralysis of the hand also strongly suppressed 20-Hz activity but did not affect the somatosensory evoked response. The results indicate that the primary motor cortex is involved in motor imagery. Both imaginative and executive motor tasks appear to utilize the cortical circuitry generating the somatomotor 20-Hz signal.
Human Brain Mapping | 2000
Gabriel Curio; Georg Neuloh; Jussi Numminen; Veikko Jousmäki; Riitta Hari
The voice we most often hear is our own, and proper interaction between speaking and hearing is essential for both acquisition and performance of spoken language. Disturbed audiovocal interactions have been implicated in aphasia, stuttering, and schizophrenic voice hallucinations, but paradigms for a noninvasive assessment of auditory self‐monitoring of speaking and its possible dysfunctions are rare. Using magnetoencephalograpy we show here that self‐uttered syllables transiently activate the speakers auditory cortex around 100 ms after voice onset. These phasic responses were delayed by 11 ms in the speech‐dominant left hemisphere relative to the right, whereas during listening to a replay of the same utterances the response latencies were symmetric. Moreover, the auditory cortices did not react to rare vowel changes interspersed randomly within a series of repetitively spoken vowels, in contrast to regular change‐related responses evoked 100–200 ms after replayed rare vowels. Thus, speaking primes the human auditory cortex at a millisecond time scale, dampening and delaying reactions to self‐produced “expected” sounds, more prominently in the speech‐dominant hemisphere. Such motor‐to‐sensory priming of early auditory cortex responses during voicing constitutes one element of speech self‐monitoring that could be compromised in central speech disorders. Hum. Brain Mapping 9:183–191, 2000.
Electroencephalography and Clinical Neurophysiology | 1997
François Mauguière; I Merlet; Nina Forss; Simo Vanni; Veikko Jousmäki; P Adeleine; Riitta Hari
Cortical areas responsive to somatosensory inputs were assessed by recording somatosensory evoked magnetic fields (SEF) to electrical stimulation of the left median nerve at wrist, using a 122-SQUID neuromagnetometer in various conditions of stimulus rate, attentional demand and detection task. Source modelling combined with magnetic resonance imaging (MRI) allowed localisation of six SEF sources on the outer aspect of the hemispheres located respectively: (1) in the posterior bank of the rolandic fissure (area SI), the upper bank of the sylvian fissure (parietal opercular area SII) and the banks of the intraparietal fissure contralateral to stimulation, (2) in the SII area ipsilateral to stimulation and (3) in the mid-frontal or inferior frontal gyri on both sides. All source areas were found to be simultaneously active at 70-140 ms after the stimulus, the SI source was the only one active already at 20-60 ms. The observed activation timing suggests that somatosensory input from SI is processed to higher-order areas through serial feedforward projections. However the long-lasting activations of all sources and their overlap in time is also compatible with a top-down control mediated via backward projections.
Current Biology | 1998
Veikko Jousmäki; Riitta Hari
Our brains continuously bind information obtained through many sensory channels to form solid percepts of objects and events. Usually these pieces of information complement and confirm each other, thereby improving the reliability of our perception [1xStein, B and Meredith, M. See all References][1]. But incongruities between the sensory inputs may result in unexpected percepts due to intersensory interactions, as in the well-known audiovisual McGurk illusion [2xHearing lips and seeing voices. McGurk, H and MacDonald, J. Science. 1976; 264: 746–748See all References][2].Audiotactile interactions have remained largely unexplored [1xStein, B and Meredith, M. See all References][1], although Paul von Schiller [3xDie Rauhigkeit als intermodale Erscheinung. von Schiller, P. Z Psychol Bd. 1932; 127: 265–289See all References][3] noted in 1932 that sounds — noise bursts or tones repeated at regular intervals — may affect tactile perception of roughness. We describe here a novel audiotactile interaction, ‘parchment-skin illusion’, which demonstrates that sounds that are exactly synchronous with hand-rubbing may strongly modify the resulting tactile sensations.The subjects were seated with forearms supported on their thighs. A microphone close to the hands was recording the sounds produced when the subjects rubbed their palms together in a back-and-forth motion at 1–2 cycles per second. The sounds were played back to the subject through headphones. This audio feedback was either identical to the original sound or modified so that the high frequencies (above 2 kHz) were either dampened by or accentuated by 15 decibels (dB). In addition, the maximum sound intensity, which was adjusted to a comfortable listening level, was attenuated by either 20 or 40 dB, resulting in a randomized experiment of 3 ×3 block design, as shown in Table 1Table 1.Table 1xa0Tactile sensation of skin roughness. View Table in HTML Tactile sensation of relative skin roughness as a function of quality of the auditory feedback; mean ± SEMs of 11 subjects. The original estimates of skin roughness were given on a 0 (rough or moist) to 10 (smooth or dry) scale but as the individual ranges varied from 3.5 to 10, the values were normalized according to the range of each individual before averaging. The individual values were means of two repeated tests.During the pilot sessions several subjects spontaneously reported that the enhanced high-frequency feedback made the palmar skin feel drier, almost resembling parchment paper; this effect was found in 13 out of 17 healthy adults tested. Moisture on the palmar skin typically prevented the phenomenon.Eleven subjects (six males, five females; age range 25–49 years) who reported the phenomenon in a consistent manner were asked to quantify the tactile sensations on their palms during varying audio feedback conditions on a scale of 0 to 10, referring to a range rough/moist–smooth/dry, respectively. The audio feedback had a very clear effect on the tactile sensation as is evident from Table 1Table 1.When either the proportion of the high frequencies or the average sound level of the auditory feedback increased, the skin started to feel more paper-like, that is, the perceived roughness/moisture of the palmar skin decreased and the smoothness/dryness increased. The effects of both the high-frequency content and the average intensity of the feedback were statistically highly significant (by analysis of variance). Tactile sensitivity — tested with von Frey hairs in two subjects while they performed the task with audio feedback — was not modified during the illusion, as compared with the no-feedback condition with the hands at rest.An additional experiment with two experienced subjects showed that a delay of the audio feedback by more than 100 milliseconds clearly diminished the illusion. Efficient binding of multisensory inputs evidently requires accurate temporal coincidence, or a temporal window for multisensory integration (as discussed in [4xRepresentation and integration of multiple sensory inputs in primate superior colliculus. Wallace, MT, Wilkinson, LK, and Stein, BE. J Neurophysiol. 1996; 76: 1246–1266PubMedSee all References][4]), which naturally happens when the subjects hear the sounds produced by their own hand movements.We hypothesize that the parchment-skin illusion reflects an omnipresent intersensory integration phenomenon, which helps the subject to make accurate tactile decisions about the roughness and stiffness of different textures they manipulate.
NeuroImage | 1997
Stephan Salenius; Alfons Schnitzler; Riitta Salmelin; Veikko Jousmäki; Riitta Hari
We studied modulation of cortical neuromagnetic rhythms in association with left and right median nerve stimulation, during rest, finger movements, and passive tactile hand stimulation, in seven healthy, right-handed adults. In the rest condition, the amplitude of the rhythmic sensorimotor activity decreased immediately after the median nerve stimuli and increased above the prestimulus level within 0.4 s afterward, especially in the 7- to 25-Hz band. The rebound occurred 100-300 ms earlier for 20 (7-15)-than for 10 (15-25)-Hz activity. Suppressions and rebounds were strongest in the contralateral sensorimotor hand area for the 20-Hz, but not for the 10-Hz, activity. The maximum rebound was on average 22-34% stronger in the left than in the right hemisphere. Active exploration of objects abolished rebounds of both 10- and 20-Hz signals in the contralateral hemisphere and markedly diminished them ipsilaterally. Finger movements without touching an object and passive tactile stimulation produced a weaker effect. The sensorimotor rhythms thus show a characteristic suppression and subsequent rebound after electrical median nerve stimulation. The rebound is left-hemisphere dominant in right-handed subjects and its suppression reveals bilateral cortical activation during both motor tasks and passive tactile stimulation, especially for explorative finger movements.
Current Biology | 1998
Sari Levänen; Veikko Jousmäki; Riitta Hari
Considerable changes take place in the number of cerebral neurons, synapses and axons during development, mainly as a result of competition between different neural activities [1-4]. Studies using animals suggest that when input from one sensory modality is deprived early in development, the affected neural structures have the potential to mediate functions for the remaining modalities [5-8]. We now show that similar potential exists in the human auditory system: vibrotactile stimuli, applied on the palm and fingers of a congenitally deaf adult, activated his auditory cortices. The recorded magnetoencephalographic (MEG) signals also indicated that the auditory cortices were able to discriminate between the applied 180 Hz and 250 Hz vibration frequencies. Our findings suggest that human cortical areas, normally subserving hearing, may process vibrotactile information in the congenitally deaf.
Electroencephalography and Clinical Neurophysiology | 1997
François Mauguière; I Merlet; Nina Forss; Simo Vanni; Veikko Jousmäki; P Adeleine; Riitta Hari
In this study we used a repeated measures design and univariate analysis of variance to study the respective effects of ISI, spatial attention and stimulus detection on the strengths of the sources previously identified by modelling SEFs during the 200 ms following mentally counted left median nerve stimuli delivered at long and random ISIs (Part I). We compared the SEF source strengths in response to frequent and rare stimuli, both in detection and ignoring conditions. This permitted us to establish a hierarchy in the effects of ISI, attention and stimulus detection on the activation of the cortical network of SEF sources distributed in SI and posterior parietal cortex contralateral to stimulation, and in the parietal operculum (SII) and premotor frontal cortex of both hemispheres. In all experimental conditions the SI and parietal opercular sources were the most active. All sources were more active in response to stimuli delivered at long and random ISIs and the frontal sources were activated only in this condition of stimulation. Driving the subjects attention toward the side stimulated had no detectable effect on the activity of SEF sources at short ISI. At long ISIs mental counting of the stimuli increased the responses of all sources except SI. These results suggest that activation of frontal sources during mental counting could reflect a working memory process, and that of posterior parietal sources a spatial attention effect detectable only at long ISIs.
Pain | 1997
Riitta Hari; K. Portin; Birgit Kettenmann; Veikko Jousmäki; Gerd Kobal
Abstract We recorded whole‐scalp cerebral magnetic fields from healthy adults to painful CO2 pulses (duration 200 ms, concentration 65–90%), led to the left or right nostril once every 20 or 30 s. The stimuli were embedded in a continuous airflow (140 ml/s, 36.5°C, relative humidity 80%) to prevent alterations in the mechanical and thermal conditions of the nasal mucosa. The recording passband was 0.03–90 Hz and 16 single responses were averaged per run. Five out of the 9 subjects showed replicable and artefact‐free responses 280–400 ms after stimulus onset. The main responses originated close to the second somatosensory cortex (SII), most frequently in the right hemisphere, and also in the rolandic areas, mostly on the left. The signals were considerably stronger over the right than the left frontotemporal region, with a right‐to‐left ratio of 2.3 for areal mean signal amplitudes calculated across 16 channels, for both left and right nostril stimuli. Air puffs delivered to the nasal mucosa resulted in a trend for right‐hemisphere dominant responses, but responses to air puff stimulation of the lip and the forehead were symmetric. The right‐hemisphere dominance of the SII responses may be associated with the painful, and thus unpleasant, nature of the CO2 stimulus, thereby suggesting involvement of the right hemisphere in emotional/motivational aspects of trigeminal pain, in agreement with the role of the trigeminal pathways as a general warning system.
NeuroImage | 2006
Martin Schürmann; Gina Caetano; Yevhen Hlushchuk; Veikko Jousmäki; Riitta Hari
Vibrotactile stimuli can facilitate hearing, both in hearing-impaired and in normally hearing people. Accordingly, the sounds of hands exploring a surface contribute to the explorers haptic percepts. As a possible brain basis of such phenomena, functional brain imaging has identified activations specific to audiotactile interaction in secondary somatosensory cortex, auditory belt area, and posterior parietal cortex, depending on the quality and relative salience of the stimuli. We studied 13 subjects with non-invasive functional magnetic resonance imaging (fMRI) to search for auditory brain areas that would be activated by touch. Vibration bursts of 200 Hz were delivered to the subjects fingers and palm and tactile pressure pulses to their fingertips. Noise bursts served to identify auditory cortex. Vibrotactile-auditory co-activation, addressed with minimal smoothing to obtain a conservative estimate, was found in an 85-mm3 region in the posterior auditory belt area. This co-activation could be related to facilitated hearing at the behavioral level, reflecting the analysis of sound-like temporal patterns in vibration. However, even tactile pulses (without any vibration) activated parts of the posterior auditory belt area, which therefore might subserve processing of audiotactile events that arise during dynamic contact between hands and environment.
Electroencephalography and Clinical Neurophysiology | 1996
N.E. Loveless; Sari Levänen; Veikko Jousmäki; Mikko Sams; Riitta Hari
The cortical mechanisms of auditory sensory memory were investigated by analysis of neuromagnetic evoked responses. The major deflection of the auditory evoked field (N100m) appears to comprise an early posterior component (N100mP) and a late anterior component (N100mA) which is sensitive to temporal factors. When pairs of identical sounds are presented at intervals less that about 250 msec, the second sound evokes N100mA with enhanced amplitude at a latency of about 150 msec. We suggest that N100mA may index the activity of two distinct processes in auditory sensory memory. Its recovery cycle may reflect the activity of a memory trace which, according to previous studies, can retain processed information about an auditory sequence for about 10 sec. The enhancement effect may reflect the activity of a temporal integration process, whose time constant is such that sensation persists for 200-300 msec after stimulus offset, and so serves as a short memory store. Sound sequences falling within this window of integration seem to be coded holistically as unitary events.