Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Seppo J. Laukka is active.

Publication


Featured researches published by Seppo J. Laukka.


Biological Psychology | 1995

Frontal midline theta related to learning in a simulated driving task.

Seppo J. Laukka; Timo Järvilehto; Yuri I. Alexandrov; Juhani Lindqvist

The occurrence of frontal midline theta activity (4-7 Hz) was studied in a simulated driving task during consecutive phases of goal-directed behaviour. Electrical activity of the forebrain (Fz) was analysed in a simulated traffic situation in which the subject had to find the correct way to drive a car through a set of roads in a computer game. The occurrence of theta activity was analysed during seven consecutive sections of the game. The results showed that the occurrence of theta activity increased during learning--successful behaviour produced more theta than unsuccessful behaviour. In some sections of the game the percentage of theta was larger than in others. It is suggested that the theta activity reflects relaxed concentration after mastering the game.


Developmental Neuropsychology | 2009

The Effect of Age on Attentional Modulation in Dichotic Listening

Fiia Takio; Mika Koivisto; Laura Jokiranta; Faramosh Rashid; Johanna Kallio; Tuulikki Tuominen; Seppo J. Laukka; Heikki Hämäläinen

The right-ear advantage (REA) in Dichotic listening (DL) reflects stimulus-driven bottom-up asymmetry in speech processing. The REA can be modified by top-down attentional control. We investigated attentional control in DL task as a function of age. A total of 186 participants between the ages of 5 and 79 years were tested. The youngest children demonstrated a REA that was not modified by attention, suggesting that bottom-up functional asymmetry was present. The 10–11-year-olds began to show ability to voluntarily modify DL, but only young adults were fully capable of doing so. In 59–79-year-olds, this top-down attentional control was lost again.


PLOS ONE | 2015

Predicting the Valence of a Scene from Observers’ Eye Movements

Hamed Rezazadegan Tavakoli; Adham Atyabi; Antti Rantanen; Seppo J. Laukka; Samia Nefti-Meziani; Janne Heikkilä

Multimedia analysis benefits from understanding the emotional content of a scene in a variety of tasks such as video genre classification and content-based image retrieval. Recently, there has been an increasing interest in applying human bio-signals, particularly eye movements, to recognize the emotional gist of a scene such as its valence. In order to determine the emotional category of images using eye movements, the existing methods often learn a classifier using several features that are extracted from eye movements. Although it has been shown that eye movement is potentially useful for recognition of scene valence, the contribution of each feature is not well-studied. To address the issue, we study the contribution of features extracted from eye movements in the classification of images into pleasant, neutral, and unpleasant categories. We assess ten features and their fusion. The features are histogram of saccade orientation, histogram of saccade slope, histogram of saccade length, histogram of saccade duration, histogram of saccade velocity, histogram of fixation duration, fixation histogram, top-ten salient coordinates, and saliency map. We utilize machine learning approach to analyze the performance of features by learning a support vector machine and exploiting various feature fusion schemes. The experiments reveal that ‘saliency map’, ‘fixation histogram’, ‘histogram of fixation duration’, and ‘histogram of saccade slope’ are the most contributing features. The selected features signify the influence of fixation information and angular behavior of eye movements in the recognition of the valence of images.


Laterality | 2013

Visual rightward spatial bias varies as a function of age

Fiia Takio; Mika Koivisto; Tuulikki Tuominen; Seppo J. Laukka; Heikki Hämäläinen

Age-related changes in visual spatial biases in children, young adults, and older adults were studied with unilateral and bilateral stimulus conditions in fast-paced linguistic and non-linguistic attention tasks. Only rightward spatial biases were observed. The incidence of the biases changed as a function of age: in childhood and in old age the rightward spatial biases were more common than in young adulthood. The present rightward spatial biases were similar to those observed in the corresponding auditory spatial linguistic and non-linguistic attention tests (Takio, Koivisto, Laukka, & Hämäläinen, 2011) and in the dichotic listening forced-attention task (Takio et al., 2009). We suggest that the multimodal rightward spatial bias observed under intensive attentional load is related to a right hemispace preference and modulated by age-dependent changes in executive functions.


Developmental Neuropsychology | 2011

Auditory Rightward Spatial Bias Varies as a Function of Age

Fiia Takio; Mika Koivisto; Seppo J. Laukka; Heikki Hämäläinen

Age-related changes in auditory spatial perception of linguistic and non-linguistic stimuli in participants between 5 and 79 years of age were studied. The results show that the strength of the rightward perceptual bias in linguistic bilateral (dichotic) stimulus condition changes as a function of age. In childhood and old age also other rightward spatial biases were observed in linguistic as well as in non-linguistic stimulus conditions. We propose that the auditory rightward spatial biases are not specific to the language and are probably modulated by the early development and late decline of the executive functions.


international conference of the ieee engineering in medicine and biology society | 2012

Multimodal emotion recognition by combining physiological signals and facial expressions: A preliminary study

Jukka Kortelainen; Suvi Tiinanen; Xiaohua Huang; Xiaobai Li; Seppo J. Laukka; Matti Pietikäinen; Tapio Seppänen

Lately, multimodal approaches for automatic emotion recognition have gained significant scientific interest. In this paper, emotion recognition by combining physiological signals and facial expressions was studied. Heart rate variability parameters, respiration frequency, and facial expressions were used to classify persons emotions while watching pictures with emotional content. Three classes were used for both valence and arousal. The preliminary results show that, over the proposed channels, detecting arousal seem to be easier compared to valence. While the classification performance of 54.5% was attained with arousal, only 38.0% of the samples were classified correctly in terms of valence. In future, additional modalities as well as feature selection will be utilized to improve the results.


Alcohol | 2000

Neuronal subserving of behavior before and after chronic ethanol treatment

Yu. I. Alexandrov; Yu. V. Grinchenko; M. V. Bodunov; V.N. Matz; A.V. Korpusova; Seppo J. Laukka; M Sams

We have previously shown that an acute ethanol dose (1 g/kg), sufficient to impair the performance of a healthy rabbit, also reversibly depresses the activity of those limbic-cortex neurons that are specifically activated during recently learned behavioral acts. Our new morphological and neurophysiological data suggest a death of such neurons after 9-month chronic ethanol treatment. The effect of acute ethanol administration on neurons and performance speed in alcoholic rabbits was opposite to that found in healthy animals. Our results help to understand why neurocognition of alcoholics changes and why acute low-level alcohol ingestion influences them differently than healthy individuals.


Multimedia Tools and Applications | 2016

MORE --- a multimodal observation and analysis system for social interaction research

Anja Keskinarkaus; Sami Huttunen; Antti Siipo; Jukka Holappa; Magda Laszlo; Ilkka Juuso; Eero Väyrynen; Janne Heikkilä; Matti Lehtihalmes; Tapio Seppänen; Seppo J. Laukka

The MORE system is designed for observation and machine-aided analysis of social interaction in real life situations, such as classroom teaching scenarios and business meetings. The system utilizes a multichannel approach to collect data whereby multiple streams of data in a number of different modalities are obtained from each situation. Typically the system collects a 360-degree video and audio feed from multiple microphones set up in the space. The system includes an advanced server backend component that is capable of performing video processing, feature extraction and archiving operations on behalf of the user. The feature extraction services form a key part of the system and rely on advanced signal analysis techniques, such as speech processing, motion activity detection and facial expression recognition in order to speed up the analysis of large data sets. The provided web interface weaves the multiple streams of information together, utilizes the extracted features as metadata on the audio and video data and lets the user dive into analyzing the recorded events. The objective of the system is to facilitate easy navigation of multimodal data and enable the analysis of the recorded situations for the purposes of, for example, behavioral studies, teacher training and business development. A further unique feature of the system is its low setup overhead and high portability as the lightest MORE setup only requires a laptop computer and the selected set of sensors on site.


Quarterly Journal of Experimental Psychology | 2017

How Young Adults with Autism Spectrum Disorder Watch and Interpret Pragmatically Complex Scenes.

Linda Lönnqvist; Soile Loukusa; Tuula Marketta Hurtig; Leena Mäkinen; Antti Siipo; Eero Väyrynen; Pertti Palo; Seppo J. Laukka; Laura Mämmelä; Marja-Leena Mattila; Hanna Ebeling

The aim of the current study was to investigate subtle characteristics of social perception and interpretation in high-functioning individuals with autism spectrum disorders (ASDs), and to study the relation between watching and interpreting. As a novelty, we used an approach that combined moment-by-moment eye tracking and verbal assessment. Sixteen young adults with ASD and 16 neurotypical control participants watched a video depicting a complex communication situation while their eye movements were tracked. The participants also completed a verbal task with questions related to the pragmatic content of the video. We compared verbal task scores and eye movements between groups, and assessed correlations between task performance and eye movements. Individuals with ASD had more difficulty than the controls in interpreting the video, and during two short moments there were significant group differences in eye movements. Additionally, we found significant correlations between verbal task scores and moment-level eye movement in the ASD group, but not among the controls. We concluded that participants with ASD had slight difficulties in understanding the pragmatic content of the video stimulus and attending to social cues, and that the connection between pragmatic understanding and eye movements was more pronounced for participants with ASD than for neurotypical participants.


Neuroscience and Behavioral Physiology | 2005

Neuron activity in the anterolateral motor cortex in operant food-acquiring and alcohol-acquiring behavior

Yu. I. Aleksandrov; Yu. V. Grinchenko; D. G. Shevchenko; Mats Vn; Seppo J. Laukka; Averkin Rg

The interactions of the neuronal mechanisms of food-acquiring behavior and newly formed operant alcohol-acquiring behavior were studied by recording the activity of individual neurons in the anterolateral area of the motor cortex in chronically alcoholized rabbits. Adult animals learned food-acquiring behavior in a cage with two feeders and two pedals, in the corners (the food in the feeders was presented after pressing the corresponding pedal). After nine months of chronic alcoholization, the same rabbits learned an alcohol-acquiring behavior in the same experimental cage (gelatin capsules filled with 15% ethanol solution were placed in the feeders instead of food). Analysis of neuron activity showed that the set of neurons involved in supporting food-acquiring and alcohol-acquiring behaviors overlapped, though not completely. These experiments not only help us understand the neuronal mechanisms of the newly formed and the previously formed behaviors, but also facilitate the development of concepts of the similarity of the neuronal mechanisms of long-term memory and long-term modifications of the nervous system, occurring in conditions of repeated intake of addictive substances.

Collaboration


Dive into the Seppo J. Laukka's collaboration.

Top Co-Authors

Avatar

Yu. V. Grinchenko

Russian Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Mats Vn

Russian Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Yu. I. Alexandrov

Russian Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar

Yuri I. Alexandrov

Russian Academy of Sciences

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge