Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Toni Vanhala is active.

Publication


Featured researches published by Toni Vanhala.


international conference of the ieee engineering in medicine and biology society | 2011

A Wearable, Wireless Gaze Tracker with Integrated Selection Command Source for Human‐Computer Interaction

Ville Rantanen; Toni Vanhala; Outi Tuisku; Pekka-Henrik Niemenlehto; Jarmo Verho; Veikko Surakka; Martti Juhola; Jukka Lekkala

A light-weight, wearable, wireless gaze tracker with integrated selection command source for human-computer interaction is introduced. The prototype system combines head-mounted, video-based gaze tracking with capacitive facial movement detection that enable multimodal interaction by gaze pointing and making selections with facial gestures. The system is targeted mainly to disabled people with limited mobility over their hands. The hardware was made wireless to remove the need to take off the device when moving away from the computer, and to allow future use in more mobile contexts. The algorithms responsible for determining the eye and head orientations to map gaze direction to on-screen coordinates are presented together with the one to detect movements from the measured capacitance signal. Point-and-click experiments were conducted to assess the performance of the multimodal system. The results show decent performance in laboratory and office conditions. The overall point-and-click accuracy in the multimodal experiments is comparable to the errors in previous research on head-mounted, single modality gaze tracking that does not compensate for changes in head orientation.


intelligent user interfaces | 2005

Person-independent estimation of emotional experiences from facial expressions

Timo Partala; Veikko Surakka; Toni Vanhala

The aim of this research was to develop methods for the automatic person-independent estimation of experienced emotions from facial expressions. Ten subjects watched series of emotionally arousing pictures and videos, while the electromyographic (EMG) activity of two facial muscles: zygomaticus major (activated in smiling) and corrugator supercilii (activated in frowning) was registered. Based on the changes in the activity of these two facial muscles, it was possible to distinguish between ratings of positive and negative emotional experiences at a rate of almost 70% for pictures and over 80% for videos. Using these methods, the computer could adapt its behavior according to the users emotions during human-computer interaction.


eye tracking research & application | 2012

The effect of clicking by smiling on the accuracy of head-mounted gaze tracking

Ville Rantanen; Jarmo Verho; Jukka Lekkala; Outi Tuisku; Veikko Surakka; Toni Vanhala

The effect of facial behaviour on gaze tracking accuracy was studied while using a prototype system that integrated head-mounted, video-based gaze tracking and a capacitive facial movement detection for respective pointing and selecting objects in a simple graphical user interface. Experiments were carried out to determine how voluntary smiling movements that were used to indicate clicks affect the accuracy of gaze tracking due to the combination of user eye movement behaviour and the operation of gaze tracking algorithms. The results showed no observable degradation of the gaze tracking accuracy when using voluntary smiling for object selections.


tests and proofs | 2012

Voluntary facial activations regulate physiological arousal and subjective experiences during virtual social stimulation

Toni Vanhala; Veikko Surakka; Matthieu Courgeon; Jean-Claude Martin

Exposure to distressing computer-generated stimuli and feedback of physiological changes during exposure have been effective in the treatment of anxiety disorders (e.g., social phobia). Here we studied voluntary facial activations as a method for regulating more spontaneous physiological changes during virtual social stimulation. Twenty-four participants with a low or high level of social anxiety activated either the corrugator supercilii (used in frowning) or the zygomaticus major (used in smiling) facial muscle to keep a female or a male computer character walking towards them. The more socially anxious participants had a higher level of skin conductance throughout the trials as compared to less anxious participants. Within both groups, short-term skin conductance responses were enhanced both during and after facial activations; and corrugator supercilii activations facilitated longer term electrodermal relaxation. Zygomaticus major activations had opposite effects on subjective emotional ratings of the less and the more socially anxious. In sum, voluntary facial activations were effective in regulating emotional arousal during virtual social exposure. Corrugator supercilii activation was found an especially promising method for facilitating autonomic relaxation.


Advances in Human-computer Interaction | 2013

Text entry by gazing and smiling

Outi Tuisku; Veikko Surakka; Ville Rantanen; Toni Vanhala; Jukka Lekkala

Face Interface is a wearable prototype that combines the use of voluntary gaze direction and facial activations, for pointing and selecting objects on a computer screen, respectively. The aim was to investigate the functionality of the prototype for entering text. First, three on-screen keyboard layout designs were developed and tested (n = 10) to find a layout that would be more suitable for text entry with the prototype than traditional QWERTY layout. The task was to enter one word ten times with each of the layouts by pointing letters with gaze and select them by smiling. Subjective ratings showed that a layout with large keys on the edge and small keys near the center of the keyboard was rated as the most enjoyable, clearest, and most functional. Second, using this layout, the aim of the second experiment (n = 12) was to compare entering text with Face Interface to entering text with mouse. The results showed that text entry rate for Face Interface was 20 characters per minute (cpm) and 27 cpm for the mouse. For Face Interface, keystrokes per character (KSPC) value was 1.1 and minimum string distance (MSD) error rate was 0.12. These values compare especially well with other similar techniques.


Computer Animation and Virtual Worlds | 2010

Virtual proximity and facial expressions of computer agents regulate human emotions and attention

Toni Vanhala; Veikko Surakka; Harri Siirtola; Kari-Jouko Räihä; Benoît Morel; Laurent Ach

Realistic character animation requires elaborate rigging built on top of high quality 3D models. Sophisticated anatomically based rigs are often the choice of visual effect studios where life-like animation of CG characters is the primary objective. However, rigging a character with a muscular-skeletal system is very involving and time-consuming process, even for professionals. Although, there have been recent research efforts to automate either all or some parts of the rigging process, the complexity of anatomically based rigging nonetheless opens up new research challenges. We propose a new method to automate anatomically based rigging that transfers an existing rig of one character to another. The method is based on a data interpolation in the surface and volume domain, where various rigging elements can be transferred between different models. As it only requires a small number of corresponding input feature points, users can produce highly detailed rigs for a variety of desired character with ease. Copyright


affective computing and intelligent interaction | 2007

Facial Activation Control Effect (FACE)

Toni Vanhala; Veikko Surakka

The present study was the first in line of a series of experiments investigating the possibilities of using voluntarily produced physiological signals in computer-assisted therapy. The current aim was to find out whether computer-guided voluntary facial activations have an effect on autonomous nervous system activity. Twenty-seven participants performed a series of voluntary facial muscle activations, while wireless electrocardiography and subjective experiences were recorded. Each task consisted of activating either the corrugator superciliimuscle (activated when frowning) or the zygomaticus majormuscle (activated when smiling) at one of three activation intensities (i.e. low, medium, and high). Our results showed a voluntary facial activation control effect (FACE) on psychological (i.e. level of experience) and physiological activity. Different muscle activations produced both task-specific emotional experiences and significant changes in heart rate and heart rate variability. Low intensity activations of both muscles were the most effective, easy to perform, and pleasant. We conclude that the FACE can clearly open the route for regulating involuntary physiological processes.


Archive | 2008

Computer-Assisted Regulation of Emotional and Social Processes

Toni Vanhala; Veikko Surakka

Imagine a person who has a fear of other people. Let us call her Anna. She is afraid of people watching her every move as she stands in a line or walks down the street. Meeting new people is almost impossible as she always feels stared at and judged by everyone. This fear, or maybe even a phobia, can make Anna’s life very complicated. It is difficult for her to travel through public spaces in order to get to work, to deal with a bus or taxi driver, shop for groceries, etc. Anna’s leisure time activities are also very limited. The situation is indeed a vicious cycle, as it is even difficult for her to seek treatment and go to a therapist. In USA alone, there are approximately 15 million people like Anna who suffer from social anxiety disorder (Anxiety Disorders Association of America, 2008). A total of 40 million people suffer from different anxiety disorders. The associated yearly costs of mental health care exceed 42 billion U.S. dollars. Thus, emotional disorders are a significant public health issue. There is a need for demonstrably effective and efficient new methods for therapy. Computer systems have recently been applied to the treatment of many emotional disorders, including different phobias (Krijn et al., 2004; Wiederhold & Bullinger, 2005). These systems provide controlled virtual exposure to the object of the disorder, for example, a computer simulation of a spider or a room filled with other people. In this form of behavioural therapy, patients are systematically desensitized by gradual exposure to a computer generated representation of the object of their fear (Weiten, 2007; Krijn et al., 2004). At first, the level of exposure is kept mild and constant, for example, by keeping the object of the fear visually distant and far away. Then, the level of exposure is increased little by little, for example, by moving a virtual spider closer or increasing the number of virtual people. The underlying theory is that such exposure replaces anxiety provoking memories and thoughts with more neutral ones that are acquired in a safe, controlled environment. It has been shown that people react to computer generated stimuli in the same manner as to authentic, real-life stimuli. For example, socially anxious people are cautious about disturbing embodied artificial characters in virtual reality (Garau et al., 2005). People have also reported higher anxiety and shown increased somatic responses when speaking to negative as compared to neutral and positive audiences consisting of virtual agent characters (Pertaub et al., 2002). As these studies have shown that virtual characters are able to evoke emotions or anxiety, computer generated stimuli show clear potential as a new method for treating different social and emotional disorders by enabling controlled exposure to anxiety provoking stimuli.


nordic conference on human-computer interaction | 2008

Measuring bodily responses to virtual faces with a pressure sensitive chair

Toni Vanhala; Veikko Surakka; Jenni Anttonen

The present aim was to study emotion related body movement responses using an unobtrusive measurement chair that is embedded with electromechanical film (EMFi) sensors. 30 participants viewed images of a male and a female computer agent while the magnitude and direction of body movements were measured. The facial expressions (i.e., frowning, neutral, smiling) and the size of the agents were varied. The results showed that participants leaned statistically significantly longer towards the agent when it displayed a frowning or a smiling expression as compared to a neutral expression. Also, their body movements were reduced while viewing the agents. The results suggest that the EMFi chair is a promising tool for detecting human activity related to social and emotional behaviour. In particular, the EMFi chair may support unobtrusive measurement of bodily responses in less strictly controlled contexts of human-computer interaction.


Interacting with Computers | 2006

Real-time estimation of emotional experiences from facial expressions

Timo Partala; Veikko Surakka; Toni Vanhala

Collaboration


Dive into the Toni Vanhala's collaboration.

Top Co-Authors

Avatar

Jukka Lekkala

Tampere University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ville Rantanen

Tampere University of Technology

View shared research outputs
Top Co-Authors

Avatar

Jarmo Verho

Tampere University of Technology

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Timo Partala

Tampere University of Technology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge