Juha Kela
VTT Technical Research Centre of Finland
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Juha Kela.
IEEE Pervasive Computing | 2003
Panu Korpipää; Jani Mäntyjärvi; Juha Kela; Heikki Keränen; Esko-Juhani Malm
We present a uniform mobile terminal software framework that provides systematic methods for acquiring and processing useful context information from a users surroundings and giving it to applications. The framework simplifies the development of context-aware mobile applications by managing raw context information gained from multiple sources and enabling higher-level context abstractions.
ubiquitous computing | 2006
Juha Kela; Panu Korpipää; Jani Mäntyjärvi; Sanna Kallio; Giuseppe Savino; Luca Jozzo; Di Marca
Accelerometer-based gesture control is studied as a supplementary or an alternative interaction modality. Gesture commands freely trainable by the user can be used for controlling external devices with handheld wireless sensor unit. Two user studies are presented. The first study concerns finding gestures for controlling a design environment (Smart Design Studio), TV, VCR, and lighting. The results indicate that different people usually prefer different gestures for the same task, and hence it should be possible to personalise them. The second user study concerns evaluating the usefulness of the gesture modality compared to other interaction modalities for controlling a design environment. The other modalities were speech, RFID-based physical tangible objects, laser-tracked pen, and PDA stylus. The results suggest that gestures are a natural modality for certain tasks, and can augment other modalities. Gesture commands were found to be natural, especially for commands with spatial association in design environment control.
mobile and ubiquitous multimedia | 2004
Panu Korpipää; Jonna Häkkilä; Juha Kela; Sami Ronkainen; Ilkka Känsälä
Context Studio, an application personalisation tool for semi-automated context-based adaptation, has been proposed to provide a flexible means of implementing context-aware features. In this paper, Context Studio is further developed for the end users of small-screen mobile devices. Navigating and information presentation are designed for small screens, especially for the Series 60 mobile phone user interface. Context ontology, with an enhanced vocabulary model, is utilized to offer scalable representation and easy navigation of context and action information in the UI. The ontology vocabulary hierarchy is transformed into a folder-file model representation in the graphical user interface. UI elements can be directly updated, according to the extensions and modifications to ontology vocabularies, automatically in an online system. A rule model is utilized to allow systematic management and presentation of context-action rules in the user interface. The chosen ontology-based UI model is evaluated with a usability study.
systems, man and cybernetics | 2003
Sanna Kallio; Juha Kela; Jani Mäntyjärvi
This paper introduces an accelerometer-based online gesture recognition system. Recognition of gestures can be utilised as a part of a human computer interaction for mobile devices, e.g. cell phones, PDAs and remote controllers. Gestures are captured with a small wireless sensor-box that produces three dimensional acceleration signal. Acceleration signal is preprocessed, vector quantised and finally classified using Hidden Markov Models. The design of online gesture recognition for mobile devices sets requirements for data processing. Thus, the system uses a small size codebook and simple preprocessing methods. The recognition accuracy of system is tested with gestures of four degrees of complexity. Experimental results show great potential for recognising simple and even more complex gestures with good accuracy.
International Journal of Pattern Recognition and Artificial Intelligence | 2006
Sanna Kallio; Juha Kela; Panu Korpipää; Jani Mäntyjärvi
Accelerometer-based gesture recognition facilitates a complementary interaction modality for controlling mobile devices and home appliances. Using gestures for the task of home appliance control requires use of the same device and gestures by different persons, i.e. user independent gesture recognition. The practical application in small embedded low-resource devices also requires high computational performance. The user independent gesture recognition accuracy was evaluated with a set of eight gestures and seven users, with a total of 1120 gestures in the dataset. Twenty-state continuous HMM yielded an average of 96.9% user independent recognition accuracy, which was cross-validated by leaving one user in turn out of the training set. Continuous and discrete five-state HMM computational performances were compared with a reference test in a PC environment, indicating that discrete HMM is 20% faster. Computational performance of discrete five-state HMM was evaluated in an embedded hardware environment with a 104 MHz ARM-9 processor and Symbian OS. The average recognition time per gesture calculated from 1120 gesture repetitions was 8.3 ms. With this result, the computational performance difference between the compared methods is considered insignificant in terms of practical application. Continuous HMM is hence recommended as a preferred method due to its better suitability for a continuous-valued signal, and better recognition accuracy. The results suggest that, according to both evaluation criteria, HMM is feasible for practical user independent gesture control applications in mobile low-resource embedded environments.
advanced visual interfaces | 2006
Sanna Kallio; Juha Kela; Jani Mäntyjärvi; Johan Plomp
Visualization method is proposed as an additional feature for accelerometer-based gesture control. The motivation for visualization of gesture control is justified and the challenges related to visualization are presented. The gesture control is based on Hidden Markov Models. This paper describes basic concepts of the gesture visualization and studies how well the developed visualization method can animate hand movement performed during the gesture control. The results indicate that visualization clearly provides information about the performed gesture, and it could be utilized in providing essential feedback and guidance to the user in future gesture control applications.
tangible and embedded interaction | 2008
Jukka Linjama; Panu Korpipää; Juha Kela; Tapani Rantakokko
This study addresses the issue of how to aid adoption of new interaction means for mobile devices. The research problem is how to promote and guide the use of new movement interaction modalities to a novice user, who has no prior knowledge of gesture control. The aim was to create a pleasurable experience that invites users to learn how mobile device movement control works. The main contribution is an interaction tutorial application that combines gesture control with a physical visual tangible object in a mobile device, demonstrating interaction elements that are potentially applicable in future mobile devices.
IEEE Pervasive Computing | 2006
Panu Korpipää; Esko-Juhani Malm; Tapani Rantakokko; Vesa Kyllönen; Juha Kela; Jani Mäntyjärvi; Jonna Häkkilä; Ilkka Känsälä
Journal of Multimedia | 2005
Jani Mäntyjärvi; Sanna Kallio; Panu Korpipää; Juha Kela; Johan Plomp
mobile and ubiquitous multimedia | 2004
Giulio Jacucci; Juha Kela; Johan Plomp