Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Topi Kaaresoja is active.

Publication


Featured researches published by Topi Kaaresoja.


human factors in computing systems | 2006

Feel who's talking: using tactons for mobile phone alerts

Lorna M. Brown; Topi Kaaresoja

While the sense of touch is capable of processing complex stimuli, the vibration feedback used in mobile phones is generally very simple. Using more complex vibrotactile messages would enable the communication of more information through phone alerts, however it has been suggested that phone vibration motors are not capable of presenting complex messages. This paper reports a study investigating the use of Tactons (tactile icons), presented using a standard mobile phone vibration motor, to represent mobile phone alerts. The recognition rate of 72% achieved for Tactons encoding two pieces of information is comparable to results achieved in a previous experiment with a high specification transducer, indicating that it is possible to communicate multi-dimensional information in mobile phone alerts. These results will help designers to understand the possibilities offered by standard phone vibration motors for communicating complex information.


international conference on multimodal interfaces | 2008

Feel-good touch: finding the most pleasant tactile feedback for a mobile touch screen button

Emilia Koskinen; Topi Kaaresoja; Pauli Laitinen

Earlier research has shown the benefits of tactile feedback for touch screen widgets in all metrics: performance, usability and user experience. In our current research the goal was to go deeper in understanding the characteristics of a tactile click for virtual buttons. More specifically we wanted to find a tactile click which is the most pleasant to use with a finger. We used two actuator solutions in a small mobile touch screen: piezo actuators or a standard vibration motor. We conducted three experiments: The first and second experiments aimed to find the most pleasant tactile feedback done with the piezo actuators or a vibration motor, respectively, and the third one combined and compared the results from the first two experiments. The results from the first two experiments showed significant differences for the perceived pleasantness of the tactile clicks, and we used these most pleasant clicks in the comparison experiment in addition to the condition with no tactile feedback. Our findings confirmed results from earlier studies showing that tactile feedback is superior to a nontactile condition when virtual buttons are used with the finger regardless of the technology behind the tactile feedback. Another finding suggests that the users perceived the feedback done with piezo actuators slightly more pleasant than the vibration motor based feedback, although not statistically significantly. These results indicate that it is possible to modify the characteristics of the virtual button tactile clicks towards the most pleasant ones, and on the other hand this knowledge can help designers to create better touch screen virtual buttons and keyboards.


symposium on haptic interfaces for virtual environment and teleoperator systems | 2005

Perception of short tactile pulses generated by a vibration motor in a mobile phone

Topi Kaaresoja; Jukka Linjama

This paper describes an experimental setup and results of user tests focusing on the perception of temporal characteristics of vibration of a mobile device. The experiment consisted of six vibration stimuli of different length. We asked the subjects to score the subjective perception level in a five point Lickert scale. The results suggest that the optimal duration of the control signal should be between 50 and 200 ms in this specific case. Longer durations were perceived as being irritating.


human factors in computing systems | 2009

Audio or tactile feedback: which modality when?

Eve E. Hoggan; Andrew Crossan; Stephen A. Brewster; Topi Kaaresoja

When designing interfaces for mobile devices it is import-ant to take into account the variety of contexts of use. We present a study that examines how changing noise and dis-turbance in the environment affects user performance in a touchscreen typing task with the interface being presented through visual only, visual and tactile, or visual and audio feedback. The aim of the study is to show at what exact environmental levels audio or tactile feedback become inef-fective. The results show significant decreases in perform-ance for audio feedback at levels of 94dB and above as well as decreases in performance for tactile feedback at vibration levels of 9.18g/s. These results suggest that at these levels, feedback should be presented by a different modality. These findings will allow designers to take advantage of sensor enabled mobile devices to adapt the provided feed-back to the users current context.


international conference on multimodal interfaces | 2008

Crossmodal congruence: the look, feel and sound of touchscreen widgets

Eve E. Hoggan; Topi Kaaresoja; Pauli Laitinen; Stephen A. Brewster

Our research considers the following question: how can visual, audio and tactile feedback be combined in a congruent manner for use with touchscreen graphical widgets? For example, if a touchscreen display presents different styles of visual buttons, what should each of those buttons feel and sound like? This paper presents the results of an experiment conducted to investigate methods of congruently combining visual and combined audio/tactile feedback by manipulating the different parameters of each modality. The results indicate trends with individual visual parameters such as shape, size and height being combined congruently with audio/tactile parameters such as texture, duration and different actuator technologies. We draw further on the experiment results using individual quality ratings to evaluate the perceived quality of our touchscreen buttons then reveal a correlation between perceived quality and crossmodal congruence. The results of this research will enable mobile touchscreen UI designers to create realistic, congruent buttons by selecting the most appropriate audio and tactile counterparts of visual button styles.


international conference on multimodal interfaces | 2010

Feedback is... late: measuring multimodal delays in mobile device touchscreen interaction

Topi Kaaresoja; Stephen A. Brewster

Multimodal interaction is becoming common in many kinds of devices, particularly mobile phones. If care is not taken in design and implementation, there may be latencies in the timing of feedback in the different modalities may have unintended effects on users. This paper introduces an easy to implement multimodal latency measurement tool for touchscreen interaction. It uses off-the-shelf components and free software and is capable of measuring latencies accurately between different interaction events in different modalities. The tool uses a high-speed camera, a mirror, a microphone and an accelerometer to measure the touch, visual, audio and tactile feedback events that occur in touchscreen interaction. The microphone and the accelerometer are both interfaced with a standard PC soundcard that makes the measurement and analysis simple. The latencies are obtained by hand and eye using a slow-motion video player and an audio editor. To validate the tool, we measured four commercial mobile phones. Our results show that there are significant differences in latencies, not only between the devices, but also between different applications and modalities within one device. In this paper the focus is on mobile touchscreen devices, but with minor modifications our tool could be also used in other domains.


tests and proofs | 2014

Towards the Temporally Perfect Virtual Button: Touch-Feedback Simultaneity and Perceived Quality in Mobile Touchscreen Press Interactions

Topi Kaaresoja; Stephen A. Brewster; Vuokko Lantz

Pressing a virtual button is still the major interaction method in touchscreen mobile phones. Although phones are becoming more and more powerful, operating system software is getting more and more complex, causing latency in interaction. We were interested in gaining insight into touch-feedback simultaneity and the effects of latency on the perceived quality of touchscreen buttons. In an experiment, we varied the latency between touch and feedback between 0 and 300 ms for tactile, audio, and visual feedback modalities. We modelled the proportion of simultaneity perception as a function of latency for each modality condition. We used a Gaussian model fitted with the maximum likelihood estimation method to the observations. These models showed that the point of subjective simultaneity (PSS) was 5ms for tactile, 19ms for audio, and 32ms for visual feedback. Our study included the scoring of perceived quality for all of the different latency conditions. The perceived quality dropped significantly between latency conditions 70 and 100 ms when the feedback modality was tactile or audio, and between 100 and 150 ms when the feedback modality was visual. When the latency was 300ms for all feedback modalities, the quality of the buttons was rated significantly lower than in all of the other latency conditions, suggesting that a long latency between a touch on the screen and feedback is problematic for users. Together with PSS and these quality ratings, a 75% threshold was established to define a guideline for the recommended latency range between touch and feedback. Our guideline suggests that tactile feedback latency should be between 5 and 50 ms, audio feedback latency between 20 and 70 ms, and visual feedback latency between 30 and 85 ms. Using these values will ensure that users will perceive the feedback as simultaneous with the fingers touch. These values also ensure that the users do not perceive reduced quality. These results will guide engineers and designers of touchscreen interactions by showing the trade-offs between latency and user preference and the effects that their choices might have on the quality of the interactions and feedback they design.


world haptics conference | 2011

The effect of tactile feedback latency in touchscreen interaction

Topi Kaaresoja; Emilia Anttila; Eve E. Hoggan

Touchscreens are becoming more and more popular, especially in mobile devices. There is also clear evidence of the benefits of tactile feedback in touchscreen interaction. However, the effect of the evident latency in interaction has been completely neglected in earlier investigations of touchscreen interaction. In this study we examined the effect of tactile feedback latencies on the usability of a touchscreen keypad. We used a realistic use case for number and character keypads; users entered three-number sequences and short sentences using the virtual buttons on the touch display. The experiments differed from each other in terms of the tactile feedback type (press-only or for press and release) and the keypad layout (number or QWERTY). The results were unexpected, but consistent in all three experiments: The performance did not drop significantly within the latency values used. However, the users evaluated the keypad with the shortest feedback latency more pleasant to use compared to others. We can conclude that latency makes the user experience worse, even though performance does not decrease significantly.


international conference on human-computer interaction | 2011

Playing with tactile feedback latency in touchscreen interaction: two approaches

Topi Kaaresoja; Eve E. Hoggan; Emilia Anttila

A great deal of research has investigated the potential parameters of tactile feedback for virtual buttons. However, these studies do not take the possible effects of feedback latencies into account. Therefore, this research investigates the impact of tactile feedback delays on touchscreen keyboard usage. The first experiment investigated four tactile feedback delay conditions during a number entry task. The results showed that keypads with a constant delay (18 ms) and the smallest feedback delay variation were faster to use and produced less errors compared to conditions with wider delay variability. The experiment also produced an unexpected finding - users seemed to perceive buttons with longer delays as heavier, with a need for greater force when pressing. Therefore another experiment was conducted to investigate this phenomenon. Seven delay conditions were tested using a magnitude estimation method. The results indicate that using different latencies can be used to represent tactile weight in touchscreen interaction.


human factors in computing systems | 2014

Dynamic edge: finding eyes-free controls on orientation-agnostic devices

Johan Kildal; Teemu Ahmaniemi; Topi Kaaresoja

Clean and minimalistic industrial designs dominate current multi-device ecosystems. One intended feature of such designs is that a device can be picked up and used in any orientation. However, the persistent presence of a few physical buttons forces most users to look for them by rotating the device before using it. We propose that tangible but virtual controls could appear where the user expects them to be, when the device is picked up. This would support the intended industrial design. In a user study, we evaluate two methods for synthesizing such controls (continuous and discrete), implemented in a functional prototype. While both were found to be highly usable, an optimum implementation should be a hybrid of both, where continuous feedback supports locating the exact position of the control. Then, either model could be used to obtain the best UX in button pressing, depending on the use case.

Collaboration


Dive into the Topi Kaaresoja's collaboration.

Researchain Logo
Decentralizing Knowledge