Jani Väyrynen
University of Oulu
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Jani Väyrynen.
mobile and ubiquitous multimedia | 2015
Jonna Häkkilä; Farnaz Vahabpour; Ashley Colley; Jani Väyrynen; Timo Koskela
Until today, mobile computing has been very much confined to conventional computing form factors, i.e. laptops, tablets and smartphones, which have achieved de facto design standards in outlook and shape. However, wearable devices are emerging, and especially glasses are an appealing form factor for future devices. Currently, although companies such as Google have productized a solution, little user research and design exploration has been published on either the user preferences or the technology. We set ourselves to explore the design directions for smart glasses with user research grounded use cases and design alternatives. We describe our user research utilizing a smart glasses design probe in an experience sampling method study (n=12), and present a focus group based study (n=14) providing results on perceptions on alternative industrial designs for smart glasses.
international symposium on pervasive displays | 2015
Ashley Colley; Jani Väyrynen; Jonna Häkkilä
In this paper, we explore a novel interaction technique for the automotive domain, distinguishing between different fingers when interacting with a touch screen, and compare it against standard and multi-finger gesture interaction. We conducted a pilot test (n=6) and final user evaluation of the interaction techniques (n=15) in an in-car context. We report that subjectively users found both alternative interaction techniques required less visual attention than normal touch screen interaction. Additionally, multi-finger interaction using 4 fingers simultaneously was found challenging by many users. Our approach targets to provide alternative interaction methods for touch screen UIs in cars, that reduce the amount of attention required for the interaction, and hence reduce the distraction from the concurrent driving task.
international conference on human-computer interaction | 2015
Ashley Colley; Jani Väyrynen; Jonna Häkkilä
We present a technique that classifies users’ age group, i.e., child or adult, from touch coordinates captured on touch-screen devices. Our technique delivered 86.5 % accuracy (user-independent) on a dataset of 119 participants (89 children ages 3 to 6) when classifying each touch event one at a time and up to 99 % accuracy when using a window of 7+ consecutive touches. Our results establish that it is possible to reliably classify a smartphone user on the fly as a child or an adult with high accuracy using only basic data about their touches, and will inform new, automatically adaptive interfaces for touch-screen devices.
nordic conference on human-computer interaction | 2014
Ashley Colley; Olli Koskenranta; Jani Väyrynen; Leena Ventä-Olkkonen; Jonna Häkkilä
Mobile projection offers an interesting technology for creating displays on any surface without a situated screen. In this paper, we investigate two concepts that use handheld projection to see to other places through a virtual window. Firstly, we present a projector phone based prototype which, when pointed to the walls of a room, reveals images and a video stream from the physical space on the other side of the wall. Secondly, a novel handheld dual-display virtual reality browser that opens a virtual window to a remote location is presented. This prototype combines two displays, a screen and a projected display. Both concepts were evaluated in user studies (n=22 and n=23). We report, for example, that mobile projector based browsing was considered more fun and inspiring than a screen and mouse format, and that the horizon level of the projected image should be kept horizontal when browsing.
human computer interaction with mobile devices and services | 2017
Ismo Alakärppä; Elisa Susanna Jaakkola; Jani Väyrynen; Jonna Häkkilä
We present a concept, prototype and in-the-wild evaluation of a mobile augmented reality (AR) application in which physical items from nature are used as AR markers. By blending the physical and digital, AR technology has the potential to create an enhanced learning experience compared to paper-based solutions and conventional mobile applications. Our prototype, an application running on a tablet computer, uses natural markers such as leaves and pinecones in a game-like nature quiz. The system was evaluated using interviews with and observations of 6- to 12-year-old children (n=11) who played the game as well as focus group discussions with play club counsellors (n=4) and primary school teachers (n=7). Our salient findings suggest that the concept has sound potential in its mixture of physical activity and educational elements in an outdoor context. In particular, teachers found the use of natural objects to be an appealing approach and a factor contributing to the learning experience.
international symposium on pervasive displays | 2015
Ashley Colley; Lasse Virtanen; Jani Väyrynen; Jonna Häkkilä
This paper presents a novel approach on guiding people how to interact with touch screen based public displays. In our approach, we apply an additional layer of transparent plastic employing different shapes of laser cut holes that is placed on top of the public display. The holes restrict the touch screen input area, and provide a physical guidance to the user on interactive elements on the screen. This enables interaction styles with the screen that are not practical in the unguided case. Moreover, the guiding plastic layer can easily be replaced with another one with altering pattern. This provides opportunities to apply different interaction templates e.g. according to the displays location or users.
international symposium on pervasive displays | 2017
Jonna Häkkilä; Ashley Colley; Paula Roinesalo; Tuomas Lappalainen; Inka Rantala; Jani Väyrynen
A concept of mobile augmented reality (AR) use for wellness data is explored. We are especially interested in using aesthetic wearables as AR markers for augmented reality applications. A focus group based user study, and a concept design and prototype of a mobile AR based wellness wearable interface is presented. The work extends the research body on wellness wearables in an under-explored design direction.
mobile and ubiquitous multimedia | 2016
Jani Väyrynen; Ashley Colley; Jonna Häkkilä
This paper describes the use of a head mounted display as a design tool to gain better understanding of the issues faced by visually impaired people. The Oculus Rift Head Mounted Display (HMD) was used to navigate through a virtual 3D city model whilst conducting wayfinding and target location tasks. In addition to the baseline of normal vision, we simulated visual disabilities (macular degeneration. cataracts, glaucoma and myopia), such that the user experienced them in first person. We evaluated the system with 14 design students, who found the system useful and helpful in understanding the challenges faced by the visually impaired. The method can be applied to the early design phases of architectural and space design, where 3D models are increasingly commonly used.
international symposium on pervasive displays | 2017
Juho Rantakari; Jani Väyrynen; Ashley Colley; Jonna Häkkilä
In this paper, we investigate the use of stereoscopic 3D displays for presenting multilevel map based information to improve indoor navigation task performance in 3D spaces, such as multi-floor buildings. Current visualizations of multilevel 3D spaces map the floor levels to flat 2D presentations, which do not directly represent the 3D space. We present a laboratory based user study (n=17), where the participants completed two types of tasks with a S3D device - searching a route through a S3D maze, and assessing different visualization techniques for S3D indoor map user interfaces (UIs). The findings suggest that wayfinding tasks can be completed more quickly with the S3D approach than with a 2D representation where floor levels are shown side-by-side.
human factors in computing systems | 2017
Markus Löchtefeld; Tuomas Lappalainen; Jani Väyrynen; Ashley Colley; Jonna Häkkilä
We investigate the use of thermal feedback, i.e. the feeling of warmth and cold, as an output mechanism for hand-held device user interfaces (UIs). In a prototype implementation, we enhanced a console game controller with thermal elements, positioned under the users fingertips. The prototype was evaluated using a simple video game, where the user was required to locate targets based on output cues. In a user study (n = 21) the performance and user experience of using visual, vibrotactile and thermal forms of feedback in the game were compared. Our salient findings suggest that thermal UI feedback is suited for presenting ambient information cues or creating an atmosphere.