Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Wim Fikkert is active.

Publication


Featured researches published by Wim Fikkert.


GW'09 Proceedings of the 8th international conference on Gesture in Embodied Communication and Human-Computer Interaction | 2009

Gestures for large display control

Wim Fikkert; Paul E. van der Vet; Gerrit C. van der Veer; Anton Nijholt

The hands are highly suited to interact with large public displays. It is, however, not apparent which gestures come naturally for easy and robust use of the interface. We first explored how uninstructed users gesture when asked to perform basic tasks. Our subjects gestured with great similarity and readily produced gestures they had seen before; not necessarily in a human-computer interface. In a second investigation these and other gestures were rated by a hundred subjects. A gesture set for explicit command-giving to large displays emerged from these ratings. It is notable that for a selection task, tapping the index finger in mid-air, like with a traditional mouse, scored highest by far. It seems that the mouse has become a metaphor in everyday life.


international symposium on multimedia | 2010

User-Evaluated Gestures for Touchless Interactions from a Distance

Wim Fikkert; Paul E. van der Vet; Anton Nijholt

Very big displays are now commonplace but interactions with them are limited, even poorly understood. Recently, understanding touch-based interactions have received a great deal of attention due to the popularity and low costs of these displays. The direct extension of such interactions, touch less interactions, has not. In this paper we evaluated gesture-based interactions with very big interactive screens to learn which gestures are suited and why. In other words, did ‘Minority Report’ get it right? We aim to discover to which extend these gesture interfaces are technology-driven and influenced by prototyped, commercial and fictive interfaces. A qualitative evaluation of a gesture interface for wall sized displays is presented in which subjects experienced the interface while completing several simple puzzle tasks. We found that simple gestures based on the act of pressing buttons was the most intuitive.


intelligent user interfaces | 2010

Gestures in an Intelligent User Interface

Wim Fikkert; Paul E. van der Vet; Anton Nijholt

In this chapter we investigated which hand gestures are intuitive to control a large display multimedia interface from a user’s perspective. Over the course of two sequential user evaluations we defined a simple gesture set that allows users to fully control a large display multimedia interface, intuitively. First, we evaluated numerous gesture possibilities for a set of commands that can be issued to the interface. These gestures were selected from literature, science fiction movies and a previous exploratory study. Second, we implemented a working prototype with which the users could interact with both hands and the preferred hand gestures with 2D and 3D visualizations of biochemical structures. We found that the gestures are influenced to significant extend by the fast paced developments in multimedia interfaces such as the Apple iPhone and the Nintendo Wii and to no lesser degree by decades of experience with the more traditional WIMP-based interfaces.


Lecture Notes in Computer Science | 2009

Gestures to Intuitively Control Large Displays

Wim Fikkert; Paul E. van der Vet; Han Rauwerda; Timo M. Breit; Anton Nijholt

Large displays are highly suited to support discussions in empirical science. Such displays can display project results on a large digital surface to feed the discussion. This paper describes our approach to closely involve multidisciplinary omics scientists in the design of an intuitive display control through hand gestures. This interface is based upon a gesture repertoire. This paper describes how this repertoire is designed based on observations of, and scripted task experiments with, omics scientists.


intelligent technologies for interactive entertainment | 2009

Navigating a Maze with Balance Board and Wiimote

Wim Fikkert; Niek Hoeijmakers; Paul E. van der Vet; Anton Nijholt

Input from the lower body in human-computer interfaces can be beneficial, enjoyable and even entertaining when users are expected to perform tasks simultaneously. Users can navigate a virtual (game) world or even an (empirical) dataset while having their hands free to issue commands. We compared the Wii Balance Board to a hand-held Wiimote for navigating a maze and found that users completed this task slower with the Balance Board. However, the Balance Board was considered more intuitive, easy to learn and ‘much fun’.


advances in computer entertainment technology | 2009

FeelSound: interactive acoustic music making

Wim Fikkert; Michiel Hakvoort; Paul E. van der Vet; Anton Nijholt

FeelSound is a multi-user, multi-touch application that aims to collaboratively compose, in an entertaining way, acoustic music. Simultaneous input by each of up to four users enables collaborative composing. This process as well as the resulting music are entertaining. Sensor-packed intelligent environments form the target location of FeelSound, varying from the home to schools and other public spaces. Inhabitants of these environments can be creative and artistic through the act of composing music. We continue our previous research on entertaining through music [3] by bringing it to touch-sensitive tables. Composers stand around the FeelSound tabletop interface and, by touching the table with multiple fingers and hands, create acoustic samples. Each instrument is represented by virtual composer stones that a user touches. Upon touching such a stone, an input field is shown on which samples can be created by drawing shapes. Through user identification, multiple composers can create music samples simultaneously and compose these samples into a score. Each user may position any sample in the score, requiring social agreement from their fellow composers.


international conference on advanced learning technologies | 2006

Estimating the Gaze Point of a Student in a Driving Simulator

Wim Fikkert; D.K.J. Heylen; B. van Dijk; Anton Nijholt; J. Kuipers; A. Brugman

In this paper we discuss an approach towards passively observing students in a driving simulator. The goal is to enhance the learning experience for students taking lessons in this simulator. To this end, a virtual driving instructor is provided with added information consisting of the gaze behavior of its student. The gaze behavior is defined by estimated head locations and orientations. The learning experience for the student is enhanced by providing added feedback to the student based on his observed behavior.


Lecture Notes in Computer Science | 2010

Gestures for Large Display Control

Wim Fikkert; van der Paul Vet; van der Gerrit Veer; Anton Nijholt; Stefan Kopp; Ipke Wachsmuth


Applied Physics Letters | 2008

Measuring Behavior using Motion Capture Symposium

Wim Fikkert; van der Herman Kooij; Zsófia Ruttkay; van Herwin Welbergen; A.J. Spink; M.R. Ballintijn; N.D. Bogers; F. Grieco; L.W.S. Loijens; L.P.J.J. Noldus; G. Smit; P.H. Zimmerman


International Journal of Arts and Technology | 2010

Fun and Efficiency of the Wii Balance Interface

Wim Fikkert; Niek Hoeijmakers; Paul E. van der Vet; Anton Nijholt

Collaboration


Dive into the Wim Fikkert's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Han Rauwerda

University of Amsterdam

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge