Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Chris Raymaekers is active.

Publication


Featured researches published by Chris Raymaekers.


ieee international workshop on horizontal interactive human computer systems | 2008

IntuPaint: Bridging the gap between physical and digital painting

Peter Vandoren; T. Van Laerhoven; Luc Claesen; Johannes Taelman; Chris Raymaekers; F. Van Reeth

This paper presents a novel interface for a digital paint system: IntuPaint. A tangible interface for a digital paint easel, using an interactive surface and electronic brushes with a tuft of bristles, has been developed. The flexible brush bristles conduct light by means of total internal reflection inside the individual bristles. This enables to capture subtle paint nuances of the artist in a way that was not possible in previous technologies. This approach provides natural interaction and enables detailed tracking of specific brush strokes. Additional tangible and finger-based input techniques allow for specific paint operations or effects. IntuPaint also offers an extensive model-based paint simulation, rendering realistic paint results. The reality-based approach in the combination of user interface and paint software is a new step forward to bridge the gap between physical and digital painting, as is demonstrated by initial user tests.


interactive tabletops and surfaces | 2009

FluidPaint: an interactive digital painting system using real wet brushes

Peter Vandoren; Luc Claesen; Tom Van Laerhoven; Johannes Taelman; Chris Raymaekers; Eddy Flerackers; Frank Van Reeth

This paper presents FluidPaint, a novel digital paint system using real wet brushes. A new interactive canvas, accurately registering brush footprints and paint strokes in high precision has been developed. It is based on the real-time imaging of brushes and other painting instruments as well as the real-time co-located rendering of the painting results. This new painting user interface enhances the user experience and the artists expressiveness. User tests demonstrate the intuitive nature of FluidPaint, naturally integrating interface elements of traditional painting in a digital paint system.


ieee international conference on rehabilitation robotics | 2009

Arm training in Multiple Sclerosis using Phantom: Clinical relevance of robotic outcome measures

Peter Feys; Geert Alders; Domien Gijbels; Joan De Boeck; Tom De Weyer; Karin Coninx; Chris Raymaekers; Veronik Truyens; Patric Groenen; Kenneth Meijer; Hans Savelberg; Bert O. Eijnde

Upper limb weakness due to Multiple Sclerosis has a major negative effect on the functional activities of the patient. Promising developments in the field of rehabilitation robotics may enable additional exercise. This study aims to investigate which types of robotic outcome measures are clinically relevant, in preparation of the evaluation for intervention studies.Within this context, appropriate movement tasks and tests for the haptic PHANTOM end-effector robot were designed in a virtual environment. These tasks focused on spatial accuracy, object manipulation and speed. Outcome measures were: 1) virtual movement tests, recorded by the robot to quantify motor control; 2) clinical outcome measures such as the Motricity Index, Jamar and MicroFET hand-held dynamometer to evaluate muscle strength; and the Nine Hole Peg Test, Purdue Pegboard, ARAt and TEMPA to asses upper limb function and manual dexterity.10 healthy controls performed the virtual movement tasks using the Phantom as interface. 21 MS subjects with upper limb dysfunction caused by muscle weakness were included in an interventional training study. Pearson correlations were calculated at baseline between the performance on the three virtual movement tasks and the clinical tests on impairment and activity level. The virtual movement tests discriminated between healthy controls and MS patients with hand dysfunction. In the MS patient group, no significant correlations were found between muscle strength tests and virtual movement tasks, while mainly significant correlations were found between specific functional measures (specifically ARAt and Purdue pegboard test) and virtual movement tasks.


Proceedings of the 2007 workshop on Multimodal interfaces in semantic interaction | 2007

Introducing semantic information during conceptual modelling of interaction for virtual environments

Lode Vanacken; Chris Raymaekers; Karin Coninx

The integration of semantic information in virtual environment interaction is mostly still ad-hoc. The system is usually designed in such a way that the design of the framework incorporates the semantic information which then can be used to utilise these semantics during interaction. We introduce a model-based user interface approach which introduces semantic information, represented using ontologies, during the modelling phase. This semantic information itself is created during the design of the virtual world. The approach we propose is system independent and makes it possible for the semantic information content to be chosen and adapted in complete freedom without considering the underlying framework. We incorporate semantics in NiMMiT, our notation for multimodal interaction modelling. We present two case studies which validate the flexibility of our approach.


DSVIS'05 Proceedings of the 12th international conference on Interactive Systems: design, specification, and verification | 2005

A model-based design process for interactive virtual environments

Erwin Cuppens; Chris Raymaekers; Karin Coninx

Nowadays, interactive systems are not limited to the desktop. On the one hand they are deployed onto handheld and embedded devices, and on the other hand they evolve into interactive virtual environments which are controlled by direct manipulation interaction techniques. However, the development of these virtual environment user interfaces is not a straightforward process and thus not easily accessible for non-programmers. In this paper, we envision a model-based design process for these highly interactive applications, in order to bridge the gap between the designer and the programmer of the application. The process is based on both requirements of model-based user interface developments processes, and virtual environment development tools and toolkits. To evaluate the envisioned approach, a tool was created that supports the described process, and a case study has been performed.


Multimedia Tools and Applications | 2011

Adaptation in virtual environments: conceptual framework and user models

Johanna Renny Octavia; Chris Raymaekers; Karin Coninx

When interacting in a virtual environment, users are confronted with a number of interaction techniques. These interaction techniques may complement each other, but in some circumstances can be used interchangeably. Because of this situation, it is difficult for the user to determine which interaction technique to use. Furthermore, the use of multimodal feedback, such as haptics and sound, has proven beneficial for some, but not all, users. This complicates the development of such a virtual environment, as designers are not sure about the implications of the addition of interaction techniques and multimodal feedback. A promising approach for solving this problem lies in the use of adaptation and personalization. By incorporating knowledge of a user’s preferences and habits, the user interface should adapt to the current context of use. This could mean that only a subset of all possible interaction techniques is presented to the user. Alternatively, the interaction techniques themselves could be adapted, e.g. by changing the sensitivity or the nature of the feedback. In this paper, we propose a conceptual framework for realizing adaptive personalized interaction in virtual environments. We also discuss how to establish, verify and apply a user model, which forms the first and important step in implementing the proposed conceptual framework. This study results in general and individual user models, which are then verified to benefit users interacting in virtual environments. Furthermore, we conduct an investigation to examine how users react to a specific type of adaptation in virtual environments (i.e. switching between interaction techniques). When an adaptation is integrated in a virtual environment, users positively respond to this adaptation as their performance significantly improve and their level of frustration decrease.


symposium on 3d user interfaces | 2006

Using the Non-Dominant Hand for Selection in 3D

J. De Boeck; T. De Weyer; Chris Raymaekers; Karin Coninx

Although 3D virtual environments are designed to provide the user with an intuitive interface to view or manipulate highly complex data, current solutions are still not ideal. In order to make the interaction as natural as possible, metaphors are used to allow the users to apply their everyday knowledge in the generated environment. In literature, a lot of experiments can be found, describing new or improved metaphors. In our former work, we presented the ‘Object In Hand’ metaphor [4], which addresses some problems regarding the access of objects and menus in a 3D world. Although the metaphor turned out to be very promising, the solution shifted the problem towards a selection problem. From the insights of our previous work, we believe the non-dominant hand can play a role in solving this problem. In this paper we formally compare three well-known selection metaphors and we will check their suitability to be carried out with the non-dominant hand in order to seamlessly integrate the most suitable selection metaphor within the ‘Object In Hand’ metaphor.


symposium on haptic interfaces for virtual environment and teleoperator systems | 2005

An empirical approach for the evaluation of haptic algorithms

Chris Raymaekers; J. De Boeck; Karin Coninx

The number of haptic algorithms has been growing over the past few years. However, little research has been performed in evaluating these algorithms. This paper provides a discussion of how force-feedback algorithms can be empirically evaluated for correctness and performance.


database and expert systems applications | 2010

Data Management for Multimodal Rehabilitation Games

Sofie Notelaers; Tom De Weyer; Chris Raymaekers; Karin Coninx; Hanne Bastiaens; Ilse Lamers

Rehabilitation games form a promising type of serious games. The goal is to provide the patients with a personalized training. In order to realize this, much information must be handled. This includes general information about the games and parameters regarding modalities, and specific information about the patients and their therapy sessions. Therapists must be able to specify the values for all parameters involved. However, different levels of these parameters should be grouped in a sensible manner in order not to overwhelm the therapists with too much and too detailed information. This paper discusses a system for the rehabilitation of Multiple Sclerosis patients and explains how the information can be managed by the therapists.


international workshop on semantic media adaptation and personalization | 2007

Runtime Personalization of Multi-Device User Interfaces: Enhanced Accessibility for Media Consumption in Heterogeneous Environments by User Interface Adaptation

Carl Bruninx; Chris Raymaekers; Kris Luyten; Karin Coninx

The diversity of end-user devices in combination with a growing user base poses important challenges for providing easy access to the huge amount of content and services currently available. Each device has its typical set of capabilities and characteristics that must be taken into account to create an appropriate user interface that provides interactive access to multimedia data and services. Furthermore, end-users also have their specific requirements that influence the accessibility of data and services for individual access. The approach we present in this paper is geared towards the idea of universal access to interactive multimedia data and services for everyone, independent of the user characteristics or end-user device capabilities. For this purpose we combine user and device models with high-level user interface description languages in order to decouple the interface presentation from its platform, and to generate the most suitable interface on a per-user, per- device basis making use of the semantics that are provided by user and device profile.

Collaboration


Dive into the Chris Raymaekers's collaboration.

Top Co-Authors

Avatar

Karin Coninx

Transnational University Limburg

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge