Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Kris Luyten is active.

Publication


Featured researches published by Kris Luyten.


EHCI-DSVIS'04 Proceedings of the 2004 international conference on Engineering Human Computer Interaction and Interactive Systems | 2004

DynaMo-AID: a design process and a runtime architecture for dynamic model-based user interface development

Tim Clerckx; Kris Luyten; Karin Coninx

The last few years a lot of research efforts have been spent on user interfaces for pervasive computing. This paper shows a design process and a runtime architecture, DynaMo-AID, that provide design support and a runtime architecture for context-aware user interfaces. In the process attention is focused on the specification of the tasks the user and the application will have to perform, together with other entities related to tasks, like dialog and presentation. In this paper we will show how we can model tasks, dialogs, and presentation when the designer wants to develop context-sensitive user interfaces. Besides the design process, a runtime architecture will be presented supporting context-sensitive user interfaces. Pervasive user interfaces can change during the runtime of the interactive application due to a change of context or when a service becomes available to the application. We will show that traditional models like task, environment and dialog model have to be extended to tackle these new problems. This is why we provide modeling and runtime support solutions for design and development of context-sensitive user interfaces.


Archive | 2007

Task Models and Diagrams for Users Interface Design

Karin Coninx; Kris Luyten; Kevin A. Schneider

Invited Paper.- Meta-User Interfaces for Ambient Spaces.- Tool Support.- Tool Support for Handling Mapping Rules from Domain to Task Models.- Towards Visual Analysis of Usability Test Logs Using Task Models.- Model-Based Interface Development.- Dialog Modeling for Multiple Devices and Multiple Interaction Modalities.- Model-Based Support for Specifying eService eGovernment Applications.- A Model-Based Approach to Develop Interactive System Using IMML.- User Interface Patterns.- PIM Tool: Support for Pattern-Driven and Model-Based UI Development.- Pattern-Based UI Design: Adding Rigor with User and Context Variables.- Error Patterns: Systematic Investigation of Deviations in Task Models.- Using an Interaction-as-Conversation Diagram as a Glue Language for HCI Design Patterns on the Web.- Bridging the Gap: Driven by Models.- An MDA Approach for Generating Web Interfaces with UML ConcurTaskTrees and Canonical Abstract Prototypes.- High-Level Modeling of Multi-user Interactive Applications.- Goals: Interactive Multimedia Documents Modeling.- Task-Centered Design.- Using Task Models for Cascading Selective Undo.- Exploring Interaction Space as Abstraction Mechanism for Task-Based User Interface Design.- Multi-modal User Interfaces.- Comparing NiMMiT and Data-Driven Notations for Describing Multimodal Interaction.- Incorporating Tilt-Based Interaction in Multimodal User Interfaces for Mobile Devices.- An HCI Model for Usability of Sonification Applications.- Reflections on Tasks and Activities in Modeling.- Non-functional User Interface Requirements Notation (NfRn) for Modeling the Global Execution Context of Tasks.- Requirements Elicitation and Elaboration in Task-Based Design Needs More Than Task Modelling: A Case Study.- Discovering Multitasking Behavior at Work: A Context-Based Ontology.- The Tacit Dimension of User Tasks: Elicitation and Contextual Representation.- Context and Plasticity.- The Comets Inspector: Towards Run Time Plasticity Control Based on a Semantic Network.- A Prototype-Driven Development Process for Context-Aware User Interfaces.


CADUI | 2005

Generating Context-Sensitive Multiple Device Interfaces from Design

Tim Clerckx; Kris Luyten; Karin Coninx

This paper shows a technique that allows adaptive user interfaces, spanning multiple devices, to be rendered from the task specification at runtime taking into account the context of use. The designer can specify a task model using the ConcurTaskTrees Notation and its context-dependent parts, and deploy the user interface immediately from the specification. By defining a set of context-rules in the design stage, the appropriate context-dependent parts of the task specification will be selected before the concrete interfaces will be rendered. The context will be resolved by the runtime environment and does not require any manual intervention. This way the same task specification can be deployed for several different contexts of use. Traditionally, a context-sensitive task specification only took into account a variable single deployment device. This paper extends this approach as it takes into account task specifications that can be executed by multiple co-operating devices.


CADUI | 2009

Design by Example of Graphical User Interfaces Adapting to Available Screen Size

Alexandre Demeure; Kris Luyten; Karin Coninx

Currently, it is difficult for a designer to create user interfaces that are of high aesthetic quality for a continuously growing range of devices with varied screen sizes. Most existing approaches use abstractions that only support form-based user interfaces. These user interfaces may be usable but are of low aesthetic quality. In this paper, we present a technique to design adaptive graphical user interfaces by example (i.e., user interfaces that can adapt to the target platform, the user, etc.), which can produce user interfaces of high aesthetic quality while reducing the development cost inherent to manual approaches. Designing adaptive user interfaces by example could lead to a new generation of design tools that put adaptive user interface development within the reach of designers as well as developers.


Archive | 2006

Designing Interactive Systems in Context: From Prototype to Deployment

Tim Clerckx; Kris Luyten; Karin Coninx

The possibility of communicating with the (in) direct environment using other devices and observing that same environment allow us to develop ambient intelligent applications which have knowledge of the environment and of the use of these applications. Despite the support for software development for this kind of application, some gaps still exist, making the creation of consistent, usable user interfaces more difficult. This paper discusses a technique that can be integrated into existing models and architectures and that supports the interface designer in making consistent context-sensitive user interfaces. We present an architecture and methodology that allows context information to be used at two different levels — dialogue and interdialogue levels — and ensures that the consistency of the interface is always maintained in the event of context changes during use of the software.


international conference on e-health networking, applications and services | 2014

ReHoblet — A home-based rehabilitation game on the tablet

Marijke Vandermaesen; Karel Robert; Kris Luyten; Karin Coninx

We present ReHoblet; a physical rehabilitation game on tablets, designed to be used in a residential setting. ReHoblet trains two gross motor movements of the upper limbs by lifting (up-down) and transporting (left-right) the tablet to control a simple platform game. By using its accelerometers and gyroscope, the tablet is capable of detecting movements made by the user and steer the interaction based on this data. A formative evaluation with five Multiple Sclerosis (MS) patients and their therapists showed high appreciation for ReHoblet. Patients stated they liked ReHoblet not only to improve their physical abilities, but to train on performing technology-related tasks. Based on the results, we reflect on tablet-based games in home-based rehabilitation.


CADUI | 2002

Specifying User Interfaces for Runtime Modal Independent Migration

Kris Luyten; Tom Van Laerhoven; Karin Coninx; Frank Van Reeth

The usage of computing systems has evolved dramatically over the last years. Starting from a low level procedural usage, in which a process for executing one or several tasks is carried out, computers now tend to be used in a problem oriented way. Future computer usage will be more centered around particular services, and will not be focused on platforms or applications. These services should be independent of the technology used to interact with them. In this paper an approach will be presented which provides a uniform interface to such services, without any dependence on modality, platform or programming language. Through the usage of general user interface descriptions, presented in XML, and converted using XSLT, a uniform framework is presented for runtime migration of user interfaces. As a consequence, future services will become easily extensible for all kinds of devices and modalities. An implementation serving as a proof of concept, a runtime conversion of a joystick in a 3D virtual environment into a 2D dialog-based user interface, is developed.


robot and human interactive communication | 2016

Toward specifying Human-Robot Collaboration with composite events

Jan Van den Bergh; Fredy Cuenca Lucero; Kris Luyten; Karin Coninx

Human-Robot Collaboration is increasingly considered in manufacturing to better combine the strengths of humans and robots. Establishing this human-robot collaboration may require multi-modal interaction; input to and output from the robot can both use multiple channels in sequence or in parallel. Designing effective interaction requires the expertise from different domains, possibly originating from people with different backgrounds. In our work we explore how composite events - hierarchical composition of events - can be used in a way that eases the communication within a multi-disciplinary team. In this paper, we present how the concept of composite events can be used to create different layers of abstraction that can be used to ease prototyping and discussion of human-robot collaboration with stakeholders through a supporting tool called Hasselt UIMS. At the lower level(s) of abstraction, the composite events can be mapped to the message-based communication as implemented in the Robotic Operating System (ROS), which is used to program collaborative robots, such as the Baxter robot from Rethink Robotics.


Collaboration in Creative Design | 2016

Storyboards as a Lingua Franca in Multidisciplinary Design Teams

Mieke Haesen; Davy Vanacken; Kris Luyten; Karin Coninx

Design, and in particular user-centered design processes for interactive systems, typically involve multidisciplinary teams. The different and complementary perspectives of the team members enrich the design ideas and decisions, and the involvement of all team members is needed to achieve a user interface for a system that carefully considers all aspects, ranging from user needs to technical requirements. The difficulty is getting all team members involved in the early stages of design and communicating design ideas and decisions in a way that all team members can understand them and use them in an appropriate way in later stages of the process. This chapter describes the COMuICSer storyboarding technique, which presents the scenario of use of a future system in a way that is understandable for each team member, regardless of their background. Based on an observational study in which multidisciplinary teams collaboratively created storyboards during a co-located session, we present recommendations for the facilitation of co-located collaborative storyboarding sessions for multidisciplinary teams and digital tool support for this type of group work.


Proceedings of Software Techniques for Embedded and Pervasive Systems | 2005

A component-based infrastructure for pervasive user interaction

Peter Rigole; Chris Vandervelpen; Kris Luyten; Yves Vandewoude; Karin Coninx; Yolande Berbers

Collaboration


Dive into the Kris Luyten's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge