Tim Clerckx
University of Hasselt
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Tim Clerckx.
ambient intelligence | 2004
Davy Preuveneers; Jan Van den Bergh; Dennis Wagelaar; Andy Georges; Peter Rigole; Tim Clerckx; Yolande Berbers; Karin Coninx; Viviane Jonckers; Koen De Bosschere
To realise an Ambient Intelligence environment, it is paramount that applications can dispose of information about the context in which they operate, preferably in a very general manner. For this purpose various types of information should be assembled to form a representation of the context of the device on which aforementioned applications run. To allow interoperability in an Ambient Intelligence environment, it is necessary that the context terminology is commonly understood by all participating devices. In this paper we propose an adaptable and extensible context ontology for creating context-aware computing infrastructures, ranging from small embedded devices to high-end service platforms. The ontology has been designed to solve several key challenges in Ambient Intelligence, such as application adaptation, automatic code generation and code mobility, and generation of device specific user interfaces.
Lecture Notes in Computer Science | 2003
Kris Luyten; Tim Clerckx; Karin Coninx; Jean Vanderdonckt
Over the last few years, Model-Based User Interface Design has become an important tool for creating multi-device User Interfaces. By providing information about several aspects of the User Interface, such as the task for which it is being built, different User Interfaces can be generated for fulfilling the same needs although they have a different concrete appearance. In the process of making User Interfaces with a Model-Based Design approach, several models can be used: a task model, a dialog model, a user model, a data model,etc. Intuitively, using more models provides more (detailed) information and will create more appropriate User Interfaces. Nevertheless, the designer must take care to keep the different models consistent with respect to each other. This paper presents an algorithm to extract the dialog model (partially) from the task model. A task model and dialog model are closely related because the dialog model defines a sequence of user interactions, an activity chain, to reach the goal postulated in the task specification. We formalise the activity chain as a State Transition Network, and in addition this chain can be partially extracted out of the task specification. The designer benefits of this approach since the task and dialog model are consistent. This approach is useful in automatic User Interface generation where several different dialogs are involved: the transitions between dialogs can be handled smoothly without explicitely implementing them.
EHCI-DSVIS'04 Proceedings of the 2004 international conference on Engineering Human Computer Interaction and Interactive Systems | 2004
Tim Clerckx; Kris Luyten; Karin Coninx
The last few years a lot of research efforts have been spent on user interfaces for pervasive computing. This paper shows a design process and a runtime architecture, DynaMo-AID, that provide design support and a runtime architecture for context-aware user interfaces. In the process attention is focused on the specification of the tasks the user and the application will have to perform, together with other entities related to tasks, like dialog and presentation. In this paper we will show how we can model tasks, dialogs, and presentation when the designer wants to develop context-sensitive user interfaces. Besides the design process, a runtime architecture will be presented supporting context-sensitive user interfaces. Pervasive user interfaces can change during the runtime of the interactive application due to a change of context or when a service becomes available to the application. We will show that traditional models like task, environment and dialog model have to be extended to tackle these new problems. This is why we provide modeling and runtime support solutions for design and development of context-sensitive user interfaces.
task models and diagrams for user interface design | 2004
Tim Clerckx; Kris Luyten; Karin Coninx
Model-Based User Interface Development uses a multitude of models which are related in one way or another. Usually there is some kind of process that starts with the design of the abstract models and progresses gradually towards the more concrete models, resulting in the final user interface when the design process is complete. Progressing from one model to another involves transforming the model and mapping pieces of information contained in the source model onto the target model. Most existing development environments propose solutions that apply these steps (semi-)automatically in one way only (from abstract to concrete models). Manual intervention that changes the target model (e.g. dialog model) to the designer preferences is not reflected in the source model (e.g. task model), thus this step can introduce inconsistencies between the different models. In this paper, we identify some rules that can be manually applied to the model after a transformation has taken place. The effect on the target and source models are shown together with how different models involved in the transformation can be updated accordingly to ensure consistency between models.
CADUI | 2005
Tim Clerckx; Kris Luyten; Karin Coninx
This paper shows a technique that allows adaptive user interfaces, spanning multiple devices, to be rendered from the task specification at runtime taking into account the context of use. The designer can specify a task model using the ConcurTaskTrees Notation and its context-dependent parts, and deploy the user interface immediately from the specification. By defining a set of context-rules in the design stage, the appropriate context-dependent parts of the task specification will be selected before the concrete interfaces will be rendered. The context will be resolved by the runtime environment and does not require any manual intervention. This way the same task specification can be deployed for several different contexts of use. Traditionally, a context-sensitive task specification only took into account a variable single deployment device. This paper extends this approach as it takes into account task specifications that can be executed by multiple co-operating devices.
Engineering Interactive Systems | 2008
Jo Vermeulen; Yves Vandriessche; Tim Clerckx; Kris Luyten; Karin Coninx
Semantic service descriptions have paved the way for flexible interaction with services in a mobile computing environment. Services can be automatically discovered, invoked and even composed. On the contrary, the user interfaces for interacting with these services are often still designed by hand. This approach poses a serious threat to the overall flexibility of the system. To make the user interface design process scale, it should be automated as much as possible. We propose to augment service descriptions with high-level user interface models to support automatic user interface adaptation. Our method builds upon OWL-S, an ontology for Semantic Web Services, by connecting a collection of OWL-S services to a hierarchical task structure and selected presentation information. This allows end-users to interact with services on a variety of platforms.
task models and diagrams for user interface design | 2005
Tim Clerckx; Frederik Winters; Karin Coninx
Since mobile devices are expected to become more and more influenced by various kinds of context information in the near future, context needs to be taken into consideration when user interfaces are developed for these systems. When user interfaces are being developed using a model-based approach, developers need to design several models where the aggregate describes the entire user interface. These models tend to be very complex: models for applications where a lot of interaction is required can be huge and are mutually connected. This is in particular the case when external context information can act on the user interface. In this paper we describe a development process for context-aware user interfaces. We focus on the design part of the development process with attention to tool support for constructing, editing and viewing the models.
intelligent user interfaces | 2006
Tim Clerckx; Chris Vandervelpen; Kris Luyten; Karin Coninx
This paper presents a modular runtime architecture supporting our model-based user interface design approach for designing context-aware, distributable user interfaces for ambient intelligent environments.
Engineering Interactive Systems | 2008
Tim Clerckx; Chris Vandervelpen; Karin Coninx
This paper describes an approach that uses task modelling for the development of distributed and multimodal user interfaces. We propose to enrich tasks with possible interaction modalities in order to allow the user to perform these tasks using an appropriate modality. The information of the augmented task model can then be used in a generic runtime architecture we have extended to support runtime decisions for distributing the user interface among several devices based on the specified interaction modalities. The approach was tested in the implementation of several case studies. One of these will be presented in this paper to clarify the approach.
Pervasive and Mobile Computing | 2007
Peter Rigole; Tim Clerckx; Yolande Berbers; Karin Coninx
This article presents a strategy for deploying component-based applications gradually in order to match the functionality of pervasive computing applications onto the current needs of the user. We establish this deployment strategy by linking component composition models with task models at design-time, from which a run-time deployment plan is deduced. Enhanced with a Markov model, this deployment plan is able to drive a component life cycle manager to anticipate future deployments. The result is a seamless integration of pervasive computing applications with the users tasks, guaranteeing the availability of the required functionality without wasting computing resources on components that are not currently needed.