Kai Breiner
Kaiserslautern University of Technology
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Kai Breiner.
human factors in computing systems | 2010
Jan Van den Bergh; Gerrit Meixner; Kai Breiner; Andreas Pleuss; Stefan Sauer; Heinrich Hussmann
The workshop on model-driven development of advanced user interfaces will be a forum of multi-disciplinary discussion on how to integrate model-driven development with the often more informal methodologies used in user-centered design. Starting point of the discussion will be the tools, models, methods and experiences of the workshop participants.
international conference on human computer interaction | 2009
Kai Breiner; Daniel Görlich; Oliver Maschino; Gerrit Meixner; Detlef Zühlke
The SmartFactory KL is an arbitrarily modifiable and expandable (flexible) intelligent production environment, connecting components from multiple manufacturers (networked), enabling its components to perform context-related tasks autonomously (self-organizing), and emphasizing user-friendliness (user-oriented). This paper presents the results of a research project focusing on the run-time generation and adaptation of a universal task-oriented user interface for such intelligent production environments. It employs a Room-based Use Model (RUM) developed in the context of a continuing research project series on universal remote control devices for intelligent production environments. The SmartFactory KL is the first ambient intelligent production environment for demonstration and development purposes worldwide. After three years of research, a first prototype has been finished that allows for controlling the production line using a single remote user interface able to adapt to varying remote devices according to the actual context of use, in a complex, model-based approach.
Proceedings of the 1st International Workshop on Pattern-Driven Engineering of Interactive Computing Systems | 2010
Kai Breiner; Marc Seissler; Gerrit Meixner; Peter Forbrig; Ahmed Seffah; Kerstin Klöckner
Despite intense research activities in the last years, HCI patterns still lack in a standardized description and organization. This makes it difficult for the developers to identify the relevant patterns for solving a problem as well as to apply them accordingly to the problem context. To fully benefit from HCI patterns within the engineering of interactive computer systems they have to be prepared for integration into a model-based user interface development process. This workshop on Pattern-driven Engineering of Interactive Computer Systems (PEICS) focuses on bringing together various research approaches in order to be conjointly conductive to the state of the art. We present contributions according to semantics, formalization, languages, support, research directions as well as tools.
international conference on ergonomics and health aspects of work with computers | 2011
Kai Breiner; Tobias Wüchner; Malte Brunnlieb
Todays software systems and especially graphical user interfaces are mostly designed to fit to the needs of an ideal target audience - most often purely focusing on young, physically and mentally healthy persons. Not even the development of tailored (e.g. to elderly people) user interfaces but also the testing is a challenging task, because a large set of test persons suffering from specific impairments needs to be recruited which in practice often is unfeasible and the reason for statistically insignificant results. But software systems and their graphical user interfaces have to be designed to cope with the special needs of also handicapped persons. In this paper we introduce a method to support the target oriented design process and evaluation of such graphical user interfaces by simulating specific disabilities and typical impairments. Therefore we emulate the influences of such impairments on the performance while using any graphical user interfaces by applying specific filter algorithms on the target interface. This enables evaluations of the GUI under realistic conditions without being forced to actually involve real impaired participants.
analysis, design, and evaluation of human-machine systems | 2010
Marc Seissler; Gerrit Meixner; Kai Breiner
Abstract Ubiquitous information access within intelligent environments – like the SmartFactoryKL – will become more and more important in everyday life. Model-based User Interface Development (MBUID) processes can help to handle the complexity emerging from the numerous interaction devices and usage scenarios, facilitating run-time adaption of the user interface to the users’ needs and preferences. To improve the usability of the user interfaces the designers’ knowledge has to be formalized and integrated into the generation process. HCI patterns are a promising approach for representing solutions for reoccurring design problems in an abstract, machine processable format which can be used to improve the model-based development of user interfaces, while still having a lack of formalization, organization and tool support. In our presented approach, we show the feasibility of the integration of HCI patterns (layout as well as task patterns) within an established automatic user interface generation process, which thereby supports the separation between content and visualization (separation of concern) within MBUID.
ambient intelligence | 2007
Sebastian Adam; Kai Breiner; Kizito Ssamula Mukasa; Marcus Trapp
Context-awareness, personalization and adaptation are among salient features of Ambient Intelligent (AmI) Systems. The User Interfaces (UI) in AmI environments should therefore also be dynamic at runtime. Developing such UIs is challenging since many aspects have to be considered. Most existing approaches follow Model-driven Engineering (MDE) as a solution. However, they only address design time issues. We successfully applied MDE for runtime generation of UI in the BelAmI project but encountered challenges that we presented in this paper. Possible solutions have also been discussed.
engineering interactive computing system | 2010
Marc Seissler; Kai Breiner; Gerrit Meixner; Peter Forbrig; Ahmed Seffah; Kerstin Kloeckner
Since almost over one decade, patterns have been gaining a lot of interest in the domain of Human-Computer-Interaction (HCI) engineering. It is generally agreed upon that patterns can be used to facilitate the exchange of best practices and knowledge between the interdisciplinary team members, involved in interactive systems design process. Despite intense research activities in the last years, HCI patterns still lack in a standardized description and organization. This makes it difficult for the developers to identify the relevant patterns for solving a problem as well as to apply them accordingly to the problem context. To fully benefit from HCI patterns within the engineering of interactive computer systems they have to be prepared for integration into a model-based user interface development process. Instead of guiding and advising the UI developers of which solution should be applied, HCI patterns should enable the easy reuse of already designed model or code fragments. To enable the integration of HCI patterns in the model-based development process the informal textual, or graphical notation of HCI patterns has to be overcome. HCI patterns have to support the formal description of their solution-part, which allows the direct integration of the solution-parts into the different models, like task-, dialog- and presentation-model.
requirements engineering: foundation for software quality | 2015
Kai Breiner; Michael Gillmann; Axel Kalenborn; Christian Müller
[Context and motivation] Before a software project officially starts, there is a stage that has not received much consideration in literature: the precontract or bidding stage. [Question/problem] In this phase, basic Requirements Engineering (RE) activities are conducted without having a budget, yet. In this paper, the SmartOffer project is described, which aims on improving RE during this precontract phase. [Principal idea/results] Therefore, bidding processes of several organizations were analyzed and commonalities/differences were identified. The consolidated process is described in this paper. It consists out of four abstract phases: assessment of demand, conception, proposal, and actual project conduction. Mandatory and optional process steps within these phases allow for being tailored to different companies and products. [Contribution] The consolidated bidding process provides the potential for automation and tool support. In consequence the precontract phase will be more efficient and effective. Building a tool supporting this process as well as evaluating this tool will be addressed in future work to complement this research preview.
international conference on human computer interaction | 2011
Kai Breiner; Kai Bizik; Thilo Rauch; Marc Seissler; Gerrit Meixner; Philipp Diebold
Model-based universal interaction devices are already capable to react on contextual changes by automatically adapting the user interface, but without considering the usefulness of the resulting user interface. Often tasks cannot be executed any more or execution orders will result in dead locks caused by unavailable functionality. We present our approach of investigating this property of adapted models based on the example of the SmartMote in our living lab the SmartFactoryKL. Given the task description of the user interaction we determine a dialog model in terms of a state machine - which is necessary in our process of user interface generation - to determine possible execution orders leading to the accept state of this state machine. Using these execution orders the initial task model can be adapted, all misleading tasks can be removed and the resulting user interface will only offer valid user interactions.
international conference on human computer interaction | 2011
Christian Wiehr; Nathalie Aquino; Kai Breiner; Marc Seissler; Gerrit Meixner
This paper presents an approach that adds flexibility in the varieties of user interfaces that can be generated by processes of model-based user interface development. This approach is used at design time. Ideas from this approach have been extended for use at runtime and have been applied to SmartMote, a universal interaction device for industrial environments.