Florian Nothdurft
University of Ulm
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Florian Nothdurft.
intelligent environments | 2014
Frank Honold; Pascal Bercher; Felix Richter; Florian Nothdurft; Thomas Geier; Roland Barth; Thilo Hörnle; Felix Schüssel; Stephan Reuter; Matthias Rau; Gregor Bertrand; Bastian Seegebarth; Peter Kurzok; Bernd Schattenberg; Wolfgang Minker; Michael Weber; Susanne Biundo
The properties of multimodality, individuality, adaptability, availability, cooperativeness and trustworthiness are at the focus of the investigation of Companion Systems. In this article, we describe the involved key components of such a system and the way they interact with each other. Along with the article comes a video, in which we demonstrate a fully functional prototypical implementation and explain the involved scientific contributions in a simplified manner. The realized technology considers the entire situation of the user and the environment in current and past states. The gained knowledge reflects the context of use and serves as basis for decision-making in the presented adaptive system.
intelligent environments | 2013
Frank Honold; Felix Schüssel; Michael Weber; Florian Nothdurft; Gregor Bertrand; Wolfgang Minker
This article presents a context adaptive approach for multimodal interaction for the use in cognitive technical systems, so called companion systems. A system architecture is presented and we clarify where context awareness occurs on different levels with a layered context model. The focus is on the topics of dialog management, multimodal fusion, and multimodal fission, as the main participants in interaction. An implemented prototype is presented, yielding some concrete instances of the described context models and the adaption to them.
Neurocomputing | 2015
Michael Glodek; Frank Honold; Thomas Geier; Gerald Krell; Florian Nothdurft; Stephan Reuter; Felix Schüssel; Thilo Hörnle; Klaus Dietmayer; Wolfgang Minker; Susanne Biundo; Michael Weber; Günther Palm; Friedhelm Schwenker
Recent trends in human-computer interaction (HCI) show a development towards cognitive technical systems (CTS) to provide natural and efficient operating principles. To do so, a CTS has to rely on data from multiple sensors which must be processed and combined by fusion algorithms. Furthermore, additional sources of knowledge have to be integrated, to put the observations made into the correct context. Research in this field often focuses on optimizing the performance of the individual algorithms, rather than reflecting the requirements of CTS. This paper presents the information fusion principles in CTS architectures we developed for Companion Technologies. Combination of information generally goes along with the level of abstractness, time granularity and robustness, such that large CTS architectures must perform fusion gradually on different levels - starting from sensor-based recognitions to highly abstract logical inferences. In our CTS application we sectioned information fusion approaches into three categories: perception-level fusion, knowledge-based fusion and application-level fusion. For each category, we introduce examples of characteristic algorithms. In addition, we provide a detailed protocol on the implementation performed in order to study the interplay of the developed algorithms.
annual meeting of the special interest group on discourse and dialogue | 2014
Florian Nothdurft; Felix Richter; Wolfgang Minker
Human-computer trust has shown to be a critical factor in influencing the complexity and frequency of interaction in technical systems. Particularly incomprehensible situations in human-computer interaction may lead to a reduced users trust in the system and by that influence the style of interaction. Analogous to human-human interaction, explaining these situations can help to remedy negative effects. In this paper we present our approach of augmenting task-oriented dialogs with selected explanation dialogs to foster the humancomputer trust relationship in those kinds of situations. We have conducted a webbased study testing the effects of different goals of explanations on the components of human-computer trust. Subsequently, we show how these results can be used in our probabilistic trust handling architecture to augment pre-defined task-oriented dialogs.
intelligent environments | 2010
Florian Nothdurft; Gregor Bertrand; Tobias Heinroth; Wolfgang Minker
In this paper, we describe the development of a dialogue model that integrates emotional dialogue strategies and explanations in a simple hence powerful way. As intelligent environments make inroads into the market, the need for user-friendly interaction with these systems grows. Pro-active reaction to user knowledge and emotions is one of the key points in user-friendly adaption of dialogue systems and therefore one of the main topics of research. As intelligent environments grow in complexity and field of application, the knowledge requirements for the user grow as well. Therefore it is vitally important to impart knowledge and information in an emotionally sensitive and user-aware way. In our dialog model we consider the natural structure of a nontrivial dialogue as a structure divided into several goals. These goals are protected by so called guards which represent preconditions which have to be fulfilled in order to tackle the related goal.
computer software and applications conference | 2011
Gregor Bertrand; Florian Nothdurft; Frank Honold; Felix Schüssel
In the research area of spoken language dialogue systems there are many ways for modeling dialogues. The dialog models particular structure depends on the algorithm used to interpret it. In most cases a dialogues model is quite difficult to understand and to create. We present a novel technique for modeling dialogues based on ready to use open source tools in an easy and understandable way. Making use of our approach a dialogue designer (unfamiliar with the internals of the dialogue manager) can simply develop even complex and adaptive dialogues. The dialogues are then ready to be interpreted by the dialogue management in order to integrate them seamlessly into the spoken language dialogue system.
annual meeting of the special interest group on discourse and dialogue | 2015
Florian Nothdurft; Gregor Behnke; Pascal Bercher; Susanne Biundo; Wolfgang Minker
Technical systems evolve from simple dedicated task solvers to cooperative and competent assistants, helping the user with increasingly complex and demanding tasks. For this, they may proactively take over some of the users responsibilities and help to find or reach a solution for the user’s task at hand, using e.g., Artificial Intelligence (AI) Planning techniques. However, this intertwining of user-centered dialog and AI planning systems, often called mixed-initiative planning (MIP), does not only facilitate more intelligent and competent systems, but does also raise new questions related to the alignment of AI and human problem solving. In this paper, we describe our approach on integrating AI Planning techniques into a dialog system, explain reasons and effects of arising problems, and provide at the same time our solutions resulting in a coherent, userfriendly and efficient mixed-initiative system. Finally, we evaluate our MIP system and provide remarks on the use of explanations in MIP-related phenomena.
computer software and applications conference | 2012
Florian Nothdurft; Gregor Bertrand; Helmut Lang; Wolfgang Minker
One of the most important challenges in the field of human-computer interaction is maintaining and enhancing the willingness of a user to interact with a technical system. This cooperativeness provides a solid basis for a real dialogue between user and technical system. The goals and tasks of an intelligent technical system seem unrealistic without it. In particular intelligent technical systems, which are continually assisting the user in his everyday life, degenerate without this willingness for dialogue to helpers for quite simple tasks and cannot fulfil their original purpose as intelligent assistants for complex tasks. Trust has shown to be an important factor influencing the frequency and kind of usage. If the user does not understand system actions or instructions, the trust of the user in the system will decrease and this can lead to a reduced frequency or in the worst case to a total cease of usage. Therefore, the intelligibility of a technical system should be upheld. This paper is concerned with how the intelligibility of an intelligent technical system can be upheld by providing explanations to the user. Providing explanations may prevent or at least decrease the loss of trust. However, trust is a complex construct consisting of different bases. We show why and how these bases of trust should be treated by giving individual kinds of explanations.
intelligent environments | 2014
Florian Nothdurft; Frank Honold; Kseniya Zablotskaya; Amr Diab; Wolfgang Minker
In this paper we present a prototypical dialog system adaptive to verbal user intelligence. Verbal intelligence (VI) is the ability to analyze information and to solve problems using language-based reasoning. VI can be analyzed by the number of reused words, lemmas, n-grams, cosine similarity and other features. Here we concentrate on the application of VI in a human-computer interaction (HCI) and how this value can be used by the dialog management to adapt the dialog flow, and complexity at run-time. In our work complexity as well as informative value of presented information can be reduced or increased when encountering human-computer interaction by individually adapting to a lower or higher verbally intelligent user. Especially in intelligent environments, where users may rely on speech as their primary interaction modality, the adaptation of system instructions to the the users VI could prove helpful.
international conference on multimodal interfaces | 2012
Florian Nothdurft; Frank Honold; Peter Kurzok
In this demo paper we present a system that is capable of adapting the dialogue between human and so-called companion systems in real-time. Companion systems are continually available, co-operative, and reliable assistants which adapt to a users capabilities, preferences, requirements, and current needs. Typically state-of-the art Human-Computer interfaces adapt the interaction only to pre-defined levels of expertise. In contrast, the presented system adapts the structure and content of the interaction to each user by including explanations to prepare him for upcoming tasks he has to solve together with the companion system.