Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Nikola Bubalo is active.

Publication


Featured researches published by Nikola Bubalo.


Psychiatry Research-neuroimaging | 2012

Prone to excitement: Adolescent females with non-suicidal self-injury (NSSI) show altered cortical pattern to emotional and NSS-related material

Paul L. Plener; Nikola Bubalo; Anne K. Fladung; Andrea G. Ludolph; Dorothée Lulé

Emotion-regulation difficulties have been identified as one of the core components in Non-suicidal self-injury (NSSI), a behaviour often beginning in adolescence. This pilot study evaluated differences in emotion processing between 18 female adolescents with and without NSSI by using verbal responses and functional magnetic resonance imaging (fMRI). Responses to pictures taken from the International Affective Picture System and slides with reference to NSSI were recorded both by verbal rating of valence and arousal and by fMRI. The NSSI group rated pictures with self-injurious reference as significantly more arousing than controls. For emotional pictures, the NSSI group showed a significantly stronger brain response in the amygdala, hippocampus and anterior cingulate cortex bilaterally. Depression explained differences between groups in the limbic area. Furthermore, the NSSI group also showed increased activity in the middle orbitofrontal cortex, and inferior and middle frontal cortex when viewing NSSI picture material. Participants with NSSI showed decreased activity in correlation to arousal in the occipital cortex and to valence in inferior frontal cortex when watching emotional pictures. The fMRI data support the notion that individuals with NSSI show an altered neural pattern for emotional and NSSI pictures. Behavioural data highlight proneness to excitement regarding NSSI topics. This fMRI study provides evidence for emotion-regulation deficits in the developing brain of adolescents with NSSI.


international conference on multimodal interfaces | 2014

Multimodal Interaction History and its use in Error Detection and Recovery

Felix Schüssel; Frank Honold; Miriam Schmidt; Nikola Bubalo; Anke Huckauf; Michael Weber

Multimodal systems still tend to ignore the individual input behavior of users, and at the same time, suffer from erroneous sensor inputs. Although many researchers have described user behavior in specific settings and tasks, little to nothing is known about the applicability of such information, when it comes to increase the robustness of a system for multimodal inputs. We conducted a gamified experimental study to investigate individual user behavior and error types found in an actually running system. It is shown, that previous ways of describing input behavior by a simple classification scheme (like simultaneous and sequential) are not suited to build up an individual interaction history. Instead, we propose to use temporal distributions of different metrics derived from multimodal event timings. We identify the major errors that can occur in multimodal interactions and finally show how such an interaction history can practically be applied for error detection and recovery. Applying the proposed approach to the experimental data, the initial error rate is reduced from 4.9% to a minimum of 1.2%.


conference on human information interaction and retrieval | 2017

Towards Identifying User Intentions in Exploratory Search using Gaze and Pupil Tracking

Thomas Low; Nikola Bubalo; Tatiana Gossen; Michael Kotzyba; André Brechmann; Anke Huckauf; Andreas Nürnberger

Exploration in large multimedia collections is challenging because the user often navigates into misleading directions or information areas. The vision of our project is to develop an assistive technology that is able to support the individual user and enhance the efficiency of an ongoing exploratory search. Such a technical search aid should be able to find out about the users current interests and goals. Respective parameters can be found in the central and in the peripheral nervous system as well as in overt behavior. Therefore, we aim at using eye movements, pupillometry and EEG to assess respective information. Here, we describe the set-up and the first results of a preliminary user study investigating the effects of searching an image collection on eye movements and pupil dilations. First data show that numbers of fixation, fixation durations as well as pupil dilations differ systematically when looking at a subsequently selected target as compared with not selected items. These results support our vision that further research additionally investigating EEG can in fact result in better predicting the searchers goals and next choices.


european conference on cognitive ergonomics | 2016

User Expertise in Multimodal HCI

Nikola Bubalo; Frank Honold; Felix Schüssel; Michael Weber; Anke Huckauf

Previous experience with the same or similar computer systems can affect the behavior and preferences of a user in multimodal human-computer interaction (HCI). Consequently, adaptive systems need to identify and react to different kinds of previous user experiences. In this paper we examine possible effects of the user expertise on a users choice of modality, performance, and the temporal parameters of the interaction. Results show that increasing user expertise elevates the use of multimodal interaction modalities in general and the use of the familiar multimodal modality in particular. Increasing unimodal expertise, however, does not. Our results further show, that the level of user expertise can affect the performance of the user. Based on our findings, this can be deducted from temporal factors of the interaction, such as the temporal gap between two subsequent inputs.


Companion Technology | 2017

Management of Multimodal User Interaction in Companion -Systems

Felix Schüssel; Frank Honold; Nikola Bubalo; Michael Weber; Anke Huckauf

While interacting, human beings continuously adapt their way of communication to their surroundings and their communication partner. Although present context-aware ubiquitous systems gather a lot of information to maximize their functionality, they predominantly offer rather static ways to communicate. In order to fulfill the user’s communication needs and demands, ubiquitous sensors’ varied information could be used to dynamically adapt the user interface. Considering such an adaptive user interface management as a major and relevant component for a Companion-Technology, we also have to cope with emotional and dispositional user input as a source of implicit user requests and demands. In this chapter we demonstrate how multimodal fusion based on evidential reasoning and probabilistic fission with adaptive reasoning can act together to form a highly adaptive and model-driven interactive system component for multimodal interaction. The presented interaction management (IM) can handle uncertain or ambiguous data throughout the complete interaction cycle with a user. In addition, we present the IM’s architecture and its model-driven concept. Finally, we discuss its role within the framework of the other constituents of a Companion-Technology.


Companion Technology | 2017

Interaction History in Adaptive Multimodal Interaction

Nikola Bubalo; Felix Schüssel; Frank Honold; Michael Weber; Anke Huckauf

Modern Companion-Technologies provide multimodal and adaptive interaction possibilities. However, it is still unclear which user characteristics should be used in which manner to optimally support the interaction. An important aspect is that users themselves learn and adapt their behavior and preferences based on their own experiences. In other words, certain characteristics of user behavior are slowly but continuously changed and updated by the users themselves over multiple encounters with the Companion-Technology. Thus, a biological adaptive multimodal system observes and interacts with an electronic one, and vice versa. Consequently, such a user-centered interaction history is essential and should be integrated in the prediction of user behavior. Doing so enables the Companion to achieve more robust predictions of user behavior, which in turn leads to better fusion decisions and more efficient customization of the UI. We present the development of an experimental paradigm based on visual search tasks. The setup allows the induction of various user experiences as well as the testing of their effects on user behavior and preferences during multimodal interaction.


Proceedings of the Workshop on Multimodal Analyses enabling Artificial Agents in Human-Machine Interaction | 2016

Increasing robustness of multimodal interaction via individual interaction histories

Felix Schüssel; Frank Honold; Nikola Bubalo; Michael Weber

Multimodal input fusion can be considered a well researched topic and yet it is rarely found in real world applications. One reason for this could be the lack of robustness in real world situations, especially regarding unimodal recognition technologies like speech and gesture, that tend to produce erroneous inputs that can not be detected by the subsequent multimodal input fusion mechanism. Previous work implying the possibility to detect and overcome such errors through knowledge of individual temporal behaviors has neither provided a real-time implementation nor evaluated the real benefit of such an approach. We present such an implementation of applying individual interaction histories in order to increase the robustness of multimodal inputs within a smartwatch scenario. We show how such knowledge can be created and maintained at runtime, present evaluation data from an experiment conducted in a realistic scenario, and compare the approach to the state of the art known from literature. Our approach is ready to use in other applications and existing systems, with the prospect to increase the overall robustness of future multimodal systems.


Proceedings of 2016 SAI Computing Conference 2016 | 2016

Measuring effects of user-specific behaviour on selection tasks in HCI

Nikola Bubalo; Frank Honold; Felix Schuessel; Michael Weber; Anke Huckauf

Assessing the user behaviour is essential for adaptive multimodal dialogues in human computer interaction (HCI). However, in recent research, the vast range of possibly interacting parameters make it difficult to examine the influence of individual factors such as previous experience on user behaviour. In this paper we apply a new experimental setup, which enables independent examination of a large range of factors, in order to examine the effects of previous experience on the users choice of modality and performance. The results show that previous experience in terms of an interaction history has a systematic effect and is thus a key factor for the development of adaptive dialogue systems.


international conference on human computer interaction | 2016

In-Depth Analysis of Multimodal Interaction: An Explorative Paradigm

Felix Schüssel; Frank Honold; Nikola Bubalo; Anke Huckauf; Harald C. Traue; Dilana Hazer-Rau


Interactions | 2015

Benefiting from legacy bias

Anne Köpsel; Nikola Bubalo

Collaboration


Dive into the Nikola Bubalo's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Andreas Nürnberger

Otto-von-Guericke University Magdeburg

View shared research outputs
Top Co-Authors

Avatar

André Brechmann

Leibniz Institute for Neurobiology

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge