Featured Researches

Human Computer Interaction

On the Readability of Abstract Set Visualizations

Set systems are used to model data that naturally arises in many contexts: social networks have communities, musicians have genres, and patients have symptoms. Visualizations that accurately reflect the information in the underlying set system make it possible to identify the set elements, the sets themselves, and the relationships between the sets. In static contexts, such as print media or infographics, it is necessary to capture this information without the help of interactions. With this in mind, we consider three different systems for medium-sized set data, LineSets, EulerView, and MetroSets, and report the results of a controlled human-subjects experiment comparing their effectiveness. Specifically, we evaluate the performance, in terms of time and error, on tasks that cover the spectrum of static set-based tasks. We also collect and analyze qualitative data about the three different visualization systems. Our results include statistically significant differences, suggesting that MetroSets performs and scales better.

Read more
Human Computer Interaction

On the design of text editors

Text editors are written by and for developers. They come with a large set of default and implicit choices in terms of layout, typography, colorization and interaction that hardly change from one editor to the other. It is not clear if these implicit choices derive from the ignorance of alternatives or if they derive from developers' habits, reproducing what they are used to. The goal of this article is to characterize these implicit choices and to illustrate what are some alternatives without prescribing one or the other.

Read more
Human Computer Interaction

Online Knowledge Base for Designing Shape-changing Interfaces using Modular Workshop Elements

Building and maintaining knowledge about specific interface technologies is a challenge. Current solutions include standard file-based document repositories, wikis, and other online tools. However, these solutions are often only available in intranets, become outdated and do not support the acquisition of knowledge in an efficient manner. The effort to gain an overview and detailed knowledge about novel interface technologies can be overwhelming and requires to research and read many technical reports and scientific publications. We propose to implement open source online knowledge bases that include building blocks for creating custom workshops to understand and apply the contained knowledge. We demonstrate this concept with a knowledge base for shape-changing interfaces (SCI-KB). The SCI-KB is hosted online at GitHub and fosters collaborative creation of knowledge elements accompanied by practical exercises and workshop elements that can be combined and adapted by individuals or groups of people new to the topic of shape-changing interfaces.

Read more
Human Computer Interaction

Online LDA based brain-computer interface system to aid disabled people

This paper aims to develop brain-computer interface system based on electroencephalography that can aid disabled people in daily life. The system relies on one of the most effective event-related potential wave, P300, which can be elicited by oddball paradigm. Developed application has a basic interaction tool that enables disabled people to convey their needs to other people selecting related objects. These objects pseudo-randomly flash in a visual interface on computer screen. The user must focus on related object to convey desired needs. The system can convey desired needs correctly by detecting P300 wave in acquired 14-channel EEG signal and classifying using linear discriminant analysis classifier just in 15 seconds. Experiments have been carried out on 19 volunteers to validate developed BCI system. As a result, accuracy rate of 90.83% is achieved in online performance

Read more
Human Computer Interaction

Online Mobile App Usage as an Indicator of Sleep Behavior and Job Performance

Sleep is critical to human function, mediating factors like memory, mood, energy, and alertness; therefore, it is commonly conjectured that a good night's sleep is important for job performance. However, both real-world sleep behavior and job performance are hard to measure at scale. In this work, we show that people's everyday interactions with online mobile apps can reveal insights into their job performance in real-world contexts. We present an observational study in which we objectively tracked the sleep behavior and job performance of salespeople (N = 15) and athletes (N = 19) for 18 months, using a mattress sensor and online mobile app. We first demonstrate that cumulative sleep measures are correlated with job performance metrics, showing that an hour of daily sleep loss for a week was associated with a 9.0% and 9.5% reduction in performance of salespeople and athletes, respectively. We then examine the utility of online app interaction time as a passively collectible and scalable performance indicator. We show that app interaction time is correlated with the performance of the athletes, but not the salespeople. To support that our app-based performance indicator captures meaningful variation in psychomotor function and is robust against potential confounds, we conducted a second study to evaluate the relationship between sleep behavior and app interaction time in a cohort of 274 participants. Using a generalized additive model to control for per-participant random effects, we demonstrate that participants who lost one hour of daily sleep for a week exhibited 5.0% slower app interaction times. We also find that app interaction time exhibits meaningful chronobiologically consistent correlations with sleep history, time awake, and circadian rhythms. Our findings reveal an opportunity for online app developers to generate new insights regarding cognition and productivity.

Read more
Human Computer Interaction

Open-Source Concealed EEG Data Collection for Brain-Computer-Interfaces -- Real-World Neural Observation Through OpenBCI Amplifiers with Around-the-Ear cEEGrid Electrodes

Observing brain activity in real-world settings offers exciting possibilities like the support of physical health, mental well-being, and thought-controlled interaction modalities. The development of such applications is, however, strongly impeded by poor accessibility to research-grade neural data and by a lack of easy-to-use and comfortable sensors. This work presents the cost-effective adaptation of concealed around-the-ear EEG electrodes (cEEGrids) to the open-source OpenBCI EEG signal acquisition platform to provide a promising new toolkit. An integrated system design is described, that combines publicly available electronics components with newly designed 3D-printed parts to form an easily replicable, versatile, single-unit around-the-ear EEG recording system for prolonged use and easy application development. To demonstrate the system's feasibility, observations of experimentally induced changes in visual stimulation and mental workload are presented. Lastly, as there have been no applications of the cEEGrids to HCI contexts, a novel application area for the system is investigated, namely the observation of flow experiences through observation of temporal Alpha power changes. Support for a link between temporal Alpha power and flow is found, which indicates an efficient engagement of verbal-analytic reasoning with intensified flow experiences, and specifically intensified task absorption.

Read more
Human Computer Interaction

Optimal Action-based or User Prediction-based Haptic Guidance: Can You Do Even Better?

The recently advanced robotics technology enables robots to assist users in their daily lives. Haptic guidance (HG) improves users' task performance through physical interaction between robots and users. It can be classified into optimal action-based HG (OAHG), which assists users with an optimal action, and user prediction-based HG (UPHG), which assists users with their next predicted action. This study aims to understand the difference between OAHG and UPHG and propose a combined HG (CombHG) that achieves optimal performance by complementing each HG type, which has important implications for HG design. We propose implementation methods for each HG type using deep learning-based approaches. A user study (n=20) in a haptic task environment indicated that UPHG induces better subjective evaluations, such as naturalness and comfort, than OAHG. In addition, the CombHG that we proposed further decreases the disagreement between the user intention and HG, without reducing the objective and subjective scores.

Read more
Human Computer Interaction

Outcome-Explorer: A Causality Guided Interactive Visual Interface for Interpretable Algorithmic Decision Making

The widespread adoption of algorithmic decision-making systems has brought about the necessity to interpret the reasoning behind these decisions. The majority of these systems are complex black box models, and auxiliary models are often used to approximate and then explain their behavior. However, recent research suggests that such explanations are not overly accessible to non-expert users and can lead to incorrect interpretation of the underlying model. In this paper, we show that a predictive and interactive model based on causality is inherently interpretable, does not require any auxiliary model, and allows both expert and non-expert users to understand the model comprehensively. To demonstrate our method we developed Outcome Explorer, a causality guided interactive interface, and evaluated it by conducting think-aloud sessions with three expert users and a user study with 18 non-expert users. All three expert users found our tool to be comprehensive in supporting their explanation needs while the non-expert users were able to understand the inner workings of the model easily.

Read more
Human Computer Interaction

PAR: Personal Activity Radius Camera View for Contextual Sensing

Contextual sensing using wearable cameras has seen a variety of different camera angles proposed to capture a wide gamut of different visual scenes. In this paper, we propose a new camera view that aims to capture the same visual information as many of the camera positions and orientations combined from a single camera view point. The camera, mounted on the corner of a glasses frame is pointing downwards towards the floor, a field-of-view we named Personal Activity Radius (PAR). The PAR field-of-view captures the visual information around a wearer's personal bubble, including items they interact with, their body motion, their surrounding environment, etc. In our evaluation, we tested the PAR view's interpretability by human labelers in two different activity tracking scenarios: food related behaviors and exercise tracking. Human labelers achieved an overall high level of precision in identifying body motions in exercise tracking of 91% precision and eating/drinking motions at 96% precision. Item interaction identification reached a precision of 86% precision for labeling grocery categories. We show a high level on the device setup and contextual views we were able to capture with the device. We see that the camera wide angle captures different activities such as driving, shopping, gym exercises, walking and eating and can observe the specific interaction item of the user as well as the immediate contextual surrounding.

Read more
Human Computer Interaction

PRAGMA: Interactively Constructing Functional Brain Parcellations

A prominent goal of neuroimaging studies is mapping the human brain, in order to identify and delineate functionally-meaningful regions and elucidate their roles in cognitive behaviors. These brain regions are typically represented by atlases that capture general trends over large populations. Despite being indispensable to neuroimaging experts, population-level atlases do not capture individual differences in functional organization. In this work, we present an interactive visualization method, PRAGMA, that allows domain experts to derive scan-specific parcellations from established atlases. PRAGMA features a user-driven, hierarchical clustering scheme for defining temporally correlated parcels in varying granularity. The visualization design supports the user in making decisions on how to perform clustering, namely when to expand, collapse, or merge parcels. This is accomplished through a set of linked and coordinated views for understanding the user's current hierarchy, assessing intra-cluster variation, and relating parcellations to an established atlas. We assess the effectiveness of PRAGMA through a user study with four neuroimaging domain experts, where our results show that PRAGMA shows the potential to enable exploration of individualized and state-specific brain parcellations and to offer interesting insights into functional brain networks.

Read more

Ready to get started?

Join us today