Featured Researches

Human Computer Interaction

3D4ALL: Toward an Inclusive Pipeline to Classify 3D Contents

Algorithmic content moderation manages an explosive number of user-created content shared online everyday. Despite a massive number of 3D designs that are free to be downloaded, shared, and 3D printed by the users, detecting sensitivity with transparency and fairness has been controversial. Although sensitive 3D content might have a greater impact than other media due to its possible reproducibility and replicability without restriction, prevailed unawareness resulted in proliferation of sensitive 3D models online and a lack of discussion on transparent and fair 3D content moderation. As the 3D content exists as a document on the web mainly consisting of text and images, we first study the existing algorithmic efforts based on text and images and the prior endeavors to encompass transparency and fairness in moderation, which can also be useful in a 3D printing domain. At the same time, we identify 3D specific features that should be addressed to advance a 3D specialized algorithmic moderation. As a potential solution, we suggest a human-in-the-loop pipeline using augmented learning, powered by various stakeholders with different backgrounds and perspectives in understanding the content. Our pipeline aims to minimize personal biases by enabling diverse stakeholders to be vocal in reflecting various factors to interpret the content. We add our initial proposal for redesigning metadata of open 3D repositories, to invoke users' responsible actions of being granted consent from the subject upon sharing contents for free in the public spaces.

Read more
Human Computer Interaction

A Blast From the Past: Personalizing Predictions of Video-Induced Emotions using Personal Memories as Context

A key challenge in the accurate prediction of viewers' emotional responses to video stimuli in real-world applications is accounting for person- and situation-specific variation. An important contextual influence shaping individuals' subjective experience of a video is the personal memories that it triggers in them. Prior research has found that this memory influence explains more variation in video-induced emotions than other contextual variables commonly used for personalizing predictions, such as viewers' demographics or personality. In this article, we show that (1) automatic analysis of text describing their video-triggered memories can account for variation in viewers' emotional responses, and (2) that combining such an analysis with that of a video's audiovisual content enhances the accuracy of automatic predictions. We discuss the relevance of these findings for improving on state of the art approaches to automated affective video analysis in personalized contexts.

Read more
Human Computer Interaction

A Design Space of Vision Science Methods for Visualization Research

A growing number of efforts aim to understand what people see when using a visualization. These efforts provide scientific grounding to complement design intuitions, leading to more effective visualization practice. However, published visualization research currently reflects a limited set of available methods for understanding how people process visualized data. Alternative methods from vision science offer a rich suite of tools for understanding visualizations, but no curated collection of these methods exists in either perception or visualization research. We introduce a design space of experimental methods for empirically investigating the perceptual processes involved with viewing data visualizations to ultimately inform visualization design guidelines. This paper provides a shared lexicon for facilitating experimental visualization research. We discuss popular experimental paradigms, adjustment types, response types, and dependent measures used in vision science research, rooting each in visualization examples. We then discuss the advantages and limitations of each technique. Researchers can use this design space to create innovative studies and progress scientific understanding of design choices and evaluations in visualization. We highlight a history of collaborative success between visualization and vision science research and advocate for a deeper relationship between the two fields that can elaborate on and extend the methodological design space for understanding visualization and vision.

Read more
Human Computer Interaction

A First Step Towards On-Device Monitoring of Body Sounds in the Wild

Body sounds provide rich information about the state of the human body and can be useful in many medical applications. Auscultation, the practice of listening to body sounds, has been used for centuries in respiratory and cardiac medicine to diagnose or track disease progression. To date, however, its use has been confined to clinical and highly controlled settings. Our work addresses this limitation: we devise a chest-mounted wearable for continuous monitoring of body sounds, that leverages data processing algorithms that run on-device. We concentrate on the detection of heart sounds to perform heart rate monitoring. To improve robustness to ambient noise and motion artefacts, our device uses an algorithm that explicitly segments the collected audio into the phases of the cardiac cycle. Our pilot study with 9 users demonstrates that it is possible to obtain heart rate estimates that are competitive with commercial heart rate monitors, with low enough power consumption for continuous use.

Read more
Human Computer Interaction

A Framework for Evaluating Dashboards in Healthcare

In the era of "information overload", effective information provision is essential for enabling rapid response and critical decision making. In making sense of diverse information sources, data dashboards have become an indispensable tool, providing fast, effective, adaptable, and personalized access to information for professionals and the general public alike. However, these objectives place a heavy requirement on dashboards as information systems, resulting in poor usability and ineffective design. Understanding these shortfalls is a challenge given the absence of a consistent and comprehensive approach to dashboard evaluation. In this paper we systematically review literature on dashboard implementation in the healthcare domain, a field where dashboards have been employed widely, and in which there is widespread interest for improving the current state of the art, and subsequently analyse approaches taken towards evaluation. We draw upon consolidated dashboard literature and our own observations to introduce a general definition of dashboards which is more relevant to current trends, together with a dashboard task-based classification, which underpin our subsequent analysis. From a total of 81 papers we derive seven evaluation scenarios - task performance, behaviour change, interaction workflow, perceived engagement, potential utility, algorithm performance and system implementation. These scenarios distinguish different evaluation purposes which we illustrate through measurements, example studies, and common challenges in evaluation study design. We provide a breakdown of each evaluation scenario, and highlight some of the subtle and less well posed questions. We conclude by outlining a number of active discussion points and a set of dashboard evaluation best practices for the academic, clinical and software development communities alike.

Read more
Human Computer Interaction

A Framework for Integrating Gesture Generation Models into Interactive Conversational Agents

Embodied conversational agents (ECAs) benefit from non-verbal behavior for natural and efficient interaction with users. Gesticulation - hand and arm movements accompanying speech - is an essential part of non-verbal behavior. Gesture generation models have been developed for several decades: starting with rule-based and ending with mainly data-driven methods. To date, recent end-to-end gesture generation methods have not been evaluated in a real-time interaction with users. We present a proof-of-concept framework, which is intended to facilitate evaluation of modern gesture generation models in interaction. We demonstrate an extensible open-source framework that contains three components: 1) a 3D interactive agent; 2) a chatbot backend; 3) a gesticulating system. Each component can be replaced, making the proposed framework applicable for investigating the effect of different gesturing models in real-time interactions with different communication modalities, chatbot backends, or different agent appearances. The code and video are available at the project page this https URL.

Read more
Human Computer Interaction

A Game-Based Approach for Helping Designers Learn Machine Learning Concepts

Machine Learning (ML) is becoming more prevalent in the systems we use daily. Yet designers of these systems are under-equipped to design with these technologies. Recently, interactive visualizations have been used to present ML concepts to non-experts. However, little research exists evaluating how designers build an understanding of ML in these environments or how to instead design interfaces that guide their learning. In a user study (n=21), we observe how designers interact with our interactive visualizer, \textit{QUBE}, focusing on visualizing Q-Learning through a game metaphor. We analyze how designers approach interactive visualizations and game metaphors to form an understanding of ML concepts and the challenges they face along the way. We found the interactive visualization significantly improved participants' high-level understanding of ML concepts. However, it did not support their ability to design with these concepts. We present themes on the challenges our participants faced when learning an ML concept and their self-guided learning behaviors. Our findings suggest design recommendations for supporting an understanding of ML concepts through guided learning interfaces and game metaphors.

Read more
Human Computer Interaction

A Multi-Platform Study of Crowd Signals Associated with Successful Online Fundraising

The growing popularity of online fundraising (aka "crowdfunding") has attracted significant research on the subject. In contrast to previous studies that attempt to predict the success of crowdfunded projects based on specific characteristics of the projects and their creators, we present a more general approach that focuses on crowd dynamics and is robust to the particularities of different crowdfunding platforms. We rely on a multi-method analysis to investigate the correlates, predictive importance, and quasi-causal effects of features that describe crowd dynamics in determining the success of crowdfunded projects. By applying a multi-method analysis to a study of fundraising in three different online markets, we uncover general crowd dynamics that ultimately decide which projects will succeed. In all analyses and across the three different platforms, we consistently find that funders' behavioural signals (1) are significantly correlated with fundraising success; (2) approximate fundraising outcomes better than the characteristics of projects and their creators such as credit grade, company valuation, and subject domain; and (3) have significant quasi-causal effects on fundraising outcomes while controlling for potentially confounding project variables. By showing that universal features deduced from crowd behaviour are predictive of fundraising success on different crowdfunding platforms, our work provides design-relevant insights about novel types of collective decision-making online. This research inspires thus potential ways to leverage cues from the crowd and catalyses research into crowd-aware system design.

Read more
Human Computer Interaction

A Neuro-inspired Theory of Joint Human-Swarm Interaction

Human-swarm interaction (HSI) is an active research challenge in the realms of swarm robotics and human-factors engineering. Here we apply a cognitive systems engineering perspective and introduce a neuro-inspired joint systems theory of HSI. The mindset defines predictions for adaptive, robust and scalable HSI dynamics and therefore has the potential to inform human-swarm loop design.

Read more
Human Computer Interaction

A Probabilistic Interpretation of Motion Correlation Selection Techniques

Motion correlation interfaces are those that present targets moving in different patterns, which the user can select by matching their motion. In this paper, we re-formulate the task of target selection as a probabilistic inference problem. We demonstrate that previous interaction techniques can be modelled using a Bayesian approach and that how modelling the selection task as transmission of information can help us make explicit the assumptions behind similarity measures. We propose ways of incorporating uncertainty into the decision-making process and demonstrate how the concept of entropy can illuminate the measurement of the quality of a design. We apply these techniques in a case study and suggest guidelines for future work.

Read more

Ready to get started?

Join us today