Featured Researches

Human Computer Interaction

Images, Emotions, and Credibility: Effect of Emotional Facial Images on Perceptions of News Content Bias and Source Credibility in Social Media

Images are an indispensable part of the news content we consume. Highly emotional images from sources of misinformation can greatly influence our judgements. We present two studies on the effects of emotional facial images on users' perception of bias in news content and the credibility of sources. In study 1, we investigate the impact of happy and angry facial images on users' decisions. In study 2, we focus on sources' systematic emotional treatment of specific politicians. Our results show that depending on the political orientation of the source, the cumulative effect of angry facial emotions impacts users' perceived content bias and source credibility. When sources systematically portray specific politicians as angry, users are more likely to find those sources as less credible and their content as more biased. These results highlight how implicit visual propositions manifested by emotions in facial expressions might have a substantial effect on our trust of news content and sources.

Read more
Human Computer Interaction

Implications of ageing for the design of cognitive interaction systems

We are living longer in times of the biggest technological revolution humanity had ever seen before. Trying to understand how these two facts interact with each other, or more specifically, trying to maximise the benefits that new developments could potentially offer for the enhancement of the quality of life of older adults, is a task on which we have already begun to work. In particular, the rapid growth of cognitive interaction systems (CISs), technologies that learn and interact with humans to extend what human and machine could do on their own, offers a promising landscape of possibilities.

Read more
Human Computer Interaction

Importance of Instruction for Pedestrian-Automated Driving Vehicle Interaction with an External Human Machine Interface: Effects on Pedestrians' Situation Awareness, Trust, Perceived Risks and Decision Making

Compared to a manual driving vehicle (MV), an automated driving vehicle lacks a way to communicate with the pedestrian through the driver when it interacts with the pedestrian because the driver usually does not participate in driving tasks. Thus, an external human machine interface (eHMI) can be viewed as a novel explicit communication method for providing driving intentions of an automated driving vehicle (AV) to pedestrians when they need to negotiate in an interaction, e.g., an encountering scene. However, the eHMI may not guarantee that the pedestrians will fully recognize the intention of the AV. In this paper, we propose that the instruction of the eHMI's rationale can help pedestrians correctly understand the driving intentions and predict the behavior of the AV, and thus their subjective feelings (i.e., dangerous feeling, trust in the AV, and feeling of relief) and decision-making are also improved. The results of an interaction experiment in a road-crossing scene indicate that the participants were more difficult to be aware of the situation when they encountered an AV w/o eHMI compared to when they encountered an MV; further, the participants' subjective feelings and hesitation in decision-making also deteriorated significantly. When the eHMI was used in the AV, the situational awareness, subjective feelings and decision-making of the participants regarding the AV w/ eHMI were improved. After the instruction, it was easier for the participants to understand the driving intention and predict driving behavior of the AV w/ eHMI. Further, the subjective feelings and the hesitation related to decision-making were improved and reached the same standards as that for the MV.

Read more
Human Computer Interaction

Improving Artificial Teachers by Considering How People Learn and Forget

The paper presents a novel model-based method for intelligent tutoring, with particular emphasis on the problem of selecting teaching interventions in interaction with humans. Whereas previous work has focused on either personalization of teaching or optimization of teaching intervention sequences, the proposed individualized model-based planning approach represents convergence of these two lines of research. Model-based planning picks the best interventions via interactive learning of a user memory model's parameters. The approach is novel in its use of a cognitive model that can account for several key individual- and material-specific characteristics related to recall/forgetting, along with a planning technique that considers users' practice schedules. Taking a rule-based approach as a baseline, the authors evaluated the method's benefits in a controlled study of artificial teaching in second-language vocabulary learning (N=53).

Read more
Human Computer Interaction

Improving Engagement of Animated Visualization with Visual Foreshadowing

Animated visualization is becoming increasingly popular as a compelling way to illustrate changes in time series data. However, maintaining the viewer's focus throughout the entire animation is difficult because of its time-consuming nature. Viewers are likely to become bored and distracted during the ever-changing animated visualization. Informed by the role of foreshadowing that builds the expectation in film and literature, we introduce visual foreshadowing to improve the engagement of animated visualizations. In specific, we propose designs of visual foreshadowing that engage the audience while watching the animation. To demonstrate our approach, we built a proof-of-concept animated visualization authoring tool that incorporates visual foreshadowing techniques with various styles. Our user study indicates the effectiveness of our foreshadowing techniques on improving engagement for animated visualization.

Read more
Human Computer Interaction

Influences of Temporal Factors on GPS-based Human Mobility Lifestyle

Analysis of human mobility from GPS trajectories becomes crucial in many aspects such as policy planning for urban citizens, location-based service recommendation/prediction, and especially mitigating the spread of biological and mobile viruses. In this paper, we propose a method to find temporal factors affecting the human mobility lifestyle. We collected GPS data from 100 smartphone users in Japan. We designed a model that consists of 13 temporal patterns. We then applied a multiple linear regression and found that people tend to keep their mobility habits on Thursday and the days in the second week of a month but tend to lose their habits on Friday. We also explained some reasons behind these findings.

Read more
Human Computer Interaction

Insights From Experiments With Rigor in an EvoBio Design Study

Design study is an established approach of conducting problem-driven visualization research. The academic visualizationcommunity has produced a large body of work for reporting on design studies, informed by a handful of theoretical frameworks, andapplied to a broad range of application areas. The result is an abundance of reported insights into visualization design, with anemphasis on novel visualization techniques and systems as the primary contribution of these studies. In recent work we proposeda new, interpretivist perspective on design study and six companion criteria for rigor that highlight the opportunities for researchersto contribute knowledge that extends beyond visualization idioms and software. In this work we conducted a year-long collaborationwith evolutionary biologists to develop an interactive tool for visual exploration of multivariate datasets and phylogenetic trees. Duringthis design study we experimented with methods to support three of the rigor criteria:ABUNDANT,REFLEXIVE, andTRANSPARENT. As aresult we contribute two novel visualization techniques for the analysis of multivariate phylogenetic datasets, three methodologicalrecommendations for conducting design studies drawn from reflections over our process of experimentation, and two writing devices forreporting interpretivist design study. We offer this work as an example for implementing the rigor criteria to produce a diverse range ofknowledge contributions.

Read more
Human Computer Interaction

Integrated Visualization Editing via Parameterized Declarative Templates

Interfaces for creating visualizations typically embrace one of several common forms. Textual specification enables fine-grained control, shelf building facilitates rapid exploration, while chart choosing promotes immediacy and simplicity. Ideally these approaches could be unified to integrate the user- and usage-dependent benefits found in each modality, yet these forms remain distinct. We propose parameterized declarative templates, a simple abstraction mechanism over JSON-based visualization grammars, as a foundation for multimodal visualization editors. We demonstrate how templates can facilitate organization and reuse by factoring the more than 160 charts that constitute Vega-Lite's example gallery into approximately 40 templates. We exemplify the pliability of abstracting over charting grammars by implementing -- as a template -- the functionality of the shelf builder Polestar (a simulacra of Tableau) and a set of templates that emulate the Google Sheets chart chooser. We show how templates support multimodal visualization editing by implementing a prototype and evaluating it through an approachability study.

Read more
Human Computer Interaction

Integrative Object and Pose to Task Detection for an Augmented-Reality-based Human Assistance System using Neural Networks

As a result of an increasingly automatized and digitized industry, processes are becoming more complex. Augmented Reality has shown considerable potential in assisting workers with complex tasks by enhancing user understanding and experience with spatial information. However, the acceptance and integration of AR into industrial processes is still limited due to the lack of established methods and tedious integration efforts. Meanwhile, deep neural networks have achieved remarkable results in computer vision tasks and bear great prospects to enrich Augmented Reality applications . In this paper, we propose an Augmented-Reality-based human assistance system to assist workers in complex manual tasks where we incorporate deep neural networks for computer vision tasks. More specifically, we combine Augmented Reality with object and action detectors to make workflows more intuitive and flexible. To evaluate our system in terms of user acceptance and efficiency, we conducted several user studies. We found a significant reduction in time to task completion in untrained workers and a decrease in error rate. Furthermore, we investigated the users learning curve with our assistance system.

Read more
Human Computer Interaction

Interface Features and Users' Well-Being: Measuring the Sensitivity of Users' Well-Being to Resource Constraints and Feature Types

Users increasingly face multiple interface features on one hand, and constraints on available resources (e.g., time, attention) on the other. Understanding the sensitivity of users' well-being to feature type and resource constraints, is critical for informed design. Building on microeconomic theory, and focusing on social information features, users' interface choices were conceptualized as an exchange of resources (e.g., time), in return for access to goods (social information features). We studied how sensitive users' well-being is to features' type, and to their cost level and type. We found that (1) increased cost of feature use leads to decreased well-being, (2) users' well-being is a function of features' cost type, and (3) users' well-being is sensitive to differences in feature type. The approach used here to quantify user well-being derived from interface features offers a basis for asynchronous feature comparison.

Read more

Ready to get started?

Join us today