Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Daniel Afergan is active.

Publication


Featured researches published by Daniel Afergan.


human factors in computing systems | 2014

Dynamic difficulty using brain metrics of workload

Daniel Afergan; Evan M. Peck; Erin Treacy Solovey; Andrew Jenkins; Samuel W. Hincks; Eli T. Brown; Remco Chang; Robert J. K. Jacob

Dynamic difficulty adjustments can be used in human-computer systems in order to improve user engagement and performance. In this paper, we use functional near-infrared spectroscopy (fNIRS) to obtain passive brain sensing data and detect extended periods of boredom or overload. From these physiological signals, we can adapt a simulation in order to optimize workload in real-time, which allows the system to better fit the task to the user from moment to moment. To demonstrate this idea, we ran a laboratory study in which participants performed path planning for multiple unmanned aerial vehicles (UAVs) in a simulation. Based on their state, we varied the difficulty of the task by adding or removing UAVs and found that we were able to decrease error by 35% over a baseline condition. Our results show that we can use fNIRS brain sensing to detect task difficulty in real-time and construct an interface that improves user performance through dynamic difficulty adjustment.


ACM Transactions on Computer-Human Interaction | 2015

Designing Implicit Interfaces for Physiological Computing: Guidelines and Lessons Learned Using fNIRS

Erin Treacy Solovey; Daniel Afergan; Evan M. Peck; Samuel W. Hincks; Robert J. K. Jacob

A growing body of recent work has shown the feasibility of brain and body sensors as input to interactive systems. However, the interaction techniques and design decisions for their effective use are not well defined. We present a conceptual framework for considering implicit input from the brain, along with design principles and patterns we have developed from our work. We also describe a series of controlled, offline studies that lay the foundation for our work with functional near-infrared spectroscopy (fNIRS) neuroimaging, as well as our real-time platform that serves as a testbed for exploring brain-based adaptive interaction techniques. Finally, we present case studies illustrating the principles and patterns for effective use of brain data in human--computer interaction. We focus on signals coming from the brain, but these principles apply broadly to other sensor data and in domains such as aviation, education, medicine, driving, and anything involving multitasking or varying cognitive workload.


IEEE Transactions on Visualization and Computer Graphics | 2016

Improving Bayesian Reasoning: The Effects of Phrasing, Visualization, and Spatial Ability

Alvitta Ottley; Evan M. Peck; Lane Harrison; Daniel Afergan; Caroline Ziemkiewicz; Holly A. Taylor; Paul K. J. Han; Remco Chang

Decades of research have repeatedly shown that people perform poorly at estimating and understanding conditional probabilities that are inherent in Bayesian reasoning problems. Yet in the medical domain, both physicians and patients make daily, life-critical judgments based on conditional probability. Although there have been a number of attempts to develop more effective ways to facilitate Bayesian reasoning, reports of these findings tend to be inconsistent and sometimes even contradictory. For instance, the reported accuracies for individuals being able to correctly estimate conditional probability range from 6% to 62%. In this work, we show that problem representation can significantly affect accuracies. By controlling the amount of information presented to the user, we demonstrate how text and visualization designs can increase overall accuracies to as high as 77%. Additionally, we found that for users with high spatial ability, our designs can further improve their accuracies to as high as 100%. By and large, our findings provide explanations for the inconsistent reports on accuracy in Bayesian reasoning tasks and show a significant improvement over existing methods. We believe that these findings can have immediate impact on risk communication in health-related fields.


Archive | 2014

Using fNIRS to Measure Mental Workload in the Real World

Evan M. Peck; Daniel Afergan; Beste F. Yuksel; Francine Lalooses; Robert J. K. Jacob

In the past decade, functional near-infrared spectroscopy (fNIRS) has seen increasing use as a non-invasive brain sensing technology. Using optical signals to approximate blood-oxygenation levels in localized regions of the brain, the appeal of the fNIRS signal is that it is relatively robust to movement artifacts and comparable to fMRI measures. We provide an overview of research that builds towards the use of fNIRS to monitor user workload in real world environments, and eventually to act as input to biocybernetic systems. While there are still challenges for the use of fNIRS in real world environments, its unique characteristics make it an appealing alternative for monitoring the cognitive processes of a user.


user interface software and technology | 2014

Brain-based target expansion

Daniel Afergan; Tomoki Shibata; Samuel W. Hincks; Evan M. Peck; Beste F. Yuksel; Remco Chang; Robert J. K. Jacob

The bubble cursor is a promising cursor expansion technique, improving a users movement time and accuracy in pointing tasks. We introduce a brain-based target expansion system, which improves the efficacy of bubble cursor by increasing the expansion of high importance targets at the optimal time based on brain measurements correlated to a particular type of multitasking. We demonstrate through controlled experiments that brain-based target expansion can deliver a graded and continuous level of assistance to a user according to their cognitive state, thereby improving task and speed-accuracy metrics, even without explicit visual changes to the system. Such an adaptation is ideal for use in complex systems to steer users toward higher priority goals during times of increased demand.


international conference on augmented cognition | 2015

Phylter: A System for Modulating Notifications in Wearables Using Physiological Sensing

Daniel Afergan; Samuel W. Hincks; Tomoki Shibata; Robert J. K. Jacob

As wearable computing becomes more mainstream, it holds the promise of delivering timely, relevant notifications to the user. However, these devices can potentially inundate the user, distracting them at the wrong times and providing the wrong amount of information. As physiological sensing also becomes consumer-grade, it holds the promise of helping to control these notifications. To solve this, we build a system Phylter that uses physiological sensing to modulate notifications to the user. Phylter receives streaming data about a user’s cognitive state, and uses this to modulate whether the user should receive the information. We discuss the components of the system and how they interact.


user interface software and technology | 2014

Using brain-computer interfaces for implicit input

Daniel Afergan

Passive brain-computer interfaces, in which implicit input is derived from a users changing brain activity without conscious effort from the user, may be one of the most promising applications of brain-computer interfaces because they can improve user performance without additional effort on the users part. I seek to use physiological signals that correlate to particular brain states in order to adapt an interface while the user behaves normally. My research aims to develop strategies to adapt the interface to the user and the users cognitive state using functional near-infrared spectroscopy (fNIRS), a non-invasive, lightweight brain-sensing technique. While passive brain-computer interfaces are currently being developed and researchers have shown their utility, there has been little effort to develop a framework or hierarchy for adaptation strategies.


user interface software and technology | 2014

Building implicit interfaces for wearable computers with physiological inputs: zero shutter camera and phylter

Tomoki Shibata; Evan M. Peck; Daniel Afergan; Samuel W. Hincks; Beste F. Yuksel; Robert J. K. Jacob

We propose implicit interfaces that use passive physiological input as additional communication channels between wearable devices and wearers. A defining characteristic of physiological input is that it is implicit and continuous, distinguishing it from conventional event-driven action on a keyboard, for example, which is explicit and discrete. By considering the fundamental differences between the two types of inputs, we introduce a core framework to support building implicit interface, such that the framework follows the three key principles: Subscription, Accumulation, and Interpretation of implicit inputs. Unlike a conventional event driven system, our framework subscribes to continuous streams of input data, accumulates the data in a buffer, and subsequently attempts to recognize patterns in the accumulated data -- upon request from the application, rather than directly in response to the input events. Finally, in order to embody the impacts of implicit interfaces in the real world, we introduce two prototype applications for Google Glass, Zero Shutter Camera triggering a camera snapshot and Phylter filtering notifications the both leverage the wearers physiological state information.


human factors in computing systems | 2016

Learn Piano with BACh: An Adaptive Learning Interface that Adjusts Task Difficulty Based on Brain State

Beste F. Yuksel; Kurt B. Oleson; Lane Harrison; Evan M. Peck; Daniel Afergan; Remco Chang; Robert J. K. Jacob


augmented human international conference | 2013

Investigation of fNIRS brain sensing as input to information filtering systems

Evan M. Peck; Daniel Afergan; Robert J. K. Jacob

Collaboration


Dive into the Daniel Afergan's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Lane Harrison

Worcester Polytechnic Institute

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge