Mishel Johns
Stanford University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Mishel Johns.
Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2015
David Miller; Annabel Sun; Mishel Johns; Hillary Page Ive; David Sirkin; Sudipto Aich; Wendy Ju
As vehicle automation proliferates, the current emphasis on preventing driver distraction needs to transition to maintaining driver availability. During automated driving, vehicle operators are likely to use brought-in devices to access entertainment and information. Do these media devices need to be controlled by the vehicle in order to manage driver attention? In a driving simulation study (N=48) investigating driver performance shortly after transitions from automated to human control, we found that participants watching videos or reading on a tablet were far less likely (6% versus 27%) to exhibit behaviors indicative of drowsiness than when overseeing the automated driving system; irrespective of the pre-driving activity, post- transition driving performance after a five-second structured handoff was not impaired. There was not a significant difference in collision avoidance reaction time or minimum headway distance between supervision and media consumption conditions, irrespective of whether messages were presented on the tablet device, or only presented on the instrument panel, or whether there was a single or two-stage handoff.
human robot interaction | 2016
Mishel Johns; Brian K. Mok; David Sirkin; Nikhil Gowda; Catherine Smith; Walter J. Talamonti; Wendy Ju
Automated driving systems that share control with human drivers by using haptic feedback through the steering wheel have been shown to have advantages over fully automated systems and manual driving. Here, we describe an experiment to elicit tacit expectations of behavior from such a system. A gaming steering wheel electronically coupled to the steering wheel in a full-car driving simulator allows two participants to share control of the vehicle. One participant was asked to use the gaming wheel to act as the automated driving agent while another participant acted as the car driver. The course provided different information and visuals to the driving agent and the driver to simulate possible automation failures and conflict situations between automation and the driver. The driving agent was also given prompts that specified a communicative goal at various points along the course. Both participants were interviewed before and after the drive, and vehicle data and drive video were collected. Our results suggest that drivers were able to interpret simple trajectory intentions, such as a lane change, conveyed by the driving agent. However, the driving agent was not able to effectively communicate more nuanced, higher level ideas such as availability, primarily due to the steering wheel being the control mechanism. Torque on the steering wheel without warning was seen most often as a failure of automation. Gentle and steady steering movements were viewed more favorably.
human factors in computing systems | 2017
Brian K. Mok; Mishel Johns; David Miller; Wendy Ju
In partially automated driving, rapid transitions of control present a severe hazard. How long does it take a driver to take back control of the vehicle when engaged with other non-driving tasks? In this driving simulator study, we examined the performance of participants (N=30) after an abrupt loss of automated vehicle control. We tested three transition time conditions, with an unstructured transition of control occurring 2s, 5s, or 8s before entering a curve. As participants were occupied with an active secondary task (playing a game on a tablet) while the automated driving mode was enabled, they needed to disengage from the task and regain control of the car when the transition occurred. Few drivers in the 2 second condition were able to safely negotiate the road hazard situation, while the majority of drivers in the 5 or 8 second conditions were able to navigate the hazard situation safely.
automotive user interfaces and interactive vehicular applications | 2014
Mishel Johns; Srinath Sibi; Wendy Ju
Present study on cognitive workload in driving focuses on reduction of workload for better driving performance. In this paper, we talk about the cognitive load in drivers of autonomous cars and their performance under multiple cognitive loads. Our previous studies have indicated that low to no workload is likely to induce drowsiness in drivers of autonomous vehicles. We hypothesize that there is an optimal cognitive load for a driver during autonomous driving for best performance after transfer of control from autonomous to manual. We propose an experiment to study the cognitive load on the driver of a simulated autonomous car and the effects on manual driving performance. We also describe our use of biometric devices to obtain physiological measures indicative of cognitive workload.
Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2016
David Miller; Mishel Johns; Brian K. Mok; Nikhil Gowda; David Sirkin; Key Jung Lee; Wendy Ju
Stating that one trusts a system is markedly different from demonstrating that trust. To investigate trust in automation, we introduce the trust fall: a two-stage behavioral test of trust. In the trust fall paradigm, first the one learns the capabilities of the system, and in the second phase, the ‘fall,’ one’s choices demonstrate trust or distrust. Our first studies using this method suggest the value of measuring behaviors that demonstrate trust, compared with self-reports of one’s trust. Designing interfaces that encourage appropriate trust in automation will be critical for the safe and successful deployment of partially automated vehicles, and this will rely on a solid understanding of whether these interfaces actually inspire trust and encourage supervision.
user interface software and technology | 2017
Brian K. Mok; Mishel Johns; Stephen Yang; Wendy Ju
In this paper, we introduce two different transforming steering wheel systems that can be utilized to augment user experience for future partially autonomous and fully autonomous vehicles. The first one is a robotic steering wheel that can mechanically transform by using its actuators to move the various components into different positions. The second system is a LED steering wheel that can visually transform by using LEDs embedded along the rim of wheel to change colors. Both steering wheel systems contain onboard microcontrollers developed to interface with our driving simulator. The main function of these two systems is to provide emergency warnings to drivers in a variety of safety critical scenarios, although the design space that we propose for these steering wheel systems also includes the use as interactive user interfaces. To evaluate the effectiveness of the emergency alerts, we conducted a driving simulator study examining the performance of participants (N=56) after an abrupt loss of autonomous vehicle control. Drivers who experienced the robotic steering wheel performed significantly better than those who experienced the LED steering wheel. The results of this study suggest that alerts utilizing mechanical movement are more effective than purely visual warnings.
human robot interaction | 2016
David Sirkin; Nik Martelaro; Hamish Tennent; Mishel Johns; Brian K. Mok; Wendy Ju; Guy Hoffman; Heather Knight; Bilge Mutlu; Leila Takayama
This tutorial is a hands-on introduction to human-centered design topics and practices for human-robot interaction. It is intended for researchers with a variety of backgrounds, particularly those with little or no prior experience in design. In the morning, participants will learn about user needs and needfinding, as ways to understand the stakeholders in research outcomes, guide the selection of participants, and as possible measures of success. We then focus on design sketching, including ways to represent objects, people and their interactions through storyboards. Design sketching is not intended to be art, rather a way to develop and build upon ideas with oneself, and quickly communicate with colleagues. In the afternoon, participants will use the tools and materials, and learn techniques for lightweight physical prototyping and improvisation. Participants will build a small paper robot (not actuated) of their own design, to practice puppeteering, explore bodily movement and prototype interactions.
automotive user interfaces and interactive vehicular applications | 2016
Amir H. Ghasemi; Mishel Johns; Benjamin Garber; Paul Boehm; Paramsothy Jayakumar; Wendy Ju; R. Brent Gillespie
Haptic shared control is a promising means for combining the best of human manual control with automatic control, keeping the human in the loop while avoiding automation pitfalls. In this study, we consider a situation in which both human and the automation system recognize an obstacle but choose different paths of avoidance. While the driver and automation have similar perceptions of the situation, the commands they issue are incompatible and their simple sum is most likely dangerous. To resolve this issue, this study is focused on exploring how roles (i.e. leader and follower) in a haptic collaboration can be negotiated and exchanged between the two partners. Specifically, we test the influence of the timing of cues to promote adoption of leader and follower roles in a shared control task. Preliminary results suggest that haptic feedback can enhance drivability and prevent accidents.
international conference on intelligent transportation systems | 2015
Brian K. Mok; Mishel Johns; Key Jung Lee; David Miller; David Sirkin; Page Ive; Wendy Ju
ieee intelligent vehicles symposium | 2015
Brian K. Mok; Mishel Johns; Key Jung Lee; Hillary Page Ive; David Miller; Wendy Ju