Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David Sirkin is active.

Publication


Featured researches published by David Sirkin.


international conference on persuasive technology | 2010

Animate objects: how physical motion encourages public interaction

Wendy Ju; David Sirkin

The primary challenge for information terminals, kiosks, and incidental use systems of all sorts, is that of getting the “first click” from busy passersby. This paper presents two studies that investigate the role of motion and physicality in drawing people to look and actively interact with generic information kiosks. The first study was designed as a 2x2 factorial design, physical v. on-screen gesturing and hand v. arrow motion, on a kiosk deployed in two locations, a bookstore and a computer science building lobby. The second study examined the effect of physical v. projected gesturing, and included a follow-up survey. Over twice as many passersby interacted in the physical v. on-screen condition in the first study and 60% more interacted in the second. These studies, in concert, indicate that physical gesturing does indeed significantly attract more looks and use for the information kiosk, and that form affects people’s impression and interpretation of these gestures.


Proceedings of the Human Factors and Ergonomics Society Annual Meeting | 2015

Distraction Becomes Engagement in Automated Driving

David Miller; Annabel Sun; Mishel Johns; Hillary Page Ive; David Sirkin; Sudipto Aich; Wendy Ju

As vehicle automation proliferates, the current emphasis on preventing driver distraction needs to transition to maintaining driver availability. During automated driving, vehicle operators are likely to use brought-in devices to access entertainment and information. Do these media devices need to be controlled by the vehicle in order to manage driver attention? In a driving simulation study (N=48) investigating driver performance shortly after transitions from automated to human control, we found that participants watching videos or reading on a tablet were far less likely (6% versus 27%) to exhibit behaviors indicative of drowsiness than when overseeing the automated driving system; irrespective of the pre-driving activity, post- transition driving performance after a five-second structured handoff was not impaired. There was not a significant difference in collision avoidance reaction time or minimum headway distance between supervision and media consumption conditions, irrespective of whether messages were presented on the tablet device, or only presented on the instrument panel, or whether there was a single or two-stage handoff.


human-robot interaction | 2015

Mechanical Ottoman: How Robotic Furniture Offers and Withdraws Support

David Sirkin; Brian K. Mok; Stephen Yang; Wendy Ju

This paper describes our approach to designing, developing behaviors for, and exploring the use of, a robotic footstool, which we named the mechanical ottoman. By approaching unsuspecting participants and attempting to get them to place their feet on the footstool, and then later attempting to break the engagement and get people to take their feet down, we sought to understand whether and how motion can be used by non-anthropomorphic robots to engage people in joint action. In several embodied design improvisation sessions, we observed a tension between people perceiving the ottoman as a living being, such as a pet, and simultaneously as a functional object, which requests that they place their feet on it—something they would not ordinarily do with a pet. In a follow-up lab study (N=20), we found that most participants did make use of the footstool, although several chose not to place their feet on it for this reason. We also found that participants who rested their feet understood a brief lift and drop movement as a request to withdraw, and formed detailed notions about the footstool’s agenda, ascribing intentions based on its movement alone.


robot and human interactive communication | 2016

Ghost driver: A field study investigating the interaction between pedestrians and driverless vehicles

Dirk Rothenbücher; Jamy Li; David Sirkin; Brian K. Mok; Wendy Ju

How will pedestrians and bicyclists interact with autonomous vehicles when there is no human driver? In this paper, we outline a novel method for performing observational field experiments to investigate interactions with driverless cars. We provide a proof-of-concept study (N=67), conducted at a crosswalk and a traffic circle, which applies this method. In the study, participants encountered a vehicle that appeared to have no driver, but which in fact was driven by a human confederate hidden inside. We constructed a car seat costume to conceal the driver, who was specially trained to emulate an autonomous system. Data included video recordings and participant responses to post-interaction questionnaires. Pedestrians who encountered the car reported that they saw no driver, yet they managed interactions smoothly, except when the car misbehaved by moving into the crosswalk just as they were about to cross. This method is the first of its kind, and we believe that it contributes a valuable technique for safely acquiring empirical data and insights about driverless vehicle interactions. These insights can then be used to design vehicle behaviors well in advance of the broad deployment of autonomous technology.


ieee intelligent vehicles symposium | 2016

Monitoring driver cognitive load using functional near infrared spectroscopy in partially autonomous cars

Srinath Sibi; Hasan Ayaz; David P. Kuhns; David Sirkin; Wendy Ju

In partially automated cars, it is vital to understand the driver state, especially the drivers cognitive load. This might indicate whether the driver is alert or distracted, and if the car can safely transfer control of driving. In order to better understand the relationship between cognitive load and the driver performance in a partially autonomous vehicle, functional near infrared spectroscopy (fNIRS) measures were employed to study the activation of the prefrontal cortex of drivers in a simulated environment. We studied a total of 14 participants while they drove a partially autonomous car and performed common secondary tasks. We observed that when participants were asked to monitor the driving of an autonomous car they had low cognitive load compared to when the same participants were asked to perform a secondary reading or video watching task on a brought in device. This observation was in line with the increased drowsy behavior observed during intervals of autonomous system monitoring in previous studies. Results demonstrate that fNIRS signals from prefrontal cortex indicate additional cognitive load during manual driving compared to autonomous. Such brain function metrics could be used with minimally intrusive and low cost sensors to enable real-time assessment of driver state in future autonomous vehicles to improve safety and efficacy of transfer of control.


robot and human interactive communication | 2015

Experiences developing socially acceptable interactions for a robotic trash barrel

Stephen Yang; Brian K. Mok; David Sirkin; Hillary Page Ive; Rohan Maheshwari; Kerstin Fischer; Wendy Ju

Service robots in public places need to both understand environmental cues and move in ways that people can understand and predict. We developed and tested interactions with a trash barrel robot to better understand the implicit protocols for public interaction. In eight lunch-time sessions spread across two crowded campus dining destinations, we experimented with piloting our robot in Wizard of Oz fashion, initiating and responding to requests for impromptu interactions centered on collecting peoples trash. Our studies progressed from open-ended experimentation to testing specific interaction strategies that seemed to evoke clear engagement and responses, both positive and negative. Observations and interviews show that a) people most welcome the robots presence when they need its services and it actively advertises its intent through movement; b) people create mental models of the trash barrel as having intentions and desires; c) mistakes in navigation are indicators of autonomous control, rather than a remote operator; and d) repeated mistakes and struggling behavior polarized responses as either ignoring or endearing.


human robot interaction | 2016

Exploring Shared Control in Automated Driving

Mishel Johns; Brian K. Mok; David Sirkin; Nikhil Gowda; Catherine Smith; Walter J. Talamonti; Wendy Ju

Automated driving systems that share control with human drivers by using haptic feedback through the steering wheel have been shown to have advantages over fully automated systems and manual driving. Here, we describe an experiment to elicit tacit expectations of behavior from such a system. A gaming steering wheel electronically coupled to the steering wheel in a full-car driving simulator allows two participants to share control of the vehicle. One participant was asked to use the gaming wheel to act as the automated driving agent while another participant acted as the car driver. The course provided different information and visuals to the driving agent and the driver to simulate possible automation failures and conflict situations between automation and the driver. The driving agent was also given prompts that specified a communicative goal at various points along the course. Both participants were interviewed before and after the drive, and vehicle data and drive video were collected. Our results suggest that drivers were able to interpret simple trajectory intentions, such as a lane change, conveyed by the driving agent. However, the driving agent was not able to effectively communicate more nuanced, higher level ideas such as availability, primarily due to the steering wheel being the control mechanism. Torque on the steering wheel without warning was seen most often as a failure of automation. Gentle and steady steering movements were viewed more favorably.


human robot interaction | 2015

Adventures of an Adolescent Trash Barrel

Stephen Yang; Brian K. Mok; David Sirkin; Wendy Ju

Our demonstration presents the roving trash barrel, a robot that we developed to understand how people perceive and respond to a mobile trashcan that offers its service in public settings. In a field study, we found that considerable coordination is involved in actively collecting trash, including capturing someones attention, signaling an intention to interact, acknowledging the willingness--or implicit signs of unwillingness--to interact, and closing the interaction. In post-interaction interviews, we discovered that people believed that the robot was intrinsically motivated to collect trash, and attributed social mishaps to higher levels of autonomy.


automotive user interfaces and interactive vehicular applications | 2015

Ghost driver: a platform for investigating interactions between pedestrians and driverless vehicles

Dirk Rothenbücher; Jamy Li; David Sirkin; Brian K. Mok; Wendy Ju

How will pedestrians and cyclists interact with self-driving cars when there is no human driver? To find answers to this question we need a secure experimental design in which pedestrians can interact with a car that appears to drive on its own. In Ghost Driver we staged a fake autonomous car by installing LIDARs, cameras and decals on the outside of the vehicle and by covering the driver with a seat costume so that it appeared that there was no driver in the car. In initial field studies we found that this Wizard-of-Oz technique convinced more than 80% of the participants that the car was driving autonomously without a driver. Consequently the Ghost Driver methodology could become a platform for further investigation of how pedestrians or cyclists interact with driverless vehicles.


automotive user interfaces and interactive vehicular applications | 2015

DAZE: a real-time situation awareness measurement tool for driving

Nikolas Martelaro; David Sirkin; Wendy Ju

A drivers situation awareness (SA) while on the road is a critical factor in his or her ability to make decisions to avoid hazards, plan routes and maintain safe travel. Understanding SA can therefore be a great help when designing or evaluating new interfaces intended to support these decisions. However, existing tools for measuring SA in simulated environments require halting the simulation currently in-progress to question participants or administering post-drive questionnaires. Through our experience developing interfaces for autonomous vehicles in both simulated and on-road environments, we have found a need for real-time SA measurement designed specifically for both simulation and on-road driving scenarios. We are developing a tool, inspired by the Waze [tm] driving app, to measure SA through real time on-road event questions. The system has been tested in lab and will be further evaluated against current SA measurement tools.

Collaboration


Dive into the David Sirkin's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge