Stephen Yang
Stanford University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Stephen Yang.
ACM Transactions on Computer Systems | 2015
John K. Ousterhout; Arjun Gopalan; Ashish Gupta; Ankita Kejriwal; Collin Lee; B. Montazeri; Diego Ongaro; Seo Jin Park; Henry Qin; Mendel Rosenblum; Stephen M. Rumble; Ryan Stutsman; Stephen Yang
RAMCloud is a storage system that provides low-latency access to large-scale datasets. To achieve low latency, RAMCloud stores all data in DRAM at all times. To support large capacities (1PB or more), it aggregates the memories of thousands of servers into a single coherent key-value store. RAMCloud ensures the durability of DRAM-based data by keeping backup copies on secondary storage. It uses a uniform log-structured mechanism to manage both DRAM and secondary storage, which results in high performance and efficient memory usage. RAMCloud uses a polling-based approach to communication, bypassing the kernel to communicate directly with NICs; with this approach, client applications can read small objects from any RAMCloud storage server in less than 5μs, durable writes of small objects take about 13.5μs. RAMCloud does not keep multiple copies of data online; instead, it provides high availability by recovering from crashes very quickly (1 to 2 seconds). RAMCloud’s crash recovery mechanism harnesses the resources of the entire cluster working concurrently so that recovery performance scales with cluster size.
human-robot interaction | 2015
David Sirkin; Brian K. Mok; Stephen Yang; Wendy Ju
This paper describes our approach to designing, developing behaviors for, and exploring the use of, a robotic footstool, which we named the mechanical ottoman. By approaching unsuspecting participants and attempting to get them to place their feet on the footstool, and then later attempting to break the engagement and get people to take their feet down, we sought to understand whether and how motion can be used by non-anthropomorphic robots to engage people in joint action. In several embodied design improvisation sessions, we observed a tension between people perceiving the ottoman as a living being, such as a pet, and simultaneously as a functional object, which requests that they place their feet on it—something they would not ordinarily do with a pet. In a follow-up lab study (N=20), we found that most participants did make use of the footstool, although several chose not to place their feet on it for this reason. We also found that participants who rested their feet understood a brief lift and drop movement as a request to withdraw, and formed detailed notions about the footstool’s agenda, ascribing intentions based on its movement alone.
robot and human interactive communication | 2015
Stephen Yang; Brian K. Mok; David Sirkin; Hillary Page Ive; Rohan Maheshwari; Kerstin Fischer; Wendy Ju
Service robots in public places need to both understand environmental cues and move in ways that people can understand and predict. We developed and tested interactions with a trash barrel robot to better understand the implicit protocols for public interaction. In eight lunch-time sessions spread across two crowded campus dining destinations, we experimented with piloting our robot in Wizard of Oz fashion, initiating and responding to requests for impromptu interactions centered on collecting peoples trash. Our studies progressed from open-ended experimentation to testing specific interaction strategies that seemed to evoke clear engagement and responses, both positive and negative. Observations and interviews show that a) people most welcome the robots presence when they need its services and it actively advertises its intent through movement; b) people create mental models of the trash barrel as having intentions and desires; c) mistakes in navigation are indicators of autonomous control, rather than a remote operator; and d) repeated mistakes and struggling behavior polarized responses as either ignoring or endearing.
human robot interaction | 2015
Stephen Yang; Brian K. Mok; David Sirkin; Wendy Ju
Our demonstration presents the roving trash barrel, a robot that we developed to understand how people perceive and respond to a mobile trashcan that offers its service in public settings. In a field study, we found that considerable coordination is involved in actively collecting trash, including capturing someones attention, signaling an intention to interact, acknowledging the willingness--or implicit signs of unwillingness--to interact, and closing the interaction. In post-interaction interviews, we discovered that people believed that the robot was intrinsically motivated to collect trash, and attributed social mishaps to higher levels of autonomy.
human-robot interaction | 2014
Brian K. Mok; Stephen Yang; David Sirkin; Wendy Ju
The role of human-robot interaction is becoming more important as everyday robotic devices begin to permeate into our lives. In this study, we video-prototyped a user’s interactions with a set of robotic drawers. The user and robot each displayed one of five emotional states - angry, happy, indifferent, sad, and timid. The results of our study indicated that the participants of our online questionnaire preferred empathetic drawers to neutral ones. They disliked robotic drawers that displayed emotions orthogonal to the user’s emotions. This showed the importance of displaying emotions, and empathy in particular, when designing robotic devices that share our living and working spaces. Category and Subject Descriptors H.5.m [Information Interface and Presentation]: Miscellaneous.
Archive | 2016
David Sirkin; Brian K. Mok; Stephen Yang; Rohan Maheshwari; Wendy Ju
Over the last 2 years, we have been following an improvisational approach to physical interaction design research. It emphasizes the use of exploratory lab and field experiments as a way to (a) source novel ideas about how people might interact with expressive objects such as robots and active spaces, (b) appraise the performance of our prototypes of these technologies, and (c) build frameworks to understand users’ mental models and develop new insights into interaction. We have focused, in particular, on staging environments—whether in public settings or recreated in our workspace—where we can provoke discussion about what behaviors and emotions would be desirable or natural. This paper describes how we design and run experiments to evaluate how people interact with expressive robots built from everyday objects, including a mechanical ottoman, emotive dresser drawers and roving trash barrel.
user interface software and technology | 2017
Brian K. Mok; Mishel Johns; Stephen Yang; Wendy Ju
In this paper, we introduce two different transforming steering wheel systems that can be utilized to augment user experience for future partially autonomous and fully autonomous vehicles. The first one is a robotic steering wheel that can mechanically transform by using its actuators to move the various components into different positions. The second system is a LED steering wheel that can visually transform by using LEDs embedded along the rim of wheel to change colors. Both steering wheel systems contain onboard microcontrollers developed to interface with our driving simulator. The main function of these two systems is to provide emergency warnings to drivers in a variety of safety critical scenarios, although the design space that we propose for these steering wheel systems also includes the use as interactive user interfaces. To evaluate the effectiveness of the emergency alerts, we conducted a driving simulator study examining the performance of participants (N=56) after an abrupt loss of autonomous vehicle control. Drivers who experienced the robotic steering wheel performed significantly better than those who experienced the LED steering wheel. The results of this study suggest that alerts utilizing mechanical movement are more effective than purely visual warnings.
human robot interaction | 2015
David Sirkin; Brian K. Mok; Stephen Yang; Wendy Ju
This video introduces a robotic footstool--the mechanical ottoman--which explores how non-humanlike robots can coordinate joint action. It approaches seated people and offers to support their feet, then attempts to take leave during the interaction.
human robot interaction | 2015
David Sirkin; Brian K. Mok; Stephen Yang; Wendy Ju
This demonstration presents a robotic footstool--the mechanical ottoman--which approaches seated people and offers to support their feet, or alternatively can serve as seat or side table, then bids to take leave once engaged in the interaction.
human robot interaction | 2015
Brian K. Mok; Stephen Yang; David Sirkin; Wendy Ju
In this video, we explored how everyday household robots should behave when performing collaborative tasks with human users. We ran a Wizard of Oz study (N=20) that utilized a set of robotic drawers. The participants were asked to assemble a cube by working together with the drawers which contained the tools needed to accomplish the task. We conducted a between-subjects test with the drawers while varying one of two variables (expressivity and proactivity) to yield a 2x2 factorial design.