Ayse Kucukyilmaz
Koç University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ayse Kucukyilmaz.
The International Journal of Robotics Research | 2012
Alexander Mörtl; Martin Lawitzky; Ayse Kucukyilmaz; Metin Sezgin; Cagatay Basdogan; Sandra Hirche
Since the strict separation of working spaces of humans and robots has experienced a softening due to recent robotics research achievements, close interaction of humans and robots comes rapidly into reach. In this context, physical human–robot interaction raises a number of questions regarding a desired intuitive robot behavior. The continuous bilateral information and energy exchange requires an appropriate continuous robot feedback. Investigating a cooperative manipulation task, the desired behavior is a combination of an urge to fulfill the task, a smooth instant reactive behavior to human force inputs and an assignment of the task effort to the cooperating agents. In this paper, a formal analysis of human–robot cooperative load transport is presented. Three different possibilities for the assignment of task effort are proposed. Two proposed dynamic role exchange mechanisms adjust the robot’s urge to complete the task based on the human feedback. For comparison, a static role allocation strategy not relying on the human agreement feedback is investigated as well. All three role allocation mechanisms are evaluated in a user study that involves large-scale kinesthetic interaction and full-body human motion. Results show tradeoffs between subjective and objective performance measures stating a clear objective advantage of the proposed dynamic role allocation scheme.
ieee haptics symposium | 2010
S. Ozgur Oguz; Ayse Kucukyilmaz; Tevfik Metin Sezgin; Cagatay Basdogan
We investigate how collaborative guidance can be realized in multi-modal virtual environments for dynamic tasks involving motor control. Haptic guidance in our context can be defined as any form of force/tactile feedback that the computer generates to help a user execute a task in a faster, more accurate, and subjectively more pleasing fashion. In particular, we are interested in determining guidance mechanisms that best facilitate task performance and arouse a natural sense of collaboration. We suggest that a haptic guidance system can be further improved if it is supplemented with a role exchange mechanism, which allows the computer to adjust the forces it applies to the user in response to his/her actions. Recent work on collaboration and role exchange presented new perspectives on defining roles and interaction. However existing approaches mainly focus on relatively basic environments where the state of the system can be defined with a few parameters. We designed and implemented a complex and highly dynamic multimodal game for testing our interaction model. Since the state space of our application is complex, role exchange needs to be implemented carefully. We defined a novel negotiation process, which facilitates dynamic communication between the user and the computer, and realizes the exchange of roles using a three-state finite state machine. Our preliminary results indicate that even though the negotiation and role exchange mechanism we adopted does not improve performance in every evaluation criteria, it introduces a more personal and human-like interaction model.
IEEE Transactions on Haptics | 2013
Ayse Kucukyilmaz; Tevfik Metin Sezgin; Cagatay Basdogan
In human-computer collaboration involving haptics, a key issue that remains to be solved is to establish an intuitive communication between the partners. Even though computers are widely used to aid human operators in teleoperation, guidance, and training, because they lack the adaptability, versatility, and awareness of a human, their ability to improve efficiency and effectiveness in dynamic tasks is limited. We suggest that the communication between a human and a computer can be improved if it involves a decision-making process in which the computer is programmed to infer the intentions of the human operator and dynamically adjust the control levels of the interacting parties to facilitate a more intuitive interaction setup. In this paper, we investigate the utility of such a dynamic role exchange mechanism, where partners negotiate through the haptic channel to trade their control levels on a collaborative task. We examine the energy consumption, the work done on the manipulated object, and the joint efficiency in addition to the task performance. We show that when compared to an equal control condition, a role exchange mechanism improves task performance and the joint efficiency of the partners. We also show that augmenting the system with additional informative visual and vibrotactile cues, which are used to display the state of interaction, allows the users to become aware of the underlying role exchange mechanism and utilize it in favor of the task. These cues also improve the users sense of interaction and reinforce his/her belief that the computer aids with the execution of the task.
IEEE Transactions on Haptics | 2015
Cigil Ece Madan; Ayse Kucukyilmaz; Tevfik Metin Sezgin; Cagatay Basdogan
The development of robots that can physically cooperate with humans has attained interest in the last decades. Obviously, this effort requires a deep understanding of the intrinsic properties of interaction. Up to now, many researchers have focused on inferring human intents in terms of intermediate or terminal goals in physical tasks. On the other hand, working side by side with people, an autonomous robot additionally needs to come up with in-depth information about underlying haptic interaction patterns that are typically encountered during human-human cooperation. However, to our knowledge, no study has yet focused on characterizing such detailed information. In this sense, this work is pioneering as an effort to gain deeper understanding of interaction patterns involving two or more humans in a physical task. We present a labeled human-human-interaction dataset, which captures the interaction of two humans, who collaboratively transport an object in an haptics-enabled virtual environment. In the light of information gained by studying this dataset, we propose that the actions of cooperating partners can be examined under three interaction types: In any cooperative task, the interacting humans either 1) work in harmony, 2) cope with conflicts, or 3) remain passive during interaction. In line with this conception, we present a taxonomy of human interaction patterns; then propose five different feature sets, comprising force-, velocity-and power-related information, for the classification of these patterns. Our evaluation shows that using a multi-class support vector machine (SVM) classifier, we can accomplish a correct classification rate of 86 percent for the identification of interaction patterns, an accuracy obtained by fusing a selected set of most informative features by Minimum Redundancy Maximum Relevance (mRMR) feature selection method.
IEEE Transactions on Haptics | 2012
Salih Ozgur Oguz; Ayse Kucukyilmaz; Tevfik Metin Sezgin; Cagatay Basdogan
An active research goal for human-computer interaction is to allow humans to communicate with computers in an intuitive and natural fashion, especially in real-life interaction scenarios. One approach that has been advocated to achieve this has been to build computer systems with human-like qualities and capabilities. In this paper, we present insight on how human-computer interaction can be enriched by employing the computers with behavioral patterns that naturally appear in human-human negotiation scenarios. For this purpose, we introduce a two-party negotiation game specifically built for studying the effectiveness of haptic and audio-visual cues in conveying negotiation related behaviors. The game is centered around a real-time continuous two-party negotiation scenario based on the existing game-theory and negotiation literature. During the game, humans are confronted with a computer opponent, which can display different behaviors, such as concession, competition, and negotiation. Through a user study, we show that the behaviors that are associated with human negotiation can be incorporated into human-computer interaction, and the addition of haptic cues provides a statistically significant increase in the human-recognition accuracy of machine-displayed behaviors. In addition to aspects of conveying these negotiation-related behaviors, we also focus on and report game-theoretical aspects of the overall interaction experience. In particular, we show that, as reported in the game-theory literature, certain negotiation strategies such as tit-for-tat may generate maximum combined utility for the negotiating parties, providing an excellent balance between the energy spent by the user and the combined utility of the negotiating parties.
robot and human interactive communication | 2015
Ayse Kucukyilmaz; Yiannis Demiris
An emerging research problem in the field of assistive robotics is the design of methodologies that allow robots to provide human-like assistance to the users. Especially within the rehabilitation domain, a grand challenge is to program a robot to mimic the operation of an occupational therapist, intervening with the user when necessary so as to improve the therapeutic power of the assistive robotic system. We propose a method to estimate assistance policies from expert demonstrations to present human-like intervention during navigation in a powered wheelchair setup. For this purpose, we constructed a setting, where a human offers assistance to the user over a haptic shared control system. The robot learns from human assistance demonstrations while the user is actively driving the wheelchair in an unconstrained environment. We train a Gaussian process regression model to learn assistance commands given past and current actions of the user and the state of the environment. The results indicate that the model can estimate human assistance after only a single demonstration, i.e. in one-shot, so that the robot can help the user by selecting the appropriate assistance in a human-like fashion.
Immersive Multimodal Interactive Presence | 2012
Ayse Kucukyilmaz; Salih Ozgur Oguz; Tevfik Metin Sezgin; Cagatay Basdogan
Even though in many systems, computers have been programmed to share control with human operators in order to increase task performance, the interaction in such systems is still artificial when compared to natural human-human cooperation. In complex tasks, cooperating human partners may have their own agendas and take initiatives during the task. Such initiatives contribute to a richer interaction between cooperating parties, yet little research exists on how this can be established between a human and a computer. In a cooperation involving haptics, the coupling between the human and the computer should be defined such that the computer can understand the intentions of the human operator and respond accordingly. We believe that this will make the haptic interactions between the human and the computer more natural and human-like. In this regard, we suggest (1) a role exchange mechanism that is activated based on the magnitude of the force applied by the cooperating parties and (2) a negotiation model that enables more human-like coupling between the cooperating parties. We argue that when presented through the haptic channel, the proposed role exchange mechanism and the negotiation model serve to communicate the cooperating parties dynamically, naturally, and seamlessly, in addition to improving the task efficiency of the user. In this chapter, we explore how human-computer cooperation can be improved using a role-exchange mechanism and a haptic negotiation framework. We also discuss the use of haptic negotiation in assigning different behaviors to the computer; and the effectiveness of visual and haptic cues in conveying negotiation-related complex affective states. Throughout this chapter, we will adopt a broad terminology and speak of cooperative systems, in which both parties take some part in control, as shared control schemes, but the term “control” is merely used to address the partners’ manipulation capacities on the task.
signal processing and communications applications conference | 2013
Ayse Kucukyilmaz; Tevfik Metin Sezgin; Cagatay Basdogan
This paper presents a summary of our efforts to enable dynamic role allocation between humans and robots in physical collaboration tasks. A major goal in physical human-robot interaction research is to develop tacit and natural communication between partners. In previous work, we suggested that the communication between a human and a robot would benefit from a decision making process in which the robot can dynamically adjust its control level during the task based on the intentions of the human. In order to do this, we define leader and follower roles for the partners, and using a role exchange mechanism, we enable the partners to negotiate solely through force information to exchange roles. We show that when compared to an “equal control” condition, the role exchange mechanism improves task performance and the joint efficiency of the partners.
world haptics conference | 2011
Ayse Kucukyilmaz; T. Metin Sezgin; Cagatay Basdogan
Archive | 2018
Rama Krishna Reddy Dyava; Marco Aggravi; Gionata Salvietti; Alexander Mort; Martin Lawitzky; Ayse Kucukyilmaz; Domenico Prattichizzo