Hatice Kose
Istanbul Technical University
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Hatice Kose.
International Journal of Social Robotics | 2012
Hatice Kose; Rabia Yorganci; Esra H. Algan; Dag Sverre Syrdal
The results are from an on-going study which aims to assist in the teaching of Sign Language (SL) to hearing impaired children by means of non-verbal communication and imitation based interaction games between a humanoid robot and the child. In this study, the robot will be able to express a word in the SL among a set of chosen words using hand movements, body and face gestures and having comprehended the word, the child will give relevant feedback to the robot.This paper reports the findings of such an evaluation on a subset of sample words chosen from Turkish Sign Language (TSL) via the comparison of their video representations carried out by human teachers and the Nao H25 robot. Within this study, several surveys and user studies have been realized to reveal the resemblance between the two types of videos involving the performance of the robot simulator and the human teacher for each chosen word. In order to investigate the perceived level of similarity between human and robot behavior, participants with different sign language acquaintance levels and age groups have been asked to evaluate the videos using paper based and online questionnaires. The results of these surveys have been summarized and the most significant factors affecting the comprehension of TSL words have been discussed.
International Journal of Humanoid Robotics | 2014
Hatice Kose; Neziha Akalin; Pinar Uluer
This paper investigates the role of interaction and communication kinesics in human–robot interaction. This study is part of a novel research project on sign language (SL) tutoring through interaction games with humanoid robots. The main goal is to motivate the children with communication problems to understand and imitate the signs implemented by the robot using basic upper torso gestures and sound. We present an empirical and exploratory study investigating the effect of basic nonverbal gestures consisting of hand movements, body and face gestures expressed by a humanoid robot, and having comprehended the word, the participants will give relevant feedback in SL. This way the participant is both a passive observer and an active imitator throughout the learning process in different phases of the game. A five-fingered R3 robot platform and a three-fingered Nao H-25 robot are employed within the games. Vision-, sound-, touch- and motion-based cues are used for multimodal communication between the robot, child and therapist/parent within the study. This paper presents the preliminary results of the proposed game tested with adult participants. The aim is to evaluate the SL learning ability of participants from a robot, and compare different robot platforms within this setup.
ieee-ras international conference on humanoid robots | 2011
Hatice Kose; Rabia Yorganci
There is an on-going study which aims to assist in teaching Sign Language (SL) to hearing impaired children by means of non-verbal communication and imitation based interaction games between a humanoid robot and the child. In this study, the robot will be able to express a word in SL among a set of chosen words using hand movements, body and face gestures. Having comprehended the word, the child will give relevant feedback to the robot. In the current study, we propose an interactive story telling game between a NAO H25 humanoid robot and preschool children based on Turkish Sign Language (TSL). Since most of the children do not know how to read and write, and they are not familiar with sign language, we prepared a short story including specially selected words which is performed by the robot verbally and with sign language as well. The children are expected to give feedback to the robot with matching colour flashcards when it implements a word in sign language. The robotic event covered 106 preschool children. The aim is to evaluate the childrens sign language learning ability from a robot, and comparison of these results with the results of video based studies.
International Journal of Social Robotics | 2015
Pinar Uluer; Neziha Akalin; Hatice Kose
This paper presents a socially interactive humanoid robot-assisted system for sign language (SL) tutoring for children with communication impairments by means of imitation-based interaction games. In this study, a five-fingered robot platform Robovie R3 is used to express a set of chosen words in Turkish sign language (TSL) using hand and body movements combined with facial expressions. The robot is able to recognize signs through a RGB-D camera and give vocal, visual and motional (as signs) feedback. The proposed game consists of an introductory phase where participants are introduced to the robot and the signs; an imitation-based learning phase where participants are motivated to imitate the signs demonstrated by the robot, and a test phase where the signs taught in the previous phases are tested within a guessing game. The current paper presents results from the studies with three different test groups. The presented humanoid robot is used as an assistive social companion in the game context using sign language and visual clues to interact with the children. The robot is evaluated according to the participant’s sign recognition ability within different setups. The results indicate that the robot has a significant effect on the sign learning performance of participants.
signal processing and communications applications conference | 2013
Bekir Sıtkı Ertuğrul; Cemal Gurpinar; Hasan Kivrak; Hatice Kose
This work is part of an ongoing work for sign language tutoring with imitation based turn-taking and interaction games (iSign) with humanoid robots and children with communication impairments. The paper focuses on the extension of the game, mainly for children with autism. Autism Spektrum Disorder (ASD) involves communication impairments, limited social interaction, and limited imagination. Many such children show interest in robots and find them engaging. Robots can facilitate social interaction between the child and teacher. In this work, a Nao H25 Humanoid robot assisted the human teacher to teach some signs and basic upper torso actions which were observed and imitated by the participants. Kinect camera based system was used to recognize the signs and other actions, and the robot gave visual and audial feedback to the participants based on the performance.
signal processing and communications applications conference | 2012
Itauma Isong Itauma; Hasan Kivrak; Hatice Kose
This study is a part of an ongoing project which aims to assist in teaching Sign Language (SL) to hearing-impaired children by means of non-verbal communication and imitation-based interaction games between a humanoid robot and a child. In this paper, the problem is geared towards a robot learning to imitate basic upper torso gestures (SL signs) using different machine learning techniques. RGBD sensor (Microsoft Kinect) is employed to track the skeletal model of humans and create a training set. A novel method called Decision Based Rule is proposed. Additionally, linear regression models are compared to find which learning technique has a higher accuracy on gesture prediction. The learning technique with the highest accuracy is then used to simulate an imitation system where the Nao robot imitates these learned gestures as observed by the users.
International Journal of Social Robotics | 2015
Hatice Kose; Pinar Uluer; Neziha Akalin; Rabia Yorganci; Ahmet Özkul; Gökhan Ince
This paper presents interactive games for sign language tutoring assisted by humanoid robots. The games are specially designed for children with communication impairments. In this study, different robot platforms such as a Nao H25 and a Robovie R3 humanoid robots are used to express a set of chosen signs in Turkish Sign Language using hand and arm movements. Two games involving physically and virtually embodied robots are designed. In the game involving physically embodied robot, the robot is able to communicate with the participant by recognizing colored flashcards through a camera based system and generating a selected subset of signs including motivational facial gestures, in return. A mobile version of the game is also implemented to be used as part of children’s education and therapy for the purpose of teaching signs. The humanoid robot acts as a social peer and assistant in the games to motivate the child, teach a selected set of signs, evaluate the child’s effort, and give appropriate feedback to improve the learning and recognition rate of children. Current paper presents results from the preliminary study with different test groups, where children played with the physical robot platform, R3, and a mobile game incorporating the videos of the robot performing the signs, thus the effect of assistive robot’s embodiment is analyzed within these games. The results indicate that the physical embodiment plays a significant role on improving the children’s performance, engagement and motivation.
robotics and biomimetics | 2011
Hatice Kose; Rabia Yorganci; Itauma Isong Itauma
There is an on-going study which aims to assist in teaching Sign Language to hearing impaired children by means of non-verbal communication and imitation based interaction games between a humanoid robot and the child. In this study, the robot will be able to express a word in Sign Language among a set of chosen words using hand movements, body and face gestures and having comprehended the word, the child will give relevant feedback to the robot. This study proposes an interactive game between a NAO H25 humanoid robot and preschool children based on Sign Language. Currently the demo is in Turkish Sign Language (TSL) but it will be extended to ASL, too. Since the children do not know how to read and write, and are not familiar with sign language, we prepared a short story including special words where the robot realized the specially selected word with sign language as well as pronouncing the word verbally. After realizing every special word with sign language the robot waited for response from children, where the children were asked to show colour flashcards with the illustration of the word. If the flashcard and the word match the robot pronounces the word verbally and continues to tell the story. At the end of the story the robot realizes the words one by one with sign language in a random order and asks the children to put the sticker of the relevant flashcard on their play cards which include the story with illustrations of the flashcards. We also carried the game to internet and tablet pc environments. The aim is to evaluate the childrens sign language learning ability from a robot, in different embodiments and make the system available to children disregarding the cost of the robot, transportation and knowhow issues.
robot and human interactive communication | 2013
Neziha Akalin; Pinar Uluer; Hatice Kose
The work presented in this paper was carried out within an on-going project which aims to assist in Sign language tutoring to children with communication problems by means of interaction games with a humanoid robot. In this project, the robot is able to express words in the Turkish Sign language (TSL) and American Sign Language (ASL), and having comprehended the word, the child is motivated to give relevant feedback to the robot with colored flashcards, signs and sound. Within the multi-modal turn-taking games, child both learns the semantic and the realization of the words, besides have chance to test this information through imitation based interaction games. The performance of the children is also verified with flash card based activities as a part of the game, as a feedback for the teacher/parent. Childs performance is recognized and evaluated by the robot through an RGB-D camera based system. This paper summarizes a longitudinal exploratory study from a classroom work, where children were taught 6 gestures including signs from ASL and TSL with the assistance of a Nao H25 humanoid robot. The performances of the children and robot are presented and discussed in terms of childrens subjective and objective evaluations.
Intelligent Assistive Robots | 2015
Hatice Kose; Neziha Akalin; Rabia Yorganci; Bekir Sıtkı Ertuğrul; Hasan Kivrak; Semih Kavak; Ahmet Özkul; Cemal Gurpinar; Pinar Uluer; Gökhan Ince
This paper investigates the role of interaction and communication kinesics in human-robot interaction. It is based on a project on Sign Language (SL) tutoring through interaction games with humanoid robots. The aim of the study is to design a computational framework, which enables to motivate the children with communication problems (i.e., ASD and hearing impairments) to understand and imitate the signs implemented by the robot using the basic upper torso gestures and sound in a turn-taking manner. This framework consists of modular computational components to endow the robot the capability of perceiving the actions of the children, carrying out a game or storytelling task and tutoring the children in any desired mode, i.e., supervised and semi-supervised. Visual (colored cards), vocal (storytelling, music), touch (using tactile sensors on the robot to communicate), and motion (recognition and implementation of gestures including signs) based cues are proposed to be used for a multimodal communication between the robot, child and therapist/parent. We present an empirical and exploratory study investigating the effect of basic non-verbal gestures consisting of hand movements, body and face gestures expressed by a humanoid robot, and having comprehended the word, the child will give relevant feedback in SL or visually to the robot, according to the context of the game.