Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Neziha Akalin is active.

Publication


Featured researches published by Neziha Akalin.


International Journal of Humanoid Robotics | 2014

Socially Interactive Robotic Platforms as Sign Language Tutors

Hatice Kose; Neziha Akalin; Pinar Uluer

This paper investigates the role of interaction and communication kinesics in human–robot interaction. This study is part of a novel research project on sign language (SL) tutoring through interaction games with humanoid robots. The main goal is to motivate the children with communication problems to understand and imitate the signs implemented by the robot using basic upper torso gestures and sound. We present an empirical and exploratory study investigating the effect of basic nonverbal gestures consisting of hand movements, body and face gestures expressed by a humanoid robot, and having comprehended the word, the participants will give relevant feedback in SL. This way the participant is both a passive observer and an active imitator throughout the learning process in different phases of the game. A five-fingered R3 robot platform and a three-fingered Nao H-25 robot are employed within the games. Vision-, sound-, touch- and motion-based cues are used for multimodal communication between the robot, child and therapist/parent within the study. This paper presents the preliminary results of the proposed game tested with adult participants. The aim is to evaluate the SL learning ability of participants from a robot, and compare different robot platforms within this setup.


International Journal of Social Robotics | 2015

A New Robotic Platform for Sign Language Tutoring

Pinar Uluer; Neziha Akalin; Hatice Kose

This paper presents a socially interactive humanoid robot-assisted system for sign language (SL) tutoring for children with communication impairments by means of imitation-based interaction games. In this study, a five-fingered robot platform Robovie R3 is used to express a set of chosen words in Turkish sign language (TSL) using hand and body movements combined with facial expressions. The robot is able to recognize signs through a RGB-D camera and give vocal, visual and motional (as signs) feedback. The proposed game consists of an introductory phase where participants are introduced to the robot and the signs; an imitation-based learning phase where participants are motivated to imitate the signs demonstrated by the robot, and a test phase where the signs taught in the previous phases are tested within a guessing game. The current paper presents results from the studies with three different test groups. The presented humanoid robot is used as an assistive social companion in the game context using sign language and visual clues to interact with the children. The robot is evaluated according to the participant’s sign recognition ability within different setups. The results indicate that the robot has a significant effect on the sign learning performance of participants.


International Journal of Social Robotics | 2015

The Effect of Embodiment in Sign Language Tutoring with Assistive Humanoid Robots

Hatice Kose; Pinar Uluer; Neziha Akalin; Rabia Yorganci; Ahmet Özkul; Gökhan Ince

This paper presents interactive games for sign language tutoring assisted by humanoid robots. The games are specially designed for children with communication impairments. In this study, different robot platforms such as a Nao H25 and a Robovie R3 humanoid robots are used to express a set of chosen signs in Turkish Sign Language using hand and arm movements. Two games involving physically and virtually embodied robots are designed. In the game involving physically embodied robot, the robot is able to communicate with the participant by recognizing colored flashcards through a camera based system and generating a selected subset of signs including motivational facial gestures, in return. A mobile version of the game is also implemented to be used as part of children’s education and therapy for the purpose of teaching signs. The humanoid robot acts as a social peer and assistant in the games to motivate the child, teach a selected set of signs, evaluate the child’s effort, and give appropriate feedback to improve the learning and recognition rate of children. Current paper presents results from the preliminary study with different test groups, where children played with the physical robot platform, R3, and a mobile game incorporating the videos of the robot performing the signs, thus the effect of assistive robot’s embodiment is analyzed within these games. The results indicate that the physical embodiment plays a significant role on improving the children’s performance, engagement and motivation.


robot and human interactive communication | 2013

Ispy-usign humanoid assisted interactive sign language tutoring games

Neziha Akalin; Pinar Uluer; Hatice Kose

The work presented in this paper was carried out within an on-going project which aims to assist in Sign language tutoring to children with communication problems by means of interaction games with a humanoid robot. In this project, the robot is able to express words in the Turkish Sign language (TSL) and American Sign Language (ASL), and having comprehended the word, the child is motivated to give relevant feedback to the robot with colored flashcards, signs and sound. Within the multi-modal turn-taking games, child both learns the semantic and the realization of the words, besides have chance to test this information through imitation based interaction games. The performance of the children is also verified with flash card based activities as a part of the game, as a feedback for the teacher/parent. Childs performance is recognized and evaluated by the robot through an RGB-D camera based system. This paper summarizes a longitudinal exploratory study from a classroom work, where children were taught 6 gestures including signs from ASL and TSL with the assistance of a Nao H25 humanoid robot. The performances of the children and robot are presented and discussed in terms of childrens subjective and objective evaluations.


Intelligent Assistive Robots | 2015

iSign: An Architecture for Humanoid Assisted Sign Language Tutoring

Hatice Kose; Neziha Akalin; Rabia Yorganci; Bekir Sıtkı Ertuğrul; Hasan Kivrak; Semih Kavak; Ahmet Özkul; Cemal Gurpinar; Pinar Uluer; Gökhan Ince

This paper investigates the role of interaction and communication kinesics in human-robot interaction. It is based on a project on Sign Language (SL) tutoring through interaction games with humanoid robots. The aim of the study is to design a computational framework, which enables to motivate the children with communication problems (i.e., ASD and hearing impairments) to understand and imitate the signs implemented by the robot using the basic upper torso gestures and sound in a turn-taking manner. This framework consists of modular computational components to endow the robot the capability of perceiving the actions of the children, carrying out a game or storytelling task and tutoring the children in any desired mode, i.e., supervised and semi-supervised. Visual (colored cards), vocal (storytelling, music), touch (using tactile sensors on the robot to communicate), and motion (recognition and implementation of gestures including signs) based cues are proposed to be used for a multimodal communication between the robot, child and therapist/parent. We present an empirical and exploratory study investigating the effect of basic non-verbal gestures consisting of hand movements, body and face gestures expressed by a humanoid robot, and having comprehended the word, the child will give relevant feedback in SL or visually to the robot, according to the context of the game.


ieee-ras international conference on humanoid robots | 2014

Non-verbal communication with a social robot peer: Towards robot assisted interactive sign language tutoring

Neziha Akalin; Pinar Uluer; Hatice Kose

This paper presents a humanoid robot assisted imitation based interactive game for Sign Language (SL) tutoring. The game is specially designed for children with communication impairments. The work presented is a part of Robot Sign Language Tutor* project. In this study, a Robovie R3 humanoid robot is used to express a set of chosen words in SL using hand movements, body and face gestures. The robot platform is specially modified with LEDs in face (for non-manual facial gestures), additional DoFs in wrist and 5 independent fingers at hands for robust SL generation. The robot is able to communicate with the participant by recognizing signs and colored flashcards through a RGB-D camera based system and generating a selected subset of signs, including motivating facial gestures, in return. The game also aims to improve childrens imitation and turn-taking skills and to teach the words semantically. Current paper presents results from the preliminary study with a group of hearing impaired children, where children had almost 100% score in recognizing the robots signs from a subset of 12 words.


signal processing and communications applications conference | 2014

A new approach to sign language teaching: Humanoid robot usage

Neziha Akalin; Hatice Kose

This work is part of an ongoing work for sign language tutoring with imitation based turn-taking and interaction games with humanoid robots. The aim of the project is to motivate the children with communication problems to understand and imitate the signs implemented by the humanoid robots. In this study, we implemented selected words from Turkish Sign Language Dictionary on autonomous humanoid robots (NAO H-25 and Robovie R3). We also examined the effects of interaction games on sign language teaching between participants with different age groups and the humanoid robot.


ieee-ras international conference on humanoid robots | 2014

Learning sign language from a social robot peer by playing an interactive game

Pinar Uluer; Neziha Akalin; Hatice Kose

Summary form only given. This work presents a humanoid robot assisted interactive game for Sign Language (SL) tutoring. The game is specially designed for children with communication impairments. The children play the game with a modified Robovie R3 humanoid robot which is able to express a set of chosen words in SL using hand movements, body and face gestures. The robotic platform is specially modified with LEDs in face, additional DOFs in wrist and 5 independent fingers at hands for robust SL generation. The robot is able to communicate with the children by recognizing colored flashcards through a RGB-D camera based system and generating a selected subset of signs, including motivating facial gestures, in return. The presented game is composed of three stages: 1) Introduction of the selected signs to a group of children, 2) Each childs one-to-one play with the robot using flashcards 3) Paper-based test to evaluate the recognition of the signed demonstrated by the robot. Current video presents Screenshots from a preliminary study with a group of hearing-impaired children (7-14 years), where children had almost 100% score in recognizing the robots signs from a subset of 10 words.


advanced robotics and its social impacts | 2013

Humanoid robots communication with participants using sign language: An interaction based sign language game

Neziha Akalin; Pinar Uluer; Hatice Kose; Gökhan Ince


signal processing and communications applications conference | 2018

Emotion recognition in valence-arousal scale by using physiological signals

Neziha Akalin; Hatice Kose

Collaboration


Dive into the Neziha Akalin's collaboration.

Top Co-Authors

Avatar

Hatice Kose

Istanbul Technical University

View shared research outputs
Top Co-Authors

Avatar

Pinar Uluer

Istanbul Technical University

View shared research outputs
Top Co-Authors

Avatar

Gökhan Ince

Istanbul Technical University

View shared research outputs
Top Co-Authors

Avatar

Ahmet Özkul

Istanbul Technical University

View shared research outputs
Top Co-Authors

Avatar

Cemal Gurpinar

Istanbul Technical University

View shared research outputs
Top Co-Authors

Avatar

Rabia Yorganci

Istanbul Technical University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Hasan Kivrak

Istanbul Technical University

View shared research outputs
Top Co-Authors

Avatar

Semih Kavak

Istanbul Technical University

View shared research outputs
Researchain Logo
Decentralizing Knowledge