Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Shuyin Li is active.

Publication


Featured researches published by Shuyin Li.


intelligent robots and systems | 2006

BIRON, where are you? Enabling a robot to learn new places in a real home environment by integrating spoken dialog and visual localization

Thorsten P. Spexard; Shuyin Li; Britta Wrede; Jannik Fritsch; Gerhard Sagerer; Olaf Booij; Zoran Zivkovic; Bas Terwijn; Ben J. A. Kröse

An ambitious goal in modern robotic science is to build mobile robots that are able to interact as companions in real world environments. Especially for caretaking of elderly people a system robustly working at private homes is essential, requiring a very natural and human oriented way of communication. Since home environments are usually very individual a first task for a newly acquired robot is to get familiar with its new environment. This paper gives a short overview on how we integrated a vision based localization using the advantages of a very modular architecture and extending a spoken dialog system for online labeling and interaction about different locations. We present results from the integrated system working in a real, fully furnished home environment where it was able to learn the names of different rooms. This system enables us to perform real user studies in future without the need to fall back to Wizard-of-Oz experiments. Ongoing work aims at enabling the robot to take initiative by asking for unknown locations. A future extension is the ability to generalize over features of known rooms to make predictions when encountering unknown rooms


systems, man and cybernetics | 2004

BIRON, let me show you something: evaluating the interaction with a robot companion

Shuyin Li; Marcus Kleinehagenbrock; Jannik Fritsch; Britta Wrede; Gerhard Sagerer

Current research on the interaction with a robot is driven by the desire to build intuitive and natural interaction schemes. In order for our robot BIRON to behave naturally we integrated an attention system that enables the robot to search for and eventually focus on human communication partners by detecting and tracking persons. Via a natural language interface the user can then interact with BIRON and teach him new objects or ask him to follow her. First valuation results from 21 users interacting with the robot indicate that users appreciate the natural language capabilities of BIRON. However, users are very sensitive to speech recognition failures even though all of our subjects had prior experience with speech recognition systems. The results also indicate that feedback on the internal status of the robot is extremely helpful for users.


robot and human interactive communication | 2006

A dialog system for comparative user studies on robot verbal behavior

Shuyin Li; Britta Wrede; Gerhard Sagerer

In domestic social robot systems the dialog system is often the main user interface. The verbal behavior of such a robot, therefore, plays crucial role in human-robot interaction. Comparative user studies on various verbal behaviors of a robot can effectively contribute to human-robot interaction research. In this paper we present a dialog system that can be easily configured to demonstrate different verbal, initiative-taking behaviors for a robot and, thus, can be used as a platform for such comparative user studies. The pilot study we conducted does not only provide strong evidence for this suitability, but also reveals benefits of comparative studies on a real robot in general


robot and human interactive communication | 2007

A study of interaction between dialog and decision for human-robot collaborative task achievement

Aurélie Clodic; Rachid Alami; Vincent Montreuil; Shuyin Li; Britta Wrede; Agnes Swadzba

Human-robot collaboration requires both communicative and decision making skills of a robot. To enable flexible coordination and turn-taking between human users and a robot in joint tasks, the robots dialog and decision making mechanism have to be synchronized in a meaningful way. In this paper, we propose a integration framework to combine the dialog and the decision making processes. With this framework, we investigate various task negotiation situations for a social robot in a fetch-and-carry scenario. For the technical realization of the framework, the interface specification between the dialog and the decision making systems is also presented. Further, we discuss several challenging issues identified in our integration effort that should be addressed in the future.


international conference on multimodal interfaces | 2005

Human-style interaction with a robot for cooperative learning of scene objects

Shuyin Li; Axel Haasch; Britta Wrede; Jannik Fritsch; Gerhard Sagerer

In research on human-robot interaction the interest is currently shifting from uni-modal dialog systems to multi-modal interaction schemes. We present a system for human-style interaction with a robot that is integrated on our mobile robot BIRON. To model the dialog we adopt an extended grounding concept with a mechanism to handle multi-modal in- and output where object references are resolved by the interaction with an object attention system (OAS). The OAS integrates multiple input from, e.g., the object and gesture recognition systems and provides the information for a common representation. This representation can be accessed by both modules and combines symbolic verbal attributes with sensor-based features. We argue that such a representation is necessary to achieve a robust and efficient information processing.


annual meeting of the special interest group on discourse and dialogue | 2009

A computational model of multi-modal grounding for human robot interaction

Shuyin Li; Britta Wrede; Gerhard Sagerer

Dialog systems for mobile robots operating in the real world should enable mixed-initiative dialog style, handle multi-modal information involved in the communication and be relatively independent of the domain knowledge. Most dialog systems developed for mobile robots today, however, are often system-oriented and have limited capabilities. We present an agent-based dialog model that are specially designed for human-robot interaction and provide evidence for its efficiency with our implemented system.


conference of the international speech communication association | 2004

A Multi-modal Dialog System for a Mobile Robot

Ioannis Toptsis; Shuyin Li; Britta Wrede; Gernot A. Fink


meeting of the association for computational linguistics | 2006

A computational model of multi-modal grounding

Shuyin Li; Britta Wrede; Gerhard Sagerer


intelligent robots and systems | 2006

Integrating Miscommunication Analysis in Natural Language Interface Design for a Service Robot

Anders Green; Kerstin Severinson Eklundh; Britta Wrede; Shuyin Li


national conference on artificial intelligence | 2007

Why and how to model multi-modal interaction for a mobile robot companion

Shuyin Li; Britta Wrede

Collaboration


Dive into the Shuyin Li's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Gernot A. Fink

Technical University of Dortmund

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Bas Terwijn

University of Amsterdam

View shared research outputs
Top Co-Authors

Avatar

Olaf Booij

University of Amsterdam

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge