Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Sebastien Rougeaux is active.

Publication


Featured researches published by Sebastien Rougeaux.


ieee international conference on automatic face and gesture recognition | 2000

Real-time stereo tracking for head pose and gaze estimation

Rhys Newman; Yoshio Matsumoto; Sebastien Rougeaux; Alexander Zelinsky

Computer systems which analyse human face/head motion have attracted significant attention recently as there are a number of interesting and useful applications. Not least among these is the goal of tracking the head in real time. A useful extension of this problem is to estimate the subjects gaze point in addition to his/her head pose. This paper describes a real-time stereo vision system which determines the head pose and gaze direction of a human subject. Its accuracy makes it useful for a number of applications including human/computer interaction, consumer research and ergonomic assessment.


Interacting with Computers | 2002

Visual gesture interfaces for virtual environments

Rochelle O'Hagan; Alexander Zelinsky; Sebastien Rougeaux

Abstract Virtual environments provide a whole new way of viewing and manipulating 3D data. Current technology moves the images out of desktop monitors and into the space immediately surrounding the user. Users can literally put their hands on the virtual objects. Unfortunately, techniques for interacting with such environments are yet to mature. Gloves and sensor-based trackers are unwieldy, constraining and uncomfortable to use. A natural, more intuitive method of interaction would be to allow the user to grasp objects with their hands and manipulate them as if they were real objects. We are investigating the use of computer vision in implementing a natural interface based on hand gestures. A framework for a gesture recognition system is introduced along with results of experiments in colour segmentation, feature extraction and template matching for finger and hand tracking, and simple hand pose recognition. Implementation of a gesture interface for navigation and object manipulation in virtual environments is presented.


international conference on robotics and automation | 1994

Cooperation by observation: the framework and basic task patterns

Yasuo Kuniyoshi; Nobuyuki Kita; Sebastien Rougeaux; Shigeyuki Sakane; Masaru Ishii; M. Kakikua

A novel framework for multiple robot cooperation called cooperation by observation is presented. It introduces many interesting issues such as a viewpoint constraint and role interchange, as well as novel concepts like attentional structure. The framework has the potential to realize a high level of task coordination by decentralized autonomous robots allowing minimum explicit communication. Its source of power lies in an advanced capability given to each robot for recognizing other agents actions by (primarily visual) observation. This provides rich information about the current task situation around each robot which facilitates highly-structured task coordination. The basic visuo-motor routines are described. Concrete examples and experiments using real mobile robots are also presented.<<ETX>>


asian conference on computer vision | 1995

Active Stereo Vision System with Foveated Wide Angle Lenses

Yasuo Kuniyoshi; Nobuyuki Kita; Sebastien Rougeaux; Takashi Suehiro

A novel active stereo vision system with a pair of foveated wide angle lenses is presented. The projection curve of the lens is designed so that it facilitates active vision algorithms for motion analysis, object identification, and precise fixation. A pair of such lenses are mounted on a specially designed active stereo vision platform. It is compact and light so that it can be mounted on a mobile robot or a manipulator. A real time stereo tracking system is constructed using the platform, a dual-processor servo controller, and pipelined image processors with a multi-processor backend.


robot and human interactive communication | 1997

Deferred imitation of human head movements by an active stereo vision head

J. Demiris; Sebastien Rougeaux; G.M. Hayes; Luc Berthouze; Yasuo Kuniyoshi

Designing a mechanism that will allow a robot to imitate the actions of a human, apart from being interesting for opening the possibilities for efficient social learning through observation and imitation, is challenging since it requires the integration of information from the visual, memory and motor systems. This paper deals with the implementation of an imitation architecture on an active, stereo vision head, and describes our experiments on the deferred imitation of human head movements.


intelligent robots and systems | 1994

Vision-based behaviors for multi-robot cooperation

Yasuo Kuniyoshi; J. Rickki; Masaru Ishii; Sebastien Rougeaux; Nobuyuki Kita; Shigeyuki Sakane; Masayoshi Kakikura

This paper presents some advanced examples of reactive vision-based cooperative behaviors: 1) chasing and posing against another robot among others; 2) unblocking the path of another robot by removing an obstacle; and 3) passing an object from one to another. These behaviors are demonstrated using real mobile robots equipped with CCD cameras, in a complex environment, and with no central controller or explicit communication among the robots. The action observation is based on real time processing of optical flow analysis and stereo tracking. An extended behavior-based architecture for cooperation by observation is presented. The core extension consists of a mobile space buffer and an image space buffer with manipulable markers which control the internal flow of information, thereby coordinating parallel behaviors and achieving purposive tasks in complex and dynamic environments.<<ETX>>


intelligent robots and systems | 1997

Robust real-time tracking on an active vision head

Sebastien Rougeaux; Yasuo Kuniyoshi

Achieving the first step of a framework for human-robot interaction, we have designed a binocular tracking system which uses disparity and velocity information for the detection and pursuit of moving objects in cluttered environments without a-priori knowledge of the target shape or texture. The implemented system robustly tracks in real-time deformable objects such as human hands and faces, taking advantage of the mechanical and optical properties of ESCHeR, a high performances active vision head equipped with foveated wide-angle lenses.


intelligent robots and systems | 1994

Binocular tracking based on virtual horopters

Sebastien Rougeaux; Nobuyuki Kita; Yasuo Kuniyoshi; Shigeyuki Sakane; Florent Chavand

This paper presents a stereo active vision system which performs tracking tasks on smoothly moving objects in complex backgrounds. Dynamic control of the vergence angle adapts the horopter geometry to the target position and allows to pick it up easily on the basis of stereoscopic disparity features. We introduce a novel vergence control strategy based on the computation of virtual horopters to track a target movement generating rapid changes of disparity. The control strategy is implemented on a binocular head, whose right and left pan angles are controlled independently. Experimental results of gaze holding on a smoothly moving target translating and rotating in a complex surrounding demonstrate the efficiency of the tracking system.<<ETX>>


computer vision and pattern recognition | 1997

Velocity and disparity cues for robust real-time binocular tracking

Sebastien Rougeaux; Yasuo Kuniyoshi

We have designed and implemented a real-time binocular tracking system which uses two independent cues commonly found in the primary functions of biological visual systems to robustly track moving targets in complex environments, without a-priori knowledge of the target shape or texture: a fast optical flow segmentation algorithm quickly locates independently moving objects for target acquisition and provides a reliable velocity estimate for smooth tracking. In parallel, target position is generated from the output of a zero-disparity filter where a phase-based disparity estimation technique allows dynamic control of the camera vergence do adapt the horopter geometry to the target location. The system takes advantage of the optical properties of our custom-designed foveated wide-angle lenses, which exhibit a wide field of view along with a high resolution fovea. Methods to cope with the distortions introduced by the space-variant resolution, and a robust real-time implementation on a high performance active vision head are presented.


international symposium on experimental robotics | 2000

Advancing Active Vision Systems by Improved Design and Control

Orson Sutherland; Harley Truong; Sebastien Rougeaux; Alexander Zelinsky

This paper presents the mechanical hardware and control software of a novel high-performance active vision system. It is the latest in an ongoing research effort to develop real-world vision systems based on cable-drive transmissions. The head presented in this paper is the laboratory’s first fully cable-driven binocular rig, and builds on several successful aspects of previous monocular prototypes. Namely, an increased payload capacity, a more compact transmission, and a design optimised for rigidity. In addition, we have developed a simple and compact controller for real-time tracking applications. It consists of two behavioural subgroups, saccade and smooth pursuit. By using a single trapezoidal profile motion (TPM) algorithm, we show that saccade time and motion smoothness can be optimised.

Collaboration


Dive into the Sebastien Rougeaux's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar

Nobuyuki Kita

National Institute of Advanced Industrial Science and Technology

View shared research outputs
Top Co-Authors

Avatar

Alexander Zelinsky

Australian National University

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Harley Truong

Australian National University

View shared research outputs
Top Co-Authors

Avatar

Orson Sutherland

Australian National University

View shared research outputs
Top Co-Authors

Avatar

Samer Abdallah

Australian National University

View shared research outputs
Top Co-Authors

Avatar

Masaru Ishii

Massachusetts Institute of Technology

View shared research outputs
Top Co-Authors

Avatar

Jochen Heinzmann

Australian National University

View shared research outputs
Researchain Logo
Decentralizing Knowledge