Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Philippe Giguère is active.

Publication


Featured researches published by Philippe Giguère.


IEEE Computer | 2007

AQUA: An Amphibious Autonomous Robot

Gregory Dudek; Philippe Giguère; Chris Prahacs; Shane Saunderson; Junaed Sattar; Luz Abril Torres-Méndez; Michael Jenkin; Andrew German; Andrew Hogue; Arlene Ripsman; James E. Zacher; Evangelos E. Milios; Hui Liu; Pifu Zhang; Martin Buehler; Christina Georgiades

AQUA, an amphibious robot that swims via the motion of its legs rather than using thrusters and control surfaces for propulsion, can walk along the shore, swim along the surface in open water, or walk on the bottom of the ocean. The vehicle uses a variety of sensors to estimate its position with respect to local visual features and provide a global frame of reference


intelligent robots and systems | 2005

A visually guided swimming robot

Gregory Dudek; Michael Jenkin; Chris Prahacs; Andrew Hogue; Junaed Sattar; Philippe Giguère; Andrew German; Hui Liu; Shane Saunderson; Arlene Ripsman; Saul Simhon; Luz Abril Torres; Evangelos E. Milios; Pifu Zhang; Ioannis Rekletis

We describe recent results obtained with AQUA, a mobile robot capable of swimming, walking and amphibious operation. Designed to rely primarily on visual sensors, the AQUA robot uses vision to navigate underwater using servo-based guidance, and also to obtain high-resolution range scans of its local environment. This paper describes some of the pragmatic and logistic obstacles encountered, and provides an overview of some of the basic capabilities of the vehicle and its associated sensors. Moreover, this paper presents the first ever amphibious transition from walking to swimming.


IEEE Transactions on Robotics | 2011

A Simple Tactile Probe for Surface Identification by Mobile Robots

Philippe Giguère; Gregory Dudek

This paper describes a tactile probe designed for surface identification in a context of all-terrain low-velocity mobile robotics. The proposed tactile probe is made of a small metallic rod with a single-axis accelerometer attached near its tip. Surface identification is based on analyzing acceleration patterns induced at the tip of this mechanically robust tactile probe, while it is passively dragged along a surface. A training dataset was collected over ten different indoor and outdoor surfaces. Classification results for an artificial neural network were positive, with an 89.9% and 94.6% success rate for 1- and 4-s time windows of data, respectively. We also demonstrated that the same tactile probe can be used for unsupervised learning of terrains. For 1-s time windows of data, the classification success rate was only reduced to 74.1%. Finally, a blind mobile robot, performing real-time classification of surfaces, demonstrated the feasibility of this tactile probe as a guidance mechanism.


intelligent robots and systems | 2008

Enabling autonomous capabilities in underwater robotics

Junaed Sattar; Gregory Dudek; Olivia Chiu; Ioannis M. Rekleitis; Philippe Giguère; Alec Mills; Nicolas Plamondon; Chris Prahacs; Yogesh A. Girdhar; Meyer Nahon; John-Paul Lobos

Underwater operations present unique challenges and opportunities for robotic applications. These can be attributed in part to limited sensing capabilities, and to locomotion behaviours requiring control schemes adapted to specific tasks or changes in the environment. From enhancing teleoperation procedures, to providing high-level instruction, all the way to fully autonomous operations, enabling autonomous capabilities is fundamental for the successful deployment of underwater robots. This paper presents an overview of the approaches used during underwater sea trials in the coral reefs of Barbados, for two amphibious mobile robots and a set of underwater sensor nodes. We present control mechanisms used for maintaining a preset trajectory during enhanced teleoperations and discuss their experimental results. This is followed by a discussion on amphibious data gathering experiments conducted on the beach. We then present a tetherless underwater communication approach based on pure vision for high-level control of an underwater vehicle. Finally the construction details together with preliminary results from a set of distributed underwater sensor nodes are outlined.


intelligent robots and systems | 2005

A visual servoing system for an aquatic swimming robot

Junaed Sattar; Philippe Giguère; Gregory Dudek; Chris Prahacs

This paper describes a visual servoing system for an underwater legged robotic system named AQUA and initial experiments with the system performed in the open sea. A large class of significant applications can be leveraged by allowing such a robot to follow a diver or some other moving target. The robot uses a suite of sensing technologies, primarily based on computer vision, to allow it to navigate in shallow-water environments. The visual servoing system described here allows the robot to track and follow a given target underwater. The servo package is made up of two distinct parts: a tracker and a feedback controller. The system has been evaluated in the sea water and under natural lighting conditions. The servo system has been tested underwater, and with minor modifications, the system can be used while the robot is walking on the ground as well.


Robotics and Autonomous Systems | 2014

Autonomous tactile perception: A combined improved sensing and Bayesian nonparametric approach

Patrick Dallaire; Philippe Giguère; Daniel ímond; Brahim Chaib-draa

In recent years, autonomous robots have increasingly been deployed in unknown environments and required to manipulate or categorize unknown objects. In order to cope with these unfamiliar situations, improvements must be made both in sensing technologies and in the capability to autonomously train perception models. In this paper, we explore this problem in the context of tactile surface identification and categorization. Using a highly-discriminant tactile probe based upon large bandwidth, triple axis accelerometer that is sensitive to surface texture and material properties, we demonstrate that unsupervised learning for surface identification with this tactile probe is feasible. To this end, we derived a Bayesian nonparametric approach based on Pitman-Yor processes to model power-law distributions, an extension of our previous work using Dirichlet processes Dallaire et al. (2011). When tested against a large collection of surfaces and without providing the actual number of surfaces, the tactile probe combined with our proposed approach demonstrated near-perfect recognition in many cases and achieved perfect recognition given the right conditions. We consider that our combined improvements demonstrate the feasibility of effective autonomous tactile perception systems.


intelligent robots and systems | 2012

Multi-domain monitoring of marine environments using a heterogeneous robot team

Florian Shkurti; Anqi Xu; Malika Meghjani; Juan Camilo Gamboa Higuera; Yogesh A. Girdhar; Philippe Giguère; Bir Bikram Dey; Jimmy Li; Arnold Kalmbach; Chris Prahacs; Katrine Turgeon; Ioannis M. Rekleitis; Gregory Dudek

In this paper we describe a heterogeneous multi-robot system for assisting scientists in environmental monitoring tasks, such as the inspection of marine ecosystems. This team of robots is comprised of a fixed-wing aerial vehicle, an autonomous airboat, and an agile legged underwater robot. These robots interact with off-site scientists and operate in a hierarchical structure to autonomously collect visual footage of interesting underwater regions, from multiple scales and mediums. We discuss organizational and scheduling complexities associated with multi-robot experiments in a field robotics setting. We also present results from our field trials, where we demonstrated the use of this heterogeneous robot team to achieve multi-domain monitoring of coral reefs, based on real-time interaction with a remotely-located marine biologist.


robotics: science and systems | 2006

Environment Identification for a Running Robot Using Inertial and Actuator Cues.

Philippe Giguère; Gregory Dudek; Shane Saunderson; Chris Prahacs

In this paper, we explore the idea of using inertial and actuator information to accurately identify the environment of an amphibious robot. In particular, in our work with a legged robot we use internal sensors to measure the dynamics and interaction forces experienced by the robot. From these measurements we use simple machine learning methods to probabilistically infer properties of the environment, and therefore identify it. The robot’s gait can then be automatically selected in response to environmental changes. Experimental results show that for several environments (sand, water, snow, ice, etc.), the identification process is over 90 per cent accurate. The requisite data can be collected during a half-leg rotation (about 250 ms), making it one of the fastest and most economical environment identifiers for a dynamic robot. For the littoral setting, a gaitchange experiment is done as a proof-of-concept of a robot automatically adapting its gait to suit the environment.


The International Journal of Robotics Research | 2014

Autonomous adaptive exploration using realtime online spatiotemporal topic modeling

Yogesh A. Girdhar; Philippe Giguère; Gregory Dudek

The exploration of dangerous environments such as underwater coral reefs and shipwrecks is a difficult and potentially life-threatening task for humans, which naturally makes the use of an autonomous robotic system very appealing. This paper presents such an autonomous system, which is capable of autonomous exploration, and shows its use in a series of experiments to collect image data in challenging underwater marine environments. We present novel contributions on three fronts. First, we present an online topic-modeling-based technique to describe what is being observed using a low-dimensional semantic descriptor. This descriptor attempts to be invariant to observations of different corals belonging to the same species, or observations of similar types of rocks observed from different viewpoints. Second, we use the topic descriptor to compute the surprise score of the current observation. This is done by maintaining an online summary of observations thus far, and then computing the surprise score as the distance of the current observation from the summary in the topic space. Finally, we present a novel control strategy for an underwater robot that allows for intelligent traversal, hovering over surprising observations, and swimming quickly over previously seen corals and rocks.


canadian conference on computer and robot vision | 2007

Fourier tags: Smoothly degradable fiducial markers for use in human-robot interaction

Junaed Sattar; Eric Bourque; Philippe Giguère; Gregory Dudek

In this paper we introduce the Fourier tag, a synthetic fiducial marker used to visually encode information and provide controllable positioning. The Fourier tag is a synthetic target akin to a bar-code that specifies multi-bit information which can be efficiently and robustly detected in an image. Moreover, the Fourier tag has the beneficial property that the bit string it encodes has variable length as a function of the distance between the camera and the target. This follows from the fact that the effective resolution decreases as an effect of perspective. This paper introduces the Fourier tag, describes its design, and illustrates its properties experimentally.

Collaboration


Dive into the Philippe Giguère's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Ioannis M. Rekleitis

University of South Carolina

View shared research outputs
Top Co-Authors

Avatar

Yogesh A. Girdhar

Woods Hole Oceanographic Institution

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge