Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Brian Yamauchi is active.

Publication


Featured researches published by Brian Yamauchi.


Unmanned ground vehicle technology. Conference | 2004

PackBot: a versatile platform for military robotics

Brian Yamauchi

The iRobot PackBot is a combat-tested, man-portable UGV that has been deployed in Afghanistan and Iraq. The PackBot is also a versatile platform for mobile robotics research and development that supports a wide range of payloads suitable for many different mission types. In this paper, we describe four R&D projects that developed experimental payloads and software using the PackBot platform. CHARS was a rapid development project to develop a chemical/radiation sensor for the PackBot. We developed the CHARS payload in six weeks and deployed it to Iraq to search for chemical and nuclear weapons. Griffon was a research project to develop a flying PackBot that combined the capabilities of a UGV and a UAV. We developed a Griffon prototype equipped with a steerable parafoil and gasoline-powered motor, and we completed successful flight tests including remote-controlled launch, ascent, cruising, descent, and landing. Valkyrie is an ongoing research and development project to develop a PackBot payload that will assist medics in retrieving casualties from the battlefield. Wayfarer is an applied research project to develop autonomous urban navigation capabilities for the PackBot using laser, stereo vision, GPS, and INS sensors.


Proceedings of SPIE, the International Society for Optical Engineering | 2006

Autonomous urban reconnaissance using man-portable UGVs

Brian Yamauchi

For the Wayfarer Project, funded by the US Army through TARDEC, we have developed technologies that enable manportable PackBot Wayfarer UGVs to perform autonomous reconnaissance in urban terrain. Each Wayfarer UGV can autonomously follow urban streets and building perimeters while avoiding obstacles and building a map of the terrain. Each UGV is equipped with a 3D stereo vision system, a 360-degree planar LIDAR, GPS, INS, compass, and odometry. The Hough transform is applied to LIDAR range data to detect building walls for street following and perimeter following. We have demonstrated Wayfarers ability to autonomously follow roads in urban and rural environments, while building a map of the surrounding terrain. Recently, we have developed a ruggedized version of the Wayfarer Navigation Payload for use in rough terrain and all-weather conditions. The new payload incorporates a compact Tyzx G2 stereo vision module and a high-performance Athena Guidestar INS/GPS unit.


Industrial Robot-an International Journal | 2004

Griffon: a man‐portable hybrid UGV/UAV

Brian Yamauchi; Pavlo Rudakevych

To demonstrate proof‐of‐concept of the Griffon man‐portable hybrid unmanned ground vehicle/unmanned aerial vehicle (UGV/UAV) based on the iRobot PackBot we developed the Griffon air mobility system consisting of a gasoline‐powered propeller engine, a steerable parafoil, and a radio‐controlled servo system. We integrated the AMS with a PackBot prototype, and we conducted ground and flight tests to validate this concept. The Griffon prototype was capable of remote‐controlled flight, take‐off, and landing. The Griffon achieved speeds of over 20 mph and altitudes of up to 200 feet. We demonstrated the feasibility of developing a man‐portable hybrid UGV/UAV. Future work may explore the possibilities for teleoperated, semi‐autonomous, and fully autonomous control using the Griffon concept. The parafoil wing limits the usability of this vehicle in windy conditions, but this could be addressed using a lightweight fixed wing instead. Man‐portable hybrid UGV/UAVs may be used by the military to perform reconnaissanc...


Defense and Security Symposium | 2007

Daredevil: ultra-wideband radar sensing for small UGVs

Brian Yamauchi

We are developing an ultra wideband (UWB) radar sensor payload for the man-portable iRobot PackBot UGV. Our goal is to develop a sensor array that will allow the PackBot to navigate autonomously through foliage (such as tall grass) while avoiding obstacles and building a map of the terrain. We plan to use UWB radars in conjunction with other sensors such as LIDAR and vision. We propose an algorithm for using polarimetric (dual-polarization) radar arrays to classify radar returns as either vertically-aligned foliage or solid objects based on their differential reflectivity, a function of their aspect ratio. We have conducted preliminary experiments to measure the ability of UWB radars to detect solid objects through foliage. Our initial results indicate that UWB radars are very effective at penetrating sparse foliage, but less effective at penetrating dense foliage.


Proceedings of SPIE, the International Society for Optical Engineering | 2008

All-weather perception for small autonomous UGVs

Brian Yamauchi

For the TARDEC-funded Daredevil Project, iRobot Corporation is developing capabilities that will allow small UGVs to navigate autonomously in adverse weather and in foliage. Our system will fuse sensor data from ultra wideband (UWB) radar, LIDAR, stereo vision, GPS, and INS to build maps of the environment showing which areas are passable (e.g. covered by tall grass) and which areas must be avoided (i.e. solid obstacles). In Phase I of this project, we demonstrated that UWB radar sensors can see through precipitation, smoke/fog, and foliage and detect solid obstacles. In Phase II, we are integrating all of these sensors with an iRobot PackBot. By the end of Phase II, we will demonstrate a fully-autonomous Daredevil PackBot that can avoid obstacles, build maps, and navigate to waypoints in all-weather conditions and through foliage.


Defense and Security Symposium | 2007

A man portable hybrid UAV/UGV system

Pavlo Rudakevych; Brian Yamauchi

We developed and demonstrated a UAV package that works in conjunction with the PackBot UGV to allow medium range missions. Both the UAV and UGV are man portable, and the combined system can be used from unimproved airfields such as soccer pitches. The UAV is capable of up to 75lbs of payload, while weighing less than 30lbs. This document describes the initial proof of concept prototype, the associated ground and flight tests, and areas for further development.


international conference on robotics and automation | 2010

All-weather perception for man-portable robots using ultra-wideband radar

Brian Yamauchi

Autonomous man-portable robots have the potential to provide a wide range of new capabilities for both military and civilian applications. Previous research in autonomy for small robots has focused on vision, LIDAR, and sonar sensors. While vision and LIDAR work well in clear weather, they are seriously impaired by rain, snow, fog, and smoke. Sonar can penetrate adverse weather, but has limited range outdoors, and suffers from specular reflections indoors. For the Daredevil Project, we have investigated the use of ultra-wideband (UWB) radar to provide obstacle detection capabilities for man-portable robots. Our research shows that UWB radar can effectively penetrate adverse weather, including dense fog, and detect obstacles that would be undetectable by vision or LIDAR under the same conditions. We have developed filtering algorithms that process the raw radar returns to eliminate reflections from ground clutter and make obstacles easier to detect. We have tested this system on an iRobot PackBot equipped with both UWB radar and LIDAR, and we have demonstrated how UWB radar can be used for obstacle detection in obscured environments.


Proceedings of SPIE | 2010

Fusing ultra-wideband radar and lidar for small UGV navigation in all-weather conditions

Brian Yamauchi

Autonomous small UGVs have the potential to greatly increase force multiplication capabilities for infantry units. In order for these UGVs to be useful on the battlefield, they must be able to operate under all-weather conditions. For the Daredevil Project, we have explored the use of ultra-wideband (UWB) radar, LIDAR, and stereo vision for all-weather navigation capabilities. UWB radar provides the capability to see through rain, snow, smoke, and fog. LIDAR and stereo vision provide greater accuracy and resolution in clear weather but has difficulty with precipitation and obscurants. We investigate the ways in which the sensor data from UWB radar, LIDAR, and stereo vision can be combined to provide improved performance over the use of a single sensor modality. Our research includes both traditional sensor fusion, where data from multiple sensors is combined in a single representation, and behavior-based sensor fusion, where the data from one sensor is used to activate and deactivate behaviors using other sensor modalities. We use traditional sensor fusion to combine LIDAR and stereo vision for improved obstacle avoidance in clear air, and we use behavior-based sensor fusion to select between radar-based and LIDAR/vision-based obstacle avoidance based on current environmental conditions.


Proceedings of SPIE | 2011

Driver assist behaviors for high-speed small UGVs

Brian Yamauchi

Currently deployed small UGVs operate at speeds up to around 6 mph and have proven their usefulness in explosives ordnance disposal (EOD) missions. As part of the TARDEC-funded Stingray Project, iRobot is investigating techniques to increase the speed of small UGVs so they can be useful in a wider range of missions, such as high-speed reconnaissance and infantry assault missions. We have developed a prototype Stingray PackBot, using wheels rather than tracks, that is capable of traveling at speeds up to 18 mph. A key issue when traveling at such speeds is how to maintain stability during sharp turns and over rough terrain. We are developing driver assist behaviors that will provide dynamic stability control for high-speed small UGVs using techniques such as dynamic weight shifting to limit oversteer and understeer. These driver assist behaviors will enable operators to use future high-speed small UGVs in high optempo infantry missions and keep warfighters out of harms way.


Proceedings of SPIE | 2009

Stingray: high-speed control of small UGVs in urban terrain

Brian Yamauchi; Kent Massey

For the TARDEC-funded Stingray Project, iRobot Corporation and Chatten Associates are developing technologies that will allow small UGVs to operate at tactically useful speeds. In previous work, we integrated a Chatten Head-Aimed Remote Viewer (HARV) with an iRobot Warrior UGV, and used the HARV to drive the Warrior, as well as a small, high-speed, gas-powered UGV surrogate. In this paper, we describe our continuing work implementing semiautonomous driver-assist behaviors to help an operator control a small UGV at high speeds. We have implemented an IMU-based heading control behavior that enables tracked vehicles to maintain accurate heading control even over rough terrain. We are also developing a low-latency, low-bandwidth, high-quality digital video protocol to support immersive visual telepresence. Our experiments show that a video compression codec using the H.264 algorithm can produce several times better resolution than a Motion JPEG video stream, while utilizing the same limited bandwidth, and the same low latency. With further enhancements, our H.264 codec will provide an order of magnitude greater quality, while retaining a low latency comparable to Motion JPEG, and operating within the same bandwidth.

Collaboration


Dive into the Brian Yamauchi's collaboration.

Researchain Logo
Decentralizing Knowledge