Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Silvio M. Maeta is active.

Publication


Featured researches published by Silvio M. Maeta.


Proceedings of the IEEE | 2003

Internet-based solutions in the development and operation of an unmanned robotic airship

Josué Jr. Guimarães Ramos; Silvio M. Maeta; Luiz G. B. Mirisola; Samuel Siqueira Bueno; Marcel Bergerman; Bruno G. Faria; Gabriel E. M. Pinto; Augusto H. Bruciapaglia

Internet robotic systems have a different role in the use and development of aerial robots compared to that for ground robots. While for ground robotic vehicles, Internet is useful for remote operation or to make remote development, in aerial vehicles, in addition, the safety aspect must be considered very carefully. This paper addresses the requirements to the specific case of Internet aerial robots. It describes the conceptual and implementation aspects of an Internet-based software architecture for the AURORA unmanned autonomous airship project, showing its use in the different project phases.


intelligent robots and systems | 1999

Development of a VRML/Java unmanned airship simulating environment

Josué Jr. Guimarães Ramos; Silvio M. Maeta; Marcel Bergerman; Samuel Siqueira Bueno; Luiz G. B. Mirisola; Augusto H. Bruciapaglia

We present one of the first Internet-accessible airship simulators, based on a comprehensive airship dynamic model. The simulator is meant to be used as a tool for the development of control and navigation methods for autonomous and semi-autonomous robotic airships and as testbed for airship pilot training. Realistic views of both the airship in flight and that of a virtual pilot are provided, as are commands to apply thrust to the engines, swivel them up and down, and deflect the control surfaces. This work is significant for providing robotics researchers with means to safely experiment with airship control.


intelligent robots and systems | 2013

3D perception for accurate row following: Methodology and results

Ji Zhang; Andrew Chambers; Silvio M. Maeta; Marcel Bergerman; Sanjiv Singh

Rows of trees such as in orchards, planted in straight parallel lines can provide navigation cues for autonomous machines that operate in between them. When the tree canopies are well managed, tree rows appear similar to corridor walls and a simple 2D sensing scheme suffices. However, when the tree canopies are three dimensional, or ground vegetation occludes tree trunks, it is necessary to use a three dimensional sensing mode. An additional complication in prolific canopies is that GPS is not reliable and hence is not suitable to register data from sensors onboard a traversing vehicle. Here, we present a method to register 3D data from a lidar sensor onboard a vehicle that must accurately determine its pose relative to the rows. We first register point cloud into a common reference frame and then determine the position of tree rows and trunks in the vicinity to determine the vehicle pose. Our method is tested online and with data from commercial orchards. Experimental results show that the accuracy is sufficient to enable accurate traversal between tree rows even when tree canopies do not approximate planar walls.


Journal of Field Robotics | 2017

Robust Autonomous Flight in Constrained and Visually Degraded Shipboard Environments

Zheng Fang; Shichao Yang; Sezal Jain; Geetesh Dubey; Stephan Roth; Silvio M. Maeta; Stephen Nuske; Yu Zhang; Sebastian Scherer

This paper addresses the problem of autonomous navigation of a micro aerial vehicle MAV for inspection and damage assessment inside a constrained shipboard environment, which might be perilous or inaccessible for humans, especially in emergency scenarios. The environment is GPS-denied and visually degraded, containing narrow passageways, doorways, and small objects protruding from the wall. This causes existing two-dimensional LIDAR, vision, or mechanical bumper-based autonomous navigation solutions to fail. To realize autonomous navigation in such challenging environments, we first propose a robust state estimation method that fuses estimates from a real-time odometry estimation algorithm and a particle filtering localization algorithm with other sensor information in a two-layer fusion framework. Then, an online motion-planning algorithm that combines trajectory optimization with a receding horizon control framework is proposed for fast obstacle avoidance. All the computations are done in real time on the onboard computer. We validate the system by running experiments under different environmental conditions in both laboratory and practical shipboard environments. The field experiment results of over 10 runs show that our vehicle can robustly navigate 20-m-long and only 1-m-wide corridors and go through a very narrow doorway 66-cm width, only 4-cm clearance on each side autonomously even when it is completely dark or full of light smoke. These experiments show that despite the challenges associated with flying robustly in challenging shipboard environments, it is possible to use a MAV to autonomously fly into a confined shipboard environment to rapidly gather situational information to guide firefighting and rescue efforts.


IEEE Robotics & Automation Magazine | 2015

Robot Farmers: Autonomous Orchard Vehicles Help Tree Fruit Production

Marcel Bergerman; Silvio M. Maeta; Ji Zhang; Gustavo M. Freitas; Bradley Hamner; Sanjiv Singh; George Kantor

This article presents perception and navigation systems for a family of autonomous orchard vehicles. The systems are customized to enable safe and reliable driving in modern planting environments. The perception system is based on a global positioning system (GPS)-free sensor suite composed of a twodimensional (2-D) laser scanner, wheel and steering encoders, and algorithms that process the sensor data and output the vehicles location in the orchard and guidance commands for row following and turning. Localization is based on range data to premapped landmarks, currently one at the beginning and one at the end of each tree row. The navigation system takes as inputs the vehicles current location and guidance commands, plans trajectories for row following and turning, and drives the motors to achieve fully autonomous block coverage. The navigation system also includes an obstacle detection subsystem that prevents the vehicle from colliding with people, trees, and bins. To date, the vehicles sporting the perception and navigation infrastructure have traversed over 350 km in research and commercial orchards and nurseries in several U.S. states. Time trials showed that the autonomous orchard vehicles enable efficiency gains of up to 58% for fruit production tasks conducted on the top part of trees when compared with the same task performed on ladders. Anecdotal evidence collected from growers and workers indicates that replacing ladders with autonomous vehicles will make orchard work safer and more comfortable.


Revised Papers from the International Workshop on Sensor Based Intelligent Robots | 2000

Modelling, Control and Perception for an Autonomous Robotic Airship

Alberto Elfes; Samuel Siqueira Bueno; Josué Jr. Guimarães Ramos; Ely Carneiro de Paiva; Marcel Bergerman; José R. H. Carvalho; Silvio M. Maeta; Luiz G. B. Mirisola; Bruno G. Faria; José Raul Azinheira

Robotic unmanned aerial vehicles have an enormous potential as observation and data-gathering platforms for a wide variety of applications. These applications include environmental and biodiversity research and monitoring, urban planning and traffic control, inspection of man-made structures, mineral and archaeological prospecting, surveillance and law enforcement, communications, and many others. Robotic airships, in particular, are of great interest as observation platforms, due to their potential for extended mission times, low platform vibration characteristics, and hovering capability. In this paper we provide an overview of Project AURORA (Autonomous Unmanned Remote Monitoring Robotic Airship), a research effort that focusses on the development of the technologies required for substantially autonomous robotic airships. We discuss airship modelling and control, autonomous navigation, and sensor-based flight control. We also present the hardware and software architectures developed for the airship. Additionally, we discuss our current research in airborne perception and monitoring, including mission-specific target acquisition, discrimination and identification tasks. The paper also presents experimental results from our work.


field and service robotics | 2016

Robust Autonomous Flight in Constrained and Visually Degraded Environments

Zheng Fang; Shichao Yang; Sezal Jain; Geetesh Dubey; Silvio M. Maeta; Stephan Roth; Sebastian Scherer; Yu Zhang; Stephen Nuske

This paper addresses the problem of autonomous navigation of a micro aerial vehicle (MAV) inside a constrained shipboard environment for inspection and damage assessment, which might be perilous or inaccessible for humans especially in emergency scenarios. The environment is GPS-denied and visually degraded, containing narrow passageways, doorways and small objects protruding from the wall. This makes existing 2D LIDAR, vision or mechanical bumper-based autonomous navigation solutions fail. To realize autonomous navigation in such challenging environments, we propose a fast and robust state estimation algorithm that fuses estimates from a direct depth odometry method and a Monte Carlo localization algorithm with other sensor information in an EKF framework. Then, an online motion planning algorithm that combines trajectory optimization with receding horizon control framework is proposed for fast obstacle avoidance. All the computations are done in real-time onboard our customized MAV platform. We validate the system by running experiments in different environmental conditions. The results of over 10 runs show that our vehicle robustly navigates 20 m long corridors only 1 m wide and goes through a very narrow doorway (66 cm width, only 4 cm clearance on each side) completely autonomously even when it is completely dark or full of light smoke.


international conference on robotics and automation | 2001

Autonomous flight experiment with a robotic unmanned airship

Josué Jr. Guimarães Ramos; E.C. de Paiva; José Raul Azinheira; Samuel Siqueira Bueno; Silvio M. Maeta; Luiz G. B. Mirisola; Marcel Bergerman; Bruno G. Faria


2014 Montreal, Quebec Canada July 13 – July 16, 2014 | 2014

Mapping Orchards for Autonomous Navigation

Ji Zhang; Silvio M. Maeta; Marcel Bergerman; Sanjiv Singh


Lecture Notes in Computer Science | 2002

Modelling, control and perception for an autonomous Robotic airship

Alberto Elfes; Samuel Siqueira Bueno; Josué Jr. Guimarães Ramos; Ely Carneiro de Paiva; Marcel Bergerman; José R. H. Carvalho; Silvio M. Maeta; Luiz G. B. Mirisola; Bruno G. Faria; José Raul Azinheira

Collaboration


Dive into the Silvio M. Maeta's collaboration.

Top Co-Authors

Avatar

Marcel Bergerman

Federal University of Rio de Janeiro

View shared research outputs
Top Co-Authors

Avatar

Sanjiv Singh

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Ji Zhang

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar

Sebastian Scherer

Carnegie Mellon University

View shared research outputs
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge