Ricardo Mendonça
Universidade Nova de Lisboa
Network
Latest external collaboration on country level. Dive into details by clicking on the dots.
Publication
Featured researches published by Ricardo Mendonça.
robotics and biomimetics | 2014
Eduardo Pinto; Francisco Marques; Ricardo Mendonça; André Lourenço; Pedro F. Santana; José Barata
This paper presents RIVERWATCH, an autonomous surface-aerial marsupial robotic team for riverine environmental monitoring. The robotic system is composed of an Autonomous Surface Vehicle (ASV) piggybacking a multirotor Unmanned Aerial Vehicle (UAV) with vertical takeoff and landing capabilities. The ASV provides the team with longrange transportation in all-weather conditions, whereas the UAV assures an augmented perception of the environment. The coordinated aerial, underwater, and surface level perception allows the team to assess navigation cost from the near field to the far field, which is key for safe navigation and environmental monitoring data gathering. The robotic system is validated on a set of field trials.
robotics and biomimetics | 2012
Pedro F. Santana; Ricardo Mendonça; José Barata
This paper proposes a model for water detection in video sequences, which is a key asset of any robot operating in natural environments. By searching the visual input for the waters typically chaotic dynamic texture, the model is able to filter out the static background and even any dynamic object present in the scene. In this work, the waters signature is defined, mostly, in terms of an entropy measure computed from the optical flow obtained across several frames. To foster the classification of motionless regions in the visual input, usually associated to the far field, a segmentation guided label propagation method is used. The model is experimentally validated on 12 diverse videos, acquired from static and moving cameras.
systems, man and cybernetics | 2013
Ricardo Mendonça; Pedro F. Santana; Francisco Marques; André Lourenço; João de Abreu e Silva; José Barata
Testing and debugging real hardware is a time consuming task, in particular for the case of aquatic robots, for which it is necessary to transport and deploy the robots on the water. Performing waterborne and airborne field experiments with expensive hardware embedded in not yet fully functional prototypes is a highly risky endeavour. In this sense, physics-based 3D simulators are key for a fast paced and affordable development of such robotic systems. This paper contributes with a modular, open-source, and soon to be freely online available, ROS-based multi-robot simulator specially focused for aerial and water surface vehicles. This simulator is being developed as part of the RIVERWATCH experiment in the ECHORD european FP7 project. This experiment aims at demonstrating a multi-robot system for remote monitoring of riverine environments.
doctoral conference on computing, electrical and industrial systems | 2014
Eduardo Pinto; Pedro F. Santana; Francisco Marques; Ricardo Mendonça; André Lourenço; José Barata
This paper presents the core ideas of the RIVERWATCH experiment and describes its hardware architecture. The RIVERWATCH experiment considers the use of autonomous surface vehicles piggybacking multi-rotor unmanned aerial vehicles for the automatic monitoring of riverine environments. While the surface vehicle benefits from the aerial vehicle to extend its field of view, the aerial vehicle benefits from the surface vehicle to ensure long-range mobility. This symbiotic relation between both robots is expected to enhance the robustness and long lasting of the ensemble. The hardware architecture includes a considerable set of state-of-the-art sensory modalities and it is abstracted from the perception and navigation algorithms by using the Robotics Operating System (ROS). A set of field trials shows the ability of the prototype to scan a closed water body. The datasets obtained from the field trials are freely available to the robotics community.
oceans conference | 2016
Ricardo Mendonça; M. Marques; Francisco Marques; André Lourenço; Eduardo Pinto; Pedro F. Santana; Fernando Vieira Coito; Victor Lobo; José Barata
The sea as a very extensive area, renders difficult a pre-emptive and long-lasting search for shipwreck survivors. The operational cost for deploying manned teams with such proactive strategy is high and, thus, these teams are only reactively deployed when a disaster like a shipwreck has been communicated. To reduce the involved financial costs, unmanned robotic systems could be used instead as background surveillance teams patrolling the seas. In this sense, a robotic team for search and rescue (SAR) operations at sea is presented in this work. Composed of an Unmanned Surface Vehicle (USV) piggybacking a watertight Unmanned Aerial Vehicle (UAV) with vertical take-off and landing capabilities, the proposed cooperative system is capable of search, track and provide basic life support while reporting the position of human survivors to better prepared manned rescue teams. The USV provides long-range transportation of the UAV and basic survival kits for victims. The UAV assures an augmented perception of the environment due to its high vantage point.
Applied Soft Computing | 2013
Pedro F. Santana; Ricardo Mendonça; Luis M. Correia; José Barata
This paper extends an existing saliency-based model for path detection and tracking so that the appearance of the path being followed can be learned and used to bias the saliency computation process. The goal is to reduce ambiguities in the presence of strong distractors. In both original and extended path detectors, neural and swarm models are layered in order to attain a hybrid solution. With generalisation to other tasks in mind, these detectors are presented as instances of a generic neural-swarm layered architecture for visual saliency computation. The architecture considers a swarm-based substrate for the extraction of high-level perceptual representations, given the low-level perceptual representations extracted by a neural-based substrate. The goal of this division of labour is to ensure parallelism across the vision system while maintaining scalability and tractability. The proposed model is shown to exhibit, at 20Hz, a 98.67% success rate on a diverse data-set composed of 39 videos encompassing a total of 29,789 640x480 frames. An open source implementation of the model, fully encapsulated as a node of the Robotics Operating System (ROS), is available for download.
conference of the industrial electronics society | 2016
João L. Gomes; Francisco Marques; André Lourenço; Ricardo Mendonça; Pedro F. Santana; José Barata
The proposed telemetry system consists of a graphical user interface with reconfigurable multiple widgets whose transmission is prioritised based on the users gaze. The proposed approach encompasses a novel framework for telemetry of remote machines, based on ROS (Robot Operating System). It includes a modular ensemble of GUI, gaze device interoperability, and a ROS sensory topic modulator, which alternates different strategies to optimise all displayed information transmission quality, in a way that best suits the user. The approach was validated on a teleoperation scenario having multiple users control a robot in a designed course.
robotics and biomimetics | 2014
João Lita da Silva; Ricardo Mendonça; Francisco Marques; Paulo M. M. Rodrigues; Pedro F. Santana; José Barata
This paper presents a method for vision-based landing of a multirotor unmanned aerial vehicle (UAV) on an autonomous surface vehicle (ASV) equipped with a helipad. The method includes a mechanism for helipad behavioural search when outside the UAVs field of view, a learning saliency-based mechanism for visual tracking the helipad, and a cooperative strategy for the final vision-based landing phase. Learning how to track the helipad from above occurs during takeoff and cooperation results from having the ASV tracking the UAV for assisting its landing. A set of experimental results with both simulated and physical robots show the feasibility of the presented method.
Sensors | 2016
Pedro Deusdado; Magno Guedes; André Silva; Francisco Marques; Eduardo Pinto; Paulo M. M. Rodrigues; André Lourenço; Ricardo Mendonça; Pedro F. Santana; José Corisco; Susana Marta Lopes Almeida; Luís Portugal; Raquel Caldeira; José Barata; Luis Flores
This paper presents a robotic team suited for bottom sediment sampling and retrieval in mudflats, targeting environmental monitoring tasks. The robotic team encompasses a four-wheel-steering ground vehicle, equipped with a drilling tool designed to be able to retain wet soil, and a multi-rotor aerial vehicle for dynamic aerial imagery acquisition. On-demand aerial imagery, properly fused on an aerial mosaic, is used by remote human operators for specifying the robotic mission and supervising its execution. This is crucial for the success of an environmental monitoring study, as often it depends on human expertise to ensure the statistical significance and accuracy of the sampling procedures. Although the literature is rich on environmental monitoring sampling procedures, in mudflats, there is a gap as regards including robotic elements. This paper closes this gap by also proposing a preliminary experimental protocol tailored to exploit the capabilities offered by the robotic system. Field trials in the south bank of the river Tagus’ estuary show the ability of the robotic system to successfully extract and transport bottom sediment samples for offline analysis. The results also show the efficiency of the extraction and the benefits when compared to (conventional) human-based sampling.
Robot | 2016
Pedro Deusdado; Eduardo Pinto; Magno Guedes; Francisco Marques; Paulo M. M. Rodrigues; André Lourenço; Ricardo Mendonça; André Silva; Pedro F. Santana; José Corisco; Marta Mateus de Almeida; Luís Portugal; Raquel Caldeira; José Barata; Luis Flores
This paper presents an aerial-ground field robotic team, designed to collect and transport soil and biota samples in estuarine mudflats. The robotic system has been devised so that its sampling and storage capabilities are suited for radionuclides and heavy metals environmental monitoring. Automating these time-consuming and physically demanding tasks is expected to positively impact both their scope and frequency. The success of an environmental monitoring study heavily depends on the statistical significance and accuracy of the sampling procedures, which most often require frequent human intervention. The bird’s-eye view provided by the aerial vehicle aims at supporting remote mission specification and execution monitoring. This paper also proposes a preliminary experimental protocol tailored to exploit the capabilities offered by the robotic system. Preliminary field trials in real estuarine mudflats show the ability of the robotic system to successfully extract and transport soil samples for offline analysis.