Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Dionisio Andújar is active.

Publication


Featured researches published by Dionisio Andújar.


Sensors | 2011

Accuracy and Feasibility of Optoelectronic Sensors for Weed Mapping in Wide Row Crops

Dionisio Andújar; Angela Ribeiro; César Fernández-Quintanilla; José Dorado

The main objectives of this study were to assess the accuracy of a ground-based weed mapping system that included optoelectronic sensors for weed detection, and to determine the sampling resolution required for accurate weed maps in maize crops. The optoelectronic sensors were located in the inter-row area of maize to distinguish weeds against soil background. The system was evaluated in three maize fields in the early spring. System verification was performed with highly reliable data from digital images obtained in a regular 12 m × 12 m grid throughout the three fields. The comparison in all these sample points showed a good relationship (83% agreement on average) between the data of weed presence/absence obtained from the optoelectronic mapping system and the values derived from image processing software (“ground truth”). Regarding the optimization of sampling resolution, the comparison between the detailed maps (all crop rows with sensors separated 0.75 m) with maps obtained with various simulated distances between sensors (from 1.5 m to 6.0 m) indicated that a 4.5 m distance (equivalent to one in six crop rows) would be acceptable to construct accurate weed maps. This spatial resolution makes the system cheap and robust enough to generate maps of inter-row weeds.


Sensors | 2012

An Ultrasonic System for Weed Detection in Cereal Crops

Dionisio Andújar; Martin Weis; Roland Gerhards

Site-specific weed management requires sensing of the actual weed infestation levels in agricultural fields to adapt the management accordingly. However, sophisticated sensor systems are not yet in wider practical use, since they are not easily available for the farmers and their handling as well as the management practice requires additional efforts. A new sensor-based weed detection method is presented in this paper and its applicability to cereal crops is evaluated. An ultrasonic distance sensor for the determination of plant heights was used for weed detection. It was hypothesised that the weed infested zones have a higher amount of biomass than non-infested areas and that this can be determined by plant height measurements. Ultrasonic distance measurements were taken in a winter wheat field infested by grass weeds and broad-leaved weeds. A total of 80 and 40 circular-shaped samples of different weed densities and compositions were assessed at two different dates. The sensor was pointed directly to the ground for height determination. In the following, weeds were counted and then removed from the sample locations. Grass weeds and broad-leaved weeds were separately removed. Differences between weed infested and weed-free measurements were determined. Dry-matter of weeds and crop was assessed and evaluated together with the sensor measurements. RGB images were taken prior and after weed removal to determine the coverage percentages of weeds and crop per sampling point. Image processing steps included EGI (excess green index) computation and thresholding to separate plants and background. The relationship between ultrasonic readings and the corresponding coverage of the crop and weeds were assessed using multiple regression analysis. Results revealed a height difference between infested and non-infested sample locations. Density and biomass of weeds present in the sample influenced the ultrasonic readings. The possibilities of weed group discrimination were assessed by discriminant analysis. The ultrasonic readings permitted the separation between weed infested zones and non-infested areas with up to 92.8% of success. This system will potentially reduce the cost of weed detection and offers an opportunity to its use in non-selective methods for weed control.


Pest Management Science | 2014

Potential use of ground‐based sensor technologies for weed detection

Gerassimos Peteinatos; Martin Weis; Dionisio Andújar; Victor Rueda Ayala; Roland Gerhards

Site-specific weed management is the part of precision agriculture (PA) that tries to effectively control weed infestations with the least economical and environmental burdens. This can be achieved with the aid of ground-based or near-range sensors in combination with decision rules and precise application technologies. Near-range sensor technologies, developed for mounting on a vehicle, have been emerging for PA applications during the last three decades. These technologies focus on identifying plants and measuring their physiological status with the aid of their spectral and morphological characteristics. Cameras, spectrometers, fluorometers and distance sensors are the most prominent sensors for PA applications. The objective of this article is to describe-ground based sensors that have the potential to be used for weed detection and measurement of weed infestation level. An overview of current sensor systems is presented, describing their concepts, results that have been achieved, already utilized commercial systems and problems that persist. A perspective for the development of these sensors is given.


Sensors | 2013

Discriminating Crop, Weeds and Soil Surface with a Terrestrial LIDAR Sensor

Dionisio Andújar; Victor Rueda-Ayala; Hugo Moreno; Joan R. Rosell-Polo; Alexandre Escolà; Constantino Valero; Roland Gerhards; César Fernández-Quintanilla; José Dorado; Hans-Werner Griepentrog

In this study, the evaluation of the accuracy and performance of a light detection and ranging (LIDAR) sensor for vegetation using distance and reflection measurements aiming to detect and discriminate maize plants and weeds from soil surface was done. The study continues a previous work carried out in a maize field in Spain with a LIDAR sensor using exclusively one index, the height profile. The current system uses a combination of the two mentioned indexes. The experiment was carried out in a maize field at growth stage 12–14, at 16 different locations selected to represent the widest possible density of three weeds: Echinochloa crus-galli (L.) P.Beauv., Lamium purpureum L., Galium aparine L.and Veronica persica Poir.. A terrestrial LIDAR sensor was mounted on a tripod pointing to the inter-row area, with its horizontal axis and the field of view pointing vertically downwards to the ground, scanning a vertical plane with the potential presence of vegetation. Immediately after the LIDAR data acquisition (distances and reflection measurements), actual heights of plants were estimated using an appropriate methodology. For that purpose, digital images were taken of each sampled area. Data showed a high correlation between LIDAR measured height and actual plant heights (R2 = 0.75). Binary logistic regression between weed presence/absence and the sensor readings (LIDAR height and reflection values) was used to validate the accuracy of the sensor. This permitted the discrimination of vegetation from the ground with an accuracy of up to 95%. In addition, a Canonical Discrimination Analysis (CDA) was able to discriminate mostly between soil and vegetation and, to a far lesser extent, between crop and weeds. The studied methodology arises as a good system for weed detection, which in combination with other principles, such as vision-based technologies, could improve the efficiency and accuracy of herbicide spraying.


Sensors | 2016

An Approach to the Use of Depth Cameras for Weed Volume Estimation.

Dionisio Andújar; José Dorado; César Fernández-Quintanilla; Angela Ribeiro

The use of depth cameras in precision agriculture is increasing day by day. This type of sensor has been used for the plant structure characterization of several crops. However, the discrimination of small plants, such as weeds, is still a challenge within agricultural fields. Improvements in the new Microsoft Kinect v2 sensor can capture the details of plants. The use of a dual methodology using height selection and RGB (Red, Green, Blue) segmentation can separate crops, weeds, and soil. This paper explores the possibilities of this sensor by using Kinect Fusion algorithms to reconstruct 3D point clouds of weed-infested maize crops under real field conditions. The processed models showed good consistency among the 3D depth images and soil measurements obtained from the actual structural parameters. Maize plants were identified in the samples by height selection of the connected faces and showed a correlation of 0.77 with maize biomass. The lower height of the weeds made RGB recognition necessary to separate them from the soil microrelief of the samples, achieving a good correlation of 0.83 with weed biomass. In addition, weed density showed good correlation with volumetric measurements. The canonical discriminant analysis showed promising results for classification into monocots and dictos. These results suggest that estimating volume using the Kinect methodology can be a highly accurate method for crop status determination and weed detection. It offers several possibilities for the automation of agricultural processes by the construction of a new system integrating these sensors and the development of algorithms to properly process the information provided by them.


Weed Science | 2011

Spatial Distribution Patterns of Johnsongrass (Sorghum halepense) in Corn Fields in Spain

Dionisio Andújar; David Ruiz; Angela Ribeiro; César Fernández-Quintanilla; José Dorado

This study describes the distribution patterns of Johnsongrass populations present in 38 commercial corn fields located in three major corn growing regions of Spain. A total of 232 ha were visually assessed from the cabin of a combine during harvesting using a three-category ranking (high density, low density, no presence) and recording the georeferenced data in a tablet personal computer. On average, 10.3 and 3.9% of the surveyed area were infested with high and low density of Johnsongrass, respectively. Most of the infested area was concentrated in a few large patches with irregular shape. Small patches (less than 1,000 m2) represented only 27% of the infested area. Management factors could explain much of the spatial distribution of this weed in the studied fields. Tillage direction was the main factor explaining patch shape: the length width−1 ratio of the patches was greater than two in the tillage direction. In sprinkler irrigated fields, higher levels of infestation were generally observed close to the sprinkler lines. Areas close to the edges of the field had a higher risk of infestation than the areas in the middle of the fields: a negative relationship between distance from the edge and weed abundance was established. Because a few patches, located in some predictable parts of the field, such as field edges, represent most of the seriously infested area, site-specific treatments of these areas could reduce herbicide inputs, until more reliable, spatially precise and practical detection, mapping, and spraying systems are developed. Nomenclature: Johnsongrass, Sorghum halepense (L.) Pers. SORHA; corn, Zea mays L.


Sensors | 2015

Matching the Best Viewing Angle in Depth Cameras for Biomass Estimation Based on Poplar Seedling Geometry

Dionisio Andújar; César Fernández-Quintanilla; José Dorado

In energy crops for biomass production a proper plant structure is important to optimize wood yields. A precise crop characterization in early stages may contribute to the choice of proper cropping techniques. This study assesses the potential of the Microsoft Kinect for Windows v.1 sensor to determine the best viewing angle of the sensor to estimate the plant biomass based on poplar seedling geometry. Kinect Fusion algorithms were used to generate a 3D point cloud from the depth video stream. The sensor was mounted in different positions facing the tree in order to obtain depth (RGB-D) images from different angles. Individuals of two different ages, e.g., one month and one year old, were scanned. Four different viewing angles were compared: top view (0°), 45° downwards view, front view (90°) and ground upwards view (−45°). The ground-truth used to validate the sensor readings consisted of a destructive sampling in which the height, leaf area and biomass (dry weight basis) were measured in each individual plant. The depth image models agreed well with 45°, 90° and −45° measurements in one-year poplar trees. Good correlations (0.88 to 0.92) between dry biomass and the area measured with the Kinect were found. In addition, plant height was accurately estimated with a few centimeters error. The comparison between different viewing angles revealed that top views showed poorer results due to the fact the top leaves occluded the rest of the tree. However, the other views led to good results. Conversely, small poplars showed better correlations with actual parameters from the top view (0°). Therefore, although the Microsoft Kinect for Windows v.1 sensor provides good opportunities for biomass estimation, the viewing angle must be chosen taking into account the developmental stage of the crop and the desired parameters. The results of this study indicate that Kinect is a promising tool for a rapid canopy characterization, i.e., for estimating crop biomass production, with several important advantages: low cost, low power needs and a high frame rate (frames per second) when dynamic measurements are required.


Computers and Electronics in Agriculture | 2016

Route planning for agricultural tasks

Jesús Conesa-Muñoz; José M. Bengochea-Guevara; Dionisio Andújar; Angela Ribeiro

The proposed route planner addresses a broad range of agricultural problems.The planner considers vehicles with different features and the field variability.The planner optimizes for different criteria, even simultaneously.The planner is validated solving several illustrative problems.The planner outperforms other approaches by up to 17% and 21% in headland distance. Route planning in agricultural fields is a major challenge closely related to the amount of inputs consumed and the associated soil compaction. Current approaches primarily focus on reducing the travelled distances (i.e., the trajectories that vehicles have to cover to carry out the task) and generally do not consider other optimization criteria such as input costs (e.g., fuel, herbicides, labor). Furthermore, although few approaches consider more than one vehicle, none of them takes into consideration vehicles with different characteristics, such as different speeds or different turning radii, and some variabilities of the field such as the weed distribution have not been studied yet. All these factors affect the cost of routes to be followed to accomplish agricultural tasks such as site-specific treatments. In this context, this study proposes a very general approach to optimize the routes that considers: (1) different criteria such as the travelled distance, the time required to perform the task and the input costs, even simultaneously, (2) vehicles with different features (e.g., working speeds, both intra and inter-crop, turning radii, fuel consumptions, tank capacities and spraying costs), (3) the variability of the field and (4) the possibility of tank refilling.The proposed approach has special relevance for route planning in site-specific herbicide applications. This case requires a tank on board the vehicle to store an agrochemical product, and its capacity must be considered because it affects the routes to be followed, specifically in those cases in which the tank capacity may not be sufficient to treat the entire field even when working in cooperation with other vehicles. In such cases, refilling (i.e., a round trip to the refilling depot) may be essential despite the extra cost involved in this operation.The proposed approach was validated by solving several illustrative problems. The results showed that the proposed route planner covers a broad range of agricultural situations and that the optimal routes may vary considerably depending on the features of the fleet vehicles, the variability of the field and the optimization criteria selected. Finally, a comparative study against other well-known agricultural planners was carried out, yielding routes that improved those produced by the reference approaches.


Computers and Electronics in Agriculture | 2016

Using depth cameras to extract structural parameters to assess the growth state and yield of cauliflower crops

Dionisio Andújar; Angela Ribeiro; César Fernández-Quintanilla; José Dorado

Display Omitted Depth cameras accurately estimate the yield of cauliflower plants before harvesting.Kinect is a useful tool for determining degree of maturity of cauliflower fruits.Depth cameras are suitable to create precise 3D models for cauliflower plants.A Kinect-based automated system for plant selection at harvest can be designed. The use of robotic systems for horticultural crops is widely known. However, the use of these systems in cruciferous vegetables remains a challenge. The case of cauliflower crops is of special relevance because it is a hand-harvested crop for which the cutting time is visually chosen. This methodology leads to a yield reduction, as some inflorescences are cut before ripening because the leaves hide their real state of maturity. This work proposes the use of depth cameras instead of visual estimation. Using Kinect Fusion algorithms, depth cameras create a 3D point cloud from the depth video stream and consequently generate solid 3D models, which have been compared to the actual structural parameters of cauliflower plants. The results show good consistency among depth image models and ground truth from the actual structural parameters. In addition, the best time for individual fruit cutting could be detected using these models, which enabled the optimization of harvesting and increased yields. The accuracy of the models deviated from the ground truth by less than 2cm in diameter/height, whereas the fruit volume estimation showed an error below 0.6% overestimation. Analysis of the structural parameters revealed a significant correlation between estimated and actual values of the volume of plants and fruit weight. These results show the potential of depth cameras to be used as a precise tool in estimating the degree of ripeness during the harvesting of cauliflower and thereby optimizing the crop profitability.


Sensors | 2016

Merge fuzzy visual servoing and GPS-based planning to obtain a proper navigation behavior for a small crop-inspection robot

José M. Bengochea-Guevara; Jesús Conesa-Muñoz; Dionisio Andújar; Angela Ribeiro

The concept of precision agriculture, which proposes farming management adapted to crop variability, has emerged in recent years. To effectively implement precision agriculture, data must be gathered from the field in an automated manner at minimal cost. In this study, a small autonomous field inspection vehicle was developed to minimise the impact of the scouting on the crop and soil compaction. The proposed approach integrates a camera with a GPS receiver to obtain a set of basic behaviours required of an autonomous mobile robot to inspect a crop field with full coverage. A path planner considered the field contour and the crop type to determine the best inspection route. An image-processing method capable of extracting the central crop row under uncontrolled lighting conditions in real time from images acquired with a reflex camera positioned on the front of the robot was developed. Two fuzzy controllers were also designed and developed to achieve vision-guided navigation. A method for detecting the end of a crop row using camera-acquired images was developed. In addition, manoeuvres necessary for the robot to change rows were established. These manoeuvres enabled the robot to autonomously cover the entire crop by following a previously established plan and without stepping on the crop row, which is an essential behaviour for covering crops such as maize without damaging them.

Collaboration


Dive into the Dionisio Andújar's collaboration.

Top Co-Authors

Avatar

José Dorado

Spanish National Research Council

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Angela Ribeiro

Spanish National Research Council

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Carolina San Martín

Spanish National Research Council

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

D. Campos

Spanish National Research Council

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

José M. Bengochea-Guevara

Spanish National Research Council

View shared research outputs
Researchain Logo
Decentralizing Knowledge