Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where David Reiser is active.

Publication


Featured researches published by David Reiser.


Sensors | 2016

3-D Imaging Systems for Agricultural Applications—A Review

Manuel Vázquez-Arellano; Hans W. Griepentrog; David Reiser; Dimitris S. Paraforos

Efficiency increase of resources through automation of agriculture requires more information about the production process, as well as process and machinery status. Sensors are necessary for monitoring the status and condition of production by recognizing the surrounding structures such as objects, field structures, natural or artificial markers, and obstacles. Currently, three dimensional (3-D) sensors are economically affordable and technologically advanced to a great extent, so a breakthrough is already possible if enough research projects are commercialized. The aim of this review paper is to investigate the state-of-the-art of 3-D vision systems in agriculture, and the role and value that only 3-D data can have to provide information about environmental structures based on the recent progress in optical 3-D sensors. The structure of this research consists of an overview of the different optical 3-D vision techniques, based on the basic principles. Afterwards, their application in agriculture are reviewed. The main focus lays on vehicle navigation, and crop and animal husbandry. The depth dimension brought by 3-D sensors provides key information that greatly facilitates the implementation of automation and robotics in agriculture.


Remote Sensing | 2015

3D Maize Plant Reconstruction Based on Georeferenced Overlapping LiDAR Point Clouds

Miguel Garrido; Dimitris S. Paraforos; David Reiser; Manuel Vázquez Arellano; Hans W. Griepentrog; Constantino Valero

3D crop reconstruction with a high temporal resolution and by the use of non-destructive measuring technologies can support the automation of plant phenotyping processes. Thereby, the availability of such 3D data can give valuable information about the plant development and the interaction of the plant genotype with the environment. This article presents a new methodology for georeferenced 3D reconstruction of maize plant structure. For this purpose a total station, an IMU, and several 2D LiDARs with different orientations were mounted on an autonomous vehicle. By the multistep methodology presented, based on the application of the ICP algorithm for point cloud fusion, it was possible to perform the georeferenced point clouds overlapping. The overlapping point cloud algorithm showed that the aerial points (corresponding mainly to plant parts) were reduced to 1.5%–9% of the total registered data. The remaining were redundant or ground points. Through the inclusion of different LiDAR point of views of the scene, a more realistic representation of the surrounding is obtained by the incorporation of new useful information but also of noise. The use of georeferenced 3D maize plant reconstruction at different growth stages, combined with the total station accuracy could be highly useful when performing precision agriculture at the crop plant level.


Computers and Electronics in Agriculture | 2018

3-D reconstruction of maize plants using a time-of-flight camera

Manuel Vázquez-Arellano; David Reiser; Dimitris S. Paraforos; Miguel Garrido-Izard; Marlowe Edgar Cortes Burce; Hans W. Griepentrog

Abstract Point cloud rigid registration and stitching for plants with complex architecture is a challenging task, however, it is an important process to take advantage of the full potential of 3-D cameras for plant phenotyping and agricultural automation for characterizing production environments in agriculture. A methodology for three-dimensional (3-D) reconstruction of maize crop rows was proposed in this research, using high resolution 3-D images that were mapped into the colour images using state-of-the art software. The point cloud registration methodology was based on the Iterative Closest Point (ICP) algorithm. The incoming point cloud was previously filtered using the Random Sample Consensus (RANSAC) algorithm, by reducing the number of soil points until a threshold value was reached. This threshold value was calculated based on the approximate number of plant points in a single 3-D image. After registration and stitching of the crop rows, a plant/soil segmentation process was done relying again on the RANSAC algorithm. A quantitative comparison showed that the number of points obtained with a time-of-flight (TOF) camera, compared with the ones from two light detection and ranging (LIDARs) from a previous research, was roughly 23 times larger. Finally, the reconstruction was validated by comparing the seedling positions as ground truth and the point cloud clusters, obtained using the k- mean s clustering, that represent the plant stem positions. The resulted maize positions from the proposed methodology closely agreed with the ground truth with an average mean and standard deviation of 3.4 cm and ±1.3 cm, respectively.


Journal of Imaging | 2017

3D Imaging with a Sonar Sensor and an Automated 3-Axes Frame for Selective Spraying in Controlled Conditions

David Reiser; Javier M. Martín-López; Emir Memic; Manuel Vázquez-Arellano; Steffen Brandner; Hans W. Griepentrog

Autonomous selective spraying could be a way for agriculture to reduce production costs, save resources, protect the environment and help to fulfill specific pesticide regulations. The objective of this paper was to investigate the use of a low-cost sonar sensor for autonomous selective spraying of single plants. For this, a belt driven autonomous robot was used with an attached 3-axes frame with three degrees of freedom. In the tool center point (TCP) of the 3-axes frame, a sonar sensor and a spray valve were attached to create a point cloud representation of the surface, detect plants in the area and perform selective spraying. The autonomous robot was tested on replicates of artificial crop plants. The location of each plant was identified out of the acquired point cloud with the help of Euclidian clustering. The gained plant positions were spatially transformed from the coordinates of the sonar sensor to the valve location to determine the exact irrigation points. The results showed that the robot was able to automatically detect the position of each plant with an accuracy of 2.7 cm and could spray on these selected points. This selective spraying reduced the used liquid by 72%, when comparing it to a conventional spraying method in the same conditions.


Robot | 2016

Crop Row Detection in Maize for Developing Navigation Algorithms Under Changing Plant Growth Stages

David Reiser; Garrido Miguel; Manuel Vázquez Arellano; Hans W. Griepentrog; Dimitris S. Paraforos

To develop robust algorithms for agricultural navigation, different growth stages of the plants have to be considered. For fast validation and repeatable testing of algorithms, a dataset was recorded by a 4 wheeled robot, equipped with a frame of different sensors and was guided through maize rows. The robot position was simultaneously tracked by a total station, to get precise reference of the sensor data. The plant position and parameters were measured for comparing the sensor values. A horizontal laser scanner and corresponding total station data was recorded for 7 times over a period of 6 weeks. It was used to check the performance of a common RANSAC row algorithm. Results showed the best heading detection at a mean growth height of 0.268 m.


Computers and Electronics in Agriculture | 2018

Determination of stem position and height of reconstructed maize plants using a time-of-flight camera

Manuel Vázquez-Arellano; Dimitris S. Paraforos; David Reiser; Miguel Garrido-Izard; Hans W. Griepentrog

Abstract Three dimensional (3-D) reconstruction of maize plant morphology by proximal sensing in agriculture brings high definition data that can be used for a number of applications related with precision agriculture and agricultural robotics. However, 3-D reconstruction without methodologies for extracting useful information is a senseless strategy. In this research, a methodology for stem position estimation is presented relying on the merging of four point clouds, using the Iterative Closes Point algorithm, that were generated from different 3-D perspective views. The proposed methodology is based on bivariate point density histograms for detecting the regional maxima and a radius filter based on the closest Euclidean distance. Then, single plant segmentation was performed by projecting a spatial cylindrical boundary around the estimated stem positions on a merged plant and soil point cloud. After performing a local Random Sample Consensus, the segmented plant point cloud was clustered using the Density-based spatial clustering of applications with noise algorithm. Additionally, a height profile was generated by rasterizing the plant and soil point clouds, separately, with different cell widths. The rasterized soil point cloud was meshed, and the rasterized plant points to soil mesh distance was calculated. The resulting plant stem positions were estimated with an average mean error and standard deviation of 24 mm and 14 mm, respectively. Equivalently, the average mean error and standard deviation of the individual plant height estimation was 30 mm and 35 mm, respectively. Finally, the overall plant height profile mean error average was 8.7 mm. Thus it is possible to determine the stem position and plant height of reconstructed maize plants using a low-cost time-of-flight camera.


Advances in Animal Biosciences | 2017

Clustering of Laser Scanner Perception Points of Maize Plants

David Reiser; Manuel Vázquez-Arellano; M. Garrido Izard; Dimitris S. Paraforos; Galibjon M. Sharipov; Hans W. Griepentrog

The goal of this work was to cluster maize plants perception points under six different growth stages in noisy 3D point clouds with known positions. The 3D point clouds were assembled with a 2D laser scanner mounted at the front of a mobile robot, fusing the data with the precise robot position, gained by a total station and an Inertial Measurement Unit. For clustering the single plants in the resulting point cloud, a graph-cut based algorithm was used. The algorithm results were compared with the corresponding measured values of plant height and stem position. An accuracy for the estimated height of 1.55 cm and the stem position of 2.05 cm was achieved.


Robotics | 2018

Leaf Area Estimation of Reconstructed Maize Plants Using a Time-of-Flight Camera Based on Different Scan Directions

Manuel Vázquez-Arellano; David Reiser; Dimitrios S. Paraforos; Miguel Garrido-Izard; Hans W. Griepentrog

The leaf area is an important plant parameter for plant status and crop yield. In this paper, a low-cost time-of-flight camera, the Kinect v2, was mounted on a robotic platform to acquire 3-D data of maize plants in a greenhouse. The robotic platform drove through the maize rows and acquired 3-D images that were later registered and stitched. Three different maize row reconstruction approaches were compared: reconstruct a crop row by merging point clouds generated from both sides of the row in both directions, merging point clouds scanned just from one side, and merging point clouds scanned from opposite directions of the row. The resulted point cloud was subsampled and rasterized, the normals were computed and re-oriented with a Fast Marching algorithm. The Poisson surface reconstruction was applied to the point cloud, and new vertices and faces generated by the algorithm were removed. The results showed that the approach of aligning and merging four point clouds per row and two point clouds scanned from the same side generated very similar average mean absolute percentage error of 8.8% and 7.8%, respectively. The worst error resulted from the two point clouds scanned from both sides in opposite directions with 32.3%.


Computers in Industry | 2018

Iterative individual plant clustering in maize with assembled 2D LiDAR data

David Reiser; Manuel Vázquez-Arellano; Dimitris S. Paraforos; Miguel Garrido-Izard; Hans W. Griepentrog

Abstract A two dimensional (2D) laser scanner was mounted at the front part of a small 4-wheel autonomous robot with differential steering, at an angle of 30 ° pointing downwards. The machine was able to drive between maize rows and collect concurrent time-stamped data. A robotic total station tracked the position of a prism mounted on the vehicle. The total station and laser scanner data were fused to generate a three dimensional (3D) point cloud. This 3D representation was used to detect individual plant positions, which are of particular interest for applications such as phenotyping, individual plant treatment and precision weeding. Two different methodologies were applied to the 3D point cloud to estimate the position of the individual plants. The first methodology used the Euclidian Clustering on the entire point cloud. The second methodology utilised the position of an initial plant and the fixed plant spacing to search iteratively for the best clusters. The two algorithms were applied at three different plant growth stages. For the first method, results indicated a detection rate up to 73.7% with a root mean square error of 3.6 cm. The second method was able to detect all plants (100% detection rate) with an accuracy of 2.7–3.0 cm, taking the plant spacing of 13 cm into account.


Precision Agriculture | 2017

Autonomous field navigation, data acquisition and node location in wireless sensor networks

David Reiser; Dimitris S. Paraforos; M.T. Khan; Hans W. Griepentrog; Manuel Vázquez-Arellano

Collaboration


Dive into the David Reiser's collaboration.

Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Miguel Garrido-Izard

Technical University of Madrid

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

M.T. Khan

University of Hohenheim

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Garrido Miguel

Technical University of Madrid

View shared research outputs
Top Co-Authors

Avatar

M. Garrido Izard

Technical University of Madrid

View shared research outputs
Researchain Logo
Decentralizing Knowledge