Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where J. Hemming is active.

Publication


Featured researches published by J. Hemming.


Autonomous Robots | 2002

An Autonomous Robot for Harvesting Cucumbers in Greenhouses

E.J. van Henten; J. Hemming; B.A.J. van Tuijl; J.G. Kornet; J. Meuleman; J. Bontsema; E.A. van Os

This paper describes the concept of an autonomous robot for harvesting cucumbers in greenhouses. A description is given of the working environment of the robot and the logistics of harvesting. It is stated that for a 2 ha Dutch nursery, 4 harvesting robots and one docking station are needed during the peak season. Based on these preliminaries, the design specifications of the harvest robot are defined. The main requirement is that a single harvest operation may take at most 10 s. Then, the paper focuses on the individual hardware and software components of the robot. These include, the autonomous vehicle, the manipulator, the end-effector, the two computer vision systems for detection and 3D imaging of the fruit and the environment and, finally, a control scheme that generates collision-free motions for the manipulator during harvesting. The manipulator has seven degrees-of-freedom (DOF). This is sufficient for the harvesting task. The end-effector is designed such that it handles the soft fruit without loss of quality. The thermal cutting device included in the end-effector prevents the spreading of viruses through the greenhouse. The computer vision system is able to detect more than 95% of the cucumbers in a greenhouse. Using geometric models the ripeness of the cucumbers is determined. A motion planner based on the A*-search algorithm assures collision-free eye-hand co-ordination. In autumn 2001 system integration took place and the harvesting robot was tested in a greenhouse. With a success rate of 80%, field tests confirmed the ability of the robot to pick cucumbers without human interference. On average the robot needed 45 s to pick one cucumber. Future research focuses on hardware and software solutions to improve the picking speed and accuracy of the eye-hand co-ordination of the robot.


Journal of Field Robotics | 2014

Harvesting Robots for High-value Crops: State-of-the-art Review and Challenges Ahead

C. Wouter Bac; Eldert J. van Henten; J. Hemming; Yael Edan

This review article analyzes state-of-the-art and future perspectives for harvesting robots in high-value crops. The objectives were to characterize the crop environment relevant for robotic harvesting, to perform a literature review on the state-of-the-art of harvesting robots using quantitative measures, and to reflect on the crop environment and literature review to formulate challenges and directions for future research and development. Harvesting robots were reviewed regarding the crop harvested in a production environment, performance indicators, design process techniques used, hardware design decisions, and algorithm characteristics. On average, localization success was 85%, detachment success was 75%, harvest success was 66%, fruit damage was 5%, peduncle damage was 45%, and cycle time was 33 s. A kiwi harvesting robot achieved the shortest cycle time of 1 s. Moreover, the performance of harvesting robots did not improve in the past three decades, and none of these 50 robots was commercialized. Four future challenges with R&D directions were identified to realize a positive trend in performance and to successfully implement harvesting robots in practice: 1 simplifying the task, 2 enhancing the robot, 3 defining requirements and measuring performance, and 4 considering additional requirements for successful implementation. This review article may provide new directions for future automation projects in high-value crops.


Sensors | 2009

Root Zone Sensors for Irrigation Management in Intensive Agriculture

Alberto Pardossi; Luca Incrocci; Giorgio Incrocci; Fernando Malorgio; Piero Battista; Laura Bacci; Bernardo Rapi; Paolo Marzialetti; J. Hemming; Jos Balendonck

Crop irrigation uses more than 70% of the world’s water, and thus, improving irrigation efficiency is decisive to sustain the food demand from a fast-growing world population. This objective may be accomplished by cultivating more water-efficient crop species and/or through the application of efficient irrigation systems, which includes the implementation of a suitable method for precise scheduling. At the farm level, irrigation is generally scheduled based on the grower’s experience or on the determination of soil water balance (weather-based method). An alternative approach entails the measurement of soil water status. Expensive and sophisticated root zone sensors (RZS), such as neutron probes, are available for the use of soil and plant scientists, while cheap and practical devices are needed for irrigation management in commercial crops. The paper illustrates the main features of RZS’ (for both soil moisture and salinity) marketed for the irrigation industry and discusses how such sensors may be integrated in a wireless network for computer-controlled irrigation and used for innovative irrigation strategies, such as deficit or dual-water irrigation. The paper also consider the main results of recent or current research works conducted by the authors in Tuscany (Italy) on the irrigation management of container-grown ornamental plants, which is an important agricultural sector in Italy.


Sensors | 2014

Fruit Detectability Analysis for Different Camera Positions in Sweet-Pepper †

J. Hemming; Jos Ruizendaal; J.W. Hofstee; Eldert J. van Henten

For robotic harvesting of sweet-pepper fruits in greenhouses a sensor system is required to detect and localize the fruits on the plants. Due to the complex structure of the plant, most fruits are (partially) occluded when an image is taken from one viewpoint only. In this research the effect of multiple camera positions and viewing angles on fruit visibility and detectability was investigated. A recording device was built which allowed to place the camera under different azimuth and zenith angles and to move the camera horizontally along the crop row. Fourteen camera positions were chosen and the fruit visibility in the recorded images was manually determined for each position. For images taken from one position only with the criterion of maximum 50% occlusion per fruit, the fruit detectability (FD) was in no case higher than 69%. The best single positions were the front views and looking with a zenith angle of 60° upwards. The FD increased when a combination was made of multiple viewpoint positions. With a combination of five favourite positions the maximum FD was 90%.


Journal of Field Robotics | 2017

Performance Evaluation of a Harvesting Robot for Sweet Pepper

C. Wouter Bac; J. Hemming; B.A.J. van Tuijl; Ruud Barth; Ehud Wais; Eldert J. van Henten

This paper evaluates a robot developed for autonomous harvesting of sweet peppers in a commercial greenhouse. Objectives were to assess robot performance under unmodified and simplified crop conditions, using two types of end effectors (Fin Ray; Lip type), and to evaluate the performance contribution of stem-dependent determination of the grasp pose. We describe and discuss the performance of hardware and software components developed for fruit harvesting in a complex environment that includes lighting variation, occlusions, and densely spaced obstacles. After simplifying the crop, harvest success significantly improved from 6% to 26% (Fin Ray) and from 2% to 33% (Lip type). We observed a decrease in stem damage and an increase in grasp success after enabling stem-dependent determination of the grasp pose. Generally, the robot had difficulty in successfully picking sweet peppers and we discuss possible causes. The robots novel capability of perceiving the stem of a plant may serve as useful functionality for future robots.


IFAC Proceedings Volumes | 2013

Robotics in Protected Cultivation

E.J. van Henten; C.W. Bac; J. Hemming; Yael Edan

This paper reviews robotics for protected cultivation systems. Based on a short description of the greenhouse crop production process, the current state in greenhouse mechanization and the challenges for robotics in protected cultivation are identified. Examples of current greenhouse robotics research are presented. Since the complex working environment constitutes a considerable challenge to robotics, opportunities will be identified to deal with this complexity. Solutions can be found in developing more advanced technology, including multiple sensor approaches, sensor fusion and artificial intelligence concepts and in human-robot collaboration. Alternatively, solutions can be found in modifying the working environment. The paper also pleas for the development of more generic solutions to deal with the small and very scattered markets of robots in protected cultivation.


IFAC Proceedings Volumes | 2013

Pixel classification and post-processing of plant parts using multi-spectral images of sweet-pepper

C.W. Bac; J. Hemming; E.J. van Henten

Abstract As part of the development of a sweet-pepper harvesting robot, obstacles should be detected. Objectives were to classify sweet-pepper vegetation into five plant parts: stem, top of a leaf (TL), bottom of a leaf (BL), fruit and petiole (Pet); and to improve classification results by post-processing. A multi-spectral imaging set-up with artificial lighting was developed to acquire images of sweet-pepper plants. The background was segmented from the vegetation and vegetation was classified into five plant parts, through a sequence of four two-class classification problems. True-positive detection rate/scaled false-positive rate achieved, on a pixel basis, were 40.0/179% for stem, 78.7/59.2% for top of a leaf (TL), 68.5/54.8% for bottom of a leaf (BL), 54.5/17.2% for fruit and 49.5/176.0% for petiole (Pet), before post-processing. The opening operations applied were unable to remove false stem detections to an acceptable rate. Also, many false detections of TL (>10%), BL (14%) and Pet (>15%) remained after post-processing, but these false detections are not critical for the application because these three plant parts are soft obstacles. Furthermore, results indicate that TL and BL can be distinghuished. Green fruits were post-processed using a sequence of fill-up, opening and area-based segmentation. Several area-based thresholds were tested and the most effective threshold resulted in a true-positive detection rate, on a blob basis, of 56.7 % and a scaled false-positive detection rate of 6.7 % for green fruits (N=60). Such fruit detection rates are a reasonable starting point to detect obstacles for sweet-pepper harvesting. But, additional work is required to complement the obstacle map into a complete representation of the environment.


Computers and Electronics in Agriculture | 2018

Data synthesis methods for semantic segmentation in agriculture : A Capsicum annuum dataset

R. Barth; Joris IJsselmuiden; J. Hemming; E.J. van Henten

Abstract This paper provides synthesis methods for large-scale semantic image segmentation datasets of agricultural scenes with the objective to bridge the gap between state-of-the art computer vision performance and that of computer vision in the agricultural robotics domain. We propose a novel methodology to generate renders of random meshes of plants based on empirical measurements, including the automated generation per-pixel class and depth labels for multiple plant parts. A running example is given of Capsicum annuum (sweet or bell pepper) in a high-tech greenhouse. A synthetic dataset of 10,500 images was rendered through Blender, using scenes with 42 procedurally generated plant models with randomised plant parameters. These parameters were based on 21 empirically measured plant properties at 115 positions on 15 plant stems. Fruit models were obtained by 3D scanning and plant part textures were gathered photographically. As reference dataset for modelling and evaluate segmentation performance, 750 empirical images of 50 plants were collected in a greenhouse from multiple angles and distances using image acquisition hardware of a sweet pepper harvest robot prototype. We hypothesised high similarity between synthetic images and empirical images, which we showed by analysing and comparing both sets qualitatively and quantitatively. The sets and models are publicly released with the intention to allow performance comparisons between agricultural computer vision methods, to obtain feedback for modelling improvements and to gain further validations on usability of synthetic bootstrapping and empirical fine-tuning. Finally, we provide a brief perspective on our hypothesis that related synthetic dataset bootstrapping and empirical fine-tuning can be used for improved learning.


IFAC Proceedings Volumes | 2010

On-line Monitoring of the Energy and Moisture Flows in Greenhouses

J. Bontsema; R.J.C. van Ooteghem; J. Hemming; E.J. van Henten; A. van 't Ooster; H.J.J. Janssen

Abstract As the size of greenhouses in the Netherlands is increasing more and more, the on-line monitoring of certain, not directly measured, climate quantities and crop properties becomes important for good management of the greenhouse production system. For the ventilation rate, the crop evaporation and the air exchange through screens several monitors are developed, which calculates these quantities on-line from standard climate measurements. The design, implementation and performance in practice of the proposed methods will be shown.


Archive | 2018

Precisietechnologie Tuinbouw: PPS Autonoom onkruid verwijderen: Eindrapportage

J. Hemming; Gtb Teelt Gewasfysiologie A; Pieter Blok; Jos Ruizendaal; Ppo; Wur Gtb Tuinbouw Technologie

Work package 2 of the Precision Technology Horticulture program focuses on autonomous weed removal. This final report contains four deliverables: D2.1 Module for recognition of red lettuce, D2.2 Vision based crop row guidance module, D2.3 Machine for hoeing more than 8 crop rows simultaneously, and D2.6 Actuator for controlling weeds in full field crops. D2.1 reports on the software for an extra colour segmentation algorithm that has been added to the Steketee IC-cultivator. With this algorithm it is possible to adequately detect and distinguish non green plants, such as red lettuce, from weeds. For D2.2 a standalone module for crop row guidance for hoeing between the row has been developed. D2.3 describes is the extension in hardware and software that makes it possible to hoe up to 24 crop rows simultaneously. For D2.6, research has been conducted into different robotic arms to move the end effector to the right spot for weed control in full field crops. Different arms are compared and the maximum possible driving speed was be calculated. A test device for full field weed control was build based on a x-z position unit. The chapter publications and media provides an overview of the dissemination of the project results throughout the duration of the project.

Collaboration


Dive into the J. Hemming's collaboration.

Top Co-Authors

Avatar

E.J. van Henten

Wageningen University and Research Centre

View shared research outputs
Top Co-Authors

Avatar

J. Bontsema

Wageningen University and Research Centre

View shared research outputs
Top Co-Authors

Avatar

B.A.J. van Tuijl

Wageningen University and Research Centre

View shared research outputs
Top Co-Authors

Avatar

Eldert J. van Henten

Wageningen University and Research Centre

View shared research outputs
Top Co-Authors

Avatar

J. Balendonck

Wageningen University and Research Centre

View shared research outputs
Top Co-Authors

Avatar

Yael Edan

Ben-Gurion University of the Negev

View shared research outputs
Top Co-Authors

Avatar

C.W. Bac

Wageningen University and Research Centre

View shared research outputs
Top Co-Authors

Avatar

Ruud Barth

Wageningen University and Research Centre

View shared research outputs
Top Co-Authors

Avatar

A.T. Nieuwenhuizen

Wageningen University and Research Centre

View shared research outputs
Top Co-Authors

Avatar

P.O. Bleeker

Wageningen University and Research Centre

View shared research outputs
Researchain Logo
Decentralizing Knowledge