Publications:Upper limit for context-based crop classification in robotic weeding applications

From ISLAB/CAISR
Jump to navigationJump to search

Do not edit this section

Property "Publisher" has a restricted application area and cannot be used as annotation property by a user. Property "Author" has a restricted application area and cannot be used as annotation property by a user. Property "Author" has a restricted application area and cannot be used as annotation property by a user. Property "Author" has a restricted application area and cannot be used as annotation property by a user. Property "Author" has a restricted application area and cannot be used as annotation property by a user.

Keep all hand-made modifications below

Title Upper limit for context-based crop classification in robotic weeding applications
Author
Year 2016
PublicationType Journal Paper
Journal Biosystems Engineering
HostPublication
Conference
DOI http://dx.doi.org/10.1016/j.biosystemseng.2016.01.012
Diva url http://hh.diva-portal.org/smash/record.jsf?searchId=1&pid=diva2:939043
Abstract

Knowledge of the precise position of crop plants is a prerequisite for effective mechanical weed control in robotic weeding application such as in crops like sugar beets which are sensitive to mechanical stress. Visual detection and recognition of crop plants based on their shapes has been described many times in the literature. In this paper the potential of using knowledge about the crop seed pattern is investigated based on simulated output from a perception system. The reliability of position–based crop plant detection is shown to depend on the weed density (ρ, measured in weed plants per square metre) and the crop plant pattern position uncertainty (σx and σy, measured in metres along and perpendicular to the crop row, respectively). The recognition reliability can be described with the positive predictive value (PPV), which is limited by the seeding pattern uncertainty and the weed density according to the inequality: PPV ≤ (1 + 2πρσxσy)−1. This result matches computer simulations of two novel methods for position–based crop recognition as well as earlier reported field–based trials. © 2016 IAgrE