Difference between revisions of "Publications:Vision Based Perception for Mechatronic Weed Control"
(Created page with "<div style='display: none'> == Do not edit this section == </div> {{PublicationSetupTemplate|Author=Björn Åstrand |PID=237912 |Name=Åstrand, Björn [bjorn] (Högskolan i Ha...") |
|||
| Line 4: | Line 4: | ||
{{PublicationSetupTemplate|Author=Björn Åstrand | {{PublicationSetupTemplate|Author=Björn Åstrand | ||
|PID=237912 | |PID=237912 | ||
| − | |Name=Åstrand, Björn | + | |Name=Åstrand, Björn (bjorn) (Högskolan i Halmstad (2804), Sektionen för Informationsvetenskap, Data– och Elektroteknik (IDE) (3905), Halmstad Embedded and Intelligent Systems Research (EIS) (3938)) |
|Title=Vision Based Perception for Mechatronic Weed Control | |Title=Vision Based Perception for Mechatronic Weed Control | ||
|PublicationType=PhD Thesis | |PublicationType=PhD Thesis | ||
| Line 35: | Line 35: | ||
|LocalId=2082/1039 | |LocalId=2082/1039 | ||
|ArchiveNumber= | |ArchiveNumber= | ||
| − | |Keywords= | + | |Keywords=Vision-based perception;Plant recognition;Row following;Weed Control;Mechatronics in agriculture;Mobile robots |
|Categories=Teknik och teknologier (2) | |Categories=Teknik och teknologier (2) | ||
|ResearchSubjects= | |ResearchSubjects= | ||
|Projects= | |Projects= | ||
| − | |Notes=<p><strong>Även i serie:</strong> Technical report. D (Department of Computer Science and Engineering, Chalmers University of Technology), 1653-1787 ; 9</p><p> | + | |Notes=<p><strong>Även i serie:</strong> Technical report. D (Department of Computer Science and Engineering, Chalmers University of Technology), 1653-1787 ; 9</p><p><strong>Ytterligare delarbeten:</strong></p><p>Åstrand, B., Baerveldt, A.-J., Plant recognition and localization using context information, Proceedings of the IEEE Conference Mechatronics and Robotics 2004 – special session Autonomous Machines in Agriculture, Aachen, Germany, September 13-15, pp. 1191-1196, (2004).</p><p>Åstrand, B., Baerveldt, A.-J., Plant recognition and localization using context information and individual plant features, Submitted to International Journal of Pattern Recognition and Artificial Intelligence, June 2005.</p> |
| − | |Abstract=<p>The use of computer-based signal processing and sensor technology to guide and control different types of agricultural field implements increases the performance of traditional implements and even makes it possible to create new ones. This thesis increases the knowledge on vision-based perception for mechatronic weed control. The contributions are of four different kinds: First, a vision-based system for row guidance of agricultural field machinery has been proposed. The system uses a novel method, based on the Hough transform, for row recognition of crop rows. Second is a proposal for a vision-based perception system to discriminate between crops and weeds, using images from real situations in the field. Most crops are cultivated in rows and sown in a defined pattern, i.e. with a constant inter-plant distance. The proposed method introduces the concept of using these geometrical properties of the scene (context) for single plant recognition and localization. A mathematical model of a crop row has been derived that models the probability for the positions of consecutive crops in a row. Based on this mathematical model two novel methods for context-based classification between crops and weeds have been developed. Furthermore, a novel method that combines geometrical features of the scene (context) and individual plant features has been proposed. The method has been evaluated in two datasets of images of sugar beet rows. The classification rate was 92 % and 98 %, respectively. The third contribution is the design of a mobile agricultural robot equipped with these perception systems and a mechanical weeding tool intended for intra-row weed control in ecologically cultivated crops. The fourth contribution is a demonstration of the feasibility of the perception systems in real field environments, especially with respect to robustness and real-time performance. The row guidance system has been implemented in three different row cultivators and performed inter-row weed control at two commercial farms. The robot has proven to be able to follow a row structure by itself, while performing weed control within the seed line of a crop row, i.e. intra-row cultivation. | + | |Abstract=<p>The use of computer-based signal processing and sensor technology to guide and control different types of agricultural field implements increases the performance of traditional implements and even makes it possible to create new ones. This thesis increases the knowledge on vision-based perception for mechatronic weed control. The contributions are of four different kinds:</p><p>First, a vision-based system for row guidance of agricultural field machinery has been proposed. The system uses a novel method, based on the Hough transform, for row recognition of crop rows.</p><p>Second is a proposal for a vision-based perception system to discriminate between crops and weeds, using images from real situations in the field. Most crops are cultivated in rows and sown in a defined pattern, i.e. with a constant inter-plant distance. The proposed method introduces the concept of using these geometrical properties of the scene (context) for single plant recognition and localization. A mathematical model of a crop row has been derived that models the probability for the positions of consecutive crops in a row. Based on this mathematical model two novel methods for context-based classification between crops and weeds have been developed. Furthermore, a novel method that combines geometrical features of the scene (context) and individual plant features has been proposed. The method has been evaluated in two datasets of images of sugar beet rows. The classification rate was 92 % and 98 %, respectively.</p><p>The third contribution is the design of a mobile agricultural robot equipped with these perception systems and a mechanical weeding tool intended for intra-row weed control in ecologically cultivated crops.</p><p>The fourth contribution is a demonstration of the feasibility of the perception systems in real field environments, especially with respect to robustness and real-time performance. The row guidance system has been implemented in three different row cultivators and performed inter-row weed control at two commercial farms. The robot has proven to be able to follow a row structure by itself, while performing weed control within the seed line of a crop row, i.e. intra-row cultivation. </p> |
|Opponents=Slaughter, David, Professor | |Opponents=Slaughter, David, Professor | ||
|Supervisors= | |Supervisors= | ||
| Line 50: | Line 50: | ||
|Subject= | |Subject= | ||
|Uppsok= | |Uppsok= | ||
| − | |DefencePlace= | + | |DefencePlace=Wigforssalen, Högskolan i Halmstad, Kristian IV:s väg 3, Halmstad |
|DefenceLanguage=eng | |DefenceLanguage=eng | ||
|DefenceDate=2005-09-08T13:15 | |DefenceDate=2005-09-08T13:15 | ||
|CreatedDate=2007-05-28 | |CreatedDate=2007-05-28 | ||
|PublicationDate=2007-05-28 | |PublicationDate=2007-05-28 | ||
| − | |LastUpdated= | + | |LastUpdated=2014-05-06 |
|diva=http://hh.diva-portal.org/smash/record.jsf?searchId=1&pid=diva2:237912}} | |diva=http://hh.diva-portal.org/smash/record.jsf?searchId=1&pid=diva2:237912}} | ||
<div style='display: none'> | <div style='display: none'> | ||
Latest revision as of 21:40, 30 September 2016
Property "Publisher" has a restricted application area and cannot be used as annotation property by a user. Property "Author" has a restricted application area and cannot be used as annotation property by a user.
| Title | Vision Based Perception for Mechatronic Weed Control |
|---|---|
| Author | |
| Year | 2005 |
| PublicationType | PhD Thesis |
| Journal | |
| HostPublication | |
| Conference | |
| DOI | |
| Diva url | http://hh.diva-portal.org/smash/record.jsf?searchId=1&pid=diva2:237912 |
| Abstract | The use of computer-based signal processing and sensor technology to guide and control different types of agricultural field implements increases the performance of traditional implements and even makes it possible to create new ones. This thesis increases the knowledge on vision-based perception for mechatronic weed control. The contributions are of four different kinds: First, a vision-based system for row guidance of agricultural field machinery has been proposed. The system uses a novel method, based on the Hough transform, for row recognition of crop rows. Second is a proposal for a vision-based perception system to discriminate between crops and weeds, using images from real situations in the field. Most crops are cultivated in rows and sown in a defined pattern, i.e. with a constant inter-plant distance. The proposed method introduces the concept of using these geometrical properties of the scene (context) for single plant recognition and localization. A mathematical model of a crop row has been derived that models the probability for the positions of consecutive crops in a row. Based on this mathematical model two novel methods for context-based classification between crops and weeds have been developed. Furthermore, a novel method that combines geometrical features of the scene (context) and individual plant features has been proposed. The method has been evaluated in two datasets of images of sugar beet rows. The classification rate was 92 % and 98 %, respectively. The third contribution is the design of a mobile agricultural robot equipped with these perception systems and a mechanical weeding tool intended for intra-row weed control in ecologically cultivated crops. The fourth contribution is a demonstration of the feasibility of the perception systems in real field environments, especially with respect to robustness and real-time performance. The row guidance system has been implemented in three different row cultivators and performed inter-row weed control at two commercial farms. The robot has proven to be able to follow a row structure by itself, while performing weed control within the seed line of a crop row, i.e. intra-row cultivation. |