Philipp Lottes

Postdoc
Contact:
Email: philipp.lottes@nulligg.uni-bonn.de
Tel: +49 – 228 – 73 – N/A
Fax: +49 – 228 – 73 – 27 12
Office: Nussallee 15, Office: Nussallee 15, Ground floor, room 0.014
Address:
University of Bonn
Photogrammetry, IGG
Nussallee 15
53115 Bonn

Short CV

Philipp Lottes was a PhD student at the Photogrammetry Lab at the University of Bonn from November 2015 to January 2021 and is now a Postdoc. He received his master’s degree at the Institute of Geodesy and Geoinformation in 2015. During his master studies he was working as an assistant for the Institute of Geodesy and Geoinformation as well as for the Photogrammetry Lab. Before moving to Bonn, he finished his bachelor studies in Surveying Engineering in 2012 at the Bochum University of Applied Sciences and subsequently worked for the Marx Ingenieurgesellschaft mbH as surveying engineer for 1,5 years. He further worked as research assistant for the EU funded project FLOURISH (2015-2018). In his research, he focuses on approaches based on machine learning and probabilistic techniques in order to develop plant classification systems for agricultural ground robots as well as unmanned aerial robots. He is specifically interested in the fields of unsupervised learning, transfer learning and deep learning. One of his main research questions is: How can we design machine learning algorithms providing a high and robust classification performance in erratically changing environments such as agricultural fields.

Research Interests

  • Machine Learning
  • Computer Vision
  • Agricultural Robotics and Breeding Applications

Projects

  • FLOURISH – Developing an adaptable robotic solution for precision farming applications. By combining the aerial survey capabilities of a small autonomous multi-copter Unmanned Aerial Vehicle (UAV) with a multi-purpose agricultural Unmanned Ground Vehicle (UGV), the system will be able to survey a field from the air, perform targeted intervention on the ground, and provide detailed information for decision support, all with minimal user intervention.
    Regarding the project, I am currently developing a classification system for both the UGV and UAV robot in order to enable them to identify crops and weeds in the field.
    Demo video of the online sugar beets vs. weed classification on the field.
  • Pheno-Inspect (EFRE-funded Start-Up, TRANSFER.NRW) – Accelerating and improving breeding towards more efficient crops and varieties is key for increasing yield and improving resilience of plants. For plant breeders, it is important to observe and document phenotypic traits describing the appearance of plants in the field to evaluate the quality and success of the breeding process. With Pheno-Inspect, we aim at offering growers and farmers novel software solutions for automated high-throughput phenotyping in the fields. For sensing, we reply on small and lightweight aerial platforms, which enable a flexible, large-area and time-efficient survey of fields and plot experiments. With our software toolchain, we provide breeders and farmers with a tool to gain precise knowledge of the crop and individual plants. We automatically detect phenotypic traits of crop plants, the species of plants and weeds in the field and derive site- or plot-specific statistics. Our approach relies on state-of-the-art machine learning methods optimized for the agricultural domain to semantically interpret the captured image data and to extract the desired parameters about the plants. The learning procedures developed by us use expert knowledge inserted by the user to adapt efficiently and thus deliver the desired results quickly and effectively with regard to individual problems and local characteristics of the environment.

Awards

  • Winner of Best Paper Award in Automation of the IEEE Robotics and Automation Society at ICRA 2017, “UAV-Based Crop and Weed Classification for Smart Farming”
  • Finalist for the Best Application Paper Award of the IEEE/RSJ International Conference on Intelligent Robots and Systems at IROS 2017, “Semi-Supervised Online Visual Crop and Weed Classification in Precision Farming Exploiting Plant Arrangement”
  • Finalist for the Best Service Paper Award of the IEEE Robotics and Automation Society at ICRA 2018, “Real-time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs”

Dataset Releases

  • Crop-Weed for Semantic Segmentation: We released 12340 labeled images containing pixel-wise annotations of sugar beets and weeds. On average, we recorded data three times per week over 6 weeks within the season, which captures the interesting period for weed control starting at the emergence of the plants. The robot carried a 4-channel multi-spectral camera.
  • Crop, Weed, Grass, and Plant Stems for Semantic Segmentation:. We use FCNs to predict stem positions of plants as well as the semantic segmentation of the scene into the classes: (1) soil, (2) sugar beet, (3) dicot-weed, and (4) grass-weed. We published the datasets containing 921 RGB+NIR and 400 RGB-only images as well as their corresponding annotations for the semantic segmentation and the stem detection task. Link to the paper.

Teaching

  • 3D-Coordinate systems, ws 2014/2015
  • 3D-Coordinate systems, ws 2015/2016
  • Solving online perception problems in ROS, ws 2015/2016
  • 3D-Coordinate systems, ws 2016/2017
  • 3D-Coordinate systems, ws 2017/2018
  • Master-Project: Automated Field Analysis for Crop Farming, ss 2018
  • Master-Project: Automated Field Analysis for Crop Farming, ws 2018/2019
  • 3D-Coordinate systems, ws 2018/2019
  • Master-Project: Machine Learning for Smart Breeding Applications, ss 2019

Publications

2023

  • F. Magistri, J. Weyler, D. Gogoll, P. Lottes, J. Behley, N. Petrinic, and C. Stachniss, “From one Field to Another – Unsupervised Domain Adaptation for Semantic Segmentation in Agricultural Robotics,” Computers and Electronics in Agriculture, vol. 212, p. 108114, 2023. doi:https://doi.org/10.1016/j.compag.2023.108114
    [BibTeX] [PDF]
    @article{magistri2023cea,
    author = {F. Magistri and J. Weyler and D. Gogoll and P. Lottes and J. Behley and N. Petrinic and C. Stachniss},
    title = {From one Field to Another – Unsupervised Domain Adaptation for Semantic Segmentation in Agricultural Robotics},
    journal = cea,
    year = {2023},
    volume = {212},
    pages = {108114},
    doi = {https://doi.org/10.1016/j.compag.2023.108114},
    }

  • Y. L. Chong, J. Weyler, P. Lottes, J. Behley, and C. Stachniss, “Unsupervised Generation of Labeled Training Images for Crop-Weed Segmentation in New Fields and on Different Robotic Platforms,” IEEE Robotics and Automation Letters (RA-L), vol. 8, iss. 8, p. 5259–5266, 2023. doi:10.1109/LRA.2023.3293356
    [BibTeX] [PDF] [Code]
    @article{chong2023ral,
    author = {Y.L. Chong and J. Weyler and P. Lottes and J. Behley and C. Stachniss},
    title = {{Unsupervised Generation of Labeled Training Images for Crop-Weed Segmentation in New Fields and on Different Robotic Platforms}},
    journal = ral,
    volume = {8},
    number = {8},
    pages = {5259--5266},
    year = 2023,
    issn = {2377-3766},
    doi = {10.1109/LRA.2023.3293356},
    note = {accepted},
    codeurl = {https://github.com/PRBonn/StyleGenForLabels}
    }

2022

  • J. Weyler, J. Quakernack, P. Lottes, J. Behley, and C. Stachniss, “Joint Plant and Leaf Instance Segmentation on Field-Scale UAV Imagery,” IEEE Robotics and Automation Letters (RA-L), vol. 7, iss. 2, pp. 3787-3794, 2022. doi:10.1109/LRA.2022.3147462
    [BibTeX] [PDF]
    @article{weyler2022ral,
    author = {J. Weyler and J. Quakernack and P. Lottes and J. Behley and C. Stachniss},
    title = {{Joint Plant and Leaf Instance Segmentation on Field-Scale UAV Imagery}},
    journal = ral,
    year = 2022,
    doi = {10.1109/LRA.2022.3147462},
    issn = {377-3766},
    volume = {7},
    number = {2},
    pages = {3787-3794},
    }

2021

  • A. Barreto, P. Lottes, F. R. Ispizua, S. Baumgarten, N. A. Wolf, C. Stachniss, A. -K. Mahlein, and S. Paulus, “Automatic UAV-based counting of seedlings in sugar-beet field and extension to maize and strawberry,” Computers and Electronics in Agriculture, 2021.
    [BibTeX] [PDF]
    @article{barreto2021cea,
    author = {A. Barreto and P. Lottes and F.R. Ispizua and S. Baumgarten and N.A. Wolf and C. Stachniss and A.-K. Mahlein and S. Paulus},
    title = {Automatic UAV-based counting of seedlings in sugar-beet field and extension to maize and strawberry
    },
    journal = {Computers and Electronics in Agriculture},
    year = {2021},
    }

  • A. Pretto, S. Aravecchia, W. Burgard, N. Chebrolu, C. Dornhege, T. Falck, F. Fleckenstein, A. Fontenla, M. Imperoli, R. Khanna, F. Liebisch, P. Lottes, A. Milioto, D. Nardi, S. Nardi, J. Pfeifer, M. Popovic, C. Potena, C. Pradalier, E. Rothacker-Feder, I. Sa, A. Schaefer, R. Siegwart, C. Stachniss, A. Walter, V. Winterhalter, X. Wu, and J. Nieto, “Building an Aerial-Ground Robotics Systemfor Precision Farming: An Adaptable Solution,” IEEE Robotics & Automation Magazine, vol. 28, iss. 3, 2021.
    [BibTeX] [PDF]
    @Article{pretto2021ram,
    title = {{Building an Aerial-Ground Robotics Systemfor Precision Farming: An Adaptable Solution}},
    author = {A. Pretto and S. Aravecchia and W. Burgard and N. Chebrolu and C. Dornhege and T. Falck and F. Fleckenstein and A. Fontenla and M. Imperoli and R. Khanna and F. Liebisch and P. Lottes and A. Milioto and D. Nardi and S. Nardi and J. Pfeifer and M. Popovic and C. Potena and C. Pradalier and E. Rothacker-Feder and I. Sa and A. Schaefer and R. Siegwart and C. Stachniss and A. Walter and V. Winterhalter and X. Wu and J. Nieto},
    journal = ram,
    volume = 28,
    number = 3,
    year = {2021},
    url={https://www.ipb.uni-bonn.de/pdfs/pretto2021ram.pdf}
    }

  • F. Görlich, E. Marks, A. Mahlein, K. König, P. Lottes, and C. Stachniss, “UAV-Based Classification of Cercospora Leaf Spot Using RGB Images,” Drones, vol. 5, iss. 2, 2021. doi:10.3390/drones5020034
    [BibTeX] [PDF]

    Plant diseases can impact crop yield. Thus, the detection of plant diseases using sensors that can be mounted on aerial vehicles is in the interest of farmers to support decision-making in integrated pest management and to breeders for selecting tolerant or resistant genotypes. This paper investigated the detection of Cercospora leaf spot (CLS), caused by Cercospora beticola in sugar beet using RGB imagery. We proposed an approach to tackle the CLS detection problem using fully convolutional neural networks, which operate directly on RGB images captured by a UAV. This efficient approach does not require complex multi- or hyper-spectral sensors, but provides reliable results and high sensitivity. We provided a detection pipeline for pixel-wise semantic segmentation of CLS symptoms, healthy vegetation, and background so that our approach can automatically quantify the grade of infestation. We thoroughly evaluated our system using multiple UAV datasets recorded from different sugar beet trial fields. The dataset consisted of a training and a test dataset and originated from different fields. We used it to evaluate our approach under realistic conditions and analyzed its generalization capabilities to unseen environments. The obtained results correlated to visual estimation by human experts significantly. The presented study underlined the potential of high-resolution RGB imaging and convolutional neural networks for plant disease detection under field conditions. The demonstrated procedure is particularly interesting for applications under practical conditions, as no complex and cost-intensive measuring system is required.

    @Article{goerlich2021drones,
    AUTHOR = {Görlich, Florian and Marks, Elias and Mahlein, Anne-Katrin and König, Kathrin and Lottes, Philipp and Stachniss, Cyrill},
    TITLE = {{UAV-Based Classification of Cercospora Leaf Spot Using RGB Images}},
    JOURNAL = {Drones},
    VOLUME = {5},
    YEAR = {2021},
    NUMBER = {2},
    ARTICLE-NUMBER = {34},
    URL = {https://www.mdpi.com/2504-446X/5/2/34/pdf},
    ISSN = {2504-446X},
    ABSTRACT = {Plant diseases can impact crop yield. Thus, the detection of plant diseases using sensors that can be mounted on aerial vehicles is in the interest of farmers to support decision-making in integrated pest management and to breeders for selecting tolerant or resistant genotypes. This paper investigated the detection of Cercospora leaf spot (CLS), caused by Cercospora beticola in sugar beet using RGB imagery. We proposed an approach to tackle the CLS detection problem using fully convolutional neural networks, which operate directly on RGB images captured by a UAV. This efficient approach does not require complex multi- or hyper-spectral sensors, but provides reliable results and high sensitivity. We provided a detection pipeline for pixel-wise semantic segmentation of CLS symptoms, healthy vegetation, and background so that our approach can automatically quantify the grade of infestation. We thoroughly evaluated our system using multiple UAV datasets recorded from different sugar beet trial fields. The dataset consisted of a training and a test dataset and originated from different fields. We used it to evaluate our approach under realistic conditions and analyzed its generalization capabilities to unseen environments. The obtained results correlated to visual estimation by human experts significantly. The presented study underlined the potential of high-resolution RGB imaging and convolutional neural networks for plant disease detection under field conditions. The demonstrated procedure is particularly interesting for applications under practical conditions, as no complex and cost-intensive measuring system is required.},
    DOI = {10.3390/drones5020034}
    }

2020

  • D. Gogoll, P. Lottes, J. Weyler, N. Petrinic, and C. Stachniss, “Unsupervised Domain Adaptation for Transferring Plant Classification Systems to New Field Environments, Crops, and Robots,” in Proc. of the IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems (IROS), 2020.
    [BibTeX] [PDF] [Video]
    @inproceedings{gogoll2020iros,
    author = {D. Gogoll and P. Lottes and J. Weyler and N. Petrinic and C. Stachniss},
    title = {{Unsupervised Domain Adaptation for Transferring Plant Classification Systems to New Field Environments, Crops, and Robots}},
    booktitle = iros,
    year = {2020},
    url = {https://www.ipb.uni-bonn.de/pdfs/gogoll2020iros.pdf},
    videourl = {https://www.youtube.com/watch?v=6K79Ih6KXTs},
    }

  • X. Wu, S. Aravecchia, P. Lottes, C. Stachniss, and C. Pradalier, “Robotic Weed Control Using Automated Weed and Crop Classification,” Journal of Field Robotics, vol. 37, pp. 322-340, 2020.
    [BibTeX] [PDF]
    @Article{wu2020jfr,
    title = {Robotic Weed Control Using Automated Weed and Crop Classification},
    author = {X. Wu and S. Aravecchia and P. Lottes and C. Stachniss and C. Pradalier},
    journal = jfr,
    year = {2020},
    volume = {37},
    numer = {2},
    pages = {322-340},
    url = {https://www.ipb.uni-bonn.de/pdfs/wu2020jfr.pdf},
    }

  • P. Lottes, J. Behley, N. Chebrolu, A. Milioto, and C. Stachniss, “Robust joint stem detection and crop-weed classification using image sequences for plant-specific treatment in precision farming,” Journal of Field Robotics, vol. 37, pp. 20-34, 2020. doi:https://doi.org/10.1002/rob.21901
    [BibTeX] [PDF]
    @Article{lottes2020jfr,
    title = {Robust joint stem detection and crop-weed classification using image sequences for plant-specific treatment in precision farming},
    author = {Lottes, P. and Behley, J. and Chebrolu, N. and Milioto, A. and Stachniss, C.},
    journal = jfr,
    volume = {37},
    numer = {1},
    pages = {20-34},
    year = {2020},
    doi = {https://doi.org/10.1002/rob.21901},
    url = {https://www.ipb.uni-bonn.de/pdfs/lottes2019jfr.pdf},
    }

  • R. Sheikh, A. Milioto, P. Lottes, C. Stachniss, M. Bennewitz, and T. Schultz, “Gradient and Log-based Active Learning for Semantic Segmentation of Crop and Weed for Agricultural Robots,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2020.
    [BibTeX] [PDF] [Video]
    @InProceedings{sheikh2020icra,
    title = {Gradient and Log-based Active Learning for Semantic Segmentation of Crop and Weed for Agricultural Robots},
    author = {R. Sheikh and A. Milioto and P. Lottes and C. Stachniss and M. Bennewitz and T. Schultz},
    booktitle = icra,
    year = {2020},
    url = {https://www.ipb.uni-bonn.de/pdfs/sheikh2020icra.pdf},
    videourl = {https://www.youtube.com/watch?v=NySa59gxFAg},
    }

2019

  • A. Pretto, S. Aravecchia, W. Burgard, N. Chebrolu, C. Dornhege, T. Falck, F. Fleckenstein, A. Fontenla, M. Imperoli, R. Khanna, F. Liebisch, P. Lottes, A. Milioto, D. Nardi, S. Nardi, J. Pfeifer, M. Popović, C. Potena, C. Pradalier, E. Rothacker-Feder, I. Sa, A. Schaefer, R. Siegwart, C. Stachniss, A. Walter, W. Winterhalter, X. Wu, and J. Nieto, “Building an Aerial-Ground Robotics System for Precision Farming,” arXiv Preprint, 2019.
    [BibTeX] [PDF]
    @article{pretto2019arxiv,
    author = {A. Pretto and S. Aravecchia and W. Burgard and N. Chebrolu and C. Dornhege and T. Falck and F. Fleckenstein and A. Fontenla and M. Imperoli and R. Khanna and F. Liebisch and P. Lottes and A. Milioto and D. Nardi and S. Nardi and J. Pfeifer and M. Popović and C. Potena and C. Pradalier and E. Rothacker-Feder and I. Sa and A. Schaefer and R. Siegwart and C. Stachniss and A. Walter and W. Winterhalter and X. Wu and J. Nieto},
    title = {{Building an Aerial-Ground Robotics System for Precision Farming}},
    journal = arxiv,
    year = 2019,
    eprint = {1911.03098v1},
    url = {https://arxiv.org/pdf/1911.03098v1},
    keywords = {cs.RO},
    }

  • E. Palazzolo, J. Behley, P. Lottes, P. Giguère, and C. Stachniss, “ReFusion: 3D Reconstruction in Dynamic Environments for RGB-D Cameras Exploiting Residuals,” in Proc. of the IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems (IROS), 2019.
    [BibTeX] [PDF] [Code] [Video]
    @InProceedings{palazzolo2019iros,
    author = {E. Palazzolo and J. Behley and P. Lottes and P. Gigu\`ere and C. Stachniss},
    title = {{ReFusion: 3D Reconstruction in Dynamic Environments for RGB-D Cameras Exploiting Residuals}},
    booktitle = iros,
    year = {2019},
    url = {https://www.ipb.uni-bonn.de/pdfs/palazzolo2019iros.pdf},
    codeurl = {https://github.com/PRBonn/refusion},
    videourl = {https://youtu.be/1P9ZfIS5-p4},
    }

  • N. Chebrolu, P. Lottes, T. Laebe, and C. Stachniss, “Robot Localization Based on Aerial Images for Precision Agriculture Tasks in Crop Fields,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2019.
    [BibTeX] [PDF] [Video]
    @InProceedings{chebrolu2019icra,
    author = {N. Chebrolu and P. Lottes and T. Laebe and C. Stachniss},
    title = {{Robot Localization Based on Aerial Images for Precision Agriculture Tasks in Crop Fields}},
    booktitle = icra,
    year = 2019,
    url = {https://www.ipb.uni-bonn.de/pdfs/chebrolu2019icra.pdf},
    videourl = {https://youtu.be/TlijLgoRLbc},
    }

  • P. Lottes, N. Chebrolu, F. Liebisch, and C. Stachniss, “UAV-based Field Monitoring for Precision Farming,” in Proc. of the 25th Workshop für Computer-Bildanalyse und unbemannte autonom fliegende Systeme in der Landwirtschaft, 2019.
    [BibTeX] [PDF]
    @InProceedings{lottes2019cbaws,
    title={UAV-based Field Monitoring for Precision Farming},
    author={P. Lottes and N. Chebrolu and F. Liebisch and C. Stachniss},
    booktitle= {Proc. of the 25th Workshop f\"ur Computer-Bildanalyse und unbemannte autonom fliegende Systeme in der Landwirtschaft},
    year= {2019},
    url= {https://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/lottes2019cbaws.pdf},
    }

2018

  • I. Sa, M. Popovic, R. Khanna, Z. Chen, P. Lottes, F. Liebisch, J. Nieto, C. Stachniss, and R. Siegwart, “WeedMap: A Large-Scale Semantic Weed Mapping Framework Using Aerial Multispectral Imaging and Deep Neural Network for Precision Farming,” , vol. 10, 2018. doi:10.3390/rs10091423
    [BibTeX] [PDF]

    {The ability to automatically monitor agricultural fields is an important capability in precision farming, enabling steps towards more sustainable agriculture. Precise, high-resolution monitoring is a key prerequisite for targeted intervention and the selective application of agro-chemicals. The main goal of this paper is developing a novel crop/weed segmentation and mapping framework that processes multispectral images obtained from an unmanned aerial vehicle (UAV) using a deep neural network (DNN). Most studies on crop/weed semantic segmentation only consider single images for processing and classification. Images taken by UAVs often cover only a few hundred square meters with either color only or color and near-infrared (NIR) channels. Although a map can be generated by processing single segmented images incrementally, this requires additional complex information fusion techniques which struggle to handle high fidelity maps due to their computational costs and problems in ensuring global consistency. Moreover, computing a single large and accurate vegetation map (e.g., crop/weed) using a DNN is non-trivial due to difficulties arising from: (1) limited ground sample distances (GSDs) in high-altitude datasets, (2) sacrificed resolution resulting from downsampling high-fidelity images, and (3) multispectral image alignment. To address these issues, we adopt a stand sliding window approach that operates on only small portions of multispectral orthomosaic maps (tiles), which are channel-wise aligned and calibrated radiometrically across the entire map. We define the tile size to be the same as that of the DNN input to avoid resolution loss. Compared to our baseline model (i.e., SegNet with 3 channel RGB inputs) yielding an area under the curve (AUC) of [background=0.607

    @Article{sa2018rs,
    author = {I. Sa and M. Popovic and R. Khanna and Z. Chen and P. Lottes and F. Liebisch and J. Nieto and C. Stachniss and R. Siegwart},
    title = {{WeedMap: A Large-Scale Semantic Weed Mapping Framework Using Aerial Multispectral Imaging and Deep Neural Network for Precision Farming}},
    journal = rs,
    year = 2018,
    volume = 10,
    issue = 9,
    url = {https://www.mdpi.com/2072-4292/10/9/1423/pdf},
    doi = {10.3390/rs10091423},
    abstract = {The ability to automatically monitor agricultural fields is an important capability in precision farming, enabling steps towards more sustainable agriculture. Precise, high-resolution monitoring is a key prerequisite for targeted intervention and the selective application of agro-chemicals. The main goal of this paper is developing a novel crop/weed segmentation and mapping framework that processes multispectral images obtained from an unmanned aerial vehicle (UAV) using a deep neural network (DNN). Most studies on crop/weed semantic segmentation only consider single images for processing and classification. Images taken by UAVs often cover only a few hundred square meters with either color only or color and near-infrared (NIR) channels. Although a map can be generated by processing single segmented images incrementally, this requires additional complex information fusion techniques which struggle to handle high fidelity maps due to their computational costs and problems in ensuring global consistency. Moreover, computing a single large and accurate vegetation map (e.g., crop/weed) using a DNN is non-trivial due to difficulties arising from: (1) limited ground sample distances (GSDs) in high-altitude datasets, (2) sacrificed resolution resulting from downsampling high-fidelity images, and (3) multispectral image alignment. To address these issues, we adopt a stand sliding window approach that operates on only small portions of multispectral orthomosaic maps (tiles), which are channel-wise aligned and calibrated radiometrically across the entire map. We define the tile size to be the same as that of the DNN input to avoid resolution loss. Compared to our baseline model (i.e., SegNet with 3 channel RGB inputs) yielding an area under the curve (AUC) of [background=0.607, crop=0.681, weed=0.576], our proposed model with 9 input channels achieves [0.839, 0.863, 0.782]. Additionally, we provide an extensive analysis of 20 trained models, both qualitatively and quantitatively, in order to evaluate the effects of varying input channels and tunable network hyperparameters. Furthermore, we release a large sugar beet/weed aerial dataset with expertly guided annotations for further research in the fields of remote sensing, precision agriculture, and agricultural robotics.},
    }

  • P. Lottes, J. Behley, A. Milioto, and C. Stachniss, “Fully Convolutional Networks with Sequential Information for Robust Crop and Weed Detection in Precision Farming,” IEEE Robotics and Automation Letters (RA-L), vol. 3, pp. 3097-3104, 2018. doi:10.1109/LRA.2018.2846289
    [BibTeX] [PDF] [Video]
    @Article{lottes2018ral,
    author = {P. Lottes and J. Behley and A. Milioto and C. Stachniss},
    title = {Fully Convolutional Networks with Sequential Information for Robust Crop and Weed Detection in Precision Farming},
    journal = ral,
    year = {2018},
    volume = {3},
    issue = {4},
    pages = {3097-3104},
    doi = {10.1109/LRA.2018.2846289},
    url = {https://www.ipb.uni-bonn.de/pdfs/lottes2018ral.pdf},
    videourl = {https://www.youtube.com/watch?v=vTepw9HRLh8},
    }

  • P. Lottes, J. Behley, N. Chebrolu, A. Milioto, and C. Stachniss, “Joint Stem Detection and Crop-Weed Classification for Plant-specific Treatment in Precision Farming,” in Proc. of the IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems (IROS), 2018.
    [BibTeX] [PDF] [Video]

    Applying agrochemicals is the default procedure for conventional weed control in crop production, but has negative impacts on the environment. Robots have the potential to treat every plant in the field individually and thus can reduce the required use of such chemicals. To achieve that, robots need the ability to identify crops and weeds in the field and must additionally select effective treatments. While certain types of weed can be treated mechanically, other types need to be treated by (selective) spraying. In this paper, we present an approach that provides the necessary information for effective plant-specific treatment. It outputs the stem location for weeds, which allows for mechanical treatments, and the covered area of the weed for selective spraying. Our approach uses an end-to- end trainable fully convolutional network that simultaneously estimates stem positions as well as the covered area of crops and weeds. It jointly learns the class-wise stem detection and the pixel-wise semantic segmentation. Experimental evaluations on different real-world datasets show that our approach is able to reliably solve this problem. Compared to state-of-the-art approaches, our approach not only substantially improves the stem detection accuracy, i.e., distinguishing crop and weed stems, but also provides an improvement in the semantic segmentation performance.

    @InProceedings{lottes2018iros,
    author = {P. Lottes and J. Behley and N. Chebrolu and A. Milioto and C. Stachniss},
    title = {Joint Stem Detection and Crop-Weed Classification for Plant-specific Treatment in Precision Farming},
    booktitle = iros,
    year = 2018,
    url = {https://www.ipb.uni-bonn.de/pdfs/lottes18iros.pdf},
    videourl = {https://www.youtube.com/watch?v=C9mjZxE_Sxg},
    abstract = {Applying agrochemicals is the default procedure for conventional weed control in crop production, but has negative impacts on the environment. Robots have the potential to treat every plant in the field individually and thus can reduce the required use of such chemicals. To achieve that, robots need the ability to identify crops and weeds in the field and must additionally select effective treatments. While certain types of weed can be treated mechanically, other types need to be treated by (selective) spraying. In this paper, we present an approach that provides the necessary information for effective plant-specific treatment. It outputs the stem location for weeds, which allows for mechanical treatments, and the covered area of the weed for selective spraying. Our approach uses an end-to- end trainable fully convolutional network that simultaneously estimates stem positions as well as the covered area of crops and weeds. It jointly learns the class-wise stem detection and the pixel-wise semantic segmentation. Experimental evaluations on different real-world datasets show that our approach is able to reliably solve this problem. Compared to state-of-the-art approaches, our approach not only substantially improves the stem detection accuracy, i.e., distinguishing crop and weed stems, but also provides an improvement in the semantic segmentation performance.}
    }

  • A. Milioto, P. Lottes, and C. Stachniss, “Real-time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2018.
    [BibTeX] [PDF] [Video]

    Precision farming robots, which target to reduce the amount of herbicides that need to be brought out in the fields, must have the ability to identify crops and weeds in real time to trigger weeding actions. In this paper, we address the problem of CNN-based semantic segmentation of crop fields separating sugar beet plants, weeds, and background solely based on RGB data. We propose a CNN that exploits existing vegetation indexes and provides a classification in real time. Furthermore, it can be effectively re-trained to so far unseen fields with a comparably small amount of training data. We implemented and thoroughly evaluated our system on a real agricultural robot operating in different fields in Germany and Switzerland. The results show that our system generalizes well, can operate at around 20Hz, and is suitable for online operation in the fields.

    @InProceedings{milioto2018icra,
    author = {A. Milioto and P. Lottes and C. Stachniss},
    title = {Real-time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs},
    year = {2018},
    booktitle = icra,
    abstract = {Precision farming robots, which target to reduce the amount of herbicides that need to be brought out in the fields, must have the ability to identify crops and weeds in real time to trigger weeding actions. In this paper, we address the problem of CNN-based semantic segmentation of crop fields separating sugar beet plants, weeds, and background solely based on RGB data. We propose a CNN that exploits existing vegetation indexes and provides a classification in real time. Furthermore, it can be effectively re-trained to so far unseen fields with a comparably small amount of training data. We implemented and thoroughly evaluated our system on a real agricultural robot operating in different fields in Germany and Switzerland. The results show that our system generalizes well, can operate at around 20Hz, and is suitable for online operation in the fields.},
    url = {https://arxiv.org/abs/1709.06764},
    videourl = {https://youtu.be/DXcTkJmdWFQ},
    }

  • A. Walter, R. Khanna, P. Lottes, C. Stachniss, R. Siegwart, J. Nieto, and F. Liebisch, “Flourish – A robotic approach for automation in crop management,” in Proc. of the Intl. Conf. on Precision Agriculture (ICPA), 2018.
    [BibTeX] [PDF]

    The Flourish project aims to bridge the gap between current and desired capabilities of agricultural robots by developing an adaptable robotic solution for precision farming. Combining the aerial survey capabilities of a small autonomous multi-copter Unmanned Aerial Vehicle (UAV) with a multi-purpose agricultural Unmanned Ground Vehicle (UGV), the system will be able to survey a field from the air, perform targeted intervention on the ground, and provide detailed information for decision support, all with minimal user intervention. The system can be adapted to a wide range of farm management activities and to different crops by choosing different sensors, status indicators and ground treatment packages. The research project thereby touches a selection of topics addressed by ICPA such as sensor application in managing in-season crop variability, precision nutrient management and crop protection as well as remote sensing applications in precision agriculture and engineering technologies and advances. This contribution will introduce the Flourish consortium and concept using the results of three years of active development, testing, and measuring in field campaigns. Two key parts of the project will be shown in more detail: First, mapping of the field by drones for detection of sugar beet nitrogen status variation and weed pressure in the field and second the perception of the UGV as related to weed classification and subsequent precision weed management. The field mapping by means of an UAV will be shown for crop nitrogen status estimation and weed pressure with examples for subsequent crop management decision support. For nitrogen status, the results indicate that drones are up to the task to deliver crop nitrogen variability maps utilized for variable rate application that are of comparable quality to current on-tractor systems. The weed pressure mapping is viable as basis for the UGV showcase of precision weed management. For this, we show the automated image acquisition by the UGV and a subsequent plant classification with a four-step pipeline, differentiating crop from weed in real time. Advantages and disadvantages as well as future prospects of such approaches will be discussed.

    @InProceedings{walter2018icpa,
    Title = {Flourish - A robotic approach for automation in crop management},
    Author = {A. Walter and R. Khanna and P. Lottes and C. Stachniss and R. Siegwart and J. Nieto and F. Liebisch},
    Booktitle = icpa,
    Year = 2018,
    abstract = {The Flourish project aims to bridge the gap between current and desired capabilities of agricultural robots by developing an adaptable robotic solution for precision farming. Combining the aerial survey capabilities of a small autonomous multi-copter Unmanned Aerial Vehicle (UAV) with a multi-purpose agricultural Unmanned Ground Vehicle (UGV), the system will be able to survey a field from the air, perform targeted intervention on the ground, and provide detailed information for decision support, all with minimal user intervention. The system can be adapted to a wide range of farm management activities and to different crops by choosing different sensors, status indicators and ground treatment packages. The research project thereby touches a selection of topics addressed by ICPA such as sensor application in managing in-season crop variability, precision nutrient management and crop protection as well as remote sensing applications in precision agriculture and engineering technologies and advances. This contribution will introduce the Flourish consortium and concept using the results of three years of active development, testing, and measuring in field campaigns. Two key parts of the project will be shown in more detail: First, mapping of the field by drones for detection of sugar beet nitrogen status variation and weed pressure in the field and second the perception of the UGV as related to weed classification and subsequent precision weed management. The field mapping by means of an UAV will be shown for crop nitrogen status estimation and weed pressure with examples for subsequent crop management decision support. For nitrogen status, the results indicate that drones are up to the task to deliver crop nitrogen variability maps utilized for variable rate application that are of comparable quality to current on-tractor systems. The weed pressure mapping is viable as basis for the UGV showcase of precision weed management. For this, we show the automated image acquisition by the UGV and a subsequent plant classification with a four-step pipeline, differentiating crop from weed in real time. Advantages and disadvantages as well as future prospects of such approaches will be discussed.},
    }

2017

  • F. Liebisch, M. Popovic, J. Pfeifer, R. Khanna, P. Lottes, C. Stachniss, A. Pretto, I. S. Kyu, J. Nieto, R. Siegwart, and A. Walter, “Automatic UAV-based field inspection campaigns for weeding in row crops,” in Proc. of the 10th EARSeL SIG Imaging Spectroscopy Workshop, 2017.
    [BibTeX]
    @InProceedings{liebisch2017earsel,
    title = {Automatic UAV-based field inspection campaigns for weeding in row crops},
    author = {F. Liebisch and M. Popovic and J. Pfeifer and R. Khanna and P. Lottes and C. Stachniss and A. Pretto and S. In Kyu and J. Nieto and R. Siegwart and A. Walter},
    booktitle = {Proc. of the 10th EARSeL SIG Imaging Spectroscopy Workshop},
    year = {2017},
    }

  • P. Lottes, M. Höferlin, S. Sander, and C. Stachniss, “Effective Vision-based Classification for Separating Sugar Beets and Weeds for Precision Farming,” Journal of Field Robotics, vol. 34, pp. 1160-1178, 2017. doi:10.1002/rob.21675
    [BibTeX] [PDF]
    @Article{lottes2017jfr,
    title = {Effective Vision-based Classification for Separating Sugar Beets and Weeds for Precision Farming},
    author = {Lottes, Philipp and H\"oferlin, Markus and Sander, Slawomir and Stachniss, Cyrill},
    journal = {Journal of Field Robotics},
    year = {2017},
    volume = {34},
    issue = {6},
    pages = {1160-1178},
    doi = {10.1002/rob.21675},
    issn = {1556-4967},
    timestamp = {2016.10.5},
    url = {https://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/lottes16jfr.pdf},
    }

  • N. Chebrolu, P. Lottes, A. Schaefer, W. Winterhalter, W. Burgard, and C. Stachniss, “Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields,” The Intl. Journal of Robotics Research, 2017. doi:10.1177/0278364917720510
    [BibTeX] [PDF]
    @Article{chebrolu2017ijrr,
    title = {Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields},
    author = {N. Chebrolu and P. Lottes and A. Schaefer and W. Winterhalter and W. Burgard and C. Stachniss},
    journal = ijrr,
    year = {2017},
    doi = {10.1177/0278364917720510},
    url = {https://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/chebrolu2017ijrr.pdf},
    }

  • P. Lottes, R. Khanna, J. Pfeifer, R. Siegwart, and C. Stachniss, “UAV-Based Crop and Weed Classification for Smart Farming,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2017.
    [BibTeX] [PDF]
    @InProceedings{lottes2017icra,
    title = {UAV-Based Crop and Weed Classification for Smart Farming},
    author = {P. Lottes and R. Khanna and J. Pfeifer and R. Siegwart and C. Stachniss},
    booktitle = icra,
    year = {2017},
    url = {https://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/lottes17icra.pdf},
    }

  • P. Lottes and C. Stachniss, “Semi-Supervised Online Visual Crop and Weed Classification in Precision Farming Exploiting Plant Arrangement,” in Proc. of the IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems (IROS), 2017.
    [BibTeX] [PDF]
    @InProceedings{lottes2017iros,
    title = {Semi-Supervised Online Visual Crop and Weed Classification in Precision Farming Exploiting Plant Arrangement},
    author = {P. Lottes and C. Stachniss},
    booktitle = iros,
    year = {2017},
    url = {https://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/lottes17iros.pdf},
    }

  • A. Milioto, P. Lottes, and C. Stachniss, “Real-time Blob-wise Sugar Beets vs Weeds Classification for Monitoring Fields using Convolutional Neural Networks,” in Proc. of the ISPRS Conf. on Unmanned Aerial Vehicles in Geomatics (UAV-g), 2017.
    [BibTeX] [PDF]

    UAVs are becoming an important tool for field monitoring and precision farming. A prerequisite for observing and analyzing fields is the ability to identify crops and weeds from image data. In this paper, we address the problem of detecting the sugar beet plants and weeds in the field based solely on image data. We propose a system that combines vegetation detection and deep learning to obtain a high-quality classification of the vegetation in the field into value crops and weeds. We implemented and thoroughly evaluated our system on image data collected from different sugar beet fields and illustrate that our approach allows for accurately identifying the weeds on the field.

    @InProceedings{milioto2017uavg,
    title = {Real-time Blob-wise Sugar Beets vs Weeds Classification for Monitoring Fields using Convolutional Neural Networks},
    author = {A. Milioto and P. Lottes and C. Stachniss},
    booktitle = uavg,
    year = {2017},
    abstract = {UAVs are becoming an important tool for field monitoring and precision farming. A prerequisite for observing and analyzing fields is the ability to identify crops and weeds from image data. In this paper, we address the problem of detecting the sugar beet plants and weeds in the field based solely on image data. We propose a system that combines vegetation detection and deep learning to obtain a high-quality classification of the vegetation in the field into value crops and weeds. We implemented and thoroughly evaluated our system on image data collected from different sugar beet fields and illustrate that our approach allows for accurately identifying the weeds on the field.},
    url = {https://www.ipb.uni-bonn.de/pdfs/milioto17uavg.pdf},
    }

2016

  • F. Liebisch, J. Pfeifer, R. Khanna, P. Lottes, C. Stachniss, T. Falck, S. Sander, R. Siegwart, A. Walter, and E. Galceran, “Flourish – A robotic approach for automation in crop management,” in Proc. of the Workshop für Computer-Bildanalyse und unbemannte autonom fliegende Systeme in der Landwirtschaft, 2016.
    [BibTeX] [PDF]
    @InProceedings{liebisch16wslw,
    title = {Flourish -- A robotic approach for automation in crop management},
    author = {F. Liebisch and J. Pfeifer and R. Khanna and P. Lottes and C. Stachniss and T. Falck and S. Sander and R. Siegwart and A. Walter and E. Galceran},
    booktitle = {Proc. of the Workshop f\"ur Computer-Bildanalyse und unbemannte autonom fliegende Systeme in der Landwirtschaft},
    year = {2016},
    timestamp = {2016.06.15},
    url = {https://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/liebisch16cbaws.pdf},
    }

  • P. Lottes, M. Höferlin, S. Sander, M. Müter, P. Schulze-Lammers, and C. Stachniss, “An Effective Classification System for Separating Sugar Beets and Weeds for Precision Farming Applications,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2016.
    [BibTeX] [PDF]
    @InProceedings{lottes2016icra,
    title = {An Effective Classification System for Separating Sugar Beets and Weeds for Precision Farming Applications},
    author = {P. Lottes and M. H\"oferlin and S. Sander and M. M\"uter and P. Schulze-Lammers and C. Stachniss},
    booktitle = icra,
    year = {2016},
    timestamp = {2016.01.15},
    url = {https://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/lottes16icra.pdf},
    }

Other Articles

2017

  • P. Lottes, R. Khanna, J. Pfeifer, R. Siegwart, C. Stachniss, “UAV-Based Crop and Weed Classification for Future Farming”, RoboHub, 06/2017
  • H. Blum, R. Pude, P. Lottes, S. Brell, “Mechanische Unkrautregulierung im Arznei- und Gewürzpflanzenanbau am Campus Klein-Altendorf”, in Zeitschrift für Arznei- und Gewürzpflanzen (Journal of Medicinial and Spice Plants (ZAG) , 01/2017

2016

  • P. Lottes, “Roboter auf dem Feld”, in Gartenbau Profi, Monatszeitschrift für Obst, Gemüse und Zierpflanzen, 11/2016

2014

  • L. Klingbeil, P. Lottes, H. and Kuhlmann, “Laserscanning-Technologie auf sich bewegenden Platformen”, in Schriftenreihe des DVW Terrestrisches Laserscanning, 2014

Project Reports

2015

  • P. Lottes, “Residuenanalyse zur Detektion von Deformationen am Radioteleskop in Effelsberg”, 2015.
    [PDF]
  • P. Lottes, “Systemkalibrierung eines Profillaserscanners in einem Mobilen-Mapping-System in der Praxis”, 2015.
    [PDF]

2014

  • P. Lottes, “Systemkalibrierung profilmessender Laserscanner in Mobilen-Mapping-Systemen innerhalb einer Testfeldumgebung”, 2014.
    [PDF]