Philipp Lottes

PhD Student
Contact:
Email: philipp.lottes@nulligg.uni-bonn.de
Tel: +49 - 228 - 73 - 29 03
Fax: +49 - 228 - 73 - 27 12
Office: Nussallee 15, 1. OG, room 1.007
Address:
University of Bonn
Photogrammetry, IGG
Nussallee 15
53115 Bonn
Profiles: Google Scholar | LinkedIn | Research Gate

Short CV

Philipp Lottes is a PhD student at the Photogrammetry Lab at the University of Bonn since November 2015. He received his master’s degree at the Institute of Geodesy and Geoinformation in 2015. During his master studies he was working as an assistant for the Institute of Geodesy and Geoinformation as well as for the Photogrammetry Lab. Before moving to Bonn, he finished his bachelor studies in Surveying Engineering in 2012 at the Bochum University of Applied Sciences and subsequently worked for the Marx Ingenieurgesellschaft mbH as surveying engineer for 1,5 years.
He is currently working as research assistant for the EU funded project FLOURISH. In his research, he focuses on approaches based on machine learning and probabilistic techniques in order to develop plant classification systems for agricultural ground robots as well as unmanned aerial robots. He is further interested in the fields of unsupervised learning, transfer learning and deep learning.

Research Interests

  • Machine Learning
  • Visual and Laser Perception for Robotics
  • Computer Vision

Projects

  • FLOURISH – Developing an adaptable robotic solution for precision farming applications. By combining the aerial survey capabilities of a small autonomous multi-copter Unmanned Aerial Vehicle (UAV) with a multi-purpose agricultural Unmanned Ground Vehicle (UGV), the system will be able to survey a field from the air, perform targeted intervention on the ground, and provide detailed information for decision support, all with minimal user intervention.
    Regarding the project I am currently developing a classification system for both the UGV and UAV robot in order to enable them to identify crops and weeds in the field.
    Demo video of the online sugar beets vs. weed classification on the field.

Teaching

  • Exercises for 3D-Coordinate systems, ws 2014/2015
  • Exercises for 3D-Coordinate systems, ws 2015/2016
  • Solving online perception problems in ROS, ws 2015/2016

Publications

2017

  • F. Liebisch, M. Popovic, J. Pfeifer, R. Khanna, P. Lottes, C. Stachniss, A. Pretto, I. S. Kyu, J. Nieto, R. Siegwart, and A. Walter, “Automatic UAV-based field inspection campaigns for weeding in row crops,” in Proceedings of the 10th EARSeL SIG Imaging Spectroscopy Workshop , 2017.
    [BibTeX]
    @InProceedings{liebisch17earsel,
    Title = {Automatic UAV-based field inspection campaigns for weeding in row crops},
    Author = {F. Liebisch and M. Popovic and J. Pfeifer and R. Khanna and P. Lottes and C. Stachniss and A. Pretto and S. In Kyu and J. Nieto and R. Siegwart and A. Walter},
    Booktitle = {Proceedings of the 10th EARSeL SIG Imaging Spectroscopy Workshop},
    Year = {2017}
    }

  • N. Chebrolu, P. Lottes, A. Schaefer, W. Winterhalter, W. Burgard, and C. Stachniss, “Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields,” The International Journal of Robotics Research, 2017. doi:10.1177/0278364917720510
    [BibTeX] [PDF]
    @Article{chebrolu2017ijrr,
    Title = {Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields},
    Author = {N. Chebrolu and P. Lottes and A. Schaefer and W. Winterhalter and W. Burgard and C. Stachniss},
    Journal = ijrr,
    Year = {2017},
    Doi = {10.1177/0278364917720510},
    Url = {http://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/chebrolu2017ijrr.pdf}
    }

  • P. Lottes, R. Khanna, J. Pfeifer, R. Siegwart, and C. Stachniss, “UAV-Based Crop and Weed Classification for Smart Farming,” in Proceedings of the IEEE Int. Conf. on Robotics & Automation (ICRA) , 2017.
    [BibTeX] [PDF]
    @InProceedings{lottes17icra,
    Title = {UAV-Based Crop and Weed Classification for Smart Farming},
    Author = {P. Lottes and R. Khanna and J. Pfeifer and R. Siegwart and C. Stachniss},
    Booktitle = ICRA,
    Year = {2017},
    Url = {http://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/lottes17icra.pdf}
    }

  • P. Lottes and C. Stachniss, “Semi-Supervised Online Visual Crop and Weed Classification in Precision Farming Exploiting Plant Arrangement,” in Proceedings of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS) , 2017.
    [BibTeX] [PDF]
    @InProceedings{lottes17iros,
    Title = {Semi-Supervised Online Visual Crop and Weed Classification in Precision Farming Exploiting Plant Arrangement},
    Author = {P. Lottes and C. Stachniss},
    Booktitle = IROS,
    Year = {2017},
    Url = {http://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/lottes17iros.pdf}
    }

  • A. Milioto, P. Lottes, and C. Stachniss, “Real-time Blob-wise Sugar Beets vs Weeds Classification for Monitoring Fields using Convolutional Neural Networks,” in Proceedings of the ISPRS Conference on Unmanned Aerial Vehicles in Geomatics (UAV-g) , 2017.
    [BibTeX] [PDF]
    UAVs are becoming an important tool for field monitoring and precision farming. A prerequisite for observing and analyzing fields is the ability to identify crops and weeds from image data. In this paper, we address the problem of detecting the sugar beet plants and weeds in the field based solely on image data. We propose a system that combines vegetation detection and deep learning to obtain a high-quality classification of the vegetation in the field into value crops and weeds. We implemented and thoroughly evaluated our system on image data collected from different sugar beet fields and illustrate that our approach allows for accurately identifying the weeds on the field.

    @InProceedings{milioto17uavg,
    Title = {Real-time Blob-wise Sugar Beets vs Weeds Classification for Monitoring Fields using Convolutional Neural Networks},
    Author = {A. Milioto and P. Lottes and C. Stachniss},
    Booktitle = uavg,
    Year = {2017},
    Abstract = {UAVs are becoming an important tool for field monitoring and precision farming. A prerequisite for observing and analyzing fields is the ability to identify crops and weeds from image data. In this paper, we address the problem of detecting the sugar beet plants and weeds in the field based solely on image data. We propose a system that combines vegetation detection and deep learning to obtain a high-quality classification of the vegetation in the field into value crops and weeds. We implemented and thoroughly evaluated our system on image data collected from different sugar beet fields and illustrate that our approach allows for accurately identifying the weeds on the field.},
    url = {http://www.ipb.uni-bonn.de/pdfs/milioto17uavg.pdf}
    }

  • A. Milioto, P. Lottes, and C. Stachniss, “Real-time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs,” arXiv preprint:1709.06764, 2017.
    [BibTeX] [PDF]
    Precision farming robots, which target to reduce the amount of herbicides that need to be brought out in the fields, must have the ability to identify crops and weeds in real time to trigger weeding actions. In this paper, we address the problem of CNN-based semantic segmentation of crop fields separating sugar beet plants, weeds, and background solely based on RGB data. We propose a CNN that exploits existing vegetation indexes and provides a classification in real time. Furthermore, it can be effectively re-trained to so far unseen fields with a comparably small amount of training data. We implemented and thoroughly evaluated our system on a real agricultural robot operating in different fields in Germany and Switzerland. The results show that our system generalizes well, can operate at around 20Hz, and is suitable for online operation in the fields.

    @Article{milioto17arxiv,
    Author = {A. Milioto and P. Lottes and C. Stachniss},
    Title = {Real-time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs},
    Year = {2017},
    Journal = {arXiv preprint:1709.06764},
    Abstract = {Precision farming robots, which target to reduce the amount of herbicides that need to be brought out in the fields, must have the ability to identify crops and weeds in real time to trigger weeding actions. In this paper, we address the problem of CNN-based semantic segmentation of crop fields separating sugar beet plants, weeds, and background solely based on RGB data. We propose a CNN that exploits existing vegetation indexes and provides a classification in real time. Furthermore, it can be effectively re-trained to so far unseen fields with a comparably small amount of training data. We implemented and thoroughly evaluated our system on a real agricultural robot operating in different fields in Germany and Switzerland. The results show that our system generalizes well, can operate at around 20Hz, and is suitable for online operation in the fields.},
    Url = {https://arxiv.org/abs/1709.06764}
    }

2016

  • F. Liebisch, J. Pfeifer, R. Khanna, P. Lottes, C. Stachniss, T. Falck, S. Sander, R. Siegwart, A. Walter, and E. Galceran, “Flourish — A robotic approach for automation in crop management,” in Proceedings of the Workshop für Computer-Bildanalyse und unbemannte autonom fliegende Systeme in der Landwirtschaft , 2016.
    [BibTeX] [PDF]
    @InProceedings{liebisch16wslw,
    Title = {Flourish -- A robotic approach for automation in crop management},
    Author = {F. Liebisch and J. Pfeifer and R. Khanna and P. Lottes and C. Stachniss and T. Falck and S. Sander and R. Siegwart and A. Walter and E. Galceran},
    Booktitle = {Proceedings of the Workshop f\"ur Computer-Bildanalyse und unbemannte autonom fliegende Systeme in der Landwirtschaft},
    Year = {2016},
    Timestamp = {2016.06.15},
    Url = {http://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/liebisch16cbaws.pdf}
    }

  • P. Lottes, M. Höferlin, S. Sander, M. Müter, P. Schulze-Lammers, and C. Stachniss, “An Effective Classification System for Separating Sugar Beets and Weeds for Precision Farming Applications,” in Proceedings of the IEEE Int. Conf. on Robotics & Automation (ICRA) , 2016.
    [BibTeX] [PDF]
    @InProceedings{lottes16icra,
    Title = {An Effective Classification System for Separating Sugar Beets and Weeds for Precision Farming Applications},
    Author = {P. Lottes and M. H\"oferlin and S. Sander and M. M\"uter and P. Schulze-Lammers and C. Stachniss},
    Booktitle = ICRA,
    Year = {2016},
    Timestamp = {2016.01.15},
    Url = {http://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/lottes16icra.pdf}
    }

  • P. Lottes, M. Höferlin, S. Sander, and C. Stachniss, “Effective Vision-based Classification for Separating Sugar Beets and Weeds for Precision Farming,” Journal of Field Robotics, 2016. doi:10.1002/rob.21675
    [BibTeX] [PDF]
    @Article{lottes16jfr,
    Title = {Effective Vision-based Classification for Separating Sugar Beets and Weeds for Precision Farming},
    Author = {Lottes, Philipp and H\"oferlin, Markus and Sander, Slawomir and Stachniss, Cyrill},
    Journal = {Journal of Field Robotics},
    Year = {2016},
    Doi = {10.1002/rob.21675},
    ISSN = {1556-4967},
    Timestamp = {2016.10.5},
    Url = {http://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/lottes16jfr.pdf}
    }

Awards

  • Winner of Best Paper Award in Automation of the IEEE Robotics and Automation Society at ICRA 2017, “UAV-Based Crop and Weed Classification for Smart Farming”

Other Articles

2017

  • P. Lottes, R. Khanna, J. Pfeifer, R. Siegwart, C. Stachniss, “UAV-Based Crop and Weed Classification for Future Farming”, RoboHub, 06/2017
  • H. Blum, R. Pude, P. Lottes, S. Brell, “Mechanische Unkrautregulierung im Arznei- und Gewürzpflanzenanbau am Campus Klein-Altendorf”, in Zeitschrift für Arznei- und Gewürzpflanzen (Journal of Medicinial and Spice Plants (ZAG) , 01/2017

2016

  • P. Lottes, “Roboter auf dem Feld”, in Gartenbau Profi, Monatszeitschrift für Obst, Gemüse und Zierpflanzen, 11/2016

2014

  • L. Klingbeil, P. Lottes, H. and Kuhlmann, “Laserscanning-Technologie auf sich bewegenden Platformen”, in Schriftenreihe des DVW Terrestrisches Laserscanning, 2014

Project Reports

2015

  • P. Lottes, “Residuenanalyse zur Detektion von Deformationen am Radioteleskop in Effelsberg”, 2015.
    [PDF]
  • P. Lottes, “Systemkalibrierung eines Profillaserscanners in einem Mobilen-Mapping-System in der Praxis”, 2015.
    [PDF]

2014

  • P. Lottes, “Systemkalibrierung profilmessender Laserscanner in Mobilen-Mapping-Systemen innerhalb einer Testfeldumgebung”, 2014.
    [PDF]
igg
Institute of Geodesy
and Geoinformation
lwf
Faculty of Agriculture
ubn
University of Bonn