Philipp Lottes

PhD Student
Contact:
Email: philipp.lottes@nulligg.uni-bonn.de
Tel: +49 – 228 – 73 – 29 03
Fax: +49 – 228 – 73 – 27 12
Office: Nussallee 15, 1. OG, room 1.007
Address:
University of Bonn
Photogrammetry, IGG
Nussallee 15
53115 Bonn
Profiles: Google Scholar | LinkedIn | Research Gate

Short CV

Philipp Lottes is a PhD student at the Photogrammetry Lab at the University of Bonn since November 2015. He received his master’s degree at the Institute of Geodesy and Geoinformation in 2015. During his master studies he was working as an assistant for the Institute of Geodesy and Geoinformation as well as for the Photogrammetry Lab. Before moving to Bonn, he finished his bachelor studies in Surveying Engineering in 2012 at the Bochum University of Applied Sciences and subsequently worked for the Marx Ingenieurgesellschaft mbH as surveying engineer for 1,5 years.
He is currently working as research assistant for the EU funded project FLOURISH. In his research, he focuses on approaches based on machine learning and probabilistic techniques in order to develop plant classification systems for agricultural ground robots as well as unmanned aerial robots. He is further interested in the fields of unsupervised learning, transfer learning and deep learning.

Research Interests

  • Machine Learning
  • Computer Vision
  • Visual and Laser Perception for Robotics

Projects

  • FLOURISH – Developing an adaptable robotic solution for precision farming applications. By combining the aerial survey capabilities of a small autonomous multi-copter Unmanned Aerial Vehicle (UAV) with a multi-purpose agricultural Unmanned Ground Vehicle (UGV), the system will be able to survey a field from the air, perform targeted intervention on the ground, and provide detailed information for decision support, all with minimal user intervention.
    Regarding the project I am currently developing a classification system for both the UGV and UAV robot in order to enable them to identify crops and weeds in the field.
    Demo video of the online sugar beets vs. weed classification on the field.

Teaching

  • Exercises for 3D-Coordinate systems, ws 2014/2015
  • Exercises for 3D-Coordinate systems, ws 2015/2016
  • Solving online perception problems in ROS, ws 2015/2016
  • Exercises for 3D-Coordinate systems, ws 2016/2017
  • Exercises for 3D-Coordinate systems, ws 2017/2018
  • Master-Project: Automated Field Analysis for Crop Farming, SS 2018

Publications

2018

  • I. Sa, M. Popovic, R. Khanna, Z. Chen, P. Lottes, F. Liebisch, J. Nieto, C. Stachniss, and R. Siegwart, “WeedMap: A Large-Scale Semantic Weed Mapping Framework Using Aerial Multispectral Imaging and Deep Neural Network for Precision Farming,” , vol. 10, 2018. doi:10.3390/rs10091423
    [BibTeX] [PDF]

    {The ability to automatically monitor agricultural fields is an important capability in precision farming, enabling steps towards more sustainable agriculture. Precise, high-resolution monitoring is a key prerequisite for targeted intervention and the selective application of agro-chemicals. The main goal of this paper is developing a novel crop/weed segmentation and mapping framework that processes multispectral images obtained from an unmanned aerial vehicle (UAV) using a deep neural network (DNN). Most studies on crop/weed semantic segmentation only consider single images for processing and classification. Images taken by UAVs often cover only a few hundred square meters with either color only or color and near-infrared (NIR) channels. Although a map can be generated by processing single segmented images incrementally, this requires additional complex information fusion techniques which struggle to handle high fidelity maps due to their computational costs and problems in ensuring global consistency. Moreover, computing a single large and accurate vegetation map (e.g., crop/weed) using a DNN is non-trivial due to difficulties arising from: (1) limited ground sample distances (GSDs) in high-altitude datasets, (2) sacrificed resolution resulting from downsampling high-fidelity images, and (3) multispectral image alignment. To address these issues, we adopt a stand sliding window approach that operates on only small portions of multispectral orthomosaic maps (tiles), which are channel-wise aligned and calibrated radiometrically across the entire map. We define the tile size to be the same as that of the DNN input to avoid resolution loss. Compared to our baseline model (i.e., SegNet with 3 channel RGB inputs) yielding an area under the curve (AUC) of [background=0.607

    @Article{sa2018rs,
    author = {I. Sa and M. Popovic and R. Khanna and Z. Chen and P. Lottes and F. Liebisch and J. Nieto and C. Stachniss and R. Siegwart},
    title = {{WeedMap: A Large-Scale Semantic Weed Mapping Framework Using Aerial Multispectral Imaging and Deep Neural Network for Precision Farming}},
    journal = rs,
    year = 2018,
    volume = 10,
    issue = 9,
    url = {http://www.mdpi.com/2072-4292/10/9/1423/pdf},
    doi = {10.3390/rs10091423},
    abstract = {The ability to automatically monitor agricultural fields is an important capability in precision farming, enabling steps towards more sustainable agriculture. Precise, high-resolution monitoring is a key prerequisite for targeted intervention and the selective application of agro-chemicals. The main goal of this paper is developing a novel crop/weed segmentation and mapping framework that processes multispectral images obtained from an unmanned aerial vehicle (UAV) using a deep neural network (DNN). Most studies on crop/weed semantic segmentation only consider single images for processing and classification. Images taken by UAVs often cover only a few hundred square meters with either color only or color and near-infrared (NIR) channels. Although a map can be generated by processing single segmented images incrementally, this requires additional complex information fusion techniques which struggle to handle high fidelity maps due to their computational costs and problems in ensuring global consistency. Moreover, computing a single large and accurate vegetation map (e.g., crop/weed) using a DNN is non-trivial due to difficulties arising from: (1) limited ground sample distances (GSDs) in high-altitude datasets, (2) sacrificed resolution resulting from downsampling high-fidelity images, and (3) multispectral image alignment. To address these issues, we adopt a stand sliding window approach that operates on only small portions of multispectral orthomosaic maps (tiles), which are channel-wise aligned and calibrated radiometrically across the entire map. We define the tile size to be the same as that of the DNN input to avoid resolution loss. Compared to our baseline model (i.e., SegNet with 3 channel RGB inputs) yielding an area under the curve (AUC) of [background=0.607, crop=0.681, weed=0.576], our proposed model with 9 input channels achieves [0.839, 0.863, 0.782]. Additionally, we provide an extensive analysis of 20 trained models, both qualitatively and quantitatively, in order to evaluate the effects of varying input channels and tunable network hyperparameters. Furthermore, we release a large sugar beet/weed aerial dataset with expertly guided annotations for further research in the fields of remote sensing, precision agriculture, and agricultural robotics.},
    }

  • P. Lottes, J. Behley, A. Milioto, and C. Stachniss, “Fully Convolutional Networks with Sequential Information for Robust Crop and Weed Detection in Precision Farming,” IEEE Robotics and Automation Letters (RA-L), vol. 3, pp. 3097-3104, 2018. doi:10.1109/LRA.2018.2846289
    [BibTeX] [PDF] [Video]
    @Article{lottes2018ral,
    author = {P. Lottes and J. Behley and A. Milioto and C. Stachniss},
    title = {Fully Convolutional Networks with Sequential Information for Robust Crop and Weed Detection in Precision Farming},
    journal = {IEEE Robotics and Automation Letters (RA-L)},
    year = {2018},
    volume = {3},
    issue = {4},
    pages = {3097-3104},
    doi = {10.1109/LRA.2018.2846289},
    url = {http://www.ipb.uni-bonn.de/pdfs/lottes2018ral.pdf},
    videourl = {https://www.youtube.com/watch?v=vTepw9HRLh8},
    }

  • P. Lottes, J. Behley, N. Chebrolu, A. Milioto, and C. Stachniss, “Joint Stem Detection and Crop-Weed Classification for Plant-specific Treatment in Precision Farming,” in Proceedings of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS) , 2018.
    [BibTeX] [PDF] [Video]

    Applying agrochemicals is the default procedure for conventional weed control in crop production, but has negative impacts on the environment. Robots have the potential to treat every plant in the field individually and thus can reduce the required use of such chemicals. To achieve that, robots need the ability to identify crops and weeds in the field and must additionally select effective treatments. While certain types of weed can be treated mechanically, other types need to be treated by (selective) spraying. In this paper, we present an approach that provides the necessary information for effective plant-specific treatment. It outputs the stem location for weeds, which allows for mechanical treatments, and the covered area of the weed for selective spraying. Our approach uses an end-to- end trainable fully convolutional network that simultaneously estimates stem positions as well as the covered area of crops and weeds. It jointly learns the class-wise stem detection and the pixel-wise semantic segmentation. Experimental evaluations on different real-world datasets show that our approach is able to reliably solve this problem. Compared to state-of-the-art approaches, our approach not only substantially improves the stem detection accuracy, i.e., distinguishing crop and weed stems, but also provides an improvement in the semantic segmentation performance.

    @InProceedings{lottes2018iros,
    author = {P. Lottes and J. Behley and N. Chebrolu and A. Milioto and C. Stachniss},
    title = {Joint Stem Detection and Crop-Weed Classification for Plant-specific Treatment in Precision Farming},
    booktitle = {Proceedings of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS)},
    year = 2018,
    url = {http://www.ipb.uni-bonn.de/pdfs/lottes18iros.pdf},
    videourl = {https://www.youtube.com/watch?v=C9mjZxE_Sxg},
    abstract = {Applying agrochemicals is the default procedure for conventional weed control in crop production, but has negative impacts on the environment. Robots have the potential to treat every plant in the field individually and thus can reduce the required use of such chemicals. To achieve that, robots need the ability to identify crops and weeds in the field and must additionally select effective treatments. While certain types of weed can be treated mechanically, other types need to be treated by (selective) spraying. In this paper, we present an approach that provides the necessary information for effective plant-specific treatment. It outputs the stem location for weeds, which allows for mechanical treatments, and the covered area of the weed for selective spraying. Our approach uses an end-to- end trainable fully convolutional network that simultaneously estimates stem positions as well as the covered area of crops and weeds. It jointly learns the class-wise stem detection and the pixel-wise semantic segmentation. Experimental evaluations on different real-world datasets show that our approach is able to reliably solve this problem. Compared to state-of-the-art approaches, our approach not only substantially improves the stem detection accuracy, i.e., distinguishing crop and weed stems, but also provides an improvement in the semantic segmentation performance.}
    }

  • A. Milioto, P. Lottes, and C. Stachniss, “Real-time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs,” in Proceedings of the IEEE Int. Conf. on Robotics & Automation (ICRA) , 2018.
    [BibTeX] [PDF] [Video]

    Precision farming robots, which target to reduce the amount of herbicides that need to be brought out in the fields, must have the ability to identify crops and weeds in real time to trigger weeding actions. In this paper, we address the problem of CNN-based semantic segmentation of crop fields separating sugar beet plants, weeds, and background solely based on RGB data. We propose a CNN that exploits existing vegetation indexes and provides a classification in real time. Furthermore, it can be effectively re-trained to so far unseen fields with a comparably small amount of training data. We implemented and thoroughly evaluated our system on a real agricultural robot operating in different fields in Germany and Switzerland. The results show that our system generalizes well, can operate at around 20Hz, and is suitable for online operation in the fields.

    @InProceedings{milioto2018icra,
    author = {A. Milioto and P. Lottes and C. Stachniss},
    title = {Real-time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs},
    year = {2018},
    booktitle = {Proceedings of the IEEE Int. Conf. on Robotics \& Automation (ICRA)},
    abstract = {Precision farming robots, which target to reduce the amount of herbicides that need to be brought out in the fields, must have the ability to identify crops and weeds in real time to trigger weeding actions. In this paper, we address the problem of CNN-based semantic segmentation of crop fields separating sugar beet plants, weeds, and background solely based on RGB data. We propose a CNN that exploits existing vegetation indexes and provides a classification in real time. Furthermore, it can be effectively re-trained to so far unseen fields with a comparably small amount of training data. We implemented and thoroughly evaluated our system on a real agricultural robot operating in different fields in Germany and Switzerland. The results show that our system generalizes well, can operate at around 20Hz, and is suitable for online operation in the fields.},
    url = {https://arxiv.org/abs/1709.06764},
    videourl = {https://youtu.be/DXcTkJmdWFQ},
    }

  • A. Walter, R. Khanna, P. Lottes, C. Stachniss, R. Siegwart, J. Nieto, and F. Liebisch, “Flourish – A robotic approach for automation in crop management,” in Proceedings of the Intl. Conference on Precision Agriculture (ICPA) , 2018.
    [BibTeX]

    The Flourish project aims to bridge the gap between current and desired capabilities of agricultural robots by developing an adaptable robotic solution for precision farming. Combining the aerial survey capabilities of a small autonomous multi-copter Unmanned Aerial Vehicle (UAV) with a multi-purpose agricultural Unmanned Ground Vehicle (UGV), the system will be able to survey a field from the air, perform targeted intervention on the ground, and provide detailed information for decision support, all with minimal user intervention. The system can be adapted to a wide range of farm management activities and to different crops by choosing different sensors, status indicators and ground treatment packages. The research project thereby touches a selection of topics addressed by ICPA such as sensor application in managing in-season crop variability, precision nutrient management and crop protection as well as remote sensing applications in precision agriculture and engineering technologies and advances. This contribution will introduce the Flourish consortium and concept using the results of three years of active development, testing, and measuring in field campaigns. Two key parts of the project will be shown in more detail: First, mapping of the field by drones for detection of sugar beet nitrogen status variation and weed pressure in the field and second the perception of the UGV as related to weed classification and subsequent precision weed management. The field mapping by means of an UAV will be shown for crop nitrogen status estimation and weed pressure with examples for subsequent crop management decision support. For nitrogen status, the results indicate that drones are up to the task to deliver crop nitrogen variability maps utilized for variable rate application that are of comparable quality to current on-tractor systems. The weed pressure mapping is viable as basis for the UGV showcase of precision weed management. For this, we show the automated image acquisition by the UGV and a subsequent plant classification with a four-step pipeline, differentiating crop from weed in real time. Advantages and disadvantages as well as future prospects of such approaches will be discussed.

    @InProceedings{walter2018icpa,
    Title = {Flourish - A robotic approach for automation in crop management},
    Author = {A. Walter and R. Khanna and P. Lottes and C. Stachniss and R. Siegwart and J. Nieto and F. Liebisch},
    Booktitle = icpa,
    Year = 2018,
    abstract = {The Flourish project aims to bridge the gap between current and desired capabilities of agricultural robots by developing an adaptable robotic solution for precision farming. Combining the aerial survey capabilities of a small autonomous multi-copter Unmanned Aerial Vehicle (UAV) with a multi-purpose agricultural Unmanned Ground Vehicle (UGV), the system will be able to survey a field from the air, perform targeted intervention on the ground, and provide detailed information for decision support, all with minimal user intervention. The system can be adapted to a wide range of farm management activities and to different crops by choosing different sensors, status indicators and ground treatment packages. The research project thereby touches a selection of topics addressed by ICPA such as sensor application in managing in-season crop variability, precision nutrient management and crop protection as well as remote sensing applications in precision agriculture and engineering technologies and advances. This contribution will introduce the Flourish consortium and concept using the results of three years of active development, testing, and measuring in field campaigns. Two key parts of the project will be shown in more detail: First, mapping of the field by drones for detection of sugar beet nitrogen status variation and weed pressure in the field and second the perception of the UGV as related to weed classification and subsequent precision weed management. The field mapping by means of an UAV will be shown for crop nitrogen status estimation and weed pressure with examples for subsequent crop management decision support. For nitrogen status, the results indicate that drones are up to the task to deliver crop nitrogen variability maps utilized for variable rate application that are of comparable quality to current on-tractor systems. The weed pressure mapping is viable as basis for the UGV showcase of precision weed management. For this, we show the automated image acquisition by the UGV and a subsequent plant classification with a four-step pipeline, differentiating crop from weed in real time. Advantages and disadvantages as well as future prospects of such approaches will be discussed.},
    }

2017

  • F. Liebisch, M. Popovic, J. Pfeifer, R. Khanna, P. Lottes, C. Stachniss, A. Pretto, I. S. Kyu, J. Nieto, R. Siegwart, and A. Walter, “Automatic UAV-based field inspection campaigns for weeding in row crops,” in Proceedings of the 10th EARSeL SIG Imaging Spectroscopy Workshop , 2017.
    [BibTeX]
    @InProceedings{liebisch2017earsel,
    title = {Automatic UAV-based field inspection campaigns for weeding in row crops},
    author = {F. Liebisch and M. Popovic and J. Pfeifer and R. Khanna and P. Lottes and C. Stachniss and A. Pretto and S. In Kyu and J. Nieto and R. Siegwart and A. Walter},
    booktitle = {Proceedings of the 10th EARSeL SIG Imaging Spectroscopy Workshop},
    year = {2017},
    }

  • P. Lottes, M. Höferlin, S. Sander, and C. Stachniss, “Effective Vision-based Classification for Separating Sugar Beets and Weeds for Precision Farming,” Journal of Field Robotics, vol. 34, pp. 1160-1178, 2017. doi:10.1002/rob.21675
    [BibTeX] [PDF]
    @Article{lottes2017jfr,
    title = {Effective Vision-based Classification for Separating Sugar Beets and Weeds for Precision Farming},
    author = {Lottes, Philipp and H\"oferlin, Markus and Sander, Slawomir and Stachniss, Cyrill},
    journal = {Journal of Field Robotics},
    year = {2017},
    volume = {34},
    issue = {6},
    pages = {1160-1178},
    doi = {10.1002/rob.21675},
    issn = {1556-4967},
    timestamp = {2016.10.5},
    url = {http://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/lottes16jfr.pdf},
    }

  • N. Chebrolu, P. Lottes, A. Schaefer, W. Winterhalter, W. Burgard, and C. Stachniss, “Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields,” The International Journal of Robotics Research, 2017. doi:10.1177/0278364917720510
    [BibTeX] [PDF]
    @Article{chebrolu2017ijrr,
    title = {Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields},
    author = {N. Chebrolu and P. Lottes and A. Schaefer and W. Winterhalter and W. Burgard and C. Stachniss},
    journal = ijrr,
    year = {2017},
    doi = {10.1177/0278364917720510},
    url = {http://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/chebrolu2017ijrr.pdf},
    }

  • P. Lottes, R. Khanna, J. Pfeifer, R. Siegwart, and C. Stachniss, “UAV-Based Crop and Weed Classification for Smart Farming,” in Proceedings of the IEEE Int. Conf. on Robotics & Automation (ICRA) , 2017.
    [BibTeX] [PDF]
    @InProceedings{lottes2017icra,
    title = {UAV-Based Crop and Weed Classification for Smart Farming},
    author = {P. Lottes and R. Khanna and J. Pfeifer and R. Siegwart and C. Stachniss},
    booktitle = icra,
    year = {2017},
    url = {http://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/lottes17icra.pdf},
    }

  • P. Lottes and C. Stachniss, “Semi-Supervised Online Visual Crop and Weed Classification in Precision Farming Exploiting Plant Arrangement,” in Proceedings of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS) , 2017.
    [BibTeX] [PDF]
    @InProceedings{lottes2017iros,
    title = {Semi-Supervised Online Visual Crop and Weed Classification in Precision Farming Exploiting Plant Arrangement},
    author = {P. Lottes and C. Stachniss},
    booktitle = iros,
    year = {2017},
    url = {http://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/lottes17iros.pdf},
    }

  • A. Milioto, P. Lottes, and C. Stachniss, “Real-time Blob-wise Sugar Beets vs Weeds Classification for Monitoring Fields using Convolutional Neural Networks,” in Proceedings of the ISPRS Conference on Unmanned Aerial Vehicles in Geomatics (UAV-g) , 2017.
    [BibTeX] [PDF]

    UAVs are becoming an important tool for field monitoring and precision farming. A prerequisite for observing and analyzing fields is the ability to identify crops and weeds from image data. In this paper, we address the problem of detecting the sugar beet plants and weeds in the field based solely on image data. We propose a system that combines vegetation detection and deep learning to obtain a high-quality classification of the vegetation in the field into value crops and weeds. We implemented and thoroughly evaluated our system on image data collected from different sugar beet fields and illustrate that our approach allows for accurately identifying the weeds on the field.

    @InProceedings{milioto2017uavg,
    title = {Real-time Blob-wise Sugar Beets vs Weeds Classification for Monitoring Fields using Convolutional Neural Networks},
    author = {A. Milioto and P. Lottes and C. Stachniss},
    booktitle = uavg,
    year = {2017},
    abstract = {UAVs are becoming an important tool for field monitoring and precision farming. A prerequisite for observing and analyzing fields is the ability to identify crops and weeds from image data. In this paper, we address the problem of detecting the sugar beet plants and weeds in the field based solely on image data. We propose a system that combines vegetation detection and deep learning to obtain a high-quality classification of the vegetation in the field into value crops and weeds. We implemented and thoroughly evaluated our system on image data collected from different sugar beet fields and illustrate that our approach allows for accurately identifying the weeds on the field.},
    url = {http://www.ipb.uni-bonn.de/pdfs/milioto17uavg.pdf},
    }

2016

  • F. Liebisch, J. Pfeifer, R. Khanna, P. Lottes, C. Stachniss, T. Falck, S. Sander, R. Siegwart, A. Walter, and E. Galceran, “Flourish — A robotic approach for automation in crop management,” in Proceedings of the Workshop für Computer-Bildanalyse und unbemannte autonom fliegende Systeme in der Landwirtschaft , 2016.
    [BibTeX] [PDF]
    @InProceedings{liebisch16wslw,
    title = {Flourish -- A robotic approach for automation in crop management},
    author = {F. Liebisch and J. Pfeifer and R. Khanna and P. Lottes and C. Stachniss and T. Falck and S. Sander and R. Siegwart and A. Walter and E. Galceran},
    booktitle = {Proceedings of the Workshop f\"ur Computer-Bildanalyse und unbemannte autonom fliegende Systeme in der Landwirtschaft},
    year = {2016},
    timestamp = {2016.06.15},
    url = {http://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/liebisch16cbaws.pdf},
    }

  • P. Lottes, M. Höferlin, S. Sander, M. Müter, P. Schulze-Lammers, and C. Stachniss, “An Effective Classification System for Separating Sugar Beets and Weeds for Precision Farming Applications,” in Proceedings of the IEEE Int. Conf. on Robotics & Automation (ICRA) , 2016.
    [BibTeX] [PDF]
    @InProceedings{lottes2016icra,
    title = {An Effective Classification System for Separating Sugar Beets and Weeds for Precision Farming Applications},
    author = {P. Lottes and M. H\"oferlin and S. Sander and M. M\"uter and P. Schulze-Lammers and C. Stachniss},
    booktitle = icra,
    year = {2016},
    timestamp = {2016.01.15},
    url = {http://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/lottes16icra.pdf},
    }

Awards

  • Winner of Best Paper Award in Automation of the IEEE Robotics and Automation Society at ICRA 2017, “UAV-Based Crop and Weed Classification for Smart Farming”
  • Finalist for the Best Application Paper Award of the IEEE/RSJ International Conference on Intelligent Robots and Systems at IROS 2017, “Semi-Supervised Online Visual Crop and Weed Classification in Precision Farming Exploiting Plant Arrangement”
  • Finalist for the Best Service Paper Award of the IEEE Robotics and Automation Society at ICRA 2018, “Real-time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs”

Other Articles

2017

  • P. Lottes, R. Khanna, J. Pfeifer, R. Siegwart, C. Stachniss, “UAV-Based Crop and Weed Classification for Future Farming”, RoboHub, 06/2017
  • H. Blum, R. Pude, P. Lottes, S. Brell, “Mechanische Unkrautregulierung im Arznei- und Gewürzpflanzenanbau am Campus Klein-Altendorf”, in Zeitschrift für Arznei- und Gewürzpflanzen (Journal of Medicinial and Spice Plants (ZAG) , 01/2017

2016

  • P. Lottes, “Roboter auf dem Feld”, in Gartenbau Profi, Monatszeitschrift für Obst, Gemüse und Zierpflanzen, 11/2016

2014

  • L. Klingbeil, P. Lottes, H. and Kuhlmann, “Laserscanning-Technologie auf sich bewegenden Platformen”, in Schriftenreihe des DVW Terrestrisches Laserscanning, 2014

Project Reports

2015

  • P. Lottes, “Residuenanalyse zur Detektion von Deformationen am Radioteleskop in Effelsberg”, 2015.
    [PDF]
  • P. Lottes, “Systemkalibrierung eines Profillaserscanners in einem Mobilen-Mapping-System in der Praxis”, 2015.
    [PDF]

2014

  • P. Lottes, “Systemkalibrierung profilmessender Laserscanner in Mobilen-Mapping-Systemen innerhalb einer Testfeldumgebung”, 2014.
    [PDF]
igg
Institute of Geodesy
and Geoinformation
lwf
Faculty of Agriculture
ubn
University of Bonn