Andres Milioto

PhD Student
Contact:
Email: amilioto@nulluni-bonn.de
Tel: +49 - 228 - 73 - 60 190
Fax: +49 - 228 - 73 - 27 12
Office: Nussallee 15, 1. OG, room 1.008
Address:
University of Bonn
Photogrammetry, IGG
Nussallee 15
53115 Bonn
Profiles: Google Scholar | LinkedIn | Research Gate | Github

Research Interests

  • Probabilistic Robotics
  • Machine Learning

Short CV

Andres Milioto is a Research Assistant and Ph.D. Student at the Rheinische Friedrich-Wilhelms-universität Bonn since February 2017. He received his Electrical Engineering Master’s from Universidad Nacional de Rosario in Rosario, Argentina in June 2016, where he was best of his class. During his Master’s he was involved in several robotics projects for private companies, including the construction of a large-scale iron pellet stacker and software development for robotics arms in welding applications, in Argentina, Mexico and Italy. The 2 years preceding his Ph.D. he worked in the United States for iRobot, doing Software development and Hardware integration, developing behaviors and communication protocols for state-of-the-art, SLAM-enabled, consumer robots.

Publications

2017

  • A. Milioto, P. Lottes, and C. Stachniss, “Real-time Blob-wise Sugar Beets vs Weeds Classification for Monitoring Fields using Convolutional Neural Networks,” in Proceedings of the ISPRS Conference on Unmanned Aerial Vehicles in Geomatics (UAV-g) , 2017.
    [BibTeX] [PDF]
    UAVs are becoming an important tool for field monitoring and precision farming. A prerequisite for observing and analyzing fields is the ability to identify crops and weeds from image data. In this paper, we address the problem of detecting the sugar beet plants and weeds in the field based solely on image data. We propose a system that combines vegetation detection and deep learning to obtain a high-quality classification of the vegetation in the field into value crops and weeds. We implemented and thoroughly evaluated our system on image data collected from different sugar beet fields and illustrate that our approach allows for accurately identifying the weeds on the field.

    @InProceedings{milioto17uavg,
    Title = {Real-time Blob-wise Sugar Beets vs Weeds Classification for Monitoring Fields using Convolutional Neural Networks},
    Author = {A. Milioto and P. Lottes and C. Stachniss},
    Booktitle = uavg,
    Year = {2017},
    Abstract = {UAVs are becoming an important tool for field monitoring and precision farming. A prerequisite for observing and analyzing fields is the ability to identify crops and weeds from image data. In this paper, we address the problem of detecting the sugar beet plants and weeds in the field based solely on image data. We propose a system that combines vegetation detection and deep learning to obtain a high-quality classification of the vegetation in the field into value crops and weeds. We implemented and thoroughly evaluated our system on image data collected from different sugar beet fields and illustrate that our approach allows for accurately identifying the weeds on the field.},
    url = {http://www.ipb.uni-bonn.de/pdfs/milioto17uavg.pdf}
    }

  • A. Milioto, P. Lottes, and C. Stachniss, “Real-time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs,” arXiv preprint:1709.06764, 2017.
    [BibTeX] [PDF]
    Precision farming robots, which target to reduce the amount of herbicides that need to be brought out in the fields, must have the ability to identify crops and weeds in real time to trigger weeding actions. In this paper, we address the problem of CNN-based semantic segmentation of crop fields separating sugar beet plants, weeds, and background solely based on RGB data. We propose a CNN that exploits existing vegetation indexes and provides a classification in real time. Furthermore, it can be effectively re-trained to so far unseen fields with a comparably small amount of training data. We implemented and thoroughly evaluated our system on a real agricultural robot operating in different fields in Germany and Switzerland. The results show that our system generalizes well, can operate at around 20Hz, and is suitable for online operation in the fields.

    @Article{milioto17arxiv,
    Author = {A. Milioto and P. Lottes and C. Stachniss},
    Title = {Real-time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs},
    Year = {2017},
    Journal = {arXiv preprint:1709.06764},
    Abstract = {Precision farming robots, which target to reduce the amount of herbicides that need to be brought out in the fields, must have the ability to identify crops and weeds in real time to trigger weeding actions. In this paper, we address the problem of CNN-based semantic segmentation of crop fields separating sugar beet plants, weeds, and background solely based on RGB data. We propose a CNN that exploits existing vegetation indexes and provides a classification in real time. Furthermore, it can be effectively re-trained to so far unseen fields with a comparably small amount of training data. We implemented and thoroughly evaluated our system on a real agricultural robot operating in different fields in Germany and Switzerland. The results show that our system generalizes well, can operate at around 20Hz, and is suitable for online operation in the fields.},
    Url = {https://arxiv.org/abs/1709.06764}
    }

igg
Institute of Geodesy
and Geoinformation
lwf
Faculty of Agriculture
ubn
University of Bonn