Andres Milioto

PhD Student
Contact:
Email: amilioto@nulluni-bonn.de
Tel: +49 – 228 – 73 – 60 190
Fax: +49 – 228 – 73 – 27 12
Office: Nussallee 15, 1. OG, room 1.008
Address:
University of Bonn
Photogrammetry, IGG
Nussallee 15
53115 Bonn
Profiles: Google Scholar | LinkedIn | Research Gate | Github | Follow Andres on GitHub

Research Interests

  • Deep Learning for Robotics
  • Computer Vision for Robotics
  • Artificial Intelligence
  • Real-time stuff

Projects

  • Flourish – Developing computer vision algorithms to detect crops and weeds in the context of autonomous precision agriculture robots. Focusing on novel machine learning algorithms to aid autonomy of agriculture robotics solutions.
  • Bonnet – an open-source training and deployment framework for semantic segmentation in robotics.
    StarForkWatch
  • Alfred: Leading development effort of autonomous multi-sensor robotic platform based on Clearpath Husky A200.
    husky_nice_3

  • goPro-meta – Software for extracting meta-data from goPro Hero5 cameras, such as GPS information, for each frame.
    StarForkWatch

Short CV

Andres Milioto is a Research Assistant and Ph.D. Student at the University of Bonn since February 2017. He received his Electrical Engineering Degree from Universidad Nacional de Rosario, Argentina in June 2016, where he was best of his class. During this time, he was involved in several robotics projects for private companies, including the construction of a large-scale iron pellet stacker and software development for robotics arms in welding applications, in Argentina, Mexico, and Italy. The 2 years preceding his Ph.D. he worked for iRobot (USA) on Software development and Hardware integration, developing behaviors and communication protocols for state-of-the-art, SLAM-enabled, consumer robots.

Teaching

  • M26-PIROS – Solving Online Perception Problems in ROS – Winter Semester 2017
  • M26-APMR – Advanced Perception for Mobile Robotics – Winter Semester 2018

Awards

  • Winner “Best Demo” Award (Bonnet), Workshop on Multimodal Robot Perception, ICRA 2018, Brisbane, Australia.
  • Finalist “IEEE ICRA Best Paper Award in Service Robotics” ICRA 2018, Brisbane, Australia.
  • Best in 2015/2016 class, Electrical Engineering, Universidad Nacional de Rosario, Argentina.

Publications

2018

  • P. Lottes, J. Behley, A. Milioto, and C. Stachniss, “Fully Convolutional Networks with Sequential Information for Robust Crop and Weed Detection in Precision Farming,” IEEE Robotics and Automation Letters (RA-L), vol. 3, pp. 3097-3104, 2018. doi:10.1109/LRA.2018.2846289
    [BibTeX] [PDF] [Video]
    @Article{lottes2018ral,
    author = {P. Lottes and J. Behley and A. Milioto and C. Stachniss},
    title = {Fully Convolutional Networks with Sequential Information for Robust Crop and Weed Detection in Precision Farming},
    journal = {IEEE Robotics and Automation Letters (RA-L)},
    year = {2018},
    volume = {3},
    issue = {4},
    pages = {3097-3104},
    doi = {10.1109/LRA.2018.2846289},
    url = {http://www.ipb.uni-bonn.de/pdfs/lottes2018ral.pdf},
    videourl = {https://www.youtube.com/watch?v=vTepw9HRLh8},
    }

  • P. Regier, A. Milioto, P. Karkowski, C. Stachniss, and M. Bennewitz, “Classifying Obstacles and Exploiting Knowledge about Classes for Efficient Humanoid Navigation,” in Proceedings of the IEEE-RAS Int. Conf. on Humanoid Robots (HUMANOIDS) , 2018.
    [BibTeX]
    @InProceedings{regier2018humanoids,
    author = {P. Regier and A. Milioto and P. Karkowski and C. Stachniss and M. Bennewitz},
    title = {{Classifying Obstacles and Exploiting Knowledge about Classes for Efficient Humanoid Navigation}},
    booktitle = {Proceedings of the IEEE-RAS Int. Conf. on Humanoid Robots (HUMANOIDS)},
    year = 2018,
    }

  • P. Lottes, J. Behley, N. Chebrolu, A. Milioto, and C. Stachniss, “Joint Stem Detection and Crop-Weed Classification for Plant-specific Treatment in Precision Farming,” in Proceedings of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS) , 2018.
    [BibTeX] [PDF] [Video]

    Applying agrochemicals is the default procedure for conventional weed control in crop production, but has negative impacts on the environment. Robots have the potential to treat every plant in the field individually and thus can reduce the required use of such chemicals. To achieve that, robots need the ability to identify crops and weeds in the field and must additionally select effective treatments. While certain types of weed can be treated mechanically, other types need to be treated by (selective) spraying. In this paper, we present an approach that provides the necessary information for effective plant-specific treatment. It outputs the stem location for weeds, which allows for mechanical treatments, and the covered area of the weed for selective spraying. Our approach uses an end-to- end trainable fully convolutional network that simultaneously estimates stem positions as well as the covered area of crops and weeds. It jointly learns the class-wise stem detection and the pixel-wise semantic segmentation. Experimental evaluations on different real-world datasets show that our approach is able to reliably solve this problem. Compared to state-of-the-art approaches, our approach not only substantially improves the stem detection accuracy, i.e., distinguishing crop and weed stems, but also provides an improvement in the semantic segmentation performance.

    @InProceedings{lottes2018iros,
    author = {P. Lottes and J. Behley and N. Chebrolu and A. Milioto and C. Stachniss},
    title = {Joint Stem Detection and Crop-Weed Classification for Plant-specific Treatment in Precision Farming},
    booktitle = {Proceedings of the IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (IROS)},
    year = 2018,
    url = {http://www.ipb.uni-bonn.de/pdfs/lottes18iros.pdf},
    videourl = {https://www.youtube.com/watch?v=C9mjZxE_Sxg},
    abstract = {Applying agrochemicals is the default procedure for conventional weed control in crop production, but has negative impacts on the environment. Robots have the potential to treat every plant in the field individually and thus can reduce the required use of such chemicals. To achieve that, robots need the ability to identify crops and weeds in the field and must additionally select effective treatments. While certain types of weed can be treated mechanically, other types need to be treated by (selective) spraying. In this paper, we present an approach that provides the necessary information for effective plant-specific treatment. It outputs the stem location for weeds, which allows for mechanical treatments, and the covered area of the weed for selective spraying. Our approach uses an end-to- end trainable fully convolutional network that simultaneously estimates stem positions as well as the covered area of crops and weeds. It jointly learns the class-wise stem detection and the pixel-wise semantic segmentation. Experimental evaluations on different real-world datasets show that our approach is able to reliably solve this problem. Compared to state-of-the-art approaches, our approach not only substantially improves the stem detection accuracy, i.e., distinguishing crop and weed stems, but also provides an improvement in the semantic segmentation performance.}
    }

  • A. Milioto, P. Lottes, and C. Stachniss, “Real-time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs,” in Proceedings of the IEEE Int. Conf. on Robotics & Automation (ICRA) , 2018.
    [BibTeX] [PDF] [Video]

    Precision farming robots, which target to reduce the amount of herbicides that need to be brought out in the fields, must have the ability to identify crops and weeds in real time to trigger weeding actions. In this paper, we address the problem of CNN-based semantic segmentation of crop fields separating sugar beet plants, weeds, and background solely based on RGB data. We propose a CNN that exploits existing vegetation indexes and provides a classification in real time. Furthermore, it can be effectively re-trained to so far unseen fields with a comparably small amount of training data. We implemented and thoroughly evaluated our system on a real agricultural robot operating in different fields in Germany and Switzerland. The results show that our system generalizes well, can operate at around 20Hz, and is suitable for online operation in the fields.

    @InProceedings{milioto2018icra,
    author = {A. Milioto and P. Lottes and C. Stachniss},
    title = {Real-time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs},
    year = {2018},
    booktitle = {Proceedings of the IEEE Int. Conf. on Robotics \& Automation (ICRA)},
    abstract = {Precision farming robots, which target to reduce the amount of herbicides that need to be brought out in the fields, must have the ability to identify crops and weeds in real time to trigger weeding actions. In this paper, we address the problem of CNN-based semantic segmentation of crop fields separating sugar beet plants, weeds, and background solely based on RGB data. We propose a CNN that exploits existing vegetation indexes and provides a classification in real time. Furthermore, it can be effectively re-trained to so far unseen fields with a comparably small amount of training data. We implemented and thoroughly evaluated our system on a real agricultural robot operating in different fields in Germany and Switzerland. The results show that our system generalizes well, can operate at around 20Hz, and is suitable for online operation in the fields.},
    url = {https://arxiv.org/abs/1709.06764},
    videourl = {https://youtu.be/DXcTkJmdWFQ},
    }

  • A. Milioto and C. Stachniss, “Bonnet: An Open-Source Training and Deployment Framework for Semantic Segmentation in Robotics using CNNs,” ICRA Worshop on Perception, Inference, and Learning for Joint Semantic, Geometric, and Physical Understanding, 2018.
    [BibTeX] [PDF] [Code] [Video]
    @Article{milioto2018icraws,
    author = {A. Milioto and C. Stachniss},
    title = "{Bonnet: An Open-Source Training and Deployment Framework for Semantic Segmentation in Robotics using CNNs}",
    journal = {ICRA Worshop on Perception, Inference, and Learning for Joint Semantic, Geometric, and Physical Understanding},
    eprint = {1802.08960},
    primaryclass = "cs.RO",
    keywords = {Computer Science - Robotics, Computer Science - Computer Vision and Pattern Recognition},
    year = 2018,
    month = may,
    url = {https://arxiv.org/abs/1802.08960},
    codeurl = {https://github.com/Photogrammetry-Robotics-Bonn/bonnet},
    videourl = {https://www.youtube.com/watch?v=tfeFHCq6YJs},
    }

  • K. Franz, R. Roscher, A. Milioto, S. Wenzel, and J. Kusche, “Ocean Eddy Identification and Tracking using Neural Networks,” in IEEE International Geoscience and Remote Sensing Symposium (IGARSS) , 2018.
    [BibTeX] [PDF]
    @InProceedings{franz2018ocean,
    author = {Franz, K. and Roscher, R. and Milioto, A. and Wenzel, S. and Kusche, J.},
    title = {Ocean Eddy Identification and Tracking using Neural Networks},
    booktitle = {IEEE International Geoscience and Remote Sensing Symposium (IGARSS)},
    year = {2018},
    note = {accepted},
    url = {https://arxiv.org/abs/arXiv:1803.07436},
    }

2017

  • A. Milioto, P. Lottes, and C. Stachniss, “Real-time Blob-wise Sugar Beets vs Weeds Classification for Monitoring Fields using Convolutional Neural Networks,” in Proceedings of the ISPRS Conference on Unmanned Aerial Vehicles in Geomatics (UAV-g) , 2017.
    [BibTeX] [PDF]

    UAVs are becoming an important tool for field monitoring and precision farming. A prerequisite for observing and analyzing fields is the ability to identify crops and weeds from image data. In this paper, we address the problem of detecting the sugar beet plants and weeds in the field based solely on image data. We propose a system that combines vegetation detection and deep learning to obtain a high-quality classification of the vegetation in the field into value crops and weeds. We implemented and thoroughly evaluated our system on image data collected from different sugar beet fields and illustrate that our approach allows for accurately identifying the weeds on the field.

    @InProceedings{milioto2017uavg,
    title = {Real-time Blob-wise Sugar Beets vs Weeds Classification for Monitoring Fields using Convolutional Neural Networks},
    author = {A. Milioto and P. Lottes and C. Stachniss},
    booktitle = uavg,
    year = {2017},
    abstract = {UAVs are becoming an important tool for field monitoring and precision farming. A prerequisite for observing and analyzing fields is the ability to identify crops and weeds from image data. In this paper, we address the problem of detecting the sugar beet plants and weeds in the field based solely on image data. We propose a system that combines vegetation detection and deep learning to obtain a high-quality classification of the vegetation in the field into value crops and weeds. We implemented and thoroughly evaluated our system on image data collected from different sugar beet fields and illustrate that our approach allows for accurately identifying the weeds on the field.},
    url = {http://www.ipb.uni-bonn.de/pdfs/milioto17uavg.pdf},
    }

igg
Institute of Geodesy
and Geoinformation
lwf
Faculty of Agriculture
ubn
University of Bonn