Andres Milioto

PhD Student
Contact:
Email: amilioto@nulluni-bonn.de
Tel: +49 – 228 – 73 – 60 190
Fax: +49 – 228 – 73 – 27 12
Office: Nussallee 15, 1. OG, room 1.008
Address:
University of Bonn
Photogrammetry, IGG
Nussallee 15
53115 Bonn
Profiles: Google Scholar | LinkedIn | Research Gate | Github | Follow Andres on GitHub

Research Interests

  • Deep Learning for Robotics
  • Computer Vision for Robotics
  • Artificial Intelligence
  • Real-time stuff

Projects

  • Flourish – Developing computer vision algorithms to detect crops and weeds in the context of autonomous precision agriculture robots. Focusing on novel machine learning algorithms to aid autonomy of agriculture robotics solutions.
  • Bonnet – an open-source training and deployment framework for semantic segmentation in robotics.
    StarForkWatch
  • Alfred: Leading development effort of autonomous multi-sensor robotic platform based on Clearpath Husky A200.
    0db64e70_f9a6_4892_b11a_5bf56e40b111_original
  • goPro-meta – Software for extracting meta-data from goPro Hero5 cameras, such as GPS information, for each frame.
    StarForkWatch

Short CV

Andres Milioto is a Research Assistant and Ph.D. Student at the University of Bonn since February 2017. He received his Electrical Engineering Degree from Universidad Nacional de Rosario, Argentina in June 2016, where he was best of his class. During this time, he was involved in several robotics projects for private companies, including the construction of a large-scale iron pellet stacker and software development for robotics arms in welding applications, in Argentina, Mexico, and Italy. The 2 years preceding his Ph.D. he worked for iRobot (USA) on Software development and Hardware integration, developing behaviors and communication protocols for state-of-the-art, SLAM-enabled, consumer robots.

Teaching

  • M26-PIROS – Solving Online Perception Problems in ROS

Awards

  • Winner “Best Demo” Award (Bonnet), Workshop on Multimodal Robot Perception, ICRA 2018, Brisbane, Australia.
  • Finalist “IEEE ICRA Best Paper Award in Service Robotics” ICRA 2018, Brisbane, Australia.
  • Best in 2015/2016 class, Electrical Engineering, Universidad Nacional de Rosario, Argentina.

Publications

2018

  • P. Lottes, J. Behley, A. Milioto, and C. Stachniss, “Fully Convolutional Networks with Sequential Information for Robust Crop and Weed Detection in Precision Farming,” IEEE Robotics and Automation Letters (RA-L), 2018. doi:10.1109/LRA.2018.2846289
    [BibTeX] [PDF]
    @Article{lottes2018ral,
    author = {P. Lottes and J. Behley and A. Milioto and C. Stachniss},
    title = {Fully Convolutional Networks with Sequential Information for Robust Crop and Weed Detection in Precision Farming},
    journal = {IEEE Robotics and Automation Letters (RA-L)},
    year = {2018},
    doi = {10.1109/LRA.2018.2846289},
    url = {http://www.ipb.uni-bonn.de/pdfs/lottes2018ral.pdf},
    }

  • A. Milioto, P. Lottes, and C. Stachniss, “Real-time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs,” in Proceedings of the IEEE Int. Conf. on Robotics & Automation (ICRA) , 2018.
    [BibTeX] [PDF] [Video]

    Precision farming robots, which target to reduce the amount of herbicides that need to be brought out in the fields, must have the ability to identify crops and weeds in real time to trigger weeding actions. In this paper, we address the problem of CNN-based semantic segmentation of crop fields separating sugar beet plants, weeds, and background solely based on RGB data. We propose a CNN that exploits existing vegetation indexes and provides a classification in real time. Furthermore, it can be effectively re-trained to so far unseen fields with a comparably small amount of training data. We implemented and thoroughly evaluated our system on a real agricultural robot operating in different fields in Germany and Switzerland. The results show that our system generalizes well, can operate at around 20Hz, and is suitable for online operation in the fields.

    @InProceedings{milioto2018icra,
    author = {A. Milioto and P. Lottes and C. Stachniss},
    title = {Real-time Semantic Segmentation of Crop and Weed for Precision Agriculture Robots Leveraging Background Knowledge in CNNs},
    year = {2018},
    booktitle = {Proceedings of the IEEE Int. Conf. on Robotics \& Automation (ICRA)},
    abstract = {Precision farming robots, which target to reduce the amount of herbicides that need to be brought out in the fields, must have the ability to identify crops and weeds in real time to trigger weeding actions. In this paper, we address the problem of CNN-based semantic segmentation of crop fields separating sugar beet plants, weeds, and background solely based on RGB data. We propose a CNN that exploits existing vegetation indexes and provides a classification in real time. Furthermore, it can be effectively re-trained to so far unseen fields with a comparably small amount of training data. We implemented and thoroughly evaluated our system on a real agricultural robot operating in different fields in Germany and Switzerland. The results show that our system generalizes well, can operate at around 20Hz, and is suitable for online operation in the fields.},
    url = {https://arxiv.org/abs/1709.06764},
    videourl = {https://youtu.be/DXcTkJmdWFQ},
    }

  • A. Milioto and C. Stachniss, “Bonnet: An Open-Source Training and Deployment Framework for Semantic Segmentation in Robotics using CNNs,” Worshop on Perception, Inference, and Learning for Joint Semantic, Geometric, and Physical Understanding, IEEE Int. Conf. on Robotics & Automation (ICRA), 2018.
    [BibTeX] [PDF] [Code] [Video]
    @Article{milioto2018icraws,
    author = {A. Milioto and C. Stachniss},
    title = "{Bonnet: An Open-Source Training and Deployment Framework for Semantic Segmentation in Robotics using CNNs}",
    journal = {Worshop on Perception, Inference, and Learning for Joint Semantic, Geometric, and Physical Understanding, IEEE Int. Conf. on Robotics \& Automation (ICRA)},
    eprint = {1802.08960},
    primaryclass = "cs.RO",
    keywords = {Computer Science - Robotics, Computer Science - Computer Vision and Pattern Recognition},
    year = 2018,
    month = may,
    url = {https://arxiv.org/abs/1802.08960},
    codeurl = {https://github.com/Photogrammetry-Robotics-Bonn/bonnet},
    videourl = {https://www.youtube.com/watch?v=tfeFHCq6YJs},
    }

  • K. Franz, R. Roscher, A. Milioto, S. Wenzel, and J. Kusche, “Ocean Eddy Identification and Tracking using Neural Networks,” in IEEE International Geoscience and Remote Sensing Symposium (IGARSS) , 2018.
    [BibTeX] [PDF]
    @InProceedings{franz2018ocean,
    author = {Franz, K. and Roscher, R. and Milioto, A. and Wenzel, S. and Kusche, J.},
    title = {Ocean Eddy Identification and Tracking using Neural Networks},
    booktitle = {IEEE International Geoscience and Remote Sensing Symposium (IGARSS)},
    year = {2018},
    note = {accepted},
    url = {https://arxiv.org/abs/arXiv:1803.07436},
    }

2017

  • A. Milioto, P. Lottes, and C. Stachniss, “Real-time Blob-wise Sugar Beets vs Weeds Classification for Monitoring Fields using Convolutional Neural Networks,” in Proceedings of the ISPRS Conference on Unmanned Aerial Vehicles in Geomatics (UAV-g) , 2017.
    [BibTeX] [PDF]

    UAVs are becoming an important tool for field monitoring and precision farming. A prerequisite for observing and analyzing fields is the ability to identify crops and weeds from image data. In this paper, we address the problem of detecting the sugar beet plants and weeds in the field based solely on image data. We propose a system that combines vegetation detection and deep learning to obtain a high-quality classification of the vegetation in the field into value crops and weeds. We implemented and thoroughly evaluated our system on image data collected from different sugar beet fields and illustrate that our approach allows for accurately identifying the weeds on the field.

    @InProceedings{milioto2017uavg,
    title = {Real-time Blob-wise Sugar Beets vs Weeds Classification for Monitoring Fields using Convolutional Neural Networks},
    author = {A. Milioto and P. Lottes and C. Stachniss},
    booktitle = uavg,
    year = {2017},
    abstract = {UAVs are becoming an important tool for field monitoring and precision farming. A prerequisite for observing and analyzing fields is the ability to identify crops and weeds from image data. In this paper, we address the problem of detecting the sugar beet plants and weeds in the field based solely on image data. We propose a system that combines vegetation detection and deep learning to obtain a high-quality classification of the vegetation in the field into value crops and weeds. We implemented and thoroughly evaluated our system on image data collected from different sugar beet fields and illustrate that our approach allows for accurately identifying the weeds on the field.},
    url = {http://www.ipb.uni-bonn.de/pdfs/milioto17uavg.pdf},
    }

igg
Institute of Geodesy
and Geoinformation
lwf
Faculty of Agriculture
ubn
University of Bonn