Nived Chebrolu

PhD Student
Contact:
Email: nived.chebrolu@nulligg.uni-bonn.de
Tel: +49 – 228 – 73 – 29 03
Fax: +49 – 228 – 73 – 27 12
Office: Nussallee 15, 1.0G, room 1.007
Address:
University of Bonn
Photogrammetry, IGG
Nussallee 15
53115 Bonn

Research Interests

  • Agricultural Robotics
  • Simultaneous Localization and Mapping
  • Autonomous Robots
  • Long-term Registration

Short CV

Nived Chebrolu joined the Photogrammetry and Robotics Lab of the University of Bonn in September 2015. He successfully defended his Ph.D. thesis titled “Spatio-Temporal Registration Techniques for Agricultural Robots” in May 2021.

Nived received his Masters’s degree in Robotics from Ecole Centrale de Nantes (ECN), France, and the University of Genoa, Italy in 2015. His Master’s thesis was about “Collaborative Visual SLAM” which he wrote with Prof. Philippe Martinet’s group at ECN.

His research focuses on techniques for localization and mapping for robots in agricultural fields. He is interested in algorithms for long-term registration, as well as collaboration techniques between ground and aerial robots.

Full CV

Projects

FLOURISH (2015-2018), H2020 EU-Funded Project, University of Bonn.
Automated analysis and intervention for precision farming using autonomous robots.

Teaching

  • Techniques for Self-driving Cars [WS2020]
  • Advanced Techniques for Mobile Sensing and Robotics [WS2018/2019/2020]
  • Sensors ans State Estimation [WS2018/2019/2020]

Publications

2024

  • M. V. R. Malladi, T. Guadagnino, L. Lobefaro, M. Mattamala, H. Griess, J. Schweier, N. Chebrolu, M. Fallon, J. Behley, and C. Stachniss, “Tree Instance Segmentation and Traits Estimation for Forestry Environments Exploiting LiDAR Data ,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2024.
    [BibTeX] [PDF] [Video]
    @inproceedings{malladi2024icra,
    author = {M.V.R. Malladi and T. Guadagnino and L. Lobefaro and M. Mattamala and H. Griess and J. Schweier and N. Chebrolu and M. Fallon and J. Behley and C. Stachniss},
    title = {{Tree Instance Segmentation and Traits Estimation for Forestry Environments Exploiting LiDAR Data }},
    booktitle = icra,
    year = 2024,
    videourl = {https://youtu.be/14uuCxmfGco},
    }

2023

  • Y. Goel, N. Vaskevicius, L. Palmieri, N. Chebrolu, K. O. Arras, and C. Stachniss, “Semantically Informed MPC for Context-Aware Robot Exploration,” in Proc. of the IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems (IROS), 2023.
    [BibTeX] [PDF]
    @inproceedings{goel2023iros,
    author = {Y. Goel and N. Vaskevicius and L. Palmieri and N. Chebrolu and K.O. Arras and C. Stachniss},
    title = {{Semantically Informed MPC for Context-Aware Robot Exploration}},
    booktitle = iros,
    year = 2023,
    }

  • J. Weyler, F. Magistri, E. Marks, Y. L. Chong, M. Sodano, G. Roggiolani, N. Chebrolu, C. Stachniss, and J. Behley, “PhenoBench –- A Large Dataset and Benchmarks for Semantic Image Interpretation in the Agricultural Domain,” Arxiv preprint, vol. arXiv:2306.04557, 2023.
    [BibTeX] [PDF] [Code]
    @article{weyler2023arxiv,
    author = {Jan Weyler and Federico Magistri and Elias Marks and Yue Linn Chong and Matteo Sodano
    and Gianmarco Roggiolani and Nived Chebrolu and Cyrill Stachniss and Jens Behley},
    title = {{PhenoBench --- A Large Dataset and Benchmarks for Semantic Image Interpretation
    in the Agricultural Domain}},
    journal = {arXiv preprint},
    volume = {arXiv:2306.04557},
    year = {2023},
    codeurl = {https://github.com/PRBonn/phenobench}
    }

2022

  • Y. Goel, N. Vaskevicius, L. Palmieri, N. Chebrolu, and C. Stachniss, “Predicting Dense and Context-aware Cost Maps for Semantic Robot Navigation,” in Iros workshop on perception and navigation for autonomous robotics in unstructured and dynamic environments, 2022.
    [BibTeX] [PDF]
    @inproceedings{goel2022irosws,
    title = {{Predicting Dense and Context-aware Cost Maps for Semantic Robot Navigation}},
    author = {Y. Goel and N. Vaskevicius and L. Palmieri and N. Chebrolu and C. Stachniss},
    booktitle = {IROS Workshop on Perception and Navigation for Autonomous Robotics in Unstructured and Dynamic Environments},
    year = {2022},
    url = {https://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/goel2022irosws.pdf},
    }

2021

  • A. Pretto, S. Aravecchia, W. Burgard, N. Chebrolu, C. Dornhege, T. Falck, F. Fleckenstein, A. Fontenla, M. Imperoli, R. Khanna, F. Liebisch, P. Lottes, A. Milioto, D. Nardi, S. Nardi, J. Pfeifer, M. Popovic, C. Potena, C. Pradalier, E. Rothacker-Feder, I. Sa, A. Schaefer, R. Siegwart, C. Stachniss, A. Walter, V. Winterhalter, X. Wu, and J. Nieto, “Building an Aerial-Ground Robotics Systemfor Precision Farming: An Adaptable Solution,” Ieee robotics & automation magazine, vol. 28, iss. 3, 2021.
    [BibTeX] [PDF]
    @Article{pretto2021ram,
    title = {{Building an Aerial-Ground Robotics Systemfor Precision Farming: An Adaptable Solution}},
    author = {A. Pretto and S. Aravecchia and W. Burgard and N. Chebrolu and C. Dornhege and T. Falck and F. Fleckenstein and A. Fontenla and M. Imperoli and R. Khanna and F. Liebisch and P. Lottes and A. Milioto and D. Nardi and S. Nardi and J. Pfeifer and M. Popovic and C. Potena and C. Pradalier and E. Rothacker-Feder and I. Sa and A. Schaefer and R. Siegwart and C. Stachniss and A. Walter and V. Winterhalter and X. Wu and J. Nieto},
    journal = ram,
    volume = 28,
    number = 3,
    year = {2021},
    url={https://www.ipb.uni-bonn.de/pdfs/pretto2021ram.pdf}
    }

  • D. Schunck, F. Magistri, R. A. Rosu, A. Cornelißen, N. Chebrolu, S. Paulus, J. Léon, S. Behnke, C. Stachniss, H. Kuhlmann, and L. Klingbeil, “Pheno4D: A spatio-temporal dataset of maize and tomato plant point clouds for phenotyping and advanced plant analysis ,” Plos one, vol. 16, iss. 8, pp. 1-18, 2021. doi:10.1371/journal.pone.0256340
    [BibTeX] [PDF]

    Understanding the growth and development of individual plants is of central importance in modern agriculture, crop breeding, and crop science. To this end, using 3D data for plant analysis has gained attention over the last years. High-resolution point clouds offer the potential to derive a variety of plant traits, such as plant height, biomass, as well as the number and size of relevant plant organs. Periodically scanning the plants even allows for performing spatio-temporal growth analysis. However, highly accurate 3D point clouds from plants recorded at different growth stages are rare, and acquiring this kind of data is costly. Besides, advanced plant analysis methods from machine learning require annotated training data and thus generate intense manual labor before being able to perform an analysis. To address these issues, we present with this dataset paper a multi-temporal dataset featuring high-resolution registered point clouds of maize and tomato plants, which we manually labeled for computer vision tasks, such as for instance segmentation and 3D reconstruction, providing approximately 260 million labeled 3D points. To highlight the usability of the data and to provide baselines for other researchers, we show a variety of applications ranging from point cloud segmentation to non-rigid registration and surface reconstruction. We believe that our dataset will help to develop new algorithms to advance the research for plant phenotyping, 3D reconstruction, non-rigid registration, and deep learning on raw point clouds. The dataset is freely accessible at https://www.ipb.uni-bonn.de/data/pheno4d/.

    @article{schunck2021plosone,
    author = {D. Schunck and F. Magistri and R.A. Rosu and A. Corneli{\ss}en and N. Chebrolu and S. Paulus and J. L\'eon and S. Behnke and C. Stachniss and H. Kuhlmann and L. Klingbeil},
    title = {{Pheno4D: A spatio-temporal dataset of maize and tomato plant point clouds for phenotyping and advanced plant analysis
    }},
    journal = plosone,
    year = 2021,
    url = {https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0256340&type=printable},
    volume = {16},
    number = {8},
    doi = {10.1371/journal.pone.0256340},
    pages = {1-18},
    abstract = {Understanding the growth and development of individual plants is of central importance in modern agriculture, crop breeding, and crop science. To this end, using 3D data for plant analysis has gained attention over the last years. High-resolution point clouds offer the potential to derive a variety of plant traits, such as plant height, biomass, as well as the number and size of relevant plant organs. Periodically scanning the plants even allows for performing spatio-temporal growth analysis. However, highly accurate 3D point clouds from plants recorded at different growth stages are rare, and acquiring this kind of data is costly. Besides, advanced plant analysis methods from machine learning require annotated training data and thus generate intense manual labor before being able to perform an analysis. To address these issues, we present with this dataset paper a multi-temporal dataset featuring high-resolution registered point clouds of maize and tomato plants, which we manually labeled for computer vision tasks, such as for instance segmentation and 3D reconstruction, providing approximately 260 million labeled 3D points. To highlight the usability of the data and to provide baselines for other researchers, we show a variety of applications ranging from point cloud segmentation to non-rigid registration and surface reconstruction. We believe that our dataset will help to develop new algorithms to advance the research for plant phenotyping, 3D reconstruction, non-rigid registration, and deep learning on raw point clouds. The dataset is freely accessible at https://www.ipb.uni-bonn.de/data/pheno4d/.},
    }

  • N. Chebrolu, “Spatio-temporal registration techniques for agricultural robots,” PhD Thesis, 2021.
    [BibTeX] [PDF]
    @PhdThesis{chebrolu2021phd,
    author = {N. Chebrolu},
    title = {Spatio-Temporal Registration Techniques for Agricultural Robots},
    year = 2021,
    school = {University of Bonn},
    URL = {https://hdl.handle.net/20.500.11811/9166},
    }

  • F. Magistri, N. Chebrolu, J. Behley, and C. Stachniss, “Towards In-Field Phenotyping Exploiting Differentiable Rendering with Self-Consistency Loss,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2021.
    [BibTeX] [PDF] [Video]
    @inproceedings{magistri2021icra,
    author = {F. Magistri and N. Chebrolu and J. Behley and C. Stachniss},
    title = {{Towards In-Field Phenotyping Exploiting Differentiable Rendering with Self-Consistency Loss}},
    booktitle = icra,
    year = 2021,
    videourl = {https://youtu.be/MF2A4ihY2lE},
    }

  • I. Vizzo, X. Chen, N. Chebrolu, J. Behley, and C. Stachniss, “Poisson Surface Reconstruction for LiDAR Odometry and Mapping,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2021.
    [BibTeX] [PDF] [Code] [Video]
    @inproceedings{vizzo2021icra,
    author = {I. Vizzo and X. Chen and N. Chebrolu and J. Behley and C. Stachniss},
    title = {{Poisson Surface Reconstruction for LiDAR Odometry and Mapping}},
    booktitle = icra,
    year = 2021,
    url = {https://www.ipb.uni-bonn.de/pdfs/vizzo2021icra.pdf},
    codeurl = {https://github.com/PRBonn/puma},
    videourl = {https://youtu.be/7yWtYWaO5Nk}
    }

  • N. Chebrolu, T. Läbe, O. Vysotska, J. Behley, and C. Stachniss, “Adaptive Robust Kernels for Non-Linear Least Squares Problems,” IEEE Robotics and Automation Letters (RA-L), vol. 6, pp. 2240-2247, 2021. doi:10.1109/LRA.2021.3061331
    [BibTeX] [PDF] [Video]
    @article{chebrolu2021ral,
    author = {N. Chebrolu and T. L\"{a}be and O. Vysotska and J. Behley and C. Stachniss},
    title = {{Adaptive Robust Kernels for Non-Linear Least Squares Problems}},
    journal = ral,
    volume = 6,
    issue = 2,
    pages = {2240-2247},
    doi = {10.1109/LRA.2021.3061331},
    year = 2021,
    videourl = {https://youtu.be/34Zp3ZX0Bnk}
    }

  • N. Chebrolu, F. Magistri, T. Läbe, and C. Stachniss, “Registration of Spatio-Temporal Point Clouds of Plants for Phenotyping,” Plos one, vol. 16, iss. 2, 2021.
    [BibTeX] [PDF] [Video]
    @article{chebrolu2021plosone,
    author = {N. Chebrolu and F. Magistri and T. L{\"a}be and C. Stachniss},
    title = {{Registration of Spatio-Temporal Point Clouds of Plants for Phenotyping}},
    journal = plosone,
    year = 2021,
    volume = 16,
    number = 2,
    videourl = {https://youtu.be/OV39kb5Nqg8},
    }

2020

  • F. Magistri, N. Chebrolu, and C. Stachniss, “Segmentation-Based 4D Registration of Plants Point Clouds for Phenotyping,” in Proc. of the IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems (IROS), 2020.
    [BibTeX] [PDF] [Video]
    @inproceedings{magistri2020iros,
    author = {F. Magistri and N. Chebrolu and C. Stachniss},
    title = {{Segmentation-Based 4D Registration of Plants Point Clouds for Phenotyping}},
    booktitle = iros,
    year = {2020},
    url={https://www.ipb.uni-bonn.de/pdfs/magistri2020iros.pdf},
    videourl = {https://youtu.be/OV39kb5Nqg8},
    }

  • N. Chebrolu, T. Laebe, O. Vysotska, J. Behley, and C. Stachniss, “Adaptive robust kernels for non-linear least squares problems,” Arxiv preprint, 2020.
    [BibTeX] [PDF]
    @article{chebrolu2020arxiv,
    title={Adaptive Robust Kernels for Non-Linear Least Squares Problems},
    author={N. Chebrolu and T. Laebe and O. Vysotska and J. Behley and C. Stachniss},
    journal = arxiv,
    year=2020,
    eprint={2004.14938},
    keywords={cs.RO},
    url={https://arxiv.org/pdf/2004.14938v2}
    }

  • P. Lottes, J. Behley, N. Chebrolu, A. Milioto, and C. Stachniss, “Robust joint stem detection and crop-weed classification using image sequences for plant-specific treatment in precision farming,” Journal of field robotics, vol. 37, pp. 20-34, 2020. doi:https://doi.org/10.1002/rob.21901
    [BibTeX] [PDF]
    @Article{lottes2020jfr,
    title = {Robust joint stem detection and crop-weed classification using image sequences for plant-specific treatment in precision farming},
    author = {Lottes, P. and Behley, J. and Chebrolu, N. and Milioto, A. and Stachniss, C.},
    journal = jfr,
    volume = {37},
    numer = {1},
    pages = {20-34},
    year = {2020},
    doi = {https://doi.org/10.1002/rob.21901},
    url = {https://www.ipb.uni-bonn.de/pdfs/lottes2019jfr.pdf},
    }

  • N. Chebrolu, T. Laebe, and C. Stachniss, “Spatio-temporal non-rigid registration of 3d point clouds of plants,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2020.
    [BibTeX] [PDF] [Video]
    @InProceedings{chebrolu2020icra,
    title = {Spatio-Temporal Non-Rigid Registration of 3D Point Clouds of Plants},
    author = {N. Chebrolu and T. Laebe and C. Stachniss},
    booktitle = icra,
    year = {2020},
    url = {https://www.ipb.uni-bonn.de/pdfs/chebrolu2020icra.pdf},
    videourl = {https://www.youtube.com/watch?v=uGkep_aelBc},
    }

  • A. Ahmadi, L. Nardi, N. Chebrolu, and C. Stachniss, “Visual servoing-based navigation for monitoring row-crop fields,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2020.
    [BibTeX] [PDF] [Code] [Video]
    @InProceedings{ahmadi2020icra,
    title = {Visual Servoing-based Navigation for Monitoring Row-Crop Fields},
    author = {A. Ahmadi and L. Nardi and N. Chebrolu and C. Stachniss},
    booktitle = icra,
    year = {2020},
    url = {https://arxiv.org/pdf/1909.12754},
    codeurl = {https://github.com/PRBonn/visual-crop-row-navigation},
    videourl = {https://youtu.be/0qg6n4sshHk},
    }

2019

  • A. Pretto, S. Aravecchia, W. Burgard, N. Chebrolu, C. Dornhege, T. Falck, F. Fleckenstein, A. Fontenla, M. Imperoli, R. Khanna, F. Liebisch, P. Lottes, A. Milioto, D. Nardi, S. Nardi, J. Pfeifer, M. Popović, C. Potena, C. Pradalier, E. Rothacker-Feder, I. Sa, A. Schaefer, R. Siegwart, C. Stachniss, A. Walter, W. Winterhalter, X. Wu, and J. Nieto, “Building an Aerial-Ground Robotics System for Precision Farming,” Arxiv preprint, 2019.
    [BibTeX] [PDF]
    @article{pretto2019arxiv,
    author = {A. Pretto and S. Aravecchia and W. Burgard and N. Chebrolu and C. Dornhege and T. Falck and F. Fleckenstein and A. Fontenla and M. Imperoli and R. Khanna and F. Liebisch and P. Lottes and A. Milioto and D. Nardi and S. Nardi and J. Pfeifer and M. Popović and C. Potena and C. Pradalier and E. Rothacker-Feder and I. Sa and A. Schaefer and R. Siegwart and C. Stachniss and A. Walter and W. Winterhalter and X. Wu and J. Nieto},
    title = {{Building an Aerial-Ground Robotics System for Precision Farming}},
    journal = arxiv,
    year = 2019,
    eprint = {1911.03098v1},
    url = {https://arxiv.org/pdf/1911.03098v1},
    keywords = {cs.RO},
    }

  • N. Chebrolu, P. Lottes, T. Laebe, and C. Stachniss, “Robot Localization Based on Aerial Images for Precision Agriculture Tasks in Crop Fields,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2019.
    [BibTeX] [PDF] [Video]
    @InProceedings{chebrolu2019icra,
    author = {N. Chebrolu and P. Lottes and T. Laebe and C. Stachniss},
    title = {{Robot Localization Based on Aerial Images for Precision Agriculture Tasks in Crop Fields}},
    booktitle = icra,
    year = 2019,
    url = {https://www.ipb.uni-bonn.de/pdfs/chebrolu2019icra.pdf},
    videourl = {https://youtu.be/TlijLgoRLbc},
    }

  • P. Lottes, N. Chebrolu, F. Liebisch, and C. Stachniss, “Uav-based field monitoring for precision farming,” in Proc. of the 25th workshop für computer-bildanalyse und unbemannte autonom fliegende systeme in der landwirtschaft, 2019.
    [BibTeX] [PDF]
    @InProceedings{lottes2019cbaws,
    title={UAV-based Field Monitoring for Precision Farming},
    author={P. Lottes and N. Chebrolu and F. Liebisch and C. Stachniss},
    booktitle= {Proc. of the 25th Workshop f\"ur Computer-Bildanalyse und unbemannte autonom fliegende Systeme in der Landwirtschaft},
    year= {2019},
    url= {https://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/lottes2019cbaws.pdf},
    }

2018

  • N. Chebrolu, T. Läbe, and C. Stachniss, “Robust long-term registration of uav images of crop fields for precision agriculture,” IEEE Robotics and Automation Letters (RA-L), vol. 3, iss. 4, pp. 3097-3104, 2018. doi:10.1109/LRA.2018.2849603
    [BibTeX] [PDF]
    @Article{chebrolu2018ral,
    author={N. Chebrolu and T. L\"abe and C. Stachniss},
    journal=ral,
    title={Robust Long-Term Registration of UAV Images of Crop Fields for Precision Agriculture},
    year={2018},
    volume={3},
    number={4},
    pages={3097-3104},
    keywords={Agriculture;Cameras;Geometry;Monitoring;Robustness;Three-dimensional displays;Visualization;Robotics in agriculture and forestry;SLAM},
    doi={10.1109/LRA.2018.2849603},
    url={https://www.ipb.uni-bonn.de/pdfs/chebrolu2018ral.pdf}
    }

  • P. Lottes, J. Behley, N. Chebrolu, A. Milioto, and C. Stachniss, “Joint stem detection and crop-weed classification for plant-specific treatment in precision farming,” in Proc. of the IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems (IROS), 2018.
    [BibTeX] [PDF] [Video]

    Applying agrochemicals is the default procedure for conventional weed control in crop production, but has negative impacts on the environment. Robots have the potential to treat every plant in the field individually and thus can reduce the required use of such chemicals. To achieve that, robots need the ability to identify crops and weeds in the field and must additionally select effective treatments. While certain types of weed can be treated mechanically, other types need to be treated by (selective) spraying. In this paper, we present an approach that provides the necessary information for effective plant-specific treatment. It outputs the stem location for weeds, which allows for mechanical treatments, and the covered area of the weed for selective spraying. Our approach uses an end-to- end trainable fully convolutional network that simultaneously estimates stem positions as well as the covered area of crops and weeds. It jointly learns the class-wise stem detection and the pixel-wise semantic segmentation. Experimental evaluations on different real-world datasets show that our approach is able to reliably solve this problem. Compared to state-of-the-art approaches, our approach not only substantially improves the stem detection accuracy, i.e., distinguishing crop and weed stems, but also provides an improvement in the semantic segmentation performance.

    @InProceedings{lottes2018iros,
    author = {P. Lottes and J. Behley and N. Chebrolu and A. Milioto and C. Stachniss},
    title = {Joint Stem Detection and Crop-Weed Classification for Plant-specific Treatment in Precision Farming},
    booktitle = iros,
    year = 2018,
    url = {https://www.ipb.uni-bonn.de/pdfs/lottes18iros.pdf},
    videourl = {https://www.youtube.com/watch?v=C9mjZxE_Sxg},
    abstract = {Applying agrochemicals is the default procedure for conventional weed control in crop production, but has negative impacts on the environment. Robots have the potential to treat every plant in the field individually and thus can reduce the required use of such chemicals. To achieve that, robots need the ability to identify crops and weeds in the field and must additionally select effective treatments. While certain types of weed can be treated mechanically, other types need to be treated by (selective) spraying. In this paper, we present an approach that provides the necessary information for effective plant-specific treatment. It outputs the stem location for weeds, which allows for mechanical treatments, and the covered area of the weed for selective spraying. Our approach uses an end-to- end trainable fully convolutional network that simultaneously estimates stem positions as well as the covered area of crops and weeds. It jointly learns the class-wise stem detection and the pixel-wise semantic segmentation. Experimental evaluations on different real-world datasets show that our approach is able to reliably solve this problem. Compared to state-of-the-art approaches, our approach not only substantially improves the stem detection accuracy, i.e., distinguishing crop and weed stems, but also provides an improvement in the semantic segmentation performance.}
    }

2017

  • N. Chebrolu, P. Lottes, A. Schaefer, W. Winterhalter, W. Burgard, and C. Stachniss, “Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields,” The intl. journal of robotics research, 2017. doi:10.1177/0278364917720510
    [BibTeX] [PDF]
    @Article{chebrolu2017ijrr,
    title = {Agricultural robot dataset for plant classification, localization and mapping on sugar beet fields},
    author = {N. Chebrolu and P. Lottes and A. Schaefer and W. Winterhalter and W. Burgard and C. Stachniss},
    journal = ijrr,
    year = {2017},
    doi = {10.1177/0278364917720510},
    url = {https://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/chebrolu2017ijrr.pdf},
    }