Yue Pan

Ph.D. Student
Contact:
Email: yue.pan@nulligg.uni-bonn.de
Tel: +49 – 228 – 73 – 29 05
Fax: +49 – 228 – 73 – 27 12
Office: Nussallee 15, 1. OG, room 1.003
Address:
University of Bonn
Photogrammetry, IGG
Nussallee 15
53115 Bonn

Profiles: Google Scholar | Github | Linkedln

Short CV

Yue Pan is a PhD student at the University of Bonn since June 2022. He received his Master degree in Geomatics in 2022 from ETH Zurich. He received his Bachelor degree in Geomatics Engineering in 2019. 

Research Interests

  • SLAM
  • 3D Reconstruction
  • Navigation

Projects

  • PhenoRob – Robotics and Phenotyping for Sustainable Crop Production (DFG Cluster of Excellence)

Awards

  • Geosuisse/IGS prize 2022 for excellent master thesis and academic record in Geomatics

Publications

2024

  • X. Zhong, Y. Pan, C. Stachniss, and J. Behley, “3D LiDAR Mapping in Dynamic Environments using a 4D Implicit Neural Representation,” in Proc. of the IEEE/CVF Conf. on Computer Vision and Pattern Recognition (CVPR), 2024.
    [BibTeX] [PDF]
    @inproceedings{zhong2024cvpr,
    author = {X. Zhong and Y. Pan and C. Stachniss and J. Behley},
    title = {{3D LiDAR Mapping in Dynamic Environments using a 4D Implicit Neural Representation}},
    booktitle = cvpr,
    year = 2024,
    }
  • Y. Pan, X. Zhong, L. Wiesmann, T. Posewsky, J. Behley, and C. Stachniss, “PIN-SLAM: LiDAR SLAM Using a Point-Based Implicit Neural Representation for Achieving Global Map Consistency,” Arxiv preprint, vol. arXiv:2401.09101, 2024.
    [BibTeX] [PDF] [Code]
    Accurate and robust localization and mapping are essential components for most autonomous robots. In this paper, we propose a SLAM system for building globally consistent maps, called PIN-SLAM, that is based on an elastic and compact point-based implicit neural map representation. Taking range measurements as input, our approach alternates between incremental learning of the local implicit signed distance field and the pose estimation given the current local map using a correspondence-free, point-to-implicit model registration. Our implicit map is based on sparse optimizable neural points, which are inherently elastic and deformable with the global pose adjustment when closing a loop. Loops are also detected using the neural point features. Extensive experiments validate that PIN-SLAM is robust to various environments and versatile to different range sensors such as LiDAR and RGB-D cameras. PIN-SLAM achieves pose estimation accuracy better or on par with the state-of-the-art LiDAR odometry or SLAM systems and outperforms the recent neural implicit SLAM approaches while maintaining a more consistent, and highly compact implicit map that can be reconstructed as accurate and complete meshes. Finally, thanks to the voxel hashing for efficient neural points indexing and the fast implicit map-based registration without closest point association, PIN-SLAM can run at the sensor frame rate on a moderate GPU. Codes will be available at: https://github.com/PRBonn/PIN_SLAM.
    @article{pan2024arxiv,
    author = {Y. Pan and X. Zhong and L. Wiesmann and T. Posewsky and J. Behley and C. Stachniss},
    title = {{PIN-SLAM: LiDAR SLAM Using a Point-Based Implicit Neural Representation for Achieving Global Map Consistency}},
    journal = arxiv,
    year = 2024,
    volume = {arXiv:2401.09101},
    url = {http://arxiv.org/pdf/2401.09101v1},
    abstract = {Accurate and robust localization and mapping are essential components for most autonomous robots. In this paper, we propose a SLAM system for building globally consistent maps, called PIN-SLAM, that is based on an elastic and compact point-based implicit neural map representation. Taking range measurements as input, our approach alternates between incremental learning of the local implicit signed distance field and the pose estimation given the current local map using a correspondence-free, point-to-implicit model registration. Our implicit map is based on sparse optimizable neural points, which are inherently elastic and deformable with the global pose adjustment when closing a loop. Loops are also detected using the neural point features. Extensive experiments validate that PIN-SLAM is robust to various environments and versatile to different range sensors such as LiDAR and RGB-D cameras. PIN-SLAM achieves pose estimation accuracy better or on par with the state-of-the-art LiDAR odometry or SLAM systems and outperforms the recent neural implicit SLAM approaches while maintaining a more consistent, and highly compact implicit map that can be reconstructed as accurate and complete meshes. Finally, thanks to the voxel hashing for efficient neural points indexing and the fast implicit map-based registration without closest point association, PIN-SLAM can run at the sensor frame rate on a moderate GPU. Codes will be available at: https://github.com/PRBonn/PIN_SLAM.},
    codeurl = {https://github.com/PRBonn/PIN_SLAM}
    }

2023

  • Y. Pan, F. Magistri, T. Läbe, E. Marks, C. Smitt, C. S. McCool, J. Behley, and C. Stachniss, “Panoptic Mapping with Fruit Completion and Pose Estimation for Horticultural Robots,” in Proc. of the IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems (IROS), 2023.
    [BibTeX] [PDF] [Code] [Video]
    @inproceedings{pan2023iros,
    author = {Y. Pan and F. Magistri and T. L\"abe and E. Marks and C. Smitt and C.S. McCool and J. Behley and C. Stachniss},
    title = {{Panoptic Mapping with Fruit Completion and Pose Estimation for Horticultural Robots}},
    booktitle = iros,
    year = 2023,
    codeurl = {https://github.com/PRBonn/HortiMapping},
    videourl = {https://youtu.be/fSyHBhskjqA}
    }
  • L. Wiesmann, T. Guadagnino, I. Vizzo, N. Zimmerman, Y. Pan, H. Kuang, J. Behley, and C. Stachniss, “LocNDF: Neural Distance Field Mapping for Robot Localization,” IEEE Robotics and Automation Letters (RA-L), vol. 8, iss. 8, p. 4999–5006, 2023. doi:10.1109/LRA.2023.3291274
    [BibTeX] [PDF] [Code] [Video]
    @article{wiesmann2023ral-icra,
    author = {L. Wiesmann and T. Guadagnino and I. Vizzo and N. Zimmerman and Y. Pan and H. Kuang and J. Behley and C. Stachniss},
    title = {{LocNDF: Neural Distance Field Mapping for Robot Localization}},
    journal = ral,
    volume = {8},
    number = {8},
    pages = {4999--5006},
    year = 2023,
    url = {https://www.ipb.uni-bonn.de/wp-content/papercite-data/pdf/wiesmann2023ral-icra.pdf},
    issn = {2377-3766},
    doi = {10.1109/LRA.2023.3291274},
    codeurl = {https://github.com/PRBonn/LocNDF},
    videourl = {https://youtu.be/-0idH21BpMI},
    }
  • X. Zhong, Y. Pan, J. Behley, and C. Stachniss, “SHINE-Mapping: Large-Scale 3D Mapping Using Sparse Hierarchical Implicit Neural Representations,” in Proc. of the IEEE Intl. Conf. on Robotics & Automation (ICRA), 2023.
    [BibTeX] [PDF] [Code] [Video]
    @inproceedings{zhong2023icra,
    author = {Zhong, Xingguang and Pan, Yue and Behley, Jens and Stachniss, Cyrill},
    title = {{SHINE-Mapping: Large-Scale 3D Mapping Using Sparse Hierarchical Implicit Neural Representations}},
    booktitle = icra,
    year = 2023,
    codeurl = {https://github.com/PRBonn/SHINE_mapping},
    videourl = {https://youtu.be/jRqIupJgQZE},
    }

2022

  • Y. Pan, Y. Kompis, L. Bartolomei, R. Mascaro, C. Stachniss, and M. Chli, “Voxfield: Non-Projective Signed Distance Fields for Online Planning and 3D Reconstruction,” in Proc. of the IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems (IROS), 2022.
    [BibTeX] [PDF] [Code] [Video]
    @inproceedings{pan2022iros,
    title = {{Voxfield: Non-Projective Signed Distance Fields for Online Planning and 3D Reconstruction}},
    author = {Y. Pan and Y. Kompis and L. Bartolomei and R. Mascaro and C. Stachniss and M. Chli},
    booktitle = iros,
    year = {2022},
    codeurl = {https://github.com/VIS4ROB-lab/voxfield},
    videourl ={https://youtu.be/JS_yeq-GR4A},
    }