TY - JOUR
T1 - GCENet
T2 - A geometric correspondence estimation network for tracking and loop detection in visual–inertial SLAM
AU - Zhou, Jichao
AU - Shen, Jiwei
AU - Lyu, Shujing
AU - Lu, Yue
N1 - Publisher Copyright:
© 2024
PY - 2025/3/1
Y1 - 2025/3/1
N2 - Establishing robust and effective data correlation has been one of the core problems in visual based SLAM (Simultaneous Localization and Mapping). In this paper, we propose a geometric correspondence estimation network, GCENet, tailored for visual tracking and loop detection in visual–inertial SLAM. GCENet considers both local and global correlation in frames, enabling deep feature matching in scenarios involving noticeable displacement. Building upon this, we introduce a tightly-coupled visual–inertial state estimation system. To address challenges in extreme environments, such as strong illumination and weak texture, where manual feature matching tends to fail, a compensatory deep optical flow tracker is incorporated into our system. In such cases, our approach utilizes GCENet for dense optical flow tracking, replacing manual pipelines to conduct visual tracking. Furthermore, a deep loop detector based on GCENet is constructed, which utilizes estimated flow to represent scene similarity. Spatial consistency discrimination on candidate loops is conducted with GCENet to establish long-term data association, effectively suppressing false negatives and false positives in loop closure. Dedicated experiments are conducted in EuRoC drone, TUM-4Seasons and private robot datasets to evaluate the proposed method. The results demonstrate that our system exhibits superior robustness and accuracy in extreme environments compared to the state-of-the-art methods.
AB - Establishing robust and effective data correlation has been one of the core problems in visual based SLAM (Simultaneous Localization and Mapping). In this paper, we propose a geometric correspondence estimation network, GCENet, tailored for visual tracking and loop detection in visual–inertial SLAM. GCENet considers both local and global correlation in frames, enabling deep feature matching in scenarios involving noticeable displacement. Building upon this, we introduce a tightly-coupled visual–inertial state estimation system. To address challenges in extreme environments, such as strong illumination and weak texture, where manual feature matching tends to fail, a compensatory deep optical flow tracker is incorporated into our system. In such cases, our approach utilizes GCENet for dense optical flow tracking, replacing manual pipelines to conduct visual tracking. Furthermore, a deep loop detector based on GCENet is constructed, which utilizes estimated flow to represent scene similarity. Spatial consistency discrimination on candidate loops is conducted with GCENet to establish long-term data association, effectively suppressing false negatives and false positives in loop closure. Dedicated experiments are conducted in EuRoC drone, TUM-4Seasons and private robot datasets to evaluate the proposed method. The results demonstrate that our system exhibits superior robustness and accuracy in extreme environments compared to the state-of-the-art methods.
KW - Geometric correspondence estimation
KW - Loop closure detection
KW - Visual tracking
KW - Visual–inertial SLAM
UR - https://www.scopus.com/pages/publications/85208196934
U2 - 10.1016/j.eswa.2024.125659
DO - 10.1016/j.eswa.2024.125659
M3 - 文章
AN - SCOPUS:85208196934
SN - 0957-4174
VL - 262
JO - Expert Systems with Applications
JF - Expert Systems with Applications
M1 - 125659
ER -