Despite recent advances in semantic Simultaneous Localization and Mapping (SLAM) for terrestrial and aerial applications, underwater semantic SLAM remains an open and largely unaddressed research problem due to the unique sensing modalities and the differing object classes. This paper presents a semantic SLAM method for underwater environments that can identify, localize, classify, and map a wide variety of marine objects without a priori knowledge of the scene’s object makeup. The method performs unsupervised object segmentation and object-level feature aggregation, and then uses opti-acoustic sensor fusion for object localization with probabilistic data association and graphical models for back- end inference. Indoor and outdoor underwater datasets with a wide variety of objects and challenging acoustic and lighting conditions are collected for evaluation. The datasets are made publicly available. Quantitative and qualitative results show the proposed method achieves reduced trajectory error compared to baseline methods, and is also able to obtain comparable map accuracy to a baseline closed-set method that requires hand- labeled data of all objects in the scene.
Opti-Acoustic Semantic SLAM with Unknown Objects in Underwater Environments
Kurran Singh and Jungseok Hong and Nicholas R. Rypkema and John J. Leonard
@inproceedings{singh2024opensetslam,
title=Opti-Acoustic Semantic SLAM with Unknown Objects in Underwater Environments,
author={Kurran Singh and Jungseok Hong and Nicholas R. Rypkema and John J. Leonard},
booktitle={arxiv Preprint},
year={2024}
}