Online Self-Calibration for Visual-Inertial Navigation: Models, Analysis, and Degeneracy

Author(s)Yang, Yulin
Author(s)Geneva, Patrick
Author(s)Zuo, Xingxing
Author(s)Huang, Guoquan
Date Accessioned2023-09-01T15:59:18Z
Date Available2023-09-01T15:59:18Z
Publication Date2023-06-07
DescriptionThis article was originally published in IEEE Transactions on Robotics. The version of record is available at: https://doi.org/10.1109/TRO.2023.3275878. © 2023 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. This article will be embargoed until 06/07/2025.
AbstractAs sensor calibration plays an important role in visual-inertial sensor fusion, this article performs an in-depth investigation of online self-calibration for robust and accurate visual-inertial state estimation. To this end, we first conduct complete observability analysis for visual-inertial navigation systems (VINS) with full calibration of sensing parameters, including inertial measurement unit (IMU)/camera intrinsics and IMU-camera spatial-temporal extrinsic calibration, along with readout time of rolling shutter (RS) cameras (if used). We study different inertial model variants containing intrinsic parameters that encompass most commonly used models for low-cost inertial sensors. With these models, the observability analysis of linearized VINS with full sensor calibration is performed. Our analysis theoretically proves the intuition commonly assumed in the literature—that is, VINS with full sensor calibration has four unobservable directions, corresponding to the system's global yaw and position, while all sensor calibration parameters are observable given fully excited motions. Moreover, we, for the first time, identify degenerate motion primitives for IMU and camera intrinsic calibration, which, when combined, may produce complex degenerate motions. We compare the proposed online self-calibration on commonly used IMUs against the state-of-art offline calibration toolbox Kalibr, showing that the proposed system achieves better consistency and repeatability. Based on our analysis and experimental evaluations, we also offer practical guidelines to effectively perform online IMU-camera self-calibration in practice.
SponsorThis work was supported in part by the University of Delaware (UD) College of Engineering, the NSF (IIS-1924897, MRI-2018905, SCH-2014264). The work of Yang was supported by the University Doctoral Fellowship, while Geneva was partially supported by the Delaware Space Grant Fellowship.
CitationY. Yang, P. Geneva, X. Zuo and G. Huang, "Online Self-Calibration for Visual-Inertial Navigation: Models, Analysis, and Degeneracy," in IEEE Transactions on Robotics, doi: 10.1109/TRO.2023.3275878.
ISSN1941-0468
URLhttps://udspace.udel.edu/handle/19716/33284
Languageen_US
PublisherIEEE Transactions on Robotics
Keywordsdegenerate motions
Keywordsobservability analysis
Keywordssensor self-calibration
Keywordsstate estimation
Keywordsvisual inertial systems
TitleOnline Self-Calibration for Visual-Inertial Navigation: Models, Analysis, and Degeneracy
TypeArticle
Files
Original bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
Online Self-Calibration for Visual-Inertial Navigation.pdf
Size:
6.84 MB
Format:
Adobe Portable Document Format
Description:
Main article
License bundle
Now showing 1 - 1 of 1
No Thumbnail Available
Name:
license.txt
Size:
2.22 KB
Format:
Item-specific license agreed upon to submission
Description: