Aided inertial navigation system: algorithm and analysis
Date
2024
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
University of Delaware
Abstract
Inertial navigation system (INS) has been widely used for mobile devices, self-driving cars and autonomous vehicles. Exteroceptive range and/or bearing sensors (cameras, LiDAR and sonars) are commonly leveraged to improve the performance of INS. We first perform a thorough observability analysis for linearized aided inertial navigation systems with different geometric features (points, lines, planes or their combinations). In particular, by reviewing common representations of geometric features, we introduce two sets of unified feature representations, i.e., the quaternion and closest point (CP) parameterizations. We further prove that there are at least 5 (7) unobservable directions for the linearized aided INS with a single line (plane) feature; and, for the first time, analytically derive the unobservable subspace for the case of multiple lines or planes. Building upon this analysis for homogeneous features, we examine the observability of the same system but with combinations of heterogeneous features, and show that, in general, the system preserves at least 4 unobservable directions. In addition, we also revisit the online full-parameter IMU-Camera calibration for aided INS. We prove that, even with a monocular rolling shutter camera, all the calibration parameters, including IMU intrinsics, camera intrinsics and IMU-Camera spatial-temporal calibration are observable given fully excited motions. We, for the first time, theoretically and experimentally, identified the degenerate motions that might cause these calibration parameters become unobservable. Self-made visual-inertial sensor rig is also leveraged to collect data for comparison with state-of-art visual-inertial calibration toolbox. We extend our work to the multi-visual-inertial system (MVIS) which can calibrate all the related (intrinsic, spatial and temporal) parameters for multiple asynchronous IMUs, gyroscopes and global/rolling shutter cameras. We also, for the first time, study the degenerate motions for the spatial-temporal IMU-IMU calibration, which are missing from current literature. Extensive Monte-Carlo simulations based on realistic trajectories and commonly-used sensors are performed to verify our analysis. Our proposed system is also evaluated on several real world benchmark datasets. The results show that by adding combined geometric features (point/line/plane), the aided INS can achieve lower localization errors. Online sensor calibration for aided INS can improve the system accuracy and robustness, especially when low-end noisy sensors are used. The proposed MVIS can successfully calibrate multiple inertial and visual sensors with better repeatability than state-of-art calibration toolbox. In addition, the calibration convergence can also be improved when more sensors (i.e., cameras) are used.
Description
Keywords
Degenerate motion, Kalman filter, Observability analysis, Sensor calibration, Visual-inertial navigation