Browsing by Author "Huang, Guoquan"
Now showing 1 - 3 of 3
Results Per Page
Sort Options
Item Decoupled Right Invariant Error States for Consistent Visual-Inertial Navigation(IEEE Robotics and Automation Letters, 2022-01-04) Yang, Yulin; Chen, Chuchu; Lee, Woosik; Huang, GuoquanThe invariant extended Kalman filter (IEKF) is proven to preserve the observability property of visual-inertial navigation systems (VINS) and suitable for consistent estimator design. However, if features are maintained in the state vector, the propagation of IEKF will become more computationally expensive because these features are involved in the covariance propagation. To address this issue, we propose two novel algorithms which preserve the system consistency by leveraging the invariant state representation and ensure efficiency by decoupling features from covariance propagation. The first algorithm combines right invariant error states with first-estimates Jacobian (FEJ) technique, by decoupling the features from the Lie group representation and utilizing FEJ for consistent estimation. The second algorithm is designed specifically for sliding-window filter-based VINS as it associates the features to an active cloned pose, instead of the current IMU state, for Lie group representation. A new pseudo-anchor change algorithm is also proposed to maintain the features in the state vector longer than the window span. Both decoupled right- and left-invariant error based VINS methods are implemented for a complete comparison. Extensive Monte-Carlo simulations on three simulated trajectories and real world evaluations on the TUM-VI datasets are provided to verify our analysis and demonstrate that the proposed algorithms can achieve improved accuracy than a state-of-art filter-based VINS algorithm using FEJ.Item Online Self-Calibration for Visual-Inertial Navigation: Models, Analysis, and Degeneracy(IEEE Transactions on Robotics, 2023-06-07) Yang, Yulin; Geneva, Patrick; Zuo, Xingxing; Huang, GuoquanAs sensor calibration plays an important role in visual-inertial sensor fusion, this article performs an in-depth investigation of online self-calibration for robust and accurate visual-inertial state estimation. To this end, we first conduct complete observability analysis for visual-inertial navigation systems (VINS) with full calibration of sensing parameters, including inertial measurement unit (IMU)/camera intrinsics and IMU-camera spatial-temporal extrinsic calibration, along with readout time of rolling shutter (RS) cameras (if used). We study different inertial model variants containing intrinsic parameters that encompass most commonly used models for low-cost inertial sensors. With these models, the observability analysis of linearized VINS with full sensor calibration is performed. Our analysis theoretically proves the intuition commonly assumed in the literature—that is, VINS with full sensor calibration has four unobservable directions, corresponding to the system's global yaw and position, while all sensor calibration parameters are observable given fully excited motions. Moreover, we, for the first time, identify degenerate motion primitives for IMU and camera intrinsic calibration, which, when combined, may produce complex degenerate motions. We compare the proposed online self-calibration on commonly used IMUs against the state-of-art offline calibration toolbox Kalibr, showing that the proposed system achieves better consistency and repeatability. Based on our analysis and experimental evaluations, we also offer practical guidelines to effectively perform online IMU-camera self-calibration in practice.Item Resilient Ground Vehicle Autonomous Navigation in GPS-denied Environments(Guidance, Navigation and Control, 2022-11-23) Baxevani, Kleio; Yadav, Indrajeet; Yang, Yulin; Sebok, Michael; Tanner, Herbert G.; Huang, GuoquanCo-design and integration of vehicle navigation and control and state estimation is key for enabling field deployment of mobile robots in GPS-denied cluttered environments, and sensor calibration is critical for successful operation of both subsystems. This paper demonstrates the potential of this co-design approach with field tests of the integration of a reactive receding horizon-based motion planner and controller with an inertial aided multi-sensor calibration scheme. The reported method provides accurate calibration parameters that improve the performance of the state estimator, and enable the motion controller to generate smooth and continuous minimal-jerk trajectories based on local LiDAR data. Numerical simulations in Unity, and real-world experimental results from the field corroborate the claims of efficacy for the reported autonomous navigation computational pipeline.