Vision Sensor Aided Navigation for Ground Vehicle Applications

Date
2019-01-11
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
Manned or unmanned ground vehicles with autonomous ability have attracted people’s attention greatly in recent decades. As a result, there is an increased demand for the navigation performance improvement of low-cost navigation systems. The integration of INS and GNSS receivers is well-known and commonly used in ground vehicle applications, not only because the two sensors have complementary characteristics, but also their integration can provide position and orientation in a global scale. However, GNSS signals can suffer from obstruction and multi-path errors in city canyons, tunnels, woodlands, and mountainous regions. They are also vulnerable to jamming and spoofing. Therefore, the navigation in GNSS-denied environment is of interest among a lot of researchers. It is significant to study the methods to mitigate the error drift by using low-cost navigation sensors and aiding sensors, as well as the new integration schemes and techniques, especially using knowledge from multiple disciplines. The content of this thesis is as follows. 1. In the non-holonomic constraints (NHC) and odometer (OD) aided navigation system, the system model fully considers the inter-sensor calibration parameters, such as the boresight error and lever-arm of IMU with respect to the vehicle frame. Considering the characteristic of low-cost IMU sensors, the observability of INS/NHC/OD integration is theoretically analyzed, which is different from the existing high-end INS case. To deal with large boresight errors and to obtain higher inter-sensor calibration accuracy, we propose to use Unscented Kalman filter (UKF) as the fusion scheme, taking special treatment on the unscented transform to the quaternion. Simulation test shows that UKF outperforms EKF in estimating the calibration parameters, especially when the boresight error is slightly larger. A new attitude-velocity constraint aided INS is developed, which has the theoretical equivalence with NHC. The vehicle experiments demonstrate that with the help of this constraint, the positioning RMS error is within 0.7m during 60s GNSS outages for the IMU with $1^\circ/h$ gyro bias. 2. We propose the vanishing point-aided INS method based on the parallel lane marking observations from a forward-looking camera. There are two cases: one is when the lane orientation is unknown and the other is known lane orientation with the help of digital maps. We develop the mathematical relationship between the vanishing point coordinates and relative attitude of the camera with respect to the road. Based on this, the relative heading formula is derived. The whole VP aiding scheme is proposed, including the straight lane detection, uncertainty analysis, sequential Kalman filtering, and sensitivity analysis of INS/VP integration. The AIME (Autonomous Integrity Monitored Extrapolation) soft failure scheme is adopted to detect the small curve of the lane. The algorithm is tested by the simulations and experiments. It is shown that with additional help of VP, 33\% improvement of the positioning accuracy is achieved than INS/NHC alone, reaching 0.32\% DT (distance travelled). 3. We propose to use the relative pose from a monocular camera to aid the INS. The frame to frame relative pose is calculated based on the epipolar constraint. An uncertainty estimation method for the relative attitude from the vision system is developed, which is essential for the sensor fusion. Simulations and experiments show the validness of the covariance estimation method. A simple but effective failure detection method of the VO system is proposed based on the translation vector from VO. Finally, the loosely-coupled INS/NHC/VO integration is developed, and the observability analysis proves the complementary properties of INS/NHC and INS/VO integration. The experiments show that in the INS/NHC/VO integrated navigation, the average horizontal positioning RMS error of 4 experiments is within 0.30\% DT. 4. The line features observed by a camera are extracted and parameterized for further improving the accuracy of existing VINS. The first approach is developed to extract the lines corresponding to the vertical 3D lines of buildings and thus to calculate the roll angle of the vehicle. This helps the existing point feature based VINS using Multiple State Constraint Kalman filter (MSCKF). Furthermore, a new straight line parameterization, which is called Anchored Inverse-Depth Pl\"{u}cker Line (AIDPL), is proposed for the undelayed initialization of 3D space lines when using line-based VINS under the framework of EKF-SLAM. The Monte Carlo simulation tests demonstrate that the positioning accuracy is significantly improved using proposed tightly-coupled VINS. Meanwhile, the 3D lines in the environment are estimated effectively and quickly in the setup.
Description
Keywords
inerital navigation system, vision aided inertial navigation, vanishing point, simultaneous localization and mapping, line feature
Citation
Liu, Z. (2019). Vision Sensor Aided Navigation for Ground Vehicle Applications (Doctoral thesis, University of Calgary, Calgary, Canada). Retrieved from https://prism.ucalgary.ca.