Integration and Calibration of an Unmanned Aerial Imaging and Ranging System
Date
2022-06
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
In recent years, mapping using unmanned aerial vehicles (UAV) has gained considerable attention in the research community and has become popular in various industrial fields such as terrain mapping, forestry, precision agriculture, mining, and construction. Most UAV mapping platforms use cameras to collect data. However, due to advancements in sensor manufacturing, nowadays lightweight and affordable light detection and ranging (LiDAR) sensors are designed for UAV platforms as well. Consequently, there has been an increased interest in the potential of integrating both active and passive sensors to combine the strengths of both technologies into one mapping system. Fusing the data collected by a camera and a LiDAR and generating accurate topographic products require the sensors to be calibrated both intrinsically and extrinsically with respect to one another. In addition, the data needs to be geo-referenced with respect to a mapping coordinate reference system. While images can be indirectly geo-referenced using ground control points, scanned points require direct geo-referencing, which is usually achieved using an inertial navigation system (INS) integrated with global navigation satellite system (GNSS). The main objective of this thesis is integrating a VLP-16 Lidar sensor and a Sony A6000 camera and a MEMS IMU into a compact, low-cost mapping system. This system uses a single-frequency GNSS receiver for time-synchronization purposes only. Hence, the main sub-objective of this thesis is applying camera-INS integration for geo-referencing the LiDAR point clouds. The main contributions of this thesis include the development and implementation of: 1) A rigorous and user-friendly calibration approach which includes the use of a custom test-field with several planar features for intrinsic calibrations of the camera and the LiDAR as well as their relative orientation parameters (ROP); 2) A two-step approach to solve for the boresight calibration parameters between the INS and the camera that involves, as the first step, an approximate estimation using visual observations of the INS body frame, followed by refinement during an error-state extended Kalam filtering (ESEKF); 3) Direct geo-referencing and texturing of the LiDAR point cloud where the accurate trajectory is estimated by integrating the camera pose with INS observations using ESEKF.
The experimental results showed that the camera was calibrated with an accuracy of 0.119±0.079 pixels. It was also shown that calibrating the intrinsic parameters of the LiDAR sensor helped reducing the mapping errors up to 13%. Using our proposed ROP-constrained bundle adjustment reduced the uncertainty in estimating the ROPs between the LiDAR and the camera by 48%. Simultaneous calibration of the LiDAR intrinsic LiDAR parameters and the LiDAR-Camera could further reduce the mapping errors by 6.5% and the ROP uncertainties by 16.1%. In summary, the ROPs were estimated with precision of 3.99 mm and 0.0587 degrees resulting in 3D mapping accuracy of 5.87 mm indoors. The trajectory estimation and geo-referencing approaches were tested in two different flights, both over agricultural sites, using two different hexacopters. The root mean square errors of camera-aided INS trajectory estimations were 1.59 cm in position and 0.0022 degrees in orientation. The directly geo-referenced LiDAR point clouds had an average distance of 0.04 m from the indirectly geo-referenced photogrammetric point clouds. Given the precision of the LiDAR ranging measurements (3 cm), these results were acceptable and comparable to the trajectory-estimation accuracies that could be achieved by an INS integrated with a dual-frequency, real-time-kinematic GNSS (3-5 cm and 0.05-0.2 degrees).
Description
Keywords
Citation
Cortes Rubio, C. E. (2022). Integration and calibration of an unmanned aerial imaging and ranging system (Master's thesis, University of Calgary, Calgary, Canada). Retrieved from https://prism.ucalgary.ca.