Tightly-Coupled GNSS/Vision Using a Sky-Pointing Camera for Vehicle Navigation in Urban Areas
dc.contributor.author | Gakne, Paul Verlaine | |
dc.contributor.author | O'Keefe, Kyle P.G. | |
dc.date.accessioned | 2018-04-18T20:26:20Z | |
dc.date.available | 2018-04-18T20:26:20Z | |
dc.date.issued | 2018-04-17 | |
dc.description.abstract | This paper presents a method of fusing the ego-motion of a robot or a land vehicle estimated from an upward-facing camera with Global Navigation Satellite System (GNSS) signals for navigation purposes in urban environments. A sky-pointing camera is mounted on the top of a car and synchronized with a GNSS receiver. The advantages of this configuration are two-fold: firstly, for the GNSS signals, the upward-facing camera will be used to classify the acquired images into sky and non-sky (also known as segmentation). A satellite falling into the non-sky areas (e.g., buildings, trees) will be rejected and not considered for the final position solution computation. Secondly, the sky-pointing camera (with a field of view of about 90 degrees) is helpful for urban area ego-motion estimation in the sense that it does not see most of the moving objects (e.g., pedestrians, cars) and thus is able to estimate the ego-motion with fewer outliers than is typical with a forward-facing camera. The GNSS and visual information systems are tightly-coupled in a Kalman filter for the final position solution. Experimental results demonstrate the ability of the system to provide satisfactory navigation solutions and better accuracy than the GNSS-only and the loosely-coupled GNSS/vision, 20 percent and 82 percent (in the worst case) respectively, in a deep urban canyon, even in conditions with fewer than four GNSS satellites. | en_US |
dc.description.grantingagency | Natural Sciences and Engineering Research Council - Discovery Grant | en_US |
dc.identifier.citation | Gakne, P. V., & O’Keefe, K. P. G. (2018). Tightly-Coupled GNSS/Vision Using a Sky-Pointing Camera for Vehicle Navigation in Urban Areas. "Sensors," 18(4), 1–32. https://doi.org/10.3390/s18041244 | en_US |
dc.identifier.doi | http://dx.doi.org/10.3390/s18041244 | en_US |
dc.identifier.issn | 1424-8220 | |
dc.identifier.uri | http://hdl.handle.net/1880/111486 | |
dc.identifier.uri | https://dx.doi.org/10.11575/PRISM/37444 | |
dc.language.iso | en | en_US |
dc.publisher | Multidisciplinary Digital Publishing Institute | en_US |
dc.publisher.department | Geomatics Engineering | en_US |
dc.publisher.faculty | Schulich School of Engineering | en_US |
dc.publisher.institution | University of Calgary | en_US |
dc.publisher.policy | https://www.mdpi.com/authors | en_US |
dc.rights | ©2018 by the authors. | |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0 | en_US |
dc.rights.uri | https://creativecommons.org/licenses/by/4.0/ | |
dc.subject | visual odometry | en_US |
dc.subject | upward-facing camera | en_US |
dc.subject | motion estimation | en_US |
dc.subject | satellites | en_US |
dc.subject | GNSS | en_US |
dc.subject | tightly-coupled integration | en_US |
dc.subject | vehicle navigation | en_US |
dc.subject | image segmentation | en_US |
dc.subject | clustering algorithms | en_US |
dc.title | Tightly-Coupled GNSS/Vision Using a Sky-Pointing Camera for Vehicle Navigation in Urban Areas | en_US |
dc.type | publishedVersion | en_US |
dc.type | journal article | en_US |
ucalgary.item.requestcopy | true | en_US |
Files
Original bundle
1 - 1 of 1
Loading...
- Name:
- sensors-18-01244-v2.pdf
- Size:
- 11.36 MB
- Format:
- Adobe Portable Document Format
- Description: