Tightly-Coupled GNSS/Vision Using a Sky-Pointing Camera for Vehicle Navigation in Urban Areas

dc.contributor.authorGakne, Paul Verlaine
dc.contributor.authorO'Keefe, Kyle P.G.
dc.date.accessioned2018-04-18T20:26:20Z
dc.date.available2018-04-18T20:26:20Z
dc.date.issued2018-04-17
dc.description.abstractThis paper presents a method of fusing the ego-motion of a robot or a land vehicle estimated from an upward-facing camera with Global Navigation Satellite System (GNSS) signals for navigation purposes in urban environments. A sky-pointing camera is mounted on the top of a car and synchronized with a GNSS receiver. The advantages of this configuration are two-fold: firstly, for the GNSS signals, the upward-facing camera will be used to classify the acquired images into sky and non-sky (also known as segmentation). A satellite falling into the non-sky areas (e.g., buildings, trees) will be rejected and not considered for the final position solution computation. Secondly, the sky-pointing camera (with a field of view of about 90 degrees) is helpful for urban area ego-motion estimation in the sense that it does not see most of the moving objects (e.g., pedestrians, cars) and thus is able to estimate the ego-motion with fewer outliers than is typical with a forward-facing camera. The GNSS and visual information systems are tightly-coupled in a Kalman filter for the final position solution. Experimental results demonstrate the ability of the system to provide satisfactory navigation solutions and better accuracy than the GNSS-only and the loosely-coupled GNSS/vision, 20 percent and 82 percent (in the worst case) respectively, in a deep urban canyon, even in conditions with fewer than four GNSS satellites.en_US
dc.description.grantingagencyNatural Sciences and Engineering Research Council - Discovery Granten_US
dc.identifier.citationGakne, P. V., & O’Keefe, K. P. G. (2018). Tightly-Coupled GNSS/Vision Using a Sky-Pointing Camera for Vehicle Navigation in Urban Areas. "Sensors," 18(4), 1–32. https://doi.org/10.3390/s18041244en_US
dc.identifier.doihttp://dx.doi.org/10.3390/s18041244en_US
dc.identifier.issn1424-8220
dc.identifier.urihttp://hdl.handle.net/1880/111486
dc.identifier.urihttps://dx.doi.org/10.11575/PRISM/37444
dc.language.isoenen_US
dc.publisherMultidisciplinary Digital Publishing Instituteen_US
dc.publisher.departmentGeomatics Engineeringen_US
dc.publisher.facultySchulich School of Engineeringen_US
dc.publisher.institutionUniversity of Calgaryen_US
dc.publisher.policyhttps://www.mdpi.com/authorsen_US
dc.rights©2018 by the authors.
dc.rights.urihttps://creativecommons.org/licenses/by/4.0en_US
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/
dc.subjectvisual odometryen_US
dc.subjectupward-facing cameraen_US
dc.subjectmotion estimationen_US
dc.subjectsatellitesen_US
dc.subjectGNSSen_US
dc.subjecttightly-coupled integrationen_US
dc.subjectvehicle navigationen_US
dc.subjectimage segmentationen_US
dc.subjectclustering algorithmsen_US
dc.titleTightly-Coupled GNSS/Vision Using a Sky-Pointing Camera for Vehicle Navigation in Urban Areasen_US
dc.typepublishedVersionen_US
dc.typejournal articleen_US
ucalgary.item.requestcopytrueen_US
Files
Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
sensors-18-01244-v2.pdf
Size:
11.36 MB
Format:
Adobe Portable Document Format
Description: