3D Image-based Navigation in Collaborative Navigation Environment

Zaydak, A; Toth, C K; Grejner-Brzezinska, D A; Molnar, B [Molnár, Bence (Térinformatika és...), szerző] Fotogrammetria és Térinformatika Tanszék (BME / ÉMK); Yi, Y; Markiel, JN

Angol nyelvű Tudományos Konferenciaközlemény (Könyvrészlet)
    A multi-sensor navigation approach, offers an effective GPS augmentation where sensors, such as IMUs, barometers, magnetometers, odometers, digital compasses, etc., have been used for applications ranging from pedestrian navigation, to georegistration of remote sensing sensors in land-based and airborne platforms. Recently, research in multi-sensor navigation has been focused on terrain-based or image-based navigation, where imaging sensory data, acquired by, for example, optical digital cameras, flash LiDARs (Light Detection and Ranging) or laser scanners, and digital elevation models (DEMs) and/or satellite imagery are used to recover the user location using image matching techniques or image-to-DEM matching. In addition, cooperative navigation, an emerging field, where a group of users navigates together by exchanging navigation and ranging information, has been considered a viable alternative for GPS-challenged environments. However, most of these systems and approaches are based on the fixed types and numbers of sensors per user/platform (restricted in sensor configuration) that eventually leads to a limitation in navigation capability, particularly in mixed/transition environments. Flash LiDAR, also called Flash LADAR, is a substantially different sensing technology compared to pulsed and CW LiDAR techniques, as it is based on an sensor array, so a it can capture a whole 3D, also called depth or range, image with intensity data in a single step. Falsh LiDAR can use both basic solutions to emit laser, either a single pulse with large aperture will "flash" the area for a short time or in CW mode a continuous laser "light" provides steady illumination in the area. While the first generation products had range limitations and were quite sensitive to ambient light conditions, basically restricting their use to indoor applications. In addition, the high-noise level provided for very low accuracy and stability. As technology advanced, the Flash LiDAR sensors have shown gradual improvements in performance, resulting in an increasing number of applications. One of the most recently introduced inexpensive sensors is the Microsoft Kinect, originally designed to support gaming. Despite its low cost, the Kinect is very powerful, providing a relatively dense point cloud and high- definition video at high data rate. This paper reports about our investigation with this sensor, including calibration experiences and indoor navigation performance evaluation. In any applications where image-to-image or image-to-DEM is performed, such as image-based navigation, the imaging sensors must be adequately characterized by a calibration process in order to achieve an optimal performance. During this process the sensor model parameters are estimated and the stochastic behavior of the sensed signal is analyzed. Based on the calibration parameter estimation, biases are removed, and the error budget of the sensor can be determined based on the statistical terms. The difference between 2D and 3D imaging sensor calibration methods is not really much, as in both cases, data is collected using reference scenes, which are based on a structured object space of known geometry and additional information, and then sensor model parameters are adjusted. Our investigation is mainly focused only on the range calibration, as the depth accuracy is the critical aspect of our applications, including indoor mapping and navigation. In particular, the ranging accuracy dependence on the range is of particular interest. In this paper, indoor testing and calibration will be presented together with the navigation (image-based) performance evaluation. In addition, collaborative navigation experiment, where the personal navigation device (including Kinect sensor) will be positioned with respect to the deployment vehicle outside the building before entering and after exiting the indoor environment will be discussed and performance evaluation will be presented.
    Hivatkozás stílusok: IEEEACMAPAChicagoHarvardCSLMásolásNyomtatás
    2021-09-23 08:43