Lightweight spatial attentive network for vehicular visual odometry estimation in urban environments

Visual odometry is the process of estimating the motion between two consecutive images. Traditional visual odometry algorithms require the careful fabrication of state-of-the-art building blocks based on geometry. These algorithms are highly sensitive to noise, and performance degradation of a singl...

Full description

Saved in:
Bibliographic Details
Main Authors: Gadipudi, N., Elamvazuthi, I., Lu, C.-K., Paramasivam, S., Su, S.
Format: Article
Published: Springer Science and Business Media Deutschland GmbH 2022
Online Access:https://www.scopus.com/inward/record.uri?eid=2-s2.0-85132752517&doi=10.1007%2fs00521-022-07484-y&partnerID=40&md5=a322a75842dfdb74b4705a78a37c97ce
http://eprints.utp.edu.my/33164/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Visual odometry is the process of estimating the motion between two consecutive images. Traditional visual odometry algorithms require the careful fabrication of state-of-the-art building blocks based on geometry. These algorithms are highly sensitive to noise, and performance degradation of a single subprocess compromises the performance of the entire system. On the other hand, learning-based methods automatically learn the features required through motion mapping. However, current learning-based methods are computationally expensive and require a significant amount of time to estimate the pose from a video sequence. This method proposes a lightweight deep neural networks architecture to estimate the odometry by exploiting the refined features through spatial attention. Three different training and test splits of the KITTI benchmark are used to effectively evaluate the proposed approach. The execution time of the proposed approach is � 1 ms, speeded up by 47 times over 1. Performed experiments demonstrate the promising performance of the proposed method to the methods used in the comparison. © 2022, The Author(s), under exclusive licence to Springer-Verlag London Ltd., part of Springer Nature.