Robust spacecraft relative pose estimation via CNN-aided line segments detection in monocular images

Initialization
DOI: 10.1016/j.actaastro.2023.11.049 Publication Date: 2023-11-29T03:43:30Z
ABSTRACT
Autonomous spacecraft relative navigation via monocular images became a hot topic in the past few years and, recently, received further push thanks to constantly growing field of artificial neural networks and publication several spaceborne image datasets. Despite proliferation relative-state initialization algorithms developed, most architectures adopt computationally expensive solutions relying on convolutional (CNNs) that provide accurate output at cost high computational burden seems unfeasible for current hardware. The paper addresses this issue by proposing novel pose algorithm based lightweight CNNs. Inspired previous state-of-the-art algorithms, developed architecture leverages fast target detection CNN followed line segment capable running with low inference time mobile devices. segments their junctions are grouped into complex geometrical groups, reducing solution search space, subsequently, they adopted extract final estimate. As main outcome, analyses demonstrate scores accuracy estimation task, mean error less than 10 cm translation 2.5°in rotation. baseline SLAB 0.04552 standard deviation 0.22972 test dataset. Detailed uncertainties overall score driven mainly errors attitude, which gives highest contribution metric adopted. distributions point out estimated position higher camera boresight axis direction. Concerning proposed has estimating directions x y axes due ambiguities related geometry. Notably, trained work outperforms top benchmark performances have been investigated analyzing effects distance presence background images. Lastly, delves possibility adopting sub-portion 2D-to-3D match matrix made perceptual groups identified positively affects run-time, pointing terms estimates providing comparison both reduced versions against concerning attitude availability, highlighting availability architectures.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (108)
CITATIONS (15)