Patrick Geneva

ORCID: 0000-0002-2179-3447
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Robotics and Sensor-Based Localization
  • Indoor and Outdoor Localization Technologies
  • Advanced Vision and Imaging
  • 3D Surveying and Cultural Heritage
  • Inertial Sensor and Navigation
  • Underwater Vehicles and Communication Systems
  • Advanced Image and Video Retrieval Techniques
  • Optical measurement and interference techniques
  • Target Tracking and Data Fusion in Sensor Networks
  • Remote Sensing and LiDAR Applications
  • Robotic Mechanisms and Dynamics
  • Robotic Path Planning Algorithms
  • Human Motion and Animation
  • Satellite Image Processing and Photogrammetry
  • Hand Gesture Recognition Systems

University of Delaware
2017-2024

Zhejiang University
2019

In this paper, we present an open platform, termed OpenVINS, for visual-inertial estimation research both the academic community and practitioners from industry. The sourced codebase provides a foundation researchers engineers to quickly start developing new capabilities their systems. This has out of box support commonly desired features, which include: (i) on-manifold sliding window Kalman filter, (ii) online camera intrinsic extrinsic calibration, (iii) inertial sensor time offset (iv)...

10.1109/icra40945.2020.9196524 article EN 2020-05-01

This paper presents a tightly-coupled multi-sensor fusion algorithm termed LiDAR-inertial-camera (LIC-Fusion), which efficiently fuses IMU measurements, sparse visual features, and extracted LiDAR points. In particular, the proposed LIC-Fusion performs online spatial temporal sensor calibration between all three asynchronous sensors, in order to compensate for possible variations. The key contribution is optimal (up linearization errors) multi-modal of detected tracked edge/surf feature...

10.1109/iros40897.2019.8967746 article EN 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2019-11-01

This paper presents the formalization of closest point plane representation and an analysis its incorporation in 3D indoor simultaneous localization mapping (SLAM). We present a singularity free factor leveraging representation, demonstrate fusion with inertial preintegratation measurements graph-based optimization framework. The resulting LiDAR-inertial SLAM (LIPS) system is validated both on custom made LiDAR simulator real-world experiment.

10.1109/iros.2018.8594463 article EN 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2018-10-01

Multi-sensor fusion of multi-modal measurements from commodity inertial, visual and LiDAR sensors to provide robust accurate 6DOF pose estimation holds great potential in robotics beyond. In this paper, building upon our prior work (i.e., LIC-Fusion), we develop a sliding-window filter based LiDAR-Inertial-Camera odometry with online spatiotemporal calibration LIC-Fusion 2.0), which introduces novel plane-feature tracking for efficiently processing 3D point clouds. particular, after motion...

10.1109/iros45743.2020.9340704 article EN 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2020-10-24

In this paper, we propose a new analytical preintegration theory for graph-based sensor fusion with an inertial measurement unit (IMU) and camera (or other aiding sensors). Rather than using discrete sampling of the dynamics as in current methods, derive closed-form solutions to equations, yielding improved accuracy state estimation. We advocate two different models preintegration: (i) model that assumes piecewise constant measurements; (ii) local true acceleration. Through extensive Monte...

10.1177/0278364919835021 article EN The International Journal of Robotics Research 2019-04-01

In this letter, we perform in-depth observability analysis for both spatial and temporal calibration parameters of an aided inertial navigation system (INS) with global and/or local sensing modalities. particular, analytically show that are observable if the sensor platform undergoes random motion. More importantly, identify four degenerate motion primitives harm accuracy thus should be avoided in reality whenever possible. Interestingly, also prove these motions would still hold even case...

10.1109/lra.2019.2893803 article EN publisher-specific-oa IEEE Robotics and Automation Letters 2019-01-17

In this paper, we present an efficient and robust GPS-aided visual inertial odometry (GPS-VIO) system that fuses IMU-camera data with intermittent GPS measurements. To perform sensor fusion, spatiotemporal calibration initialization of the transform between reference frames are required. We propose online method for both GPS-IMU extrinsics time offset as well a frame procedure is to noise. addition, prove existence four unobservable directions GPS-VIO when estimating in VIO frame, advocate...

10.1109/icra40945.2020.9197029 article EN 2020-05-01

As cameras and inertial sensors are becoming ubiquitous in mobile devices robots, it holds great potential to design visual-inertial navigation systems (VINS) for efficient versatile 3-D motion tracking, which utilize any (multiple) available measurement units (IMUs) resilient sensor failures or depletion. To this end, rather than the standard VINS paradigm using a minimal sensing suite of single camera IMU, article, we real-time consistent multi-IMU multi-camera (MIMC) estimator that is...

10.1109/tro.2021.3049445 article EN publisher-specific-oa IEEE Transactions on Robotics 2021-02-25

In this paper, we present a tightly-coupled monocular visual-inertial navigation system (VINS) using points and lines with degenerate motion analysis for 3D line triangulation. Based on segment measurements from images, propose two sliding window based triangulation algorithms compare their performance. Analysis of the proposed reveals 3 camera motions that cause failures. Both geometrical interpretation Monte-Carlo simulations are provided to verify these which prevent addition, commonly...

10.1109/iros40897.2019.8967905 article EN 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2019-11-01

This paper presents a tightly-coupled aided inertial navigation system (INS) with point and plane features, general sensor fusion framework applicable to any visual depth (e.g., RGBD, LiDAR) configuration, in which the camera is used for feature tracking extraction. The proposed exploits geometrical structures (planes) of environments adopts closest (CP) parameterization. Moreover, we distinguish planar features from non-planar order enforce point-on-plane constraints are our state...

10.1109/icra.2019.8794078 article EN 2022 International Conference on Robotics and Automation (ICRA) 2019-05-01

This paper presents a general multi-camera visual-inertial navigation system (mc-VINS) with online instrinsic and extrinsic calibration, which is able to utilize all the information from an arbitrary number of asynchronous cameras. In particular, within standard multi-state constraint Kalman Filter (MSCKF) framework, we only clone IMU poses related single "base camera" (rather than cameras) in state vector, while corresponding other camera images are represented via interpolation bounding...

10.1109/icra.2019.8793886 article EN 2022 International Conference on Robotics and Automation (ICRA) 2019-05-01

In this letter, we develop a low-cost stereo visual-inertial localization system, which leverages efficient multi-state constraint Kalman filter (MSCKF)-based odometry (VIO) while utilizing an <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">a priori</i> LiDAR map to provide bounded-error three-dimensional navigation. Besides the standard sparse visual feature measurements used in VIO, global registrations of semi-dense clouds prior are also...

10.1109/lra.2019.2927123 article EN IEEE Robotics and Automation Letters 2019-07-07

In this letter we present a novel method to perform target tracking of moving rigid body utilizing an inertial measurement unit with cameras. A key contribution is the tightly-coupling motion estimation within visual-inertial navigation system (VINS), allowing for improved performance both processes. particular, build upon standard multi-state constraint Kalman filter -based VINS and generalize it incorporate three-dimensional (3-D) tracking. Rather than representing object as point particle...

10.1109/lra.2019.2896472 article EN IEEE Robotics and Automation Letters 2019-01-30

In this paper, we introduce a novel visual-inertial-wheel odometry (VIWO) system for ground vehicles, which efficiently fuses multi-modal visual, inertial and 2D wheel measurements in sliding-window filtering fashion. As multi-sensor fusion requires both intrinsic extrinsic (spatiotemproal) calibration parameters may vary over time during terrain navigation, propose to perform VIWO along with online sensor of encoders' parameters. To end, analytically derive the measurement model from raw...

10.1109/iros45743.2020.9341161 article EN 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2020-10-24

As sensor calibration plays an important role in visual-inertial fusion, this article performs in-depth investigation of online self-calibration for robust and accurate state estimation. To end, we first conduct complete observability analysis navigation systems (VINS) with full sensing parameters, including inertial measurement unit (IMU)/camera intrinsics IMU-camera spatial-temporal extrinsic calibration, along readout time rolling shutter (RS) cameras (if used). We study different model...

10.1109/tro.2023.3275878 article EN IEEE Transactions on Robotics 2023-06-07

It holds great implications for practical applications to enable centimeter-accuracy positioning mobile and wearable sensor systems. In this paper, we propose a novel, high-precision, efficient visual-inertial (VI)-SLAM algorithm, termed Schmidt-EKF VI-SLAM (SEVIS), which optimally fuses IMU measurements monocular images in tightly-coupled manner provide 3D motion tracking with bounded error. particular, adapt the Schmidt Kalman filter formulation selectively include informative features...

10.1109/cvpr.2019.01238 article EN 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2019-06-01

This paper addresses the problem of visualinertial self-calibration while focusing on necessity online IMU intrinsic calibration.To this end, we perform observability analysis for visual-inertial navigation systems (VINS) with four different inertial model variants containing parameters that encompass one commonly used low-cost sensors.The theoretically confirms what is intuitively believed in literature, is, intrinsics are observable given fully-excited 6-axis motion.Moreover, we, first...

10.15607/rss.2020.xvi.026 article EN 2020-06-30

A navigation system which can output drift-free global trajectory estimation with local consistency holds great potential for autonomous vehicles and mobile devices. We propose a tightly-coupled GNSS-aided visual-inertial (GAINS) is able to leverage the complementary sensing modality from pair, provides high-frequency information, Global Navigation Satellite System (GNSS) receiver low-frequency observations. Specifically, raw GNSS measurements (including pseudorange, carrier phase changes,...

10.1109/icra46639.2022.9811362 article EN 2022 International Conference on Robotics and Automation (ICRA) 2022-05-23

In this paper, we present a real-time multi-IMU visual-inertial navigation system (mi-VINS) that utilizes the information from multiple inertial measurement units (IMUs) and thus is resilient to IMU sensor failures. particular, in proposed mi-VINS formulation, one of IMUs serves as "base" system, while rest act auxiliary sensors aiding state estimation. A key advantage architecture ability seamlessly "promote" an new base, for example, upon detection base failure, being single point failure...

10.1109/icra.2019.8794295 article EN 2022 International Conference on Robotics and Automation (ICRA) 2019-05-01

Enabling real-time visual-inertial navigation in unknown environments while achieving bounded-error performance holds great potentials robotic applications. To this end, paper, we propose a novel linear-complexity EKF for localization, which can efficiently utilize loop closure constraints, thus allowing long-term persistent navigation. The key idea is to adapt the Schmidt-Kalman formulation within multi-state constraint Kalman filter (MSCKF) framework, selectively include keyframes as...

10.1109/icra.2019.8793836 article EN 2022 International Conference on Robotics and Automation (ICRA) 2019-05-01

Recent advancements in the performance and affordability of cameras inertial measurement units (IMUs) have caused demand for efficient, accurate visual-inertial navigation solutions. In this paper, we present a system fusion preintegrated measurements with highly informative direct alignment images. particular, our preintegration theory is based on closed-form solutions continuous-time IMU kinematic model, instead discrete time. This allows more computation their uncertainty as well bias...

10.1109/icra.2017.7989171 article EN 2017-05-01

System modeling and parameter identification of micro aerial vehicles (MAV) are crucial for robust autonomy, especially under highly dynamic motions. Visual-inertial-aided online has recently seen research attention due to the demanding adaptation platform configuration changes with minimal onboard sensor requirements. To this end, we design an MAV system algorithm tightly fuse visual, inertial aerodynamic information within a lightweight multi-state constraint Kalman filter (MSCKF)...

10.1109/iros47612.2022.9982263 article EN 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) 2022-10-23

State-of-the-art monocular visual-inertial odometry (VIO) approaches rely on sparse point features in part due to their efficiency, robustness, and prevalence, while ignoring high-level structural regularities such as planes that are common man-made environments can be exploited further constrain motion. Generally, observed by a camera for significant periods of time large spatial presence thus, amenable long-term navigation. Therefore, this paper, we design novel real-time VIO system is...

10.1109/icra48891.2023.10160620 article EN 2023-05-29
Coming Soon ...