EgoLocate: Real-time Motion Capture, Localization, and Mapping with Sparse Body-mounted Sensors
Motion Capture
Monocular
Feature (linguistics)
DOI:
10.1145/3592099
Publication Date:
2023-07-26T15:47:45Z
AUTHORS (7)
ABSTRACT
Human and environment sensing are two important topics in Computer Vision Graphics. motion is often captured by inertial sensors, while the mostly reconstructed using cameras. We integrate techniques together EgoLocate, a system that simultaneously performs human capture (mocap), localization, mapping real time from sparse body-mounted including 6 measurement units (IMUs) monocular phone camera. On one hand, mocap suffers large translation drift due to lack of global positioning signal. EgoLo-cate leverages image-based simultaneous localization (SLAM) techniquesto locate scene. Onthe other SLAM fails when visual feature poor. EgoLocate involves provide strong prior for camera motion. Experiments show key challenge both fields, largely improved our technique, compared with state art fields. Our codes available research at https://xinyu-yi.github.io/EgoLocate/.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (70)
CITATIONS (34)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....