SurgPose: a Dataset for Articulated Robotic Surgical Tool Pose Estimation and Tracking

Tracking (education)
DOI: 10.48550/arxiv.2502.11534 Publication Date: 2025-02-17
ABSTRACT
Accurate and efficient surgical robotic tool pose estimation is of fundamental significance to downstream applications such as augmented reality (AR) in training learning-based autonomous manipulation. While significant advancements have been made for humans animals, it still a challenge robotics due the scarcity published data. The relatively large absolute error da Vinci end effector kinematics arduous calibration procedure make calibrated data collection expensive. Driven by this limitation, we collected dataset, dubbed SurgPose, providing instance-aware semantic keypoints skeletons visual tracking. By marking using ultraviolet (UV) reactive paint, which invisible under white light fluorescent UV light, execute same trajectory different lighting conditions collect raw videos keypoint annotations, respectively. SurgPose dataset consists approximately 120k instrument instances (80k 40k validation) 6 categories. Each instance labeled with 7 keypoints. Since are stereo pairs, 2D can be lifted 3D based on stereo-matching depth. In addition releasing test few baseline approaches tracking demonstrate utility SurgPose. More details found at surgpose.github.io.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....