IOSG: Image-driven Object Searching and Grasping
Similarity (geometry)
DOI:
10.48550/arxiv.2308.05821
Publication Date:
2023-01-01
AUTHORS (4)
ABSTRACT
When robots retrieve specific objects from cluttered scenes, such as home and warehouse environments, the target are often partially occluded or completely hidden. Robots thus required to search, identify a object, successfully grasp it. Preceding works have relied on pre-trained object recognition segmentation models find object. However, methods require laborious manual annotations train even fail novel objects. In this paper, we propose an Image-driven Object Searching Grasping (IOSG) approach where robot is provided with reference image of tasked We design Target Similarity Network that generates probability map infer location target. IOSG learns hierarchical policy; high-level policy predicts subtask type, whereas low-level policies, explorer coordinator, generate effective push actions. The responsible for searching when it hidden by other Once found, coordinator conducts target-oriented pushing grasping clutter. proposed pipeline trained full self-supervision in simulation applied real environment. Our model achieves 96.0% 94.5% task success rate coordination exploration tasks respectively, 85.0% search-and-grasp task.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....