Haixin Yu

ORCID: 0000-0002-7244-6031
Publications
Citations
Views
---
Saved
---
About
Contact & Profiles
Research Areas
  • Soft Robotics and Applications
  • Robot Manipulation and Learning
  • Tactile and Sensory Interactions
  • Robotics and Sensor-Based Localization
  • Face recognition and analysis
  • Advanced Sensor and Energy Harvesting Materials
  • Augmented Reality Applications
  • 3D Shape Modeling and Analysis
  • Teleoperation and Haptic Systems
  • 3D Surveying and Cultural Heritage

Tsinghua–Berkeley Shenzhen Institute
2022-2024

University Town of Shenzhen
2024

Tsinghua University
2023-2024

Toronto Metropolitan University
2022

Shanghai University
2022

The grasping of transparent objects is challenging but significance to robots. In this article, a visual–tactile fusion framework for object in complex backgrounds proposed, which synergizes the advantages vision and touch, greatly improves efficiency objects. First, we propose multiscene synthetic dataset named SimTrans12 K together with Gaussian-mask annotation method. Next, based on TaTa gripper, network object-grasping convolutional neural position detection, shows good performance both...

10.1109/tro.2023.3286071 article EN IEEE Transactions on Robotics 2023-07-06

Transparent objects are a common part of daily life, but their unique optical properties make estimating 6D pose challenging task. In this letter, we propose TGF-Net, monocular instance-level estimation method for transparent based on geometric fusion. TGF-Net learns the edge features and surface fragments as intermediate reduces influence appearance changes by fusing rich in network. Moreover, an approach generating high-fidelity large-scale synthetic datasets using Blender use to generate...

10.1109/lra.2023.3268041 article EN IEEE Robotics and Automation Letters 2023-04-17

Humans can feel and grasp efficiently in the dark through tactile feedback, whereas it is still a challenging task for robots. In this research, we create novel soft gripper named JamTac, which has high-resolution perception, large detection surface, integrated sensing-grasping capability that search low-visibility environments. The combines granular jamming visuotactile perception technologies. Using principle of refractive index matching, refraction-free liquid-particle rationing scheme...

10.1089/soro.2022.0134 article EN Soft Robotics 2023-06-05

Transparent objects are common in daily life, while their optical properties pose challenges for RGB-D cameras to capture accurate depth information. This issue is further amplified when these hand-held, as hand occlusions complicate estimation. For assistant robots, however, accurately perceiving hand-held transparent critical effective human-robot interaction. paper presents a Hand-Aware Depth Restoration (HADR) method based on creating an implicit neural representation function from...

10.48550/arxiv.2408.14997 preprint EN arXiv (Cornell University) 2024-08-27

The accurate detection and grasping of transparent objects are challenging but significance to robots. Here, a visual-tactile fusion framework for object under complex backgrounds variant light conditions is proposed, including the position detection, tactile calibration, based classification. First, multi-scene synthetic dataset generation method with Gaussian distribution data annotation proposed. Besides, novel network named TGCNN proposed showing good results in both real scenes. In...

10.48550/arxiv.2211.16693 preprint EN other-oa arXiv (Cornell University) 2022-01-01
Coming Soon ...