Grasping accuracy is a critical prerequisite for precise object manipulation that may require careful alignment between the robot hand and an object. Neural Descriptor Fields (NDFs) offer a promising vision-based method to generate grasping poses that generalize across object categories. However, NDFs alone can produce inaccurate poses due to imperfect camera calibration, incomplete point clouds, and object variability. Alternatively, tactile sensing enables more precise contact, but existing approaches limit policies to simple predefined contact geometries.
In this work, we introduce NeuralTouch, a multimodal framework that integrates NDFs and tactile sensing to enable accurate generalizable grasping through gentle physical interaction. Our approach leverages NDFs to implicitly represent the target contact geometry, from which a deep reinforcement learning policy is trained to refine the grasp using tactile feedback. This policy is conditioned on the neural descriptors and does not require explicit specification of contact types. We validate NeuralTouch through ablation studies in simulation and with zero-shot transfer to real-world manipulation tasks, such as peg-out/in-hole and bottle lid opening. Our results show that NeuralTouch significantly improves grasping accuracy and robustness over baseline methods.
Overview of the NeuralTouch: In simulation, we first pre-train an occupancy network which is the core component of the Neural Pose Descriptor Fields. Secondly, we collect human demonstrations along with object point clouds and robot target grasping pose descriptors depending on the manipulation tasks. Thirdly, we train an RL policy with tactile and proprioceptive feedback, to achieve fine grasping poses implicitly specified by these collected descriptors. After obtaining the NPDF and a well-trained policy, our system is directly deployed in the real world with a real-to-sim tactile transfer to accurately grasp unseen objects, executing manipulation tasks such as unplugging a bolt-like USB and inserting it into a socket.
Simulation-only overview of NeuralTouch across the paper's object categories and target features, illustrating how the descriptor-conditioned tactile policy operates before transferring to real-world tasks.
Real-world experiments with different objects, including ketchup bottle, green bolt, and adaptor plug, demonstrating that NeuralTouch can maintain accurate grasp acquisition and robust manipulation across varied geometries.
A diverse set of bottle lids with varying geometries and surface textures used to evaluate the generalization ability of NeuralTouch in real-world bottle-cap opening.
Real-world NeuralTouch demos on varied bottle-cap geometries, demonstrating generalization across cap shapes and surface textures with robust contact correction and accurate lid grasping. Sample numbers refer to the lid numbers listed in the top figure in this session.
Comparison videos for various-shape bottle lids showing the performance of NeuralTouch (ours), NDF, NDF+TOUCH, C2FIL, and C2FIL+TOUCH on the same real-world task setup.
The real-time performance and instant recovery of the tactile robot under perturbations demonstrate the robustness of NeuralTouch and its applicability to tasks with consistent disturbances. The real tactile image streams are shown in the first and third columns on the PC, while the simulated tactile image streams are shown in the second and fourth columns.
SimShear: Sim-to-Real Shear-based Tactile Servoing
Kipp McAdam Freud*, Yijiong Lin*, Nathan F. Lepora
Conference on Robot Learning (CoRL) 2025
Text2Touch: Tactile In-Hand Manipulation with LLM-Designed Reward Functions
Harrison Field, Max Yang, Yijiong Lin, Efi Psomopoulou, David Barton, Nathan F. Lepora
Conference on Robot Learning (CoRL) 2025
Tactile softhand-a: 3d-printed, tactile, highly-underactuated, anthropomorphic robot hand with an antagonistic tendon mechanism
Haoran Li, Christopher J Ford, Chenghua Lu, Yijiong Lin, Matteo Bianchi, Manuel G Catalano, Efi Psomopoulou, Nathan F Lepora
The International Journal of Robotics Research (IJRR) 2024
AnyRotate: Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch
Max Yang, Chenghua Lu, Alex Church, Yijiong Lin, Chris Ford, Haoran Li, Efi Psomopoulou, David A.W. Barton, Nathan F. Lepora
Conference on Robot Learning (CoRL) 2024
Bi-Touch: Bimanual Tactile Manipulation With Sim-to-Real Deep Reinforcement Learning
Yijiong Lin, Alex Church, Max Yang, Haoran Li, John Lloyd, Dandan Zhang, Nathan F. Lepora
IEEE Robotics and Automation Letters (RA-L) 2023
Tactile Gym 2.0: Sim-to-real Deep Reinforcement Learning for Comparing Low-cost High-Resolution Robot Touch
Yijiong Lin, John Lloyd, Alex Church, Nathan F. Lepora
IEEE Robotics and Automation Letters (RA-L) 2022
Neural descriptor fields: Se (3)-equivariant object representations for manipulation
Anthony Simeonov; Yilun Du; Andrea Tagliasacchi; Joshua B. Tenenbaum; Alberto Rodriguez; Pulkit Agrawal; Vincent Sitzmann
International Conference on Robotics and Automation (ICRA) 2022
Coarse-to-Fine Imitation Learning: Robot Manipulation from a Single Demonstration
Edward Johns
International Conference on Robotics and Automation (ICRA) 2021
@article{lin2026neuraltouch,
title = {NeuralTouch: Neural Descriptors for Precise Sim-to-Real Tactile Robot Control},
author = {Lin, Yijiong and Deng, Bowen and Pu, Keju and Lu, Chenghua and Yang, Max and Psomopoulou, Efi and Lepora, Nathan F.},
journal = {To be added},
year = {2026}
}