SimShear: Sim-to-Real Shear-based Tactile Servoing

University of Bristol
(*These authors contribute equally.)
9th Conference on Robot Learning (CoRL) 2025

Boosting real-World shear-oriented robot tasks via synthetic shear-conditioned tactile images.

Abstract

We present SimShear: a sim-to-real pipeline for tactile control that allows use of shear information without explicitly modeling shear dynamics in simulation. Shear, which arises from lateral movements across contact surfaces, are critical for tasks involving dynamic object interactions but are challenging to simulate. We introduce shPix2pix: a shear-conditioned U-Net GAN that transforms simulated tactile images absent of shear plus a vector encoding shear information into realistic equivalents that include shear deformations, and show this outperforms baseline pix2pix methods for simulating tactile images and pose/shear prediction. This is applied to two control tasks using a pair of low-cost desktop robotic arms equipped with a vision-based tactile sensor: first, a tactile tracking task, where a follower arm tracks a surface moved by the leader arm; second, a collaborative co-lift task, where both arms jointly hold an object while the leader arm moves along a prescribed trajectory. Our method maintain contact errors within 1-2 mm across varied trajectories where shear sensing is essential for task performance. This work validates the use of sim-to-real shear modeling with rigid-body simulators, opening new possibilities for simulation in tactile robotics.

Shear-based Sim-to-Real Pipeline for Tactile Robotics

Interpolate start reference image.

We introduce shPix2pix: a conditional U-Net GAN architecture that incorporates shear information into simulated tactile sensor images for image-to-image translation. By modeling deformations due to lateral displacements, shPix2pix enables the generation of realistic tactile images that contain shear deformations not modeled by our rigid-body simulator.

Exp. 1: Object Tracking (Circle and Square Traj.)

Exp. 1: Object Tracking (Spiral and Star Loop Traj.)

Exp. 2: Rigid-Object (Prism and Egg) Co-Lifting (Flower Traj.)

Exp. 2: Soft-Object (Brain and Duck) Co-Lifting (Flower Traj.)

Exp. 2: Rigid-Object (Prism and Egg) Co-Lifting (Star Traj.)

Exp. 2: Soft-Object (Brain and Duck) Co-Lifting (Star Traj.)

Related Articles

Text2Touch: Tactile In-Hand Manipulation with LLM-Designed Reward Functions

Harrison Field, Max Yang, Yijiong Lin, Efi Psomopoulou, David Barton, Nathan F. Lepora

Conference on Robot Learning (CoRL) 2025

Project Page

Bi-Touch: Bimanual Tactile Manipulation With Sim-to-Real Deep Reinforcement Learning

Yijiong Lin, Alex Church, Max Yang, Haoran Li, John Lloyd, Dandan Zhang, and Nathan F. Lepora

IEEE Robotics and Automation Letters 2024

Paper / arXiv / Project Page

AnyRotate: Gravity Invariant In-Hand Object Rotation with Sim-to-Real Touch

Max Yang, Chenghua Lu, Alex Church, Yijiong Lin, Chris Ford, Haoran Li, Efi Psomopoulou, David A.W. Barton, Nathan F. Lepora

Conference on Robot Learning (CoRL) 2024

Paper / arXiv / Project Page

Tactile Gym 2.0: Sim-to-Real Deep Reinforcement Learning for Comparing Low-Cost High-Resolution Robot Touch

Yijiong Lin, John Lloyd, Alex Church, Nathan F. Lepora

IEEE Robotics and Automation Letters (RA-L) 2022

Paper / arXiv / Project Page