FeelAnyForce:
Estimating Contact Force Feedback
from Tactile Sensation for Vision-Based Tactile Sensors

Amir-Hossein Shahidzadeh*
Gabriele Caddeo*
Koushik Alapati
Lorenzo Natale
Cornelia Fermüller
Yiannis Aloimonos

University of Maryland, College-Park
Istituto Italiano di Tecnologia



TL;DR: We accurately estimate 3D contact force along with contact geometry using tactile sensors

Paper
arXiv
Demo Video
Code (Coming Soon)
BibTeX


Abstract

In this paper, we tackle the problem of estimating 3D contact forces using vision-based tactile sensors. In particular, our goal is to estimate contact forces over a large range (up to 15 N) on any objects while generalizing across different vision-based tactile sensors. Thus, we collected a dataset of over 200K indentations using a robotic arm that pressed various indenters onto a GelSight Mini sensor mounted on a force sensor and then used the data to train a multi-head transformer for force regression. Strong generalization is achieved via accurate data collection and multi-objective optimization that leverages depth contact images. Despite being trained only on primitive shapes and textures, the regressor achieves a mean absolute error of 4% on a dataset of unseen real-world objects. We further evaluate our approach's generalization capability to other GelSight mini and DIGIT sensors, and propose a reproducible calibration procedure for adapting the pre-trained model to other vision-based sensors. Furthermore, the method was evaluated on real-world tasks, including weighing objects and controlling the deformation of delicate objects, which relies on accurate force feedback.



Full Video


Dataset

We have meticulously collected a dataset comprising tactile images, depth image, and 3D force vectors. The data was acquired during interactions with primitive indenters below using a robotic arm ensuring precise control over the applied force.


To minimize noise and enhance prediction accuracy, we recorded the 3D force vector only after the force reached a steady state following indentation.


Force distribution of the training dataset after balancing along each force axis. A slight difference between the X and Y components range arises from the rectangular shape of the sensor. Additionally, the placement of the gel on the sensor can lead to slightly assymetric distribution in the Y axis.

For accessing our dataset, please fill out this form.


Sensor Calibration

To compensate for sensor-specific manufacturing variations, we propose a user-friendly calibration procedure using a 3D printable setup and a set of off-the-shelf weights. The testbed and calibration indenter are accessible on OnShape. Further instructions will be posted in the repo.


Paper


Amir-Hossein Shahidzadeh*, Gabriele Caddeo*,
Koushik Alapati, Lorenzo Natale, Cornelia Fermüller, Yiannis Aloimonos