In this paper, we tackle the problem of estimating 3D contact forces using vision-based tactile sensors. In particular, our goal is to estimate contact forces over a large range (up to 15 N) on any objects while generalizing across different vision-based tactile sensors. Thus, we collected a dataset of over 200K indentations using a robotic arm that pressed various indenters onto a GelSight Mini sensor mounted on a force sensor and then used the data to train a multi-head transformer for force regression. Strong generalization is achieved via accurate data collection and multi-objective optimization that leverages depth contact images.
Despite being trained only on primitive shapes and textures, the regressor achieves a mean absolute error of 4% on a dataset of unseen real-world objects. We further evaluate our approach's generalization capability to other GelSight mini and DIGIT sensors, and propose a reproducible calibration procedure for adapting the pre-trained model to other vision-based sensors. Furthermore, the method was evaluated on real-world tasks, including weighing objects and controlling the deformation of delicate objects, which relies on accurate force feedback.
Full Video
Dataset
We have meticulously collected a dataset comprising tactile images, depth image, and 3D force vectors. The data was acquired during interactions with primitive indenters below using a robotic arm ensuring precise control over the applied force.
To minimize noise and enhance prediction accuracy, we recorded the 3D force vector only after the force reached a steady state following indentation.
Force distribution of the training dataset after balancing along each force axis. A slight difference between the X and Y components range arises from the rectangular shape of the sensor. Additionally, the placement of the gel on the sensor can lead to slightly assymetric distribution in the Y axis.
For accessing our dataset, please fill out this form.
Sensor Calibration
To compensate for sensor-specific manufacturing variations, we propose a user-friendly calibration procedure using a 3D printable setup and a set of off-the-shelf weights. The testbed and calibration indenter are accessible on OnShape. Further instructions will be posted in the repo.
@misc{shahidzadeh2024feelanyforceestimatingcontactforce,
title={FeelAnyForce: Estimating Contact Force Feedback from Tactile Sensation for Vision-Based Tactile Sensors},
author={Amir-Hossein Shahidzadeh and Gabriele Caddeo and Koushik Alapati and Lorenzo Natale and Cornelia Fermuller and Yiannis Aloimonos},
year={2024},
eprint={2410.02048},
archivePrefix={arXiv},
primaryClass={cs.RO},
url={https://arxiv.org/abs/2410.02048},
}