Tactile exploration plays a crucial role in understanding object structures for fundamental robotics tasks such
as grasping and manipulation. However, efficiently exploring
objects using tactile sensors is challenging, primarily due
to the large-scale unknown environments and limited sensing
coverage of these sensors. To this end, we present AcTExplore,
an active tactile exploration method driven by reinforcement
learning for object reconstruction at scales that automatically
explores the object surfaces in a limited number of steps.
Through sufficient exploration, our algorithm incrementally
collects tactile data and reconstructs 3D shapes of the objects
as well, which can serve as a representation for higher-level
downstream tasks. Our method achieves an average of 95.97%
IoU coverage on unseen YCB objects while just being trained
on primitive shapes.
Fig. 1: Reconstruction of a hammer: (a) demonstrates the tactile sensor trajectory in 3D. (b) illustrates the corresponding
intermediate tactile readings on the hammer's surface. Note
that we use the yellow-to-red gradient to denote the time.
After thorough tactile exploration, we obtain a complete
reconstruction of the object (c), indicating that the tactile
sensor can cover the entire object's surface through our active
Overview: This figure illustrates the key steps and components of AcTExplore in a scenario where the sensor moves upward along the jar's edge. We employed Temporal Tactile Averaging for state representation f to encode consecutive observations, enabling the perception of movement on sensor, vital for learning dexterous actions. We also incorporate an Upper Confidence Bound (UCB) exploration as a bonus to encourage effective exploration.
Qualitative results on unseen YCB objects with different state and reward settings. From active tactile
exploration, we obtain point cloud data of tactile depth readings on the object's surface. To generate mesh, we apply Poisson
surface reconstruction algorithm