TactoFind: A Tactile Only System for Object Retrieval

Sameer Pai*,1,2     Tao Chen*,1,2     Megha Tippur*,2     Edward Adelson2     Abhishek Gupta†,1,2,3     Pulkit Agrawal†,1,2
1Improbable AI Lab, 2Massachusetts Institute of Technology,3University of Washington
*Equal contribution, Equal advising


Abstract


We study the problem of object retrieval in scenarios where visual sensing is absent, object shapes are unknown beforehand and objects can move freely, like grabbing objects out of a drawer. Successful solutions require localizing free objects, identifying specific object instances, and then grasping the identified objects, only using touch feedback. Unlike vision, where cameras can observe the entire scene, touch sensors are local and only observe parts of the scene that are in contact with the manipulator. Moreover, information gathering via touch sensors necessitates applying forces on the touched surface which may disturb the scene itself. Reasoning with touch, therefore, requires careful exploration and integration of information over time -- a challenge we tackle. We present a system capable of using sparse tactile feedback from fingertip touch sensors on a dexterous hand to localize, identify and grasp novel objects without any visual feedback.


Paper


TactoFind: A Tactile Only System for Object Retrieval
Sameer Pai*, Tao Chen*, Megha Tippur*, Edward Adelson, Abhishek Gupta, Pulkit Agrawal
arXiv / project page / bibtex


Summary



Searching in the Dark



More Demos (real time)