Schunk hand Tactile pads

Grasping Using Tactile And Visual Data

CogX Part of the EU project CogX.

← Back to research project list.

To grasp an object, an agent typically first devises a grasping plan from visual data, then it executes this plan, and finally it assesses the success of its action. Planning relies on (1) the extraction of object information from vision, and on (2) the recovery of memories related the current visual context, such as previous attempts to grasp a similar object. Because of the uncertainty inherent to these two processes, designing grasp plans that are guaranteed to work in an open-loop system is difficult. Grasp execution greatly benefits from a closed-loop controller which considers sensory feedback before and while issuing motor commands.

In this project, we study means of monitoring the execution of a grasp plan using vision and touch. By pointing a camera to the robot's workspace, we can track the 6D pose of visible objects in realtime. Touch data are captured by sensors placed on the robot's fingers. These two modalities are complementary, since during grasps objects are partly occluded by the hand, and visual object cues become uncertain. Monitoring the execution of a grasp allows the agent to abort grasps that are unlikely to succeed, thus preventing potential damage to the objects or the robot.

Robot+camera
Schunk SDHSchunk SDH

Our robot platform is composed of an industrial arm, a three-finger gripper equipped with tactile sensing arrays, and a camera.

We aim to establish the likelihood of success of a grasp before attempting to lift an object. Our agent learns and memorizes what it feels like to grasp objects from various sides. Tactile data are recorded once the hand is fully closed around the object. As the object often moves while the hand is closing around it, we track the object pose throughout the grasp, and record the pose once the hand is fully closed. The robot lifts up the object and turns it upside-down. If the object stays rigidly bound to the hand during this movement, the grasp is considered successful. During training, the agent encounters both successful and unsuccessful grasps, which provide it with input-output pairs, in the form of tactile imprints and relative object-gripper configurations (input) and success/failure labels (output). These data are used to train a classifier, which is subsequently used to decide whether a grasp feels stable enough to proceed to lifting the object.

Our experiment demonstrates that joint tactile and pose-based perceptions carry valuable grasp-related information, as models trained on both hand poses and tactile parameters perform better than the models trained exclusively on one perceptual input.

Video illustrating pose- and touch-based grasp stability estimation. Download this video in MP4/H.264 or WebM/VP8, or view it on Youtube.

CogX Part of the EU project CogX.

FNRS Supported by the Belgian National Fund for Scientific Research (FNRS).

SSF Supported by the Swedish Foundation for Strategic Research (SSF).

Main reference:

bekiroglu2011d 
Y. Bekiroglu, R. Detry and D. Kragic, Learning Tactile Characterizations Of Object- And Pose-specific Grasps. In IEEE/RSJ International Conference on Intelligent Robots and Systems, 2011.
doidoi; pdfpdf; bibtexshow/hide bibtex
hyttinen2015a 
E. Hyttinen, D. Kragic and R. Detry, Learning the Tactile Signatures of Prototypical Object Parts for Robust Part-based Grasping of Novel Objects. In IEEE International Conference on Robotics and Automation, 2015.
doidoi; pdfpdf; bibtexshow/hide bibtex

Papers covering this topic:

bekiroglu2011c 
Y. Bekiroglu, R. Detry and D. Kragic, Joint Observation of Object Pose and Tactile Imprints for Online Grasp Stability Assessment. In Manipulation Under Uncertainty (Workshop at IEEE ICRA 2011), 2011.
pdfpdf; bibtexshow/hide bibtex
bekiroglu2011d 
Y. Bekiroglu, R. Detry and D. Kragic, Learning Tactile Characterizations Of Object- And Pose-specific Grasps. In IEEE/RSJ International Conference on Intelligent Robots and Systems, 2011.
doidoi; pdfpdf; bibtexshow/hide bibtex
hyttinen2015a 
E. Hyttinen, D. Kragic and R. Detry, Learning the Tactile Signatures of Prototypical Object Parts for Robust Part-based Grasping of Novel Objects. In IEEE International Conference on Robotics and Automation, 2015.
doidoi; pdfpdf; bibtexshow/hide bibtex

Many of these publications are copyrighted by their respective publishers. Downloadable versions are not necessarily identical to the published versions. They are made available here for personal use only.


Page last modified: November 09, 2017 Valid HTML5 and CSS