Learning dexterous grasps that generalise to novel objects by combining hand and contact models

Marek Kopicki, Renaud Detry, Florian Schmidt, Christoph Borst, Jeremy L. Wyatt

Research output: Chapter in Book/Report/Conference proceedingConference contribution

22 Citations (Scopus)

Abstract

Generalising dexterous grasps to novel objects is an open problem. We show how to learn grasps for high DoF hands that generalise to novel objects, given as little as one demonstrated grasp. During grasp learning two types of probability density are learned that model the demonstrated grasp. The first density type (the contact model) models the relationship of an individual finger part to local surface features at its contact point. The second density type (the hand configuration model) models the whole hand configuration during the approach to grasp. When presented with a new object, many candidate grasps are generated, and a kinematically feasible grasp is selected that maximises the product of these densities. We demonstrate 31 successful grasps on novel objects (an 86% success rate), transferred from 16 training grasps. The method enables: transfer of dexterous grasps within object categories; across object categories; to and from objects where there is no complete model of the object available; and using two different dexterous hands.

Original languageEnglish
Title of host publicationProceedings of the IEEE International Conference on Intelligent Robotics and Automation
Subtitle of host publicationICRA 2014
Place of PublicationPicsatawny, NJ, USA
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Pages5358-5365
Number of pages8
DOIs
Publication statusPublished - 22 Sept 2014

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence
  • Control and Systems Engineering
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Learning dexterous grasps that generalise to novel objects by combining hand and contact models'. Together they form a unique fingerprint.

Cite this