Task-Informed Grasping of Partially Observed Objects

Cristiana De Farias*, Brahim Tamadazte, Maxime Adjigble, Rustam Stolkin, Naresh Marturi

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

In this letter, we address the problem of task-informed grasping in scenarios where only incomplete or partial object information is available. Existing methods, which either focus on task-Aware grasping or grasping under partiality, typically require extensive data and long training durations. In contrast, we propose a one-shot task-informed methodology that enables the transfer of grasps computed for a stored object model in the database to another object of the same category that is partially perceived. Our method leverages the reconstructed shapes from Gaussian Process Implicit Surfaces (GPIS) and employs the Functional Maps (FM) framework to transfer task-specific grasping functions. By defining task functions on the objects' manifolds and incorporating an uncertainty metric from GPIS, our approach provides a robust solution for part-specific and task-oriented grasping. Validated through simulations and real-world experiments with a 7-Axis collaborative robotic arm, our methodology demonstrates a success rate exceeding 90% in achieving task-informed grasps on a variety of objects.

Original languageEnglish
Pages (from-to)8394-8401
Number of pages8
JournalIEEE Robotics and Automation Letters
Volume9
Issue number10
Early online date19 Aug 2024
DOIs
Publication statusPublished - 26 Aug 2024

Bibliographical note

Publisher Copyright: © 2024 Crown Copyright

Keywords

  • dexterous manipulation
  • Grasping
  • perception for grasping and manipu-lation
  • task planning

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Biomedical Engineering
  • Human-Computer Interaction
  • Mechanical Engineering
  • Computer Vision and Pattern Recognition
  • Computer Science Applications
  • Control and Optimization
  • Artificial Intelligence

Cite this