Representing and Reasoning with Intentional Actions on a Robot

Rocio Gomez, Mohan Sridharan, Heather Riley

Research output: Chapter in Book/Report/Conference proceedingConference contribution

40 Downloads (Pure)

Abstract

This paper describes a general architecture for robots to represent and reason with intentional actions. The architecture reasons with tightly-coupled transition diagrams of the domain at two different resolutions. Non-monotonic logical
reasoning with a coarse-resolution transition diagram is used to compute a plan comprising intentional abstract actions for any given goal. Each such abstract action is implemented as a sequence of concrete actions by reasoning over the relevant part of the fine-resolution transition diagram, with the outcomes of probabilistic execution of the concrete actions being added to the coarse-resolution history. The capabilities of this architecture are illustrated in the context of a simulated robot assisting humans in an office domain, on a physical
robot (Baxter) manipulating tabletop objects, and on a wheeled robot (Turtlebot) moving objects to particular places or people in an office. We show that this architecture improves reliability and efficiency in comparison with a planning
architecture that does not include intentional actions.
Original languageEnglish
Title of host publicationProceedings of the 6th Workshop on Planning and Robotics (PlanRob 2018)
EditorsAlberto Finzi, Erez Karpas, Goldie Nejat, AndreA Orlandini, Siddharth Srivastava
PublisherInternational Conference on Automated Planning and Scheduling
Pages133-142
Number of pages10
Publication statusPublished - 24 Jun 2018
EventWorkshop on Planning and Robotics (PlanRob) at ICAPS 2018 - Delft, Netherlands
Duration: 26 Jun 201826 Jun 2018

Conference

ConferenceWorkshop on Planning and Robotics (PlanRob) at ICAPS 2018
Country/TerritoryNetherlands
CityDelft
Period26/06/1826/06/18

Fingerprint

Dive into the research topics of 'Representing and Reasoning with Intentional Actions on a Robot'. Together they form a unique fingerprint.

Cite this