User modelling for personalised dressing assistance by humanoid robots

Yixing Gao, Hyung Jin Chang, Yiannis Demiris

Research output: Chapter in Book/Report/Conference proceedingConference contribution

30 Citations (Scopus)

Abstract

Assistive robots can improve the well-being of disabled or frail human users by reducing the burden that activities of daily living impose on them. To enable personalised assistance, such robots benefit from building a user-specific model, so that the assistance is customised to the particular set of user abilities. In this paper, we present an end-to-end approach for home-environment assistive humanoid robots to provide personalised assistance through a dressing application for users who have upper-body movement limitations. We use randomised decision forests to estimate the upper-body pose of users captured by a top-view depth camera, and model the movement space of upper-body joints using Gaussian mixture models. The movement space of each upper-body joint consists of regions with different reaching capabilities. We propose a method which is based on real-time upper-body pose and user models to plan robot motions for assistive dressing. We validate each part of our approach and test the whole system, allowing a Baxter humanoid robot to assist human to wear a sleeveless jacket.
Original languageEnglish
Title of host publicationProceedings of 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
PublisherIEEE Computer Society
Pages1840-1845
ISBN (Electronic)978-1-4799-9994-1
DOIs
Publication statusPublished - 28 Sept 2015
Event2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) - Hamburg, Germany
Duration: 28 Sept 20152 Oct 2015

Conference

Conference2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)
Country/TerritoryGermany
CityHamburg
Period28/09/152/10/15

Fingerprint

Dive into the research topics of 'User modelling for personalised dressing assistance by humanoid robots'. Together they form a unique fingerprint.

Cite this