Learning maps of indoor environments based on human activity

Slawomir Grzonka*, Frederic Dijoux, Andreas Karwath, Wolfram Burgard

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We present a novel approach to build approximate maps of structured environments utilizing human motion and activity. Our approach uses data recorded with a data suit which is equipped with several IMUs to detect movements of a person and door opening and closing events. In our approach we interpret the movements as motion constraints and door handling events as landmark detections in a graph-based SLAM framework. As we cannot distinguish between individual doors, we employ a multi-hypothesis approach on top of the SLAM system to deal with the high data-association uncertainty. As a result, our approach is able to accurately and robustly recover the trajectory of the person. We additionally take advantage of the fact that people traverse free space and that doors separate rooms to recover the geometric structure of the environment after the graph optimization. We evaluate our approach in several experiments carried out with different users and in environments of different types.

Original languageEnglish
Title of host publicationEmbedded Reasoning
Subtitle of host publicationIntelligence in Embedded Systems - Papers from the AAAI Spring Symposium, Technical Report
PublisherAI Access Foundation
Pages52-58
Number of pages7
ISBN (Print)9781577354581
Publication statusPublished - 2010
Event2010 AAAI Spring Symposium - Stanford, United States
Duration: 22 Mar 201024 Mar 2010

Publication series

NameAAAI Spring Symposium - Technical Report
VolumeSS-10-04

Conference

Conference2010 AAAI Spring Symposium
Country/TerritoryUnited States
CityStanford
Period22/03/1024/03/10

ASJC Scopus subject areas

  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Learning maps of indoor environments based on human activity'. Together they form a unique fingerprint.

Cite this