Authors:
Ming Gao
;
Ralf Kohlhaas
and
J. Marius Zöllner
Affiliation:
FZI Research Center for Information Technology, Germany
Keyword(s):
Shared Autonomy, Assisted Teleoperation, Mobile Robot, Unsupervised Learning from Demonstration.
Related
Ontology
Subjects/Areas/Topics:
Agents
;
Artificial Intelligence
;
Cognitive Robotics
;
Human-Robots Interfaces
;
Informatics in Control, Automation and Robotics
;
Mobile Robots and Autonomous Systems
;
Robotics and Automation
Abstract:
We focus on the problem of learning and recognizing contextual tasks from human demonstrations, aiming
to efficiently assist mobile robot teleoperation through sharing autonomy. We present in this study a novel
unsupervised contextual task learning and recognition approach, consisting of two phases. Firstly, we use
Dirichlet Process Gaussian Mixture Model (DPGMM) to cluster the human motion patterns of task executions
from unannotated demonstrations, where the number of possible motion components is inferred from the data
itself instead of being manually specified a priori or determined through model selection. Post clustering, we
employ Sparse Online Gaussian Process (SOGP) to classify the query point with the learned motion patterns,
due to its superior introspective capability and scalability to large datasets. The effectiveness of the proposed
approach is confirmed with the extensive evaluations on real data.