Amanote Research

Amanote Research

    RegisterSign In

Probabilistic Multimodal Modeling for Human-Robot Interaction Tasks

doi 10.15607/rss.2019.xv.047
Full Text
Open PDF
Abstract

Available in full text

Date

June 22, 2019

Authors
Joseph CampbellSimon StepputtisHeni Ben Amor
Publisher

Robotics: Science and Systems Foundation


Related search

Predicting Human Intent for Cooperative Physical Human-Robot Interaction Tasks

2019English

Probabilistic Detection of Pointing Directions for Human-Robot Interaction

2015English

Dual Track Multimodal Automatic Learning Through Human-Robot Interaction

2017English

A Multimodal Dataset for Object Model Learning From Natural Human-Robot Interaction

2017English

Learning of Robot Navigation Tasks by Probabilistic Neural Network

2013English

Multimodal Affect Modeling and Recognition for Empathic Robot Companions

International Journal of Humanoid Robotics
Mechanical EngineeringArtificial Intelligence
2013English

Measuring Multimodal Synchrony for Human-Computer Interaction

2010English

User, Gesture and Robot Behaviour Adaptation for Human-Robot Interaction

2012English

Human-Robot Interaction Requires More Than Slot Filling - Multi-Threaded Dialogue for Collaborative Tasks and Social Conversation

2018English

Amanote Research

Note-taking for researchers

Follow Amanote

© 2025 Amaplex Software S.P.R.L. All rights reserved.

Privacy PolicyRefund Policy