Amanote Research

Amanote Research

    RegisterSign In

Toward an Affect-Sensitive Multimodal Human-Computer Interaction

Proceedings of the IEEE - United States
doi 10.1109/jproc.2003.817122
Full Text
Open PDF
Abstract

Available in full text

Categories
Electronic EngineeringElectricalComputer Science
Date

September 1, 2003

Authors
M. PanticL.J.M. Rothkrantz
Publisher

Institute of Electrical and Electronics Engineers (IEEE)


Related search

Measuring Multimodal Synchrony for Human-Computer Interaction

2010English

Emotion Modelling and Facial Affect Recognition in Human-Computer and Human-Robot Interaction

2009English

Human-Computer Interaction for Alert Warning and Attention Allocation Systems of the Multimodal Watchstation

2000English

An Approach to Context in Human-Computer Interaction

Semiotica
LinguisticsLiteratureLiterary TheoryLanguage
2008English

Evaluation of an Eye-Pointer Interaction Device for Human-Computer Interaction

Heliyon
Multidisciplinary
2018English

Computer Vision in Human-Computer Interaction

Lecture Notes in Computer Science
Computer ScienceTheoretical Computer Science
2004English

Overview: Human-Computer Interaction an Globally Used Technique in Society

2017English

Probabilistic Multimodal Modeling for Human-Robot Interaction Tasks

2019English

Radar Sensing in Human-Computer Interaction

Interactions
Human-Computer Interaction
2017English

Amanote Research

Note-taking for researchers

Follow Amanote

© 2025 Amaplex Software S.P.R.L. All rights reserved.

Privacy PolicyRefund Policy