Amanote Research

Amanote Research

    RegisterSign In

GestUI: A Model-Driven Method and Tool for Including Gesture-Based Interaction in User Interfaces

Complex Systems Informatics and Modeling Quarterly
doi 10.7250/csimq.2016-6.05
Full Text
Open PDF
Abstract

Available in full text

Date

April 29, 2016

Authors
Otto ParraSergio EspañaOscar Pastor
Publisher

Riga Technical University


Related search

Including Multi-Stroke Gesture-Based Interaction in User Interfaces Using a Model-Driven Method

2015English

Engineering Adaptive Model-Driven User Interfaces

IEEE Transactions on Software Engineering
Software
2016English

User, Gesture and Robot Behaviour Adaptation for Human-Robot Interaction

2012English

Climb, Fly, Stack: Design of Tangible and Gesture-Based Interfaces for Natural and Efficient Interaction

2018English

Context and Interaction in Zoomable User Interfaces

2000English

Finger Gesture-Based Three-Dimension Mobile User Interaction Using a Rear-Facing Camera

International Journal of Multimedia and Ubiquitous Engineering
Computer Science
2013English

Model-Based Contract Testing of Graphical User Interfaces

IEICE Transactions on Information and Systems
Electronic EngineeringPattern RecognitionHardwareComputer VisionElectricalArchitectureArtificial IntelligenceSoftware
2015English

Variability in Wrist-Tilt Accelerometer Based Gesture Interfaces

Lecture Notes in Computer Science
Computer ScienceTheoretical Computer Science
2004English

Evaluation of Gesture Based Interfaces for Medical Volume Visualization Tasks

2011English

Amanote Research

Note-taking for researchers

Follow Amanote

© 2025 Amaplex Software S.P.R.L. All rights reserved.

Privacy PolicyRefund Policy