Amanote Research

Amanote Research

    RegisterSign In

How Saccadic Models Help Predict Where We Look During a Visual Task? Application to Visual Quality Assessment.

Electronic Imaging
doi 10.2352/issn.2470-1173.2016.13.iqsp-216
Full Text
Open PDF
Abstract

Available in full text

Date

February 14, 2016

Authors
Olivier Le MeurAntoine Coutrot
Publisher

Society for Imaging Science & Technology


Related search

How Task Demands Shape Brain Responses to Visual Food Cues

Human Brain Mapping
Nuclear MedicineRadiologyUltrasound TechnologyAnatomyRadiologicalNeurologyImaging
2017English

Pre-Saccadic Shifts of Visual Attention

PLoS ONE
Multidisciplinary
2012English

Timing of Saccadic Eye Movements During Visual Search for Multiple Targets

Journal of Vision
OphthalmologySensory Systems
2013English

Visual Task Identification and Characterization Using Polynomial Models

Robotics and Autonomous Systems
ControlSystems EngineeringSoftwareComputer Science ApplicationsMathematics
2007English

Where Do We Look While Sleeping?

2009English

Inhibition of Visual Discrimination During a Memory-Guided Saccade Task

Journal of Neurophysiology
NeurosciencePhysiology
2004English

Visual Compliance: Task-Directed Visual Servo Control

IEEE Transactions on Robotics and Automation
1994English

How a Neuron Perceives Visual Motion During Self-Motion

The Journal of Physical Fitness and Sports Medicine
2014English

Lighting for a Visual Inspection Task

Proceedings of the Human Factors Society Annual Meeting
1981English

Amanote Research

Note-taking for researchers

Follow Amanote

© 2025 Amaplex Software S.P.R.L. All rights reserved.

Privacy PolicyRefund Policy