[go: up one dir, main page]

Bottos et al., 2019 - Google Patents

An approach to track reading progression using eye-gaze fixation points

Bottos et al., 2019

View PDF
Document ID
5720668317437347935
Author
Bottos S
Balasingam B
Publication year
Publication venue
arXiv preprint arXiv:1902.03322

External Links

Snippet

In this paper, we consider the problem of tracking the eye-gaze of individuals while they engage in reading. Particularly, we develop ways to accurately track the line being read by an individual using commercially available eye tracking devices. Such an approach will …
Continue reading at arxiv.org (PDF) (other versions)

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterized by the transducing means

Similar Documents

Publication Publication Date Title
Pampouchidou et al. Automatic assessment of depression based on visual cues: A systematic review
Sümer et al. Multimodal engagement analysis from facial videos in the classroom
Zhang et al. Data-driven online learning engagement detection via facial expression and mouse behavior recognition technology
Schulte-Mecklenbeck et al. Process-tracing methods in decision making: On growing up in the 70s
Fu et al. Eye tracking the user experience–an evaluation of ontology visualization techniques
Zhang et al. RCEA: Real-time, continuous emotion annotation for collecting precise mobile video ground truth labels
Bottos et al. Tracking the progression of reading using eye-gaze point measurements and hidden markov models
Ruensuk et al. How do you feel online: Exploiting smartphone sensors to detect transitory emotions during social media use
Davis III et al. Brainsourcing: Crowdsourcing recognition tasks via collaborative brain-computer interfacing
Tong et al. Measuring designers’ cognitive load for timely knowledge push via eye tracking
Baray et al. Eog-based reading detection in the wild using spectrograms and nested classification approach
Jyotsna et al. Intelligent gaze tracking approach for trail making test
Bottos et al. An approach to track reading progression using eye-gaze fixation points
Qi et al. Cases: A cognition-aware smart eyewear system for understanding how people read
Prova Explainable AI-Powered Multimodal Fusion Framework for EEG-Based Autism Spectrum Disorder Classification
Santhosh et al. Multimodal assessment of interest levels in reading: Integrating eye-tracking and physiological sensing
Krishnamoorthy et al. StimulEye: An intelligent tool for feature extraction and event detection from raw eye gaze data
Eraslan et al. Eye-tracking scanpath trend analysis for autism detection
Liu et al. Academic stress detection based on multisource data: a systematic review from 2012 to 2024
Benabderrahmane et al. A novel multi-modal model to assist the diagnosis of autism spectrum disorder using eye-tracking data
Cheekaty et al. Enhanced multilevel autism classification for children using eye-tracking and hybrid CNN-RNN deep learning models
Ekiz et al. Long short-term memory network based unobtrusive workload monitoring with consumer grade smartwatches
WO2024249609A2 (en) Detecting reading state of a user based on visual attention information and semantic features of text
Panwar et al. Detecting negative emotion for mixed initiative visual analytics
Hollenstein Leveraging cognitive processing signals for natural language understanding