Zhang, 2015 - Google Patents
Eye tracking and gaze interface design for pervasive displaysZhang, 2015
- Document ID
- 6459189661393699947
- Author
- Zhang Y
- Publication year
- Publication venue
- PQDT-UK & Ireland
External Links
Snippet
Eye tracking for pervasive displays in everyday computing is an emerging area in research. There is an increasing number of pervasive displays in our surroundings, such as large displays in public spaces, digital boards in offices and smart televisions at home. Gaze is an …
- 210000001508 Eye 0 title abstract description 630
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterized by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object or an image, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F19/00—Digital computing or data processing equipment or methods, specially adapted for specific applications
- G06F19/30—Medical informatics, i.e. computer-based analysis or dissemination of patient or disease data
- G06F19/34—Computer-assisted medical diagnosis or treatment, e.g. computerised prescription or delivery of medication or diets, computerised local control of medical devices, medical expert systems or telemedicine
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00335—Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
- G06K9/00355—Recognition of hand or arm movements, e.g. recognition of deaf sign language
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9953214B2 (en) | Real time eye tracking for human computer interaction | |
| Kar et al. | A review and analysis of eye-gaze estimation systems, algorithms and performance evaluation methods in consumer platforms | |
| Drewes | Eye gaze tracking for human computer interaction | |
| Rozado et al. | Controlling a smartphone using gaze gestures as the input mechanism | |
| Majaranta et al. | Eye tracking and eye-based human–computer interaction | |
| US9823744B2 (en) | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects | |
| Mokatren et al. | Exploring the potential of a mobile eye tracker as an intuitive indoor pointing device: A case study in cultural heritage | |
| Edughele et al. | Eye-tracking assistive technologies for individuals with amyotrophic lateral sclerosis | |
| Mardanbegi et al. | Eye-based head gestures | |
| Lee et al. | Designing socially acceptable hand-to-face input | |
| Bernardos et al. | A comparison of head pose and deictic pointing interaction methods for smart environments | |
| Lei et al. | An end-to-end review of gaze estimation and its interactive applications on handheld mobile devices | |
| Kumar et al. | Eye-controlled interfaces for multimedia interaction | |
| Wang et al. | BlyncSync: Enabling multimodal smartwatch gestures with synchronous touch and blink | |
| CN107562186A (en) | The 3D campuses guide method for carrying out emotion computing is recognized based on notice | |
| Masai et al. | Eye-based interaction using embedded optical sensors on an eyewear device for facial expression recognition | |
| Manresa-Yee et al. | Design recommendations for camera-based head-controlled interfaces that replace the mouse for motion-impaired users | |
| Jalaliniya et al. | Eyegrip: Detecting targets in a series of uni-directional moving objects using optokinetic nystagmus eye movements | |
| Liu et al. | CamType: assistive text entry using gaze with an off-the-shelf webcam | |
| Jota et al. | Palpebrae superioris: Exploring the design space of eyelid gestures | |
| Lai et al. | GazePuffer: hands-free input method leveraging puff cheeks for VR | |
| Chinyere Onyemauche et al. | Towards the use of eye gaze tracking technology: Human computer interaction (hci) research | |
| Deng | Multimodal interactions in virtual environments using eye tracking and gesture control. | |
| Zhang | Eye tracking and gaze interface design for pervasive displays | |
| Shehu et al. | Paradigm Shift in Remote Eye Gaze Tracking Research: Highlights on Past and Recent Progress |