[go: up one dir, main page]

WO2018164636A1 - Évaluation de performance visuelle - Google Patents

Évaluation de performance visuelle Download PDF

Info

Publication number
WO2018164636A1
WO2018164636A1 PCT/SG2017/050407 SG2017050407W WO2018164636A1 WO 2018164636 A1 WO2018164636 A1 WO 2018164636A1 SG 2017050407 W SG2017050407 W SG 2017050407W WO 2018164636 A1 WO2018164636 A1 WO 2018164636A1
Authority
WO
WIPO (PCT)
Prior art keywords
game
subject
visual
user
performance
Prior art date
Application number
PCT/SG2017/050407
Other languages
English (en)
Inventor
Dinesh Visva GUNASEKERAN
Original Assignee
Gunasekeran Dinesh Visva
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Gunasekeran Dinesh Visva filed Critical Gunasekeran Dinesh Visva
Priority to AU2017402745A priority Critical patent/AU2017402745B2/en
Priority to SG11201907590RA priority patent/SG11201907590RA/en
Priority to CN201780087961.2A priority patent/CN110381811A/zh
Publication of WO2018164636A1 publication Critical patent/WO2018164636A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/028Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/46Computing the game score
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/67Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use

Definitions

  • the invention relates to the general field of determining, assessing or monitoring the visual performance of a subject or of subjects.
  • Embodiments relate to a method, a system, a game, a computer program product (or computer program) and a device employed in this field.
  • Non-limiting embodiments relate to utilising customised gamification and design of immersive games on platforms such as virtual reality, augmented reality, and/or mixed reality.
  • Assessments of visual function may include a large number of tests and assessment processes. Amongst the many checks that are currently used are visual acuity, peripheral visual field function, macular function, central visual field function, contrast
  • test-retest variability has been described in numerous clinic-based
  • Manual assessment of peripheral vision is achieved by the technical expert (for example an ophthalmologist, or an eye specialist doctor) instructing the patient to focus their vision on the technical expert's nose, while the technical expert presents a wiggling finger in each of the four quadrants of the patient's peripheral visual fields one eye at a time, moving inwards towards the centre of the patient's vision.
  • the patient then has to tell the technical expert once they are able to see the wiggling finger, and the expert makes a note of his rough gauge of the patient's visual field function. Each eye is tested in turn.
  • This provides a gross assessment of the scope of a patient's peripheral vision that may be done in a clinic setting without much spatial or equipment requirement, but is manpower intensive, low fidelity and difficult to reproduce or standardize.
  • neuropsychological functions such as attention or reaction time of patients with eye diseases, based on assessment of their eye movements when presented with virtual re- enactions of everyday activities.
  • Gamification indicates incorporation of at least one of the characterizing features of a game into a process.
  • Such features may include one or more of scoring, competitiveness, providing rewards and providing rules. The function of this is to enhance test accuracy/fidelity, to better engage the user to improve participation, and to increase cooperation of the user to the rules of visual function assessment to facilitate valid results and assessments.
  • embedding of process visual functional assessment(s) and their validity requirements in games to enhance test fidelity and patient compliance to testing instructions This can standardise the quality of testing procedures, improve
  • the games are immersive games to create assessments that are embedded in dynamic situations or tasks to provide novel measures of visual function.
  • the novel measures are based on users' in-game performance (virtual user information) as well as physical responses (physical user information) to in-game events/tasks, whereby these games can be deployed on platforms such as VR/AR/Mixed reality.
  • Embodiments of the invention may replace or augment the measures available and in-use today.
  • gamification methods are used to facilitate patient engagement and encourage them to perform as best they can, be compliant to eye screening frequency, and finally provide a control for remote assessments of patient visual function.
  • This may be achieved through immediate/overall goals and the deployment of social gaming elements for improved patient compliance to frequency of eye screening recommendations (e.g. screen once a month), improved convenience of more frequent eye screening or visual function assessment (e.g. weekly) by addressing logistical constraints, and provide natural controls for remote visual functional assessments to identify aberrations in performance that arise from non-organic causes (such as lack of familiarity after prolonged break, technical problems, lag, etc).
  • a method of assessing the visual performance of a subject comprising providing a game to a subject whereby the subject interacts with the game, and using the results of the interaction to determine at least one aspect of visual performance.
  • system for assessing the visual performance of a subject, the system being configured to present at least the video output of a game to the subject, whereby the subject interacts with the game, the system being configured to use the results of the interaction to determine at least one aspect of visual performance of the subject.
  • a system for assessing the visual performance of a subject is configured to present at least a video output of a game to the subject, whereby the subject interacts with the game, and to provide data indicative of the user's reaction to game play whereby at least one aspect of visual performance of the subject may be determined.
  • a device configured to provide signals representing at least the video of a game for viewing by a subject, the device further configured to receive information concerning interaction of a subject with the game, whereby the subject's visual performance may be assessed.
  • a game configured, when a video from the game is presented to a subject who interacts with the video, to allow assessment of the visual performance of the subject.
  • the assessment may be performed by a device connected to receive information concerning the subject's interaction with the game.
  • the assessment may be performed by a device receiving information concerning one or more of the movements and the positions adopted by a subject playing the game.
  • the game may be a purpose-written game.
  • the game play may be focused on or written specifically for the assessment.
  • existing tests are gamified by inclusion of game-like features, for example features including one or more of scoring, competitiveness, providing rewards and providing rules.
  • a pre-existing game is adapted to have additional features enabling optical or visual assessments to be made on a player.
  • the method may include gathering one or more of physical and virtual user information and interpretation of the user information.
  • Physical user information may include multiple domains such as gaze and pupillary tracking.
  • Physical user information may include one or more of positional and rotational tracking of the user's entire body and individual body parts such as the head.
  • the method may comprise gathering physical user information using position or orientation sensors.
  • the method may comprise gathering user information using brain-computer interfaces.
  • the game may include an aim and shoot mechanic at a centrally located visual stimulus. This may maintain a user's vision in the central visual axis.
  • the game may include a dodging or defending game mechanic based on secondary stimuli from various spatial regions, for example in the peripheral visual field.
  • the game may include one of more of performance tracking, performance registration, score tracking, score interpretation and correlation to the user's functional capacity for specific visual function domains.
  • the game may include automated gradual increments in game difficulty or difficulty tailored to individual user performance. This may be achieved through edge analytics with machine learning. Immediate experiences and goals that are incorporated include winning or completing a game module and achieving a high score. These provide immediate engagement and motivation for users to perform as best they can during assessment.
  • the game may include overall experiences and goals. These may include progressing through multiple stages and collecting enhancements to upgrade a personal gaming profile or avatar. Such techniques involve the user in the game itself and thus facilitate long-term compliance and engagement.
  • the game may be made available as, say, a computer program or a computer program product as may be made available for download from, say, an online store/marketplace.
  • Fig 1 shows a block schematic view of a visual/optical assessment system embodying the invention
  • FIG. 1 a schematic diagram of a visual/optical assessment system 100 has a computer device 101, a virtual-reality type headset device 103 and a user-actuated game control device 105.
  • a headset is the HTC Vive.
  • the headset device 103 is designed to be worn by a user and has a display system 109 with a left display 109a for the user's left eye and a right display 109b for the user's right eye, and first and second sensors 111, 113 connected to supply signals 112, 114 to the computer device 101.
  • the display system is designed to place video information in the visual field of a user when wearing the headset, as discussed below.
  • the computer device 101 is configured to provide signals representing at least the video of the game for viewing by a subject, e.g. a user wearing a headset.
  • the computer device is further configured to receive information concerning interaction of a subject with the game, whereby the subject's visual performance may be assessed.
  • the game is configured, when a video from the game is presented to a subject who interacts with the video, to allow assessment of the visual performance of the subject.
  • the assessment may be performed by a device, for example the computer device 101 connected to receive information concerning the interaction with the game.
  • the assessment may be performed by a device, e.g. the computer device 101 or a separate connected device, receiving information concerning the movements and/or the positions adopted by a subject playing the game.
  • the assessment may be performed by a technical expert.
  • First sensor(s) 111 are configured to supply data 112 allowing the user's pupils or gaze to be tracked.
  • Second sensor(s) 113 are configured to supply data 114 allowing head movements of the user to be tracked.
  • the headset device 103 also has an external camera 119 to supply signals to the computer device 101 to enable mixed or enhanced reality experience to be displayed.
  • virtual images are overlaid onto a virtual re-enactment of the user's actual surroundings on an opaque display screen 109.
  • virtual images are overlaid onto the user's actual surroundings as seen by the user through a transparent or translucent display screen 109.
  • the display is configurable and controllable by signals 122 from the computer device 101 to the headset 103, to provide a three-dimensional image by using both displays 109a and 109b to present one version of a virtual image perceived by each eye independently.
  • the computer device 101 may also cause the headset 103 to provide a different image separately to each eye of a user. It can supply no image via display 109b to the right eye, and an image via left display 109a to the left eye or vice-versa, or provide different images to each eye. It may also provide identical images to each eye.
  • displays 109a and 109b are combined in a single display system 109 which provides a single combined image perceived by both eyes.
  • the display system 109 is a transparent screen on which virtual images are displayed and overlaid over the actual surroundings of the user that he sees through the transparent display system 109.
  • the display system 109 is not be contained within a headset but is presented in one of a hand-held mobile device and a standing/wall-mounted screen, with sensors 111 and 113 embedded or provided separately.
  • the computer device 101 outputs the audio of a game via the headset. Some embodiments have audio output via a separate external device, audio output from the computer device 101, or no audio output- for example, some mixed reality displays.
  • the computer device 101 in an embodiment stores in memory the instructions for displaying the game using the headset 103, and receives back from the headset, from the various sensors and the game control device 105, signals 112, 114, 121 that cause game play to progress in different ways according to the nature and content of those signals.
  • the computer device 101 is arranged, in this embodiment, after receiving the pupil and movement data signals 112,114 from the headset 103 and the signals 121 from user actuation of the game controller 105 to process those signals to extract from them data that is either directly indicative of the user's visual performance or data that can be analyzed and interpreted to provide a measure of the user's visual performance.
  • Data in the computer device 101 is either automatically interpreted by an embedded analytic program within the architecture of the assessment system 100, relayed via the internet and/or cloud architecture to a remote analytic program, relayed via the internet and/or cloud architecture to a remote expert who interprets the data, or some combination of the above.
  • game means any set of images intending to involve a user in play.
  • the game in one embodiment is a game written with the purpose of carrying out optical/visual assessments.
  • an existing game is modified with additional features enabling optical or visual assessments to be made on a player.
  • a set of conventional tests/assessments is gamified by adding at least one of the characterizing features of a game, including for example one or more of scoring, competitiveness, providing rewards and providing rules.
  • the game involves manual interaction, via the user-actuated control device 105. Also in this embodiment interaction between a user and the game can be detected by one or more of body/body part movements of the game user, body motion by image analysis software of images of real movements of the user's body as recorded by camera 119, or eye/pupil movement by gaze/pupil tracker 112.
  • brain-computer interfacing is used.
  • the system 100 in some embodiments, also has electroencephalograph monitoring devices connected to supply inputs to the computer device 101 to enable involvement by the brain to be determined.
  • Three-dimensional gaming platforms project virtual images that can be accurately varied in terms of distance, size, and appearance from the user's eye. These enable reproducible assessments of visual functions such as visual acuity testing when integrated within games. This helps to overcome forms of test-retest variability that can be attributed to operator dependency in existing manual assessment processes, such as accurate placement of visual acuity charts at required fixed distances from individuals being assessed.
  • each visual assessment is carried out by a respective single game, so one game per test.
  • plural assessments are carried out by a single game, for example a game having different "levels", each "level” including a different test, or a multifaceted game that integrates assessment of plural visual functions.
  • all visual assessments are carried out by means of a single game.
  • the computer device 101 displays on the user's headset 103 a game in which the user's main objective is scoring points by using two controllers 105a and 105b.
  • 105a is used to toss a virtual object through a virtual basketball hoop presented in the central visual axis, while 105b is simultaneously used to defend himself from virtual objects that are thrown at him, such as balls in the game of dodge ball, that are projected from the various spatial regions of the visual field by virtual "members of an opposing team".
  • peripheral visual field function assessed by the user appropriately pressing a "defend" button on 105b whenever a stimulus is presented.
  • peripheral visual field function assessed by the user appropriately pressing a "defend" button on 105b whenever a stimulus is presented.
  • this might represent gradual development or progression of an ocular condition affecting the integrity of the user's visual function in that area of the peripheral visual field, such as Glaucoma.
  • Sensor 111 and/or 113 acts as additional stop-guard(s), with output(s) 112 and/or 114 interpreted by the computer device 101 which modifies the game to stimulate compliance to instructions. In one embodiment, this may involve pausing the game or deducting the user's score whenever pupil movement or head movement corresponding to a shift in patient's central visual axis is detected.
  • a user traverses a minefield by hopping through a plain grass patch, in order to assess the visual function of contrast sensitivity.
  • the virtual user is standing in the middle of a 3x3 grid of imaginary squares, the user would have eight movement options at any one time (front-left, front, front-right, left, right, back-left, back, back-right) to progress through the field.
  • the "safe path" is represented by square blocks of land area that have a differing contrast from the other blocks. If the user chooses the right movement option corresponding to the square with the contrast difference, he survives and progresses one step. If he chooses the wrong option, a mine explodes and he dies, for example.
  • the user's visual function or contrast sensitivity can be tracked or assessed based on how far they manage to progress through the minefield each time they play this game.
  • gaming platforms provide augmented or mixed reality to present an enhanced capacity to assess visual performance through gamification.
  • the computer device 101 causes the headset 103 to project to the user mixed reality or augmented reality images on a single opaque display 109 or opaque independent displays 109a and 109b, with game elements layered over the real world, as picked up by the camera 119.
  • the computer device 101 causes the headset 103 to project to the user mixed reality or augmented reality images on a transparent display 109, with game elements layered over the real world, as viewed by the user through the transparent display 109.
  • Augmented and mixed realities can provide added convenience and ease of home-based assessments.
  • the data of the game may be local to the headset 103- e.g.
  • augmented and mixed reality games enable the assessment of new visual assessments such as visual attentiveness to virtual images projected in the user's field of vision through augmented reality, while users continue to carry out their regular daily activities and remain productive during their time spent on assessments.
  • Playing a game/interacting with a virtual interface during everyday tasks for visual function assessment is achieved by augmented reality. This may involve a head set such as Googleglass, allowing a user to wear it as he goes about his daily activities. During rest periods/waiting time, he can select an option to play a game that now provides assessment of at least one visual function.
  • augmented reality This may involve a head set such as Googleglass, allowing a user to wear it as he goes about his daily activities. During rest periods/waiting time, he can select an option to play a game that now provides assessment of at least one visual function.
  • Virtual user information includes multiple domains such as user interaction with the virtual space, user response to virtual stimuli, and user performance in modular tasks in the virtual space. This information is gathered into the computer device 101 from the sensors 111,113 and the controller 105. Virtual user information is gathered based on performance in gamified assessments or based on performance in straightforward virtual tasks and exercises that integrate existing eye screening and/or visual assessment exercises/stations into overall tasks within games. Gathering, processing, and
  • interpretation of certain virtual user information by the computer device 101 may require interpretation of results, play progress and outcome in concert with physical user information.
  • User information is interpreted as raw independent data or in correlation with in-game user performance and interaction data. Data can be both cross-sectional and collected over time from multiple gaming experiences and modules.
  • Cross-sectional refers to study of users at a specific point of time i.e. analysing the data of the user drawn at a specific point in time (such as machine learning edge analytics) as opposed to analysing the data based on trending the user's performance in the various games over time.
  • Data is collected in terms of a single-user assessment and/or performance in a single-user setting, single-user assessment and/or performance in a multi-user setting, or multi-user assessment or performance in a multi-user setting.
  • data is compared between users in multi-player collaborative games to exclude technical interference causing any change in performance.
  • This also serves as having a dedicated control subject for any assessment of a user's visual performance.
  • Performing the assessments in the game environment can help improve the convenience and fidelity of eye screening and vision assessment and overcome technical difficulties in certain existing methods of visual assessments.
  • an example pertains to technical difficulties in performing visual fields assessments.
  • the user's visual axis and attention is focused where required through the primary objective or mission of a game, with the necessary assessments incorporated as concurrent side objectives or secondary goals to be achieved within the same game.
  • This process can be further enhanced by the integration of gaze, pupillary, and/or head movement tracking as an added measure to re-confirm and track user compliance to the requirements of the specific visual assessment.
  • a virtual/augmented/mixed reality game with a combination of mechanics deployed.
  • an aim and shoot mechanic at a centrally located visual stimulus can be employed to maintain the user's vision in the central visual axis.
  • Performance in this objective can be used to track user's visual axis and thereby interpretability/validity of peripheral vision assessment. This will also help discern variations in user performance that may not be due to changes in their visual function but instead might be attributed to increasing or decreasing familiarity with the game from having played it more or less often, respectively.
  • a dodging or defending game mechanic can be deployed based on secondary stimuli from various spatial regions of the peripheral visual field.
  • Performance in this objective can be used to assess the peripheral visual fields of a user, by tracking the score stratified to the user's responsiveness and/or perception of individual segments of the visual field spatial regions.
  • “Stratified” means that the user's in-game performance (scores for various tasks or progress for various in-game goals) are independently analysed to provide data on the user's visual function broken down into performance measures for each of the individual facets of visual function.
  • the gaming experience is further enhanced and personalised through the incorporation of machine learning and artificial intelligence capabilities.
  • machine learning and artificial intelligence capabilities facilitate personalised assessment of visual performance by varying difficulty of a gaming module to find the exact maximum difficulty that the user is able to cope with using their visual function.
  • Machine learning and artificial intelligence also exclude the confounding effect any impaired visual performance may have on other modules, by automatically varying the difficulty in known impaired visual performance to assess the impact on the user's performance on other modules. This also facilitates assessment of multiple areas of visual performance simultaneously through complex games in the same way.
  • a VR headset is not used, but instead a screen is placed at a fixed or measurable distance from the users eyes.
  • a headset may enable better correlation of stimuli within aspects of the games to functional areas of the user's visual field as well as improved coverage of the peripheral visual field to provide more immersive and engaging gamified assessments of visual function in order to boost compliance.
  • each performance metric is individually embedded into the or each developed game so that they can be assessed by analysing the data collected in the back- end about the user's performance in the or each game.
  • Performance can be monitored by the user's in-game performance (scores for various tasks/in-game goals) being analysed to provide data on the user's visual function, broken down into performance measures for each of the individual facets of visual function.
  • Users are studied at a specific point of time i.e. by analysing the data of the user drawn at a specific point in time (such as machine learning edge analytics) as opposed to analysing the data based on trending the user's performance in the various games over time, or through repeated trended assessments over time.
  • a gradual sustained decrease in a user's score for a test in the peripheral visual fields may serve as a marker to prompt referral to an ophthalmologist to exclude chronic ocular conditions that affect the peripheral visual field such as glaucoma.
  • Such changes in visual function may often go undetected by users, depending on the visual demands of their daily tasks. For instance, an elderly user who only watches the television or reads the newspaper as his main daily visual tasks may not experience sufficient demand of their peripheral vision field to notice deterioration at the onset of disease.
  • these games can serve as a measure to increase visual demands of users, so as to detect deficiencies in visual function and bring them to the attention of users earlier.
  • Elements of gamification are further incorporated into these tests to facilitate personalized assessment and improved long-term compliance to daily monitoring of vision.
  • These elements include features such as automated gradual increments in game or task difficulty tailored to individual user performance.
  • Immediate experiences and goals that are incorporated include winning/completing a game module and achieving a high score, respectively. These provide immediate engagement and motivation for users to perform as best they can during assessment.
  • Overall experiences and goals are also incorporated such as progressing through multiple stages and collecting enhancements to upgrade a personal gaming profile and/or avatar, respectively. These facilitate long-term compliance and engagement.
  • the computer device 101 includes a backend hardware and/or software system to interpret the raw data from the headset into understandable and actionable information.
  • This is information that the lay-person can make sense of in the monitoring of his condition remotely. In that case, regular involvement of a healthcare worker to interpret and explain the results is not required.
  • the information can be transmitted to an interpreter (such as an Ophthalmologist observing a dashboard with the performance of a list of users under his care) who can remotely monitor the user, receive prompts for assessments when a user's visual function decreases, and/or contact the affected user to assess further or instruct them to seek medical attention early
  • tests are layered in interactive games to foster intergenerational interaction between users and family members of all ages, reduce depression and/or loneliness amongst elderly, and improve compliance to visual function monitoring through gamification and
  • Additional benefits include, improved convenience to users due to potential home-based deployment of these gaming platforms, as well as the decreased spatial requirement when compared to physical deployment of equipment over distances (such as Snellen charts at a distance of 6 meters in the assessment of visual acuity). Benefits also include time-savings such as utilizing clinic waiting time of patients for these tests while awaiting

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Ophthalmology & Optometry (AREA)
  • Physics & Mathematics (AREA)
  • Cardiology (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Theoretical Computer Science (AREA)
  • Eye Examination Apparatus (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

L'invention concerne la ludification du dépistage et de la surveillance de l'oeil à distance sociale par l'intégration d'évaluations modulaires de la fonction visuelle dans les objectifs de jeux interactifs pour encourager la conformité à des instructions de test afin de faciliter une fidélité améliorée, une réévaluation/tendance fréquente à distance et/ou une interprétation automatisée. Cela est développé sur n'importe quelle plateforme, y compris des plateformes de jeu virtuelles et/ou tridimensionnelles qui comprennent des plateformes de technologie de jeu immersives et de plongée profonde telles que la réalité virtuelle, la réalité augmentée ou la réalité mélangée. Ce nouveau procédé sera également déployé sur n'importe quelles plateformes futures qui impliquent des casques ou des projections d'image qui entourent physiquement des utilisateurs avec des stimuli virtuels.
PCT/SG2017/050407 2017-03-04 2017-08-16 Évaluation de performance visuelle WO2018164636A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
AU2017402745A AU2017402745B2 (en) 2017-03-04 2017-08-16 Visual performance assessment
SG11201907590RA SG11201907590RA (en) 2017-03-04 2017-08-16 Visual performance assessment
CN201780087961.2A CN110381811A (zh) 2017-03-04 2017-08-16 视觉表现评估

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
SG10201701755X 2017-03-04
SG10201701755X 2017-03-04

Publications (1)

Publication Number Publication Date
WO2018164636A1 true WO2018164636A1 (fr) 2018-09-13

Family

ID=63448544

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2017/050407 WO2018164636A1 (fr) 2017-03-04 2017-08-16 Évaluation de performance visuelle

Country Status (4)

Country Link
CN (1) CN110381811A (fr)
AU (1) AU2017402745B2 (fr)
SG (1) SG11201907590RA (fr)
WO (1) WO2018164636A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112842360A (zh) * 2021-01-29 2021-05-28 苏州大学 一种判断优势眼和非优势眼的方法及系统
WO2023278926A1 (fr) * 2021-07-01 2023-01-05 Tencent America LLC Test de qualification dans une notation de sujets

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111045520B (zh) * 2019-12-27 2021-08-17 电子科技大学 一种调控虚拟现实中用户时间感知和临境感的方法
CN111000524A (zh) * 2019-12-30 2020-04-14 珠海广目锐视医疗科技有限公司 基于人工神经网络的视觉测试系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006047369A2 (fr) * 2004-10-22 2006-05-04 Vimetrics Llc Systeme et procede de depistage de troubles de la fonction visuelle
US20100195051A1 (en) * 2007-05-16 2010-08-05 University Court Of The University Of Edinburgh Testing Vision
US20130155376A1 (en) * 2011-12-20 2013-06-20 Icheck Health Connection, Inc. Video game to monitor visual field loss in glaucoma
US20140211167A1 (en) * 2013-01-25 2014-07-31 James Waller Lambuth Lewis Binocular Measurement Method and Device
US20160262608A1 (en) * 2014-07-08 2016-09-15 Krueger Wesley W O Systems and methods using virtual reality or augmented reality environments for the measurement and/or improvement of human vestibulo-ocular performance

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8083354B2 (en) * 2007-10-03 2011-12-27 Diopsys, Inc. Simultaneously multi-temporal visual test and method and apparatus therefor
JP5921533B2 (ja) * 2010-06-11 2016-05-24 バック イン フォーカスBack In Focus 観察者の視覚障害を補正するための表示を表現するシステム、方法、およびそのプログラムを格納している記憶媒体
US20140362110A1 (en) * 2013-06-08 2014-12-11 Sony Computer Entertainment Inc. Systems and methods for customizing optical representation of views provided by a head mounted display based on optical prescription of a user
CN106137112A (zh) * 2016-07-21 2016-11-23 浙江理工大学 一种基于脑电波检测的视标显示系统及视标显示优化方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006047369A2 (fr) * 2004-10-22 2006-05-04 Vimetrics Llc Systeme et procede de depistage de troubles de la fonction visuelle
US20100195051A1 (en) * 2007-05-16 2010-08-05 University Court Of The University Of Edinburgh Testing Vision
US20130155376A1 (en) * 2011-12-20 2013-06-20 Icheck Health Connection, Inc. Video game to monitor visual field loss in glaucoma
US20140211167A1 (en) * 2013-01-25 2014-07-31 James Waller Lambuth Lewis Binocular Measurement Method and Device
US20160262608A1 (en) * 2014-07-08 2016-09-15 Krueger Wesley W O Systems and methods using virtual reality or augmented reality environments for the measurement and/or improvement of human vestibulo-ocular performance

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112842360A (zh) * 2021-01-29 2021-05-28 苏州大学 一种判断优势眼和非优势眼的方法及系统
CN112842360B (zh) * 2021-01-29 2022-08-30 苏州大学 一种判断优势眼和非优势眼的方法及系统
WO2023278926A1 (fr) * 2021-07-01 2023-01-05 Tencent America LLC Test de qualification dans une notation de sujets

Also Published As

Publication number Publication date
AU2017402745A1 (en) 2019-09-12
AU2017402745B2 (en) 2024-01-04
SG11201907590RA (en) 2019-09-27
CN110381811A (zh) 2019-10-25

Similar Documents

Publication Publication Date Title
US12161410B2 (en) Systems and methods for vision assessment
JP5367693B2 (ja) 視覚的認知能力および協応動作の検査および訓練
US11291362B2 (en) Systems and methods for eye evaluation and treatment
JP7125390B2 (ja) バイオマーカーまたは他のタイプのマーカーとして構成される認知プラットフォーム
JP5654595B2 (ja) 被験者の視覚能力を測定及び/又は訓練するシステム、及び測定方法
CN101742957B (zh) 测试视觉
AU2017402745B2 (en) Visual performance assessment
EP3223678B1 (fr) Images d'ordinateur dynamiques permettant d'améliorer la perception visuelle
US20210045628A1 (en) Methods, systems, and computer readable media for testing visual function using virtual mobility tests
CN114190891B (zh) 基于眼动跟踪与沉浸式驾驶平台的单侧忽略评估系统
US20200085297A1 (en) Simulator for the evaluation of a concussion from signs displayed during a visual cranial nerve assessment
Ahmed et al. Democratizing health care in the Metaverse: How video games can monitor eye conditions using the vision performance index: A pilot study
US20230293004A1 (en) Mixed reality methods and systems for efficient measurement of eye function
Khaleghi et al. Toward using effective elements in adults’ amblyopia treatment in a virtual reality-based gamified binocular application
US20250176822A1 (en) Methods, systems, and computer readable media for assessing visual function using virtual mobility tests
US12440096B2 (en) Systems and methods for eye evaluation and treatment
US20220225873A1 (en) Systems and methods for eye evaluation and treatment
Kini et al. Realization of game mechanics in Virtual Reality for amblyopia treatment
EP4401614B1 (fr) Détermination d'une performance visuelle d'un oeil d'une personne
Arnoldi Orthoptic evaluation and treatment
HK40035072A (en) Systems and methods for visual field analysis
Kiviranta MAPPING THE VISUAL FIELD–AN EMPIRICAL STUDY ON THE USER EXPERIENCE BENEFITS OF GAZE-BASED INTERACTION IN VISUAL FIELD TESTING
Siong Relationship between vision and balance in static and dynamic manners

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17900216

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017402745

Country of ref document: AU

Date of ref document: 20170816

Kind code of ref document: A

122 Ep: pct application non-entry in european phase

Ref document number: 17900216

Country of ref document: EP

Kind code of ref document: A1