CN114201031B - Time statistics method and device for eye tracking - Google Patents
Time statistics method and device for eye tracking Download PDFInfo
- Publication number
- CN114201031B CN114201031B CN202010979396.4A CN202010979396A CN114201031B CN 114201031 B CN114201031 B CN 114201031B CN 202010979396 A CN202010979396 A CN 202010979396A CN 114201031 B CN114201031 B CN 114201031B
- Authority
- CN
- China
- Prior art keywords
- time
- interest
- target
- condition
- time period
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/30—Monitoring
- G06F11/34—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment
- G06F11/3438—Recording or statistical evaluation of computer activity, e.g. of down time, of input/output operation ; Recording or statistical evaluation of user activity, e.g. usability assessment monitoring of user actions
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Human Computer Interaction (AREA)
- Eye Examination Apparatus (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses a time counting method and a device for eye movement tracking, wherein the method comprises the following steps: acquiring application demand information; determining a time interception condition based on the application demand information; responsive to a user's gaze on the target information, a target time period is acquired based on the time-intercept condition. According to the method and the device, the time interception condition can be determined through application requirements, automatic interception of the user's interested time period is achieved when the user looks at the information, the processing process is simple and efficient, and the processing requirements of automatic batch are met.
Description
Technical Field
The application relates to the technical field of information processing, in particular to a time counting method and device for eye movement tracking.
Background
Eye movement tracking is a technique for acquiring gaze information of a subject by using detection means such as electronics and optics, and can be performed by using eye tracking technology to acquire gaze information of a subject.
In the eye tracking test, statistics is often required to be performed on data of a part of a time period, such as estimation of data in a user fixation time period, but in the prior art, the transfer condition of the user fixation point needs to be observed by naked eyes of an observer, the time period is manually intercepted, human errors are easily caused, and accuracy of time statistics results is affected. And manual marking requires observing a large number of experimental records one by one, so that automatic batch processing cannot be realized, and time is wasted.
Disclosure of Invention
In view of this, the present application provides the following technical solutions:
a statistical method for eye movement tracking, the method comprising:
acquiring application demand information;
determining a time interception condition based on the application demand information;
responsive to a user's gaze on the target information, a target time period is acquired based on the time-intercept condition.
Optionally, the application requirement information includes a requirement for acquiring a space of interest, and the determining the time interception condition based on the application requirement information includes:
And determining the transfer condition of the interest space as a time interception condition.
Optionally, the acquiring, in response to the user's gaze on the target information, a target period based on the time intercept condition includes:
Responding to the fixation of the user to the target information, judging whether the fixation point of the user jumps from the interest area to the non-interest area;
If so, the time when the user gazes at the region of interest is determined as the target time period.
Optionally, the application requirement information includes a requirement for acquiring a time of interest, and the determining the time interception condition based on the application requirement information includes:
generating time interception conditions according to the time points of the interesting time.
Optionally, the generating the time interception condition according to the time point of the interest time includes:
based on the point in time of the time of interest and the preset float time, a time-clipping condition is determined.
Optionally, the acquiring the target time period based on the time interception condition includes:
If the interesting time point is an ending condition, determining a time period corresponding to the floating time which is forward recursion by the time point as a target time period;
If the time point of interest is a starting condition, determining a time period corresponding to the floating time which is recursively backward with the time point as a target time period;
If the time point of interest includes a start time and an end time, determining a time period between the start time and the end time as a target time period.
Optionally, the method further comprises:
and acquiring user gazing data in a target time period, and analyzing the user gazing data to obtain a gazing analysis result.
A time counting apparatus for eye movement tracking, the apparatus comprising:
The first acquisition unit is used for acquiring application demand information;
a determining unit configured to determine a time interception condition based on the application demand information;
and the second acquisition unit is used for responding to the fixation of the user on the target information and acquiring a target time period based on the time interception condition.
A computer readable storage medium having stored thereon executable instructions which when executed by a processor implement a temporal statistical method for eye tracking as defined in any one of the above.
An electronic device, comprising:
A memory for storing a program;
A processor, configured to execute the program, where the program is specifically configured to:
acquiring application demand information;
determining a time interception condition based on the application demand information;
responsive to a user's gaze on the target information, a target time period is acquired based on the time-intercept condition.
As can be seen from the above technical solution, the present application discloses a time counting method and device for eye movement tracking, the method comprising: acquiring application demand information; determining a time interception condition based on the application demand information; responsive to a user's gaze on the target information, a target time period is acquired based on the time-intercept condition. According to the method and the device, the time interception condition can be determined through application requirements, automatic interception of the user's interested time period is achieved when the user looks at the information, the processing process is simple and efficient, and the processing requirements of automatic batch are met.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only embodiments of the present application, and other drawings may be obtained according to the provided drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a time statistics method for eye tracking according to an embodiment of the present application;
Fig. 2 is a flowchart of a time counting method for a time point of interest according to an embodiment of the present application;
Fig. 3 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 4 is a schematic structural diagram of a time statistics device for eye tracking according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
Example 1
In the embodiment of the application, a time counting method for eye movement tracking is provided, and the method is used for obtaining a counting time period (for example, a target time period is used for representation) meeting requirements, for example, in an eye movement tracking experiment, a user needs to pay attention to what information is watched for the first 3 seconds of a certain object for counting, wherein the 3 seconds is the target time period required for counting. At present, the transfer condition of the fixation point is usually observed through naked eyes of a person, the time period is manually intercepted for statistics, personal errors are easily caused, experimental statistics results are affected, and batch processing cannot be performed. According to the method and the device, the time period of interest can be intelligently divided and counted through intercepting triggering conditions and time spans of the time period of interest, so that the division is more accurate, the counting is more accurate, and the method and the device are suitable for batch processing.
Referring to fig. 1, a flowchart of a time statistics method for eye movement tracking according to an embodiment of the present application is shown, where the method may include the following steps:
S101, acquiring application demand information.
The application requirement information refers to the requirement of time statistics, namely, the application purpose of a target time period obtained by statistics. The requirement information in the embodiment of the present application may include, for example, but not limited to, one or several of the following information: the method comprises the steps of obtaining requirement information of a space of interest, obtaining requirement of a time of interest, obtaining requirement information meeting specific statistical conditions, and the like. For example, the requirement information is that a space of interest of the user is obtained, the gazing duration of the user on each area can be counted, and the area with longer gazing duration is the region of interest of the user.
S102, determining time interception conditions based on the application demand information.
In the embodiment of the application, the time interception condition can be automatically generated by applying the demand information, and the time interception condition refers to a condition capable of determining the starting point and the ending point of time timing in the statistical process. The time condition may be specific condition information, such as a condition that timing starts after the user's gaze point transition is detected, or a condition that the user's gaze point transition is detected plus a float time. Since the time statistics method is applied to the field of eye movement tracking, the time interception condition can also be a condition that the gazing duration, gazing here, pupil size and the like reach a certain threshold, for example, the requirement information is the requirement of acquiring the interest of the user. The time-clipping condition may be a clipping condition defining a key point in time, e.g. a first condition is that the user's attention is less than a first threshold in time on the screen, and a second condition is that the user's attention is diverted to a region of interest in the screen for the first time.
S103, responding to the fixation of the user on the target information, and acquiring a target time period based on the time interception condition.
In the time counting method of eye movement tracking, the fixation of a user on target information is used as a triggering condition of the method, namely, when the user performs fixation on the target information, relevant information of the user is detected, and when the relevant information of the user is detected to meet a time interception condition, a corresponding time period is acquired as a target time period. The target information may be a target point or a target area, such as a mark of a certain product. The relevant information of the user may be shift information of the user knowledge point, gazing frequency information of the same target point, eye feature data of the user, and the like. The target time period is obtained based on time interception conditions, meets application requirement information of the user, can serve as a data basis for subsequent statistics, and is convenient for unifying interest information of the user, for example, an interest area and an interest point of the user can be obtained.
The embodiment of the application discloses a time counting method for eye movement tracking, which comprises the following steps: acquiring application demand information; determining a time interception condition based on the application demand information; responsive to a user's gaze on the target information, a target time period is acquired based on the time-intercept condition. According to the method and the device, the time interception condition can be determined through application requirements, automatic interception of the user's interested time period is achieved when the user looks at the information, the processing process is simple and efficient, and the processing requirements of automatic batch are met.
Example two
In the second embodiment of the present application, different application requirement information is taken as an example, and the time counting method of eye tracking of the present application is described.
Related terms that may be used in the present embodiment will be described first:
The time statistics period refers to statistics of data of only a part of a time period in an experiment or practical application, for example, in an eye tracking experiment, what is watched by a user for the first 3 seconds when the user watches an object (such as an escape sign) is required to be counted, which is also called a time period TOI (Time of interests) of interest.
Eye tracking refers to a technique of monitoring a person's gaze point. Eye movement analysis refers to analysis of gaze points, including obtaining indicators of gaze duration, gaze times, pupil diameter, blink frequency, etc., and may involve forming visual forms of gaze locus maps, gaze specific maps, etc.
The region of interest (Area ofInterests, AOI) is a region of interest that is important to focus on or requires manipulation, analysis, in an image or video.
In one possible implementation of the second embodiment, the application requirement information includes a requirement for acquiring the interest space. The interest region refers to a space of interest of a user, and how to define the space of interest of the user, which may be a user-defined space, that is, a space region set by the user itself is an interest space, or a space obtained based on behavioral characteristic analysis of the user is used as the interest space of the user. For example, in the field of eye tracking, the region of interest of a user may be determined by the duration of the gaze of the user on each region, and it is necessary to count the gaze time of the user on each region, and a region with a longer gaze time is generally defined as a region of interest.
Wherein determining the time interception condition based on the application demand information comprises: and determining the transfer condition of the interest space as a time interception condition.
The transfer condition of the interest space may refer to a transfer condition of the user to the gaze point of the interest space, that is, a transfer condition of the gaze point, and whether the gaze point is in the interest space is determined by analyzing the gaze point.
Correspondingly, in response to the user's gaze on the target information, acquiring the target time period based on the time intercept condition includes:
Responding to the fixation of the user to the target information, judging whether the fixation point of the user jumps from the interest area to the non-interest area;
If so, the time when the user gazes at the region of interest is determined as the target time period.
The target information may be information capable of indicating that the user starts to look at, that is, trigger information for acquiring the target period. If the user starts to watch the screen, the watch information of the user is acquired to judge whether the watch point of the user is in the interest area. And recording the gazing time of the user in each region, and determining the time of the user gazing at the region of interest as a target time period when the gazing point of the user is formed by a non-region of interest and if the gazing point of the user is formed by a non-region of interest. In this embodiment, the time when the user gazes at the region of interest may be automatically counted as the target time based on the determined time interception condition, i.e. the user gazes point from the region of interest to the non-region of interest. The time period can be intercepted manually without a statistics person, so that the statistics result is more accurate.
In another possible implementation, the application demand information includes a demand for acquiring a time of interest. Can be applied in the application field of which things the user is interested in. And acquiring corresponding time through analysis of the user on each fixation point as analysis requirements. Wherein, according to the time point of interest time, generate the time interception condition, include: based on the time point of the time of interest and the preset floating time, a time interception condition is determined. The time point of interest may be a starting time point of the user's gaze at the target point or an ending time point of the user's gaze at the target point.
Referring to fig. 2, a flowchart of a time statistics method for a time point of interest according to a second embodiment of the present invention is shown, where the method may include the following steps:
s201, obtaining the requirement of the interested time.
S202, determining a time interception condition based on a time point of the interest time and a preset floating time.
S203, if the interesting time point is an ending condition, determining a time period corresponding to the floating time which is forward recursion at the time point as a target time period.
S204, if the interesting time point is the starting condition, determining a time period corresponding to the floating time which is recursively backward from the time point as a target time period.
S205, if the interesting time point comprises a starting time and an ending time, determining a time period between the starting time and the ending time as a target time period.
In this embodiment, the point in time of interest may be a point in time when the user gazes at a certain target point, such as a point in time when the user gazes at the first area, and the point in time may be a point in time when the user gazes at the first area initially, or a point in time when the user gazes point from the first area to the second area, that is, a gazing end condition of the first area. The statistical result of the time period can be enabled to meet the actual gazing characteristics of the user through the setting of the floating time.
After the target time period is acquired, user fixation data in the target time period can be acquired, and the user fixation data is analyzed to obtain fixation analysis results, such as information of a fixation track map, a fixation heat map and the like of the user.
In the second embodiment of the application, the interception condition and the time span of the interested time period can be input, the interested time period (TOI) can be intelligently divided, the interested Area (AOI) can be further divided on the TOI, the labor cost for dividing the AOI is reduced, the division is more accurate, and the result statistics is more accurate. Note that AOI is for a space of interest, TOI is for a time of interest, for example, a study of a certain application may only focus on a person looking at a target video advertisement to play to the gaze of the 3 rd to 5 th seconds, which is TOI. AOI is a model that focuses only on whether the user focuses on the screen or not, and if the model is to be analyzed from the 3 rd to 5 th seconds when the user sees the video advertisement, then the analysis needs to be performed in combination with TOI and AOI. Thus, TOI and AOI can be analyzed either on a study-by-study basis or separately.
Specifically, the region of interest is a region of interest for research analysis, for example, if one wants to compare how much attention is paid to an advertisement model and a LOGO, the model and the LOGO can be framed as the region of interest, and other advertisement parts are not subjected to data analysis. In this embodiment, the first condition may be defined by predefining a condition of interception of a key time point, for example, a time of attention on a screen is less than 10%; the first time attention is diverted to the AOI position in the screen is the second condition, but it is of course also possible that the blink frequency, the pupil diameter being larger than a certain threshold value, etc. are determined by eye feature information.
In this embodiment, the floating period may be set before or after the time interception condition by the preset execution program, for example, 3 seconds, and the TOI may be intelligently intercepted according to the setting of the start and end conditions, which may include the following cases:
(1) The end condition is an interception condition, and the end time is an interception period of time recursively forward with the interception condition, for example, interception attention accords with 3S before the time interception condition.
(2) The start condition is an interception condition, and the start time is an interception period of time for which the interception condition is recursively backward, for example, 3S after the interception attention accords with the time interception condition.
(3) Both the start and end conditions are intercept conditions, e.g., intercept a first period of time when attention is paid to meet the second condition until the first condition is met.
Statistical analysis of the intercepted TOI is performed, including but not limited to, gaze duration, first gaze time, number of gaze times, pupil diameter, blink frequency, etc.
Example III
The eye movement device calculates the gaze point of the user according to eye characteristics including, but not limited to, the user's eye corner point, eyelid point, pupil center position, pupil radius size, purkinje spots formed by corneal emission, etc. And the information such as pupil diameter, eye closure degree and the like is calculated at the same time of calculating the fixation point. In the embodiment of the invention, the time interception condition can be determined based on the eye characteristics, and the eye characteristics of the user in the time period can be analyzed based on the obtained target time period.
The AOI partitioning tool may target frames through circular, square, polygonal, etc. tools, such as models, LOGO, etc. in an advertisement. That is, the shape of the region of interest is not limited in the embodiment of the present invention, and may be any of the above shapes, and if there are a plurality of regions of interest, each region of interest may be represented by the same shape or may be framed by different shapes.
Referring to fig. 3, a schematic diagram of an application scenario provided by the third embodiment of the present invention is shown, where an animation film is played on the display of the electronic device in fig. 3, and the animation film includes a picture that two animals contend for a small ball. If the cartoon is applied to a child test scene, the attention of the child needs to be judged, two animals compete for the small ball by the cartoon, normal reasoning can consider that the powerful animals can capture the small ball, for example, the two animals are tigers and puppies, the tigers win, and the child can see the tigers more if the child has normal reasoning capability. However, children sometimes become distracted, and the analysis results are inaccurate after the calculation is added.
Correspondingly, the first 3 seconds of child distraction can be intercepted as the analyzed TOI, and the other times are not counted. AOI (region of interest) are tigers and puppies. In the current treatment, a researcher is required to visually observe whether the attention of children is diverted or not, and then time marks are manually performed, so that errors exist due to the fact that the observation of people requires reaction time. And if 1000 children are tested, the recorded videos are needed to be played, and then related personnel check the attention transfer of the children, so that the labor cost is high.
After the time counting method is adopted, the TOI ending condition is defined as 'the time of the 2S inner fixation point on the screen is less than 10 percent', the TOI starting condition is defined as 'the front 3S of the TOI ending time point', the time period corresponding to the starting condition and the ending condition is the target time period, the video requiring analysis in the target time period can be automatically intercepted, and related data can be extracted. For example, if the tiger and the puppy are taken out as AOI, the comparison data of the child gazing at the puppy and the gazing time length of the tiger in the period can be directly statistically processed.
Based on the scene embodiment, the time statistics method for eye tracking provided by the invention can be obtained, the TOI dividing process is simplified, a large amount of time and labor cost are saved, the TOI is intelligently identified and divided by a computer, the division is accurate, and the deviation existing in the manual division is avoided.
Example IV
Referring to fig. 4, there is also provided in an embodiment of the present invention a time counting apparatus for eye movement tracking, the apparatus including:
A first acquiring unit 10, configured to acquire application requirement information;
a determining unit 20 for determining a time interception condition based on the application requirement information;
a second acquisition unit 30 for acquiring a target period based on the time interception condition in response to a user's gaze on the target information.
On the basis of the above embodiment, when the application requirement information in the first acquisition unit includes a requirement for acquiring the space of interest, the determination unit includes:
A first determining subunit, configured to determine a transition condition of the interest space as a time interception condition.
Optionally, the second acquisition unit includes:
the judging subunit is used for responding to the fixation of the user on the target information and judging whether the fixation point of the user jumps from the interest area to the non-interest area or not;
and the second determination subunit is used for determining the time of the user looking at the region of interest as a target time period if the user looks at the region of interest.
On the basis of the above embodiment, when the application demand information in the first acquisition unit includes a demand for acquiring the time of interest, the determination unit includes:
And the generation subunit is used for generating time interception conditions according to the time point of the interested time.
Optionally, the generating subunit is specifically configured to:
based on the point in time of the time of interest and the preset float time, a time-clipping condition is determined.
Optionally, the second acquisition unit includes:
a third determining subunit, configured to determine, as a target time period, a time period corresponding to a floating time that is recursively forward with the time point of interest if the time point of interest is an end condition;
A fourth determining subunit, configured to determine, as a target time period, a time period corresponding to a floating time that is recursively backward with a time point of interest if the time point is a start condition;
a fifth determining subunit, configured to determine, as a target time period, a time period between a start time and an end time if the time point of interest includes the start time and the end time.
On the basis of the above embodiment, the apparatus further includes:
The analysis unit is used for acquiring the user gazing data of the target time period, and analyzing the user gazing data to obtain a gazing analysis result.
Example five
A fifth embodiment of the present invention provides a computer storage medium storing a plurality of instructions adapted to be loaded by a processor and to perform the steps of the method for temporal statistics for eye movement tracking according to any one of the first to third embodiments described above.
Example six
An embodiment of the present invention provides an electronic device, including:
A memory for storing a program;
A processor, configured to execute the program, where the program is specifically configured to:
acquiring application demand information;
determining a time interception condition based on the application demand information;
responsive to a user's gaze on the target information, a target time period is acquired based on the time-intercept condition.
Wherein the program is further adapted to be loaded by the processor and to perform the method of temporal statistics for eye tracking according to any one of embodiments one to three.
The device herein may be a server, PC, PAD, cell phone, etc.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In one typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include volatile memory in a computer-readable medium, random Access Memory (RAM) and/or nonvolatile memory, etc., such as Read Only Memory (ROM) or flash RAM. Memory is an example of a computer-readable medium.
Computer readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of storage media for a computer include, but are not limited to, phase change memory (PRAM), static Random Access Memory (SRAM), dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), read Only Memory (ROM), electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium, which can be used to store information that can be accessed by a computing device. Computer-readable media, as defined herein, does not include transitory computer-readable media (transitorymedia), such as modulated data signals and carrier waves.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The foregoing is merely exemplary of the present application and is not intended to limit the present application. Various modifications and variations of the present application will be apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. which come within the spirit and principles of the application are to be included in the scope of the claims of the present application.
Claims (7)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010979396.4A CN114201031B (en) | 2020-09-17 | 2020-09-17 | Time statistics method and device for eye tracking |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202010979396.4A CN114201031B (en) | 2020-09-17 | 2020-09-17 | Time statistics method and device for eye tracking |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN114201031A CN114201031A (en) | 2022-03-18 |
| CN114201031B true CN114201031B (en) | 2024-11-12 |
Family
ID=80644705
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202010979396.4A Active CN114201031B (en) | 2020-09-17 | 2020-09-17 | Time statistics method and device for eye tracking |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN114201031B (en) |
Families Citing this family (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115033874B (en) * | 2022-05-09 | 2023-02-10 | 北京数美时代科技有限公司 | Account intercepting method and device, electronic equipment and computer storage medium |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20130088645A (en) * | 2012-01-31 | 2013-08-08 | 한국전자통신연구원 | Method for providing advertising using eye-gaze |
Family Cites Families (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2014178005A1 (en) * | 2013-04-29 | 2014-11-06 | The West Pomeranian University Of Technology | System and method for probabilistic object tracking over time |
| CN103823849A (en) * | 2014-02-11 | 2014-05-28 | 百度在线网络技术(北京)有限公司 | Method and device for acquiring entries |
| CN106169063B (en) * | 2016-06-22 | 2019-11-26 | 江苏大学 | A kind of method in automatic identification user reading interest area |
| CN108763394B (en) * | 2018-05-21 | 2021-11-23 | 浙江工业大学 | Multi-user eye movement tracking data visualization method and system for collaborative interaction |
| CN108960937B (en) * | 2018-08-10 | 2019-07-30 | 陈涛 | Advertisement sending method of the application based on eye movement tracer technique of AR intelligent glasses |
| CN110362775A (en) * | 2019-07-23 | 2019-10-22 | 秒针信息技术有限公司 | Page appraisal procedure, device, electronic equipment and computer readable storage medium |
| CN110638471A (en) * | 2019-08-30 | 2020-01-03 | 杭州海飘科技有限公司 | Somatosensory technical method based on visual perception of eye tracker |
-
2020
- 2020-09-17 CN CN202010979396.4A patent/CN114201031B/en active Active
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR20130088645A (en) * | 2012-01-31 | 2013-08-08 | 한국전자통신연구원 | Method for providing advertising using eye-gaze |
Non-Patent Citations (1)
| Title |
|---|
| Analysis of Signage Using Eye-Tracking Technology;Ming Tang;《Interdisciplinary Journal of Signage and Wayfinding》;20200203;第4卷(第1期);第63页左栏第2段-第65页第3段及表1 * |
Also Published As
| Publication number | Publication date |
|---|---|
| CN114201031A (en) | 2022-03-18 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11199899B2 (en) | System and method for dynamic content delivery based on gaze analytics | |
| Martinez et al. | Contributions of facial expressions and body language to the rapid perception of dynamic emotions | |
| CN111726689A (en) | Video playing control method and device | |
| KR20180036463A (en) | Method for Processing Image and the Electronic Device supporting the same | |
| CN109271929B (en) | Detection method and device | |
| CN114201031B (en) | Time statistics method and device for eye tracking | |
| CN111860121A (en) | Reading ability auxiliary evaluation method and system based on AI vision | |
| WO2024152820A1 (en) | End-to-end time delay measuring method and apparatus, and device | |
| CN108304312B (en) | Method and device for testing webpage loading speed | |
| CN112364765B (en) | A method and device for detecting gaze point of human eye movement | |
| RU2019132178A (en) | Method and device for determining the direction of rotation of a target object, a computer-readable medium and an electronic device | |
| CN114202522A (en) | Red blood capillary non-contact measurement method, storage medium and processor | |
| Marquart | Eye-tracking methodology in research on visual politics | |
| JP6883083B2 (en) | How to broadcast an informational message about the evaluation of the quality of life of a wristwatch wearer by a wristwatch | |
| RU2738322C1 (en) | Method for transmitting by means of wristwatch information message associated with user's sleep quality assessment of said wrist watch | |
| CN110348900B (en) | Data processing method, system and device | |
| CN111506488B (en) | Application page response testing method and device | |
| CN111714140A (en) | Method, device, system and storage medium for acquiring response information of detected lie person | |
| CN109561350B (en) | User interest degree evaluation method and system | |
| CN114187347A (en) | Skin wrinkle non-contact measurement method, storage medium and processor | |
| CN110324694B (en) | Video playing method and storage medium | |
| CN112158692B (en) | Method and device for acquiring flow of target object in elevator | |
| US20170160720A1 (en) | Computer-implemented method for monitoring machine tool based on user behavior | |
| CN116627789B (en) | Model detection method and device, electronic equipment and storage medium | |
| CN107846612B (en) | Audience rating analysis method and device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |