[go: up one dir, main page]

CN101807110A - Pupil positioning method and system - Google Patents

Pupil positioning method and system Download PDF

Info

Publication number
CN101807110A
CN101807110A CN200910006769A CN200910006769A CN101807110A CN 101807110 A CN101807110 A CN 101807110A CN 200910006769 A CN200910006769 A CN 200910006769A CN 200910006769 A CN200910006769 A CN 200910006769A CN 101807110 A CN101807110 A CN 101807110A
Authority
CN
China
Prior art keywords
angle
screen
distance
value
pupil
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN200910006769A
Other languages
Chinese (zh)
Other versions
CN101807110B (en
Inventor
赵新民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Utechzone Co Ltd
Original Assignee
Utechzone Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Utechzone Co Ltd filed Critical Utechzone Co Ltd
Priority to CN2009100067693A priority Critical patent/CN101807110B/en
Publication of CN101807110A publication Critical patent/CN101807110A/en
Application granted granted Critical
Publication of CN101807110B publication Critical patent/CN101807110B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

A pupil location method and system, the method comprising the steps of: capturing an eye image of a user, and positioning a pupil center and a light reflecting point by using the eye image; defining a distance value by the distance from the pupil center to the light reflecting point, and defining an angle value by the pupil center position and a reference line matched with a vector of the light reflecting point relative to the pupil center; and converting the position corresponding to the screen when any point of the screen is watched by utilizing an angle-distance distribution graph obtained by watching the distance values and the angle values of different positions. The invention does not need to wear additional equipment on the human body and does not need complex calculation, and can more accurately and effectively control the positioning function of the computer pointer.

Description

瞳孔定位方法及系统 Pupil positioning method and system

技术领域technical field

本发明涉及一种瞳孔定位方法及系统,特别是涉及一种可用眼睛控制计算机的瞳孔定位方法及系统。The invention relates to a pupil positioning method and system, in particular to a pupil positioning method and system which can control a computer with eyes.

背景技术Background technique

目前人眼追踪(Eye tracking)技术以是否与人眼接触区分为接触式及非接触式,接触式人眼追踪技术可区分为搜寻线圈法及眼电图法,非接触式人眼追踪技术主要是以视觉辨识为基础(Vision based),可区分为头戴式(Head-mount)或免头戴式(Free-head)。At present, eye tracking technology can be divided into contact type and non-contact type according to whether it is in contact with human eyes. Contact eye tracking technology can be divided into search coil method and electrooculogram method. Non-contact eye tracking technology mainly Based on vision recognition, it can be divided into head-mount or free-head.

在接触式人眼追踪技术方面,搜寻线圈法(Search coil)是让使用者配戴具有感应线圈的软式镜片,当使用者转动眼球进而带动镜片时,感应线圈会因为磁通量变化产生感应电动势,此电动势大小即代表眼球偏转的角度,但是此方法的缺点在于容易受到使用者眼球状况的影响,如眼部的分泌物等,且软式镜片是双层结构,会影响使用者的视力;至于眼电图(EOG)法,则是在眼部周围贴附多数电极,并利用所述电极侦测眼球转动所产生的电压差来判断其上下左右的角度,缺点是脸部贴附电极的皮肤电阻容易因为角质分泌使得取得的电讯号不稳定,且仅能记录眼球的巨转向而无法记录较微小的角度变化。In terms of contact eye tracking technology, the search coil method allows the user to wear a soft lens with an induction coil. When the user turns the eyeball to drive the lens, the induction coil will generate an induced electromotive force due to the change of the magnetic flux. The magnitude of the electromotive force represents the angle of eyeball deflection, but the disadvantage of this method is that it is easily affected by the user's eyeball conditions, such as eye secretions, etc., and the soft lens has a double-layer structure, which will affect the user's vision; as for The electrooculogram (EOG) method is to attach a large number of electrodes around the eyes, and use the electrodes to detect the voltage difference generated by the eyeball rotation to judge its angle of up, down, left, and right. The disadvantage is that the skin of the face where the electrodes are attached Resistors tend to get unstable electrical signals due to horny secretion, and can only record the giant turning of the eyeball, but cannot record smaller angle changes.

在头戴式人眼追踪技术方面,使用者必须配戴一附有小型摄影机的眼镜,由于眼部及摄影机的相对距离固定,如此就不会因为脸部偏移或眼部的相对距离变化导致判断不准确,因而在使用者使用时就必须将眼镜固定于头部借此固定小型摄影机与眼睛的相对位置,对使用者而言不但不方便也不舒适。In terms of head-mounted eye tracking technology, the user must wear a pair of glasses with a small camera. Since the relative distance between the eyes and the camera is fixed, it will not be caused by face deviation or changes in the relative distance of the eyes. The judgment is inaccurate, so when the user uses the glasses, the glasses must be fixed on the head so as to fix the relative position between the small camera and the eyes, which is not only inconvenient but also uncomfortable for the user.

免头戴式人眼追踪技术方面,国外有配合屏幕及双CCD摄影机的眼部追踪器(Eye trackers),台湾较有名的则有林宸生等人使用屏幕及单CCD摄影机的相关研究。然而,前述的免头戴式人眼追踪技术采用较复杂的运算,另外,双CCD摄影机的眼部追踪器虽然可以对指标精确定位,但是造价十分昂贵,且需要采用两个CCD摄影机。In terms of head-mounted eye-tracking technology, there are eye trackers with screens and dual CCD cameras in foreign countries. The more famous ones in Taiwan are related researches by Lin Chensheng and others using screens and single CCD cameras. However, the aforementioned head-mounted eye-tracking technology uses relatively complex calculations. In addition, although the eye tracker with dual CCD cameras can accurately locate the target, it is very expensive and requires two CCD cameras.

以往眼控技术的缺失是:无论接触式或非接触式在实施时,由于需要精准的定位,而精准定位需搭配昂贵的软硬件设备,如此一来,使眼控技术无法普及化让一般大众也能使用。The lack of eye control technology in the past is that no matter when it is implemented in contact or non-contact, precise positioning is required, and precise positioning requires expensive hardware and software equipment. As a result, eye control technology cannot be popularized and the general public can also be used.

发明内容Contents of the invention

为了克服现有的眼控技术需昂贵设备才有高精确度的问题,本发明提供一种能只需程序判断就能准确定位的瞳孔定位方法。In order to overcome the problem that the existing eye control technology needs expensive equipment to have high precision, the present invention provides a pupil positioning method that can accurately locate the pupil only by program judgment.

于是,本发明瞳孔定位方法解决其技术问题所采用的技术方案是利用一种可由程序软件实现的瞳孔定位方法,所述方法是在一使用者注视一屏幕时,撷取使用者的眼部影像,并利用眼部影像定位出一瞳孔中心及一反光点;以瞳孔中心至反光点的距离界定出一距离值,并以瞳孔中心位置界定一基准线配合反光点相对于瞳孔中心的一向量界定出一角度值;利用观看不同位置的距离值及角度值得到的一角度-距离分布图后,以观看任一位置的距离值及角度值依据一预定方式在角度-距离分布图对应得到一坐标点,借此换算观看屏幕任一点时对应于屏幕的位置,最后定位出使用者观看屏幕的位置。使用者只需安装本发明方法的计算机程序软件进行判别,就能达到高精确度的目的。Therefore, the technical solution adopted by the pupil positioning method of the present invention to solve its technical problems is to use a pupil positioning method that can be realized by program software. The method is to capture the user's eye image when a user is watching a screen , and use the eye image to locate a pupil center and a reflective point; a distance value is defined by the distance from the pupil center to the reflective point, and a reference line is defined by the pupil center position and a vector of the reflective point relative to the pupil center is defined Obtain an angle value; use the angle-distance distribution map obtained by viewing the distance value and angle value of different positions, and obtain a coordinate corresponding to the angle-distance distribution map by viewing the distance value and angle value of any position according to a predetermined method point, so as to convert the position corresponding to the screen when watching any point on the screen, and finally locate the position where the user watches the screen. The user only needs to install the computer program software of the method of the present invention for discrimination, and the purpose of high accuracy can be achieved.

为了克服现有的眼控技术需昂贵设备的问题,本发明提供一种能只需简单设备就能准确定位的瞳孔定位系统。In order to overcome the problem that the existing eye control technology requires expensive equipment, the present invention provides a pupil positioning system that can accurately locate only simple equipment.

于是,本发明瞳孔定位系统解决其技术问题所采用的技术方案是包括一发射一光束照射一使用者眼部的光源、一撷取使用者的眼部影像的摄影机及一运算模块;运算模块具有一特征定位单元、一距离-角度处理单元及一坐标换算单元。Therefore, the technical solution adopted by the pupil positioning system of the present invention to solve its technical problems is to include a light source that emits a beam of light to irradiate a user's eye, a camera that captures the user's eye image, and a computing module; the computing module has A feature positioning unit, a distance-angle processing unit and a coordinate conversion unit.

特征定位单元取得眼部影像并定位出一瞳孔中心及一反光点;距离-角度处理单元以瞳孔中心至反光点的距离界定出一距离值,并以瞳孔中心位置界定一基准线配合反光点相对于瞳孔中心的一向量界定出一角度值;坐标换算单元利用观看不同位置的距离值及角度值得到的一角度-距离分布图后,以观看任一位置的距离值及角度值依据一预定方式在角度-距离分布图对应得到一坐标点,借此换算观看屏幕任一点时对应于屏幕的位置。The feature positioning unit obtains the eye image and locates a pupil center and a reflection point; the distance-angle processing unit defines a distance value based on the distance from the pupil center to the reflection point, and defines a reference line with the pupil center position to match the reflection point A vector at the center of the pupil defines an angle value; the coordinate conversion unit uses an angle-distance distribution map obtained by viewing the distance value and angle value of different positions, and then uses the distance value and angle value of any position according to a predetermined method A coordinate point is correspondingly obtained in the angle-distance distribution map, and the position corresponding to the screen when viewing any point on the screen is converted.

借此,使用者只需具备前述的光源、摄影机及运算模块,无须其它昂贵的设备,因此能达到降低硬件成本的目的。In this way, the user only needs to have the above-mentioned light source, camera and computing module, and does not need other expensive equipment, so the purpose of reducing the hardware cost can be achieved.

本发明整体的有益效果在于:无须在人体佩带额外装备而能降低成本,并利用观看不同位置的得到的角度-距离分布图来换算观看屏幕任一点时对应于屏幕的位置,因此无须复杂计算而能对定位功能更为准确且有效的控制。The overall beneficial effect of the present invention is that the cost can be reduced without wearing additional equipment on the human body, and the angle-distance distribution map obtained by viewing different positions is used to convert the position corresponding to the screen when viewing any point on the screen, so there is no need for complicated calculations. It can control the positioning function more accurately and effectively.

附图说明Description of drawings

图1是一示意图,说明本发明瞳孔定位系统的较佳实施例;Fig. 1 is a schematic diagram illustrating a preferred embodiment of the pupil positioning system of the present invention;

图2是一示意图,说明以瞳孔中心配合反光点界定出一距离值及一角度值;Fig. 2 is a schematic diagram illustrating a distance value and an angle value defined by the center of the pupil and the reflective point;

图3是一流程图,说明本发明的瞳孔定位方法的较佳实施例;Fig. 3 is a flowchart illustrating a preferred embodiment of the pupil location method of the present invention;

图4是一流程图,说明本发明的瞳孔定位系统使用的分组对应法的实施例中训练阶段的流程;Fig. 4 is a flow chart illustrating the flow process of the training stage in the embodiment of the grouping correspondence method used by the pupil positioning system of the present invention;

图5是一示意图,说明屏幕划分为多组区域,但每次只显示其中一组区域;Fig. 5 is a schematic diagram illustrating that the screen is divided into multiple groups of regions, but only one group of regions is displayed at a time;

图6是一示意图,说明瞳孔中心及反光点的定位方法;Fig. 6 is a schematic diagram illustrating the positioning method of the pupil center and the reflective point;

图7是一示意图,说明距离值及角度值与屏幕各组区域的距离-角度差异;Fig. 7 is a schematic diagram illustrating the distance-angle difference between the distance value and the angle value and each group area of the screen;

图8是一示意图,说明区域“1”至“16”的距离-角度分布图,且以距离值Li及角度值θi可对应在角度-距离分布图中得到一坐标点;Fig. 8 is a schematic diagram illustrating the distance-angle distribution map of areas "1" to "16", and a coordinate point can be obtained correspondingly in the angle-distance distribution map with the distance value Li and the angle value θi;

图9是一流程图,说明本发明的瞳孔定位系统使用的分组对应法的实施例中应用阶段的流程;Fig. 9 is a flowchart illustrating the flow process of the application stage in the embodiment of the grouping correspondence method used by the pupil positioning system of the present invention;

图10是一示意图,说明对应各区域得到的距离-角度的点集合会得到一不规则的扇形分布;Fig. 10 is a schematic diagram illustrating that the distance-angle point set corresponding to each area will obtain an irregular fan-shaped distribution;

图11是一示意图,说明以少数距离-角度点所画出的一扇形,利用一扭正运算可转换成一较为规则的矩形,然后再将矩形的位置对应至屏幕的坐标;Fig. 11 is a schematic diagram illustrating that a sector drawn with a small number of distance-angle points can be converted into a relatively regular rectangle by using a twisting operation, and then the position of the rectangle is corresponding to the coordinates of the screen;

图12是一流程图,说明本发明的瞳孔定位系统使用的内插对应法的实施例中应用阶段的流程;Fig. 12 is a flowchart illustrating the flow process of the application stage in the embodiment of the interpolation correspondence method used by the pupil positioning system of the present invention;

图13是一流程图,说明本发明的瞳孔定位系统使用的内插对应法的实施例中应用阶段的流程。Figure 13 is a flowchart illustrating the flow of the application phase of an embodiment of the interpolation correspondence method used by the pupil location system of the present invention.

具体实施方式Detailed ways

下面结合附图及实施例对本发明进行详细说明:Below in conjunction with accompanying drawing and embodiment the present invention is described in detail:

在本发明被详细描述之前,要注意的是,在以下的说明内容中,类似的组件是以相同的编号来表示。Before the present invention is described in detail, it should be noted that in the following description, similar components are denoted by the same numerals.

一、系统架构及方法概念1. System architecture and method concepts

参阅图1,本发明瞳孔定位系统1的较佳实施例包括一光源11、一摄影机12、一运算模块13及一屏幕14;运算模块13包括一训练单元130、一特征定位单元131、一距离-角度处理单元132及一坐标换算单元133。Referring to Fig. 1, the preferred embodiment of pupil location system 1 of the present invention comprises a light source 11, a video camera 12, an operation module 13 and a screen 14; Operation module 13 includes a training unit 130, a feature location unit 131, a distance -An angle processing unit 132 and a coordinate converting unit 133 .

本较佳实施例中,运算模块13是一储存媒体记录的计算机程序,用以令计算机执行本发明的瞳孔定位方法,所述方法依照下列步骤实施:In this preferred embodiment, the calculation module 13 is a computer program recorded on a storage medium, used to make the computer execute the pupil positioning method of the present invention, and the method is implemented according to the following steps:

参阅图1至图3,训练单元130是每次在屏幕14显示一目标区域供使用者观看,接续执行步骤301及步骤302后再显示另一目标区域供使用者观看,直到一预定数量的目标区域都显示完毕;特征定位单元131取得一使用者4的眼部影像401,并定位出一瞳孔中心41及一反光点42(步骤301),且眼部影像401由摄影机12拍摄,反光点42是利用光源11发射一光束照射人眼反射而得。1 to 3, the training unit 130 displays a target area on the screen 14 for the user to watch at a time, and then displays another target area for the user to watch after step 301 and step 302 are executed continuously until a predetermined number of targets are reached. The regions are all displayed; the feature positioning unit 131 obtains an eye image 401 of a user 4, and locates a pupil center 41 and a reflection point 42 (step 301), and the eye image 401 is taken by the camera 12, and the reflection point 42 It is obtained by using the light source 11 to emit a light beam to irradiate human eyes and reflect it.

接着,距离-角度处理单元132以瞳孔中心41至反光点42的距离界定出一距离值L,并以瞳孔中心41位置界定一基准线(即水平轴)配合反光点42相对于瞳孔中心41的一向量43界定出一角度值θ(步骤302)。Then, the distance-angle processing unit 132 defines a distance value L with the distance from the pupil center 41 to the reflective point 42, and defines a reference line (ie, the horizontal axis) with the position of the pupil center 41 to match the distance between the reflective point 42 and the pupil center 41. A vector 43 defines an angle value θ (step 302).

然后,坐标换算单元133利用观看不同位置的距离值L及角度值θ得到的一角度-距离分布图(步骤303),前述训练阶段的步骤301至步骤303完毕后,即进入应用阶段,让使用者观看屏幕14的任一位置时,令所述位置的距离值及角度值依据一预定方式在角度-距离分布图对应得到一坐标点(步骤304),如此就可以得到观看屏幕14任一点时对应于屏幕14的位置。Then, the coordinate conversion unit 133 uses an angle-distance distribution map obtained by viewing the distance value L and the angle value θ of different positions (step 303). When the viewer watches any position of the screen 14, the distance value and the angle value of the position are correspondingly obtained in an angle-distance distribution diagram according to a predetermined method to obtain a coordinate point (step 304), so that when viewing any point of the screen 14, the Corresponds to the position of screen 14.

需说明的是,本发明定义角度值θ为负数时,依据下述公式将其转换为0度至180度之间的角度:It should be noted that, when the present invention defines the angle value θ as a negative number, it is converted into an angle between 0 degrees and 180 degrees according to the following formula:

ifAngle<0thenifAngle<0then

Angle=180+Angle;Angle=180+Angle;

以下介绍距离值L及角度值θ如何换算对应于屏幕14的指针101位置的二个较佳实施例,其中一种预定方式是分组对应法,另一种则是内插对应法。Two preferred embodiments of how to convert the distance value L and the angle value θ to the position of the pointer 101 corresponding to the screen 14 are introduced below. One of the predetermined methods is the grouping correspondence method, and the other is the interpolation correspondence method.

二、分组对应法的实施例Two, the embodiment of group correspondence method

本发明的瞳孔定位方法可区分为一训练阶段(如图4)及一应用阶段(如图9),各阶段的流程详述如下:The pupil positioning method of the present invention can be divided into a training stage (as shown in Figure 4) and an application stage (as shown in Figure 9), and the flow process of each stage is described in detail as follows:

参阅图4,训练阶段的流程包括下述步骤:Referring to Figure 4, the process of the training phase includes the following steps:

配合图1,训练单元130将屏幕14预先划分多组区域,每次只显示其中一组区域,每次显示一组区域供使用者观看时,控制特征定位单元131执行定位及控制距离-角度处理单元132取得各组区域的距离-角度;及若判断所有区域已显示完毕时,取得各组区域的距离-角度分布图。1, the training unit 130 pre-divides the screen 14 into multiple groups of regions, and only one group of regions is displayed at a time, and each time a group of regions is displayed for the user to watch, the control feature positioning unit 131 performs positioning and controls distance-angle processing The unit 132 obtains the distance-angle of each group of regions; and if it is determined that all the regions have been displayed, obtains the distance-angle distribution map of each group of regions.

如图5所示,屏幕14划分为4x4=16组区域,但每次只显示其中一组区域;然后,训练阶段的流程开始时,先设定计数值I=1(步骤201),每次显示一区域I供使用者观看(步骤202)。As shown in Figure 5, screen 14 is divided into 4x4=16 groups of regions, but only shows wherein one group of regions at every turn; An area I is displayed for the user to view (step 202).

接着,撷取一使用者的眼部影像401(步骤203),并利用眼部影像401定位出一瞳孔中心41及一反光点42(步骤204);如图6所示,瞳孔中心41的定位方法是将眼部影像401经由动态二值化(Adaptive threshold)及型态学(Morphology)的封闭(Close)运算处理后,即可计算出瞳孔中心41的位置。Next, capture a user's eye image 401 (step 203), and use the eye image 401 to locate a pupil center 41 and a reflective point 42 (step 204); as shown in Figure 6, the location of the pupil center 41 The method is to calculate the position of the pupil center 41 after the eye image 401 is processed by dynamic binarization (Adaptive threshold) and Morphology (Morphology) closing (Close) operation.

然后,利用瞳孔中心及反光点界定出一距离值L及一角度值θ(步骤205);将计数值I+1(步骤206);再判断I=16(步骤207)?若否,则继续重复步骤202至207,直到判断I=16时,表示已经取得各组区域的距离-角度分布图(步骤208)。Then, define a distance value L and an angle value θ (step 205) by using the center of the pupil and the reflection point; with the count value I+1 (step 206); then judge whether I=16 (step 207)? If not, continue to repeat steps 202 to 207 until it is judged that I=16, indicating that the distance-angle distribution diagrams of each group of areas have been obtained (step 208).

关于距离值L及角度值θ如何界定可参考前述图2的说明文字,至于距离值L及角度值θ与屏幕14各组区域的距离-角度差异说明如下:How to define the distance value L and the angle value θ can refer to the explanatory text of the above-mentioned FIG.

参阅图7(a),屏幕14先显示“1”区域,此时得到距离值L1及角度值θ1;参阅图7(b),屏幕14显示“2”区域,此时得到距离值L2及角度值θ2;参阅图7(c),屏幕14显示“3”区域,此时得到距离值L3及角度值θ3;参阅图7(d),屏幕14显示“4”区域,此时得到距离值L4及角度值θ4;值得注意的是,由于眼睛视线是水平移动,因此距离值L1、L2、L3及L4不会差距太大,但是角度值θ1、θ2、θ3及θ4则会有较大变化。Referring to Fig. 7(a), the screen 14 first displays the "1" area, and now the distance value L1 and the angle value θ1 are obtained; referring to Fig. 7(b), the screen 14 displays the "2" area, and the distance value L2 and the angle value are obtained at this time Value θ2; Referring to Fig. 7 (c), screen 14 displays "3" area, and now distance value L3 and angle value θ3 are obtained; referring to Fig. 7 (d), screen 14 displays "4" area, and now distance value L4 is obtained and the angle value θ4; it is worth noting that since the eyesight moves horizontally, the distance values L1, L2, L3, and L4 will not differ too much, but the angle values θ1, θ2, θ3, and θ4 will have large changes.

但是当眼睛视线下移至“1”、“2”、“3”及“4”的下一排数字“5”、“6”、“7”及“8”时,因为瞳孔下移一小段距离,使得注视“5”、“6”、“7”及“8”时得到的距离值将会变小,角度值则各自对应与θ1、θ2、θ3及θ4不会有太大差距。But when the eye line of sight moves down to the next row of numbers "5", "6", "7" and "8" of "1", "2", "3" and "4", because the pupil moves down a little The distance, so that the distance value obtained when looking at "5", "6", "7" and "8" will become smaller, and the angle value will not have much difference from θ1, θ2, θ3 and θ4 respectively.

参阅图8,依据前述的原理,再加上因为眼球为球面而非平面,得到如区域“1”至“16”的距离-角度分布图。Referring to FIG. 8 , according to the aforementioned principle, and because the eyeball is a spherical surface rather than a plane, the distance-angle distribution diagrams such as areas "1" to "16" are obtained.

参阅图9,并配合图6,分组对应法于实际应用阶段的流程包括下述步骤:Referring to Fig. 9, and in conjunction with Fig. 6, the process of the group correspondence method in the actual application stage includes the following steps:

撷取一使用者的眼部影像401(步骤211),并利用眼部影像401定位出一瞳孔中心41及一反光点42(步骤212);利用瞳孔中心及反光点界定出一距离值Li及一角度值θi(步骤213);以上步骤请参考图2的说明,不再赘述。Capture a user's eye image 401 (step 211), and use the eye image 401 to locate a pupil center 41 and a reflective point 42 (step 212); use the pupil center and reflective point to define a distance value Li and An angle value θi (step 213); please refer to the description of FIG. 2 for the above steps, and will not repeat them here.

再参阅图8,以距离值Li及角度值θi可对应在角度-距离分布图中得到一坐标点5,利用最小目标函式找出距离坐标点5最近的区域(步骤214),如此即完成屏幕指针的定位功能(步骤215)。Referring to Fig. 8 again, a coordinate point 5 can be obtained correspondingly in the angle-distance distribution diagram with the distance value Li and the angle value θi, and the minimum objective function is used to find out the area closest to the coordinate point 5 (step 214), and thus it is completed The positioning function of the screen pointer (step 215).

最小目标函式的公式如下:The formula of the minimum objective function is as follows:

Min Obj.=W1|Dist_t-Dis_i|+W2|Angle_t-Angle_i|Min Obj.=W1|Dist_t-Dis_i|+W2|Angle_t-Angle_i|

(i=1~16)(i=1~16)

/Dist_i及Angle_i是目标距离值及角度值,Dis_t及Angle_t是目前距离值及角度值;W1及W2为分配权重,此是为了使二者贡献均衡//Dist_i and Angle_i are the target distance value and angle value, Dis_t and Angle_t are the current distance value and angle value; W1 and W2 are distribution weights, this is to balance the contribution of the two/

Dist = 0 [ ( x 1 - x 2 ) 2 + ( y 1 - y 2 ) 2 ] ; /距离/             公式1 Dist = 0 [ ( x 1 - x 2 ) 2 + ( the y 1 - the y 2 ) 2 ] ; /distance/ formula1

Angle = tan - 1 [ y 1 - y 2 x 1 - x 2 ] ; /角度/             公式2 angle = the tan - 1 [ the y 1 - the y 2 x 1 - x 2 ] ; /angle/ formula 2

由本较佳实施例可知,本方法是利用多组区域得到的瞳孔中心及反光点的距离及角度,来与与屏幕的多组区域的空间,彼此建立起一对应的关系,进而得到定位功能效果;然而,分组对应法需要看越多组区域,其定位效果才会越好,因而有下面介绍的内插对应法来改善此缺点。It can be seen from this preferred embodiment that this method uses the distance and angle of the pupil center and the reflective point obtained from multiple groups of areas to establish a corresponding relationship with the space of multiple groups of areas on the screen, and then obtain the positioning function effect ; However, the grouping correspondence method needs to look at more groups of areas, the better the positioning effect will be, so the interpolation correspondence method described below is used to improve this shortcoming.

三、内插对应法的实施例Three, the embodiment of interpolation corresponding method

参阅图10,若将屏幕划分为8x8=64组区域,则对应各区域得到的距离-角度的点集合会得到一不规则的扇形分布;如此一来,想利用内插法得到屏幕对应的坐标就遇到了困难。Referring to Figure 10, if the screen is divided into 8x8=64 groups of areas, the distance-angle point set corresponding to each area will get an irregular fan-shaped distribution; in this way, if you want to use the interpolation method to obtain the coordinates corresponding to the screen I encountered difficulties.

参阅图11,以少数距离-角度点所画出的一扇形51,利用一扭正(Un-warping)运算可转换成一较为规则的矩形52,然后再将矩形52的位置对应至屏幕14的坐标,如此可得到相当近似且良好的对应效果。Referring to FIG. 11 , a sector 51 drawn with a small number of distance-angle points can be converted into a relatively regular rectangle 52 by using an Un-warping operation, and then the position of the rectangle 52 corresponds to the coordinates of the screen 14 , so that a fairly approximate and good corresponding effect can be obtained.

本实施例的瞳孔定位方法也是区分为一训练阶段及一应用阶段,各阶段的流程详述如下:The pupil positioning method of the present embodiment is also divided into a training stage and an application stage, and the flow process of each stage is described in detail as follows:

参阅图12,训练阶段的流程类似图2,也包括下述步骤:Referring to Figure 12, the process of the training phase is similar to Figure 2, and also includes the following steps:

首先,设定计数值I=1(步骤601),然后,每次显示区域I供使用者观看(步骤602);接着,撷取一使用者的眼部影像401(步骤603),并利用眼部影像401定位出一瞳孔中心41及一反光点42(步骤204);然后,利用瞳孔中心及反光点界定出一距离值L及一角度值θ(步骤605);将计数值I+1(步骤606);再判断I=16(步骤607)?若否,则继续重复步骤02至07,直到判断I=16时,表示已经取得各组区域的距离-角度分布图(步骤608)。First, set the count value I=1 (step 601), then display the area I for the user to watch each time (step 602); then, capture a user's eye image 401 (step 603), and use the eye The first image 401 locates a pupil center 41 and a reflection point 42 (step 204); then, utilizes the pupil center and the reflection point to define a distance value L and an angle value θ (step 605); count value I+1( Step 606); judge again that I=16 (step 607)? If not, continue to repeat steps 02 to 07 until it is judged that I=16, indicating that the distance-angle distribution maps of each group of areas have been obtained (step 608 ).

不同的是,本方法还要将前述的距离-角度分布图经过一扭正运算以得到一正规化分布图(步骤609),利用仿射转换技术校正正规化分布图(步骤610)。The difference is that this method also performs a twisting operation on the aforementioned distance-angle distribution diagram to obtain a normalized distribution diagram (step 609 ), and corrects the normalized distribution diagram by using an affine transformation technique (step 610 ).

步骤609求得的正规化的距离-角度分布图,其公式如下:The normalized distance-angle distribution diagram obtained in step 609 has the following formula:

u = a 0 + a 1 x + a 2 y + a 3 x 2 + a 4 xy + a 5 y 2 v = b 0 + b 1 x + b 2 y + b 3 x 2 + b 4 xy + b 5 y 2 ; 公式3 u = a 0 + a 1 x + a 2 the y + a 3 x 2 + a 4 xy + a 5 the y 2 v = b 0 + b 1 x + b 2 the y + b 3 x 2 + b 4 xy + b 5 the y 2 ; Formula 3

u v = a 0 a 1 a 2 a 3 a 4 a 5 b 0 b 1 b 2 b 3 b 4 b 5 1 x y x 2 xy y 2 公式4 u v = a 0 a 1 a 2 a 3 a 4 a 5 b 0 b 1 b 2 b 3 b 4 b 5 1 x the y x 2 xy the y 2 Formula 4

扭正运算是将未扭正前的取样点(X,Y)={(x1,y1),..(xn,yn)}转正为目标点(XT,YT)={(xT1,yT1),...(xTn,yYn)},如:输入至少6组未扭正前的样本点(x1,y1),(x2,y2),(x3,y3),(x4,y4),(x5,y5)(x6,y6)转正为相对应的6组目标点(u1,v1),(u2,v2),(u3,v3),(u4,v4),(u5,v5)(u6,v6);其中X,Y表示距离值Li,角度值θi,n=取样数目。The correction operation is to convert the sampling point (X, Y)={(x 1 , y 1 ),..(x n , y n )} to the target point (X T , Y T )={ (x T1 , y T1 ),...(x Tn , y Yn )}, such as: input at least 6 sets of sample points (x 1 , y 1 ), (x 2 , y 2 ), ( x 3 , y 3 ), (x 4 , y 4 ), (x 5 , y 5 ) (x 6 , y 6 ) are turned into corresponding six groups of target points (u 1 , v 1 ), (u 2 , v 2 ), (u 3 , v 3 ), (u 4 , v 4 ), (u 5 , v 5 )(u 6 , v 6 ); where X, Y represent distance value Li, angle value θi, n= number of samples.

为了得到二阶导函数的最佳参数,步骤609的扭正运算可借由反矩阵的运算求得a0~a5,b0~b5共12组系数的最佳解;求得a0~a5,b0~b5共12组系数的解之后,将目前的未知点(X,Y)带入公式4,就可以求得新坐标点(u,v)的值。In order to obtain the optimal parameters of the second-order derivative function, the twisting operation in step 609 can obtain the optimal solution of 12 sets of coefficients of a 0 ~ a 5 , b 0 ~ b 5 through the operation of the inverse matrix; obtain a0 ~ After solving the 12 sets of coefficients of a5, b0~b5, the current unknown point (X, Y) is brought into the formula 4, and the value of the new coordinate point (u, v) can be obtained.

步骤610是利用仿射转换(Affine transform)技术对瞳孔位置进行移动校正(Moving calibration),移动校正主要是将坐标排除其影像缩放、影像平移及影像旋转影响,进而得到一正规化的坐标。Step 610 is to use the Affine transform technology to perform moving calibration on the pupil position. The moving calibration is mainly to eliminate the influence of image scaling, image translation and image rotation on the coordinates, and then obtain a normalized coordinates.

x &prime; y &prime; = a b c d x y + e f ; 公式5 x &prime; the y &prime; = a b c d x the y + e f ; Formula 5

其中(x’,y’)是新坐标,a,b,c,d,e,f是仿射转换的系数。仿射转换的计算方式,是输入未校正前的屏幕的任三个角落的三对坐标点如:(x1,y1),(x2,y2),(x3,y3)及校正后的屏幕的任三个角落的三对坐标点(x′1,y′1),(x′2,y′2)(x′3,y′3),如此即可解出a~f共6个仿射转换系数,再将目前的未知点(X,Y)带入公式5,就可以求得新坐标点(x’,y’)的值。where (x', y') are the new coordinates and a, b, c, d, e, f are the coefficients of the affine transformation. The calculation method of affine transformation is to input three pairs of coordinate points of any three corners of the screen before correction, such as: (x 1 , y 1 ), (x 2 , y 2 ), (x 3 , y 3 ) and Three pairs of coordinate points (x′ 1 , y′ 1 ), (x′ 2 , y′ 2 )(x′ 3 , y′ 3 ) of any three corners of the corrected screen, so that a~ f has 6 affine transformation coefficients in total, and then the current unknown point (X, Y) is brought into formula 5 to obtain the value of the new coordinate point (x', y').

参阅图13,内插对应法于实际应用阶段的流程包括下述步骤:Referring to Figure 13, the process of the interpolation correspondence method in the actual application stage includes the following steps:

撷取一使用者的眼部影像401(步骤611),并利用眼部影像401定位出一瞳孔中心41及一反光点42(步骤612);利用瞳孔中心及反光点界定出一距离值Li及一角度值θi(步骤613);以上步骤请参考图2的说明,不再赘述。Capture a user's eye image 401 (step 611), and use the eye image 401 to locate a pupil center 41 and a reflective point 42 (step 612); use the pupil center and reflective point to define a distance value Li and An angle value θi (step 613); please refer to the description of FIG. 2 for the above steps, and will not repeat them here.

接着,将距离值Li及角度值θi(即X与Y)对应在校正后的正规化分布图中得到一坐标点,再将坐标点以内插法换算对应于屏幕的位置(步骤614),如此即完成屏幕定位功能(步骤615)。Next, correspond the distance value Li and the angle value θi (that is, X and Y) to obtain a coordinate point in the corrected normalized distribution map, and then convert the coordinate point to the position corresponding to the screen by interpolation (step 614), so That is, the screen positioning function is completed (step 615).

四、结论4. Conclusion

由以上说明可知,本发明瞳孔定位方法及系统采用非接触式技术而具有以下的优点:As can be seen from the above description, the pupil positioning method and system of the present invention adopt non-contact technology and have the following advantages:

1.无须在人体佩带额外装备,使用者无须花费昂贵的费用。1. There is no need to wear additional equipment on the human body, and the user does not need to spend expensive expenses.

2.分组对应法是利用多组区域得到的瞳孔中心及反光点的距离及角度来与与屏幕的多组区域的空间彼此建立起一对应的关系,因此无须复杂计算而能对定位功能准确且有效的控制。2. The group correspondence method is to use the distance and angle of the pupil center and the reflective point obtained from multiple groups of areas to establish a one-to-one relationship with the space of multiple groups of areas on the screen, so the positioning function can be accurate and accurate without complicated calculations. effective control.

3.内插对应法以少数距离-角度点所画出的一扇形利用一扭正运算转换成一较为规则的矩形,然后再将矩形对应至屏幕,如此可得到相当近似且良好的对应效果,且让使用者在训练阶段只要观看少数位置就能达到良好的定位效果,不但精确也能更为省力。3. The interpolation correspondence method converts a sector drawn by a small number of distance-angle points into a relatively regular rectangle by a twisting operation, and then maps the rectangle to the screen, so that a fairly approximate and good corresponding effect can be obtained, and In the training phase, users only need to watch a few positions to achieve a good positioning effect, which is not only accurate but also saves effort.

Claims (12)

1. pupil positioning method, in order to orient the position that the user watches screen when a user watches a screen attentively, it is characterized in that: described method comprises following step:
(1) acquisition user's eye image, and utilize the eye image to orient a pupil center and a reflective spot;
(2) make a distance value with pupil center to the distance bound of reflective spot, and define a datum line with pupil center location and cooperate reflective spot to define an angle value with respect to a vector of pupil center; And
(3) distance value of diverse location and one angle-range distribution figure that angle value obtains are watched in utilization; And
(4) obtain a coordinate points according to a predetermined way in angle-range distribution figure correspondence with the distance value of watching arbitrary position and angle value, when obtaining whereby watching the screen any point corresponding to the position of screen.
2. pupil positioning method as claimed in claim 1 is characterized in that: described step (3) is that screen is divided the zones of organizing in advance, and each shows wherein one group of zone, and execution comprises the steps:
(a1) show when one group of zone is watched for the user execution in step (1) and step (2) at every turn; And
(a2) if judge that All Ranges has shown when finishing, obtain the distance-angular distribution figure in corresponding each group zone.
3. pupil positioning method as claimed in claim 2 is characterized in that: the predetermined way of described step (4) is a grouping correspondent method, and described grouping correspondent method comprises the steps:
When (a3) user watches the arbitrary position of screen, execution in step (1) and step (2) a distance value and an angle value to obtain the position;
(a4) in angle-range distribution figure, obtain a coordinate points with distance value and angle value; And
(a5) utilize a minimum target letter formula to find out the nearest zone of range coordinate point, borrow described zone to convert corresponding to the position of screen.
4. pupil positioning method as claimed in claim 3 is characterized in that: the formula of described minimum target letter formula is as follows: Min Obj=W1|Dist_t-Dis_i|+W2|Ang1e_t-Angle_i|; Dist_i and Angle_i are target range value and angle value, and Dis_t and Angle_t are present distance value and angle value, and W1 and W2 are for assigning weight.
5. pupil positioning method as claimed in claim 1 is characterized in that: described step (3) is that screen is divided the zones of organizing in advance, and each shows wherein one group of zone, and execution comprises the steps:
(b1) show when one group of zone is watched for the user execution in step (1) and step (2) at every turn;
(b2) if judge that All Ranges has shown when finishing, obtain one and comprise distance-angular distribution figure that each group is regional;
(b3) will turn round positive computing to obtain a regular distribution plan through one apart from-angular distribution figure; And
(b4) utilize affine switch technology to proofread and correct regular distribution plan.
6. pupil positioning method as claimed in claim 5 is characterized in that: the predetermined way of described step (4) is an interpolation correspondent method, comprises the steps:
When (b5) user watches the arbitrary position of screen, execution in step (1) and step (2) a distance value and an angle value to obtain the position;
(b6) obtain a coordinate points with distance value and angle value correspondence in the regular distribution plan after correction; And
(b7) coordinate points is converted corresponding to the position of screen with interpolation method.
7. pupil positioning system, it is characterized in that: the pupil positioning system comprises:
One light source is launched a light beam irradiates one user's eye;
One video camera, acquisition user's eye image; And
One computing module has:
One feature location unit is obtained the eye image and is oriented a pupil center and a reflective spot,
One distance-angle processing unit is made a distance value with pupil center to the distance bound of reflective spot, and defines a datum line with pupil center location and cooperate reflective spot to define an angle value with respect to a vector of pupil center, and
One coordinate transformation unit, after the distance value of diverse location and one angle-range distribution figure that angle value obtains are watched in utilization, obtain a coordinate points according to a predetermined way in angle-range distribution figure correspondence with the distance value of watching arbitrary position and angle value, when obtaining whereby watching the screen any point corresponding to the position of screen.
8. pupil positioning system as claimed in claim 7 is characterized in that: described computing module also has a training unit, and is regional in order to screen is divided in advance many groups, each wherein one group of zone that only shows, and training unit and execution comprise the steps:
The each demonstration when one group of zone supplies the user to watch, the controlling features positioning unit carries out the location and command range-angle processing unit is obtained the regional distance-angle of each group.
9. pupil positioning system as claimed in claim 8 is characterized in that: the predetermined way that described coordinate transformation unit is carried out is a grouping correspondent method, and it comprises the steps:
When the user watched the arbitrary position of screen, the controlling features positioning unit was carried out the location and command range-angle processing unit is obtained a distance value and an angle value;
In angle-range distribution figure, obtain a coordinate points with distance value and angle value correspondence; And
Utilize a minimum target letter formula to find out the nearest zone of range coordinate point, borrow described zone to convert corresponding to the position of screen.
10. pupil positioning system as claimed in claim 9 is characterized in that: the formula of described minimum target letter formula is as follows:
Min Obj=W1|Dist_t-Dis_i|+W2|Angle_t-Angle_i|; Dist_i and Angle_i are target range value and angle value, and Dis_t and Angle_t are present distance value and angle value, and W1 and W2 are for assigning weight.
11. pupil positioning system as claimed in claim 7 is characterized in that: described computing module also has a training unit, and is regional in order to screen is divided in advance many groups, each wherein one group of zone that only shows, and training unit and execution comprise the steps:
The each demonstration when one group of zone supplies the user to watch, the controlling features positioning unit carries out the location and command range-angle processing unit is obtained the regional distance-angle of each group;
If judge that All Ranges has shown when finishing, obtain one and comprise distance-angular distribution figure that each group is regional; To turn round positive computing to obtain a regular distribution plan through one apart from-angular distribution figure; And
Utilize affine switch technology to proofread and correct regular distribution plan.
12. pupil positioning system as claimed in claim 11 is characterized in that: the predetermined way of described coordinate transformation unit is an interpolation correspondent method, and it comprises the steps:
When the user watched the arbitrary position of screen, the controlling features positioning unit was carried out the location and command range-angle processing unit is obtained a distance value and an angle value;
Obtain a coordinate points with distance value and angle value correspondence in the regular distribution plan after correction; And
Coordinate points is converted corresponding to the position of screen with interpolation method.
CN2009100067693A 2009-02-17 2009-02-17 Pupil positioning method and system Active CN101807110B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009100067693A CN101807110B (en) 2009-02-17 2009-02-17 Pupil positioning method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009100067693A CN101807110B (en) 2009-02-17 2009-02-17 Pupil positioning method and system

Publications (2)

Publication Number Publication Date
CN101807110A true CN101807110A (en) 2010-08-18
CN101807110B CN101807110B (en) 2012-07-04

Family

ID=42608924

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009100067693A Active CN101807110B (en) 2009-02-17 2009-02-17 Pupil positioning method and system

Country Status (1)

Country Link
CN (1) CN101807110B (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103784112A (en) * 2013-10-10 2014-05-14 杨松 Eye movement sensing method, flexible contact, external sensing coil and system
CN103809737A (en) * 2012-11-13 2014-05-21 华为技术有限公司 Method and device for human-computer interaction
CN103850582A (en) * 2012-11-30 2014-06-11 由田新技股份有限公司 Eye-movement operation password input method and safe using same
US8929589B2 (en) 2011-11-07 2015-01-06 Eyefluence, Inc. Systems and methods for high-resolution gaze tracking
CN104656885A (en) * 2013-11-15 2015-05-27 由田新技股份有限公司 Handheld eye-controlled eye contact device, password input device and method thereof, and computer readable recording medium
CN104915013A (en) * 2015-07-03 2015-09-16 孙建德 Eye tracking and calibrating method based on usage history
CN106371566A (en) * 2015-07-24 2017-02-01 由田新技股份有限公司 Correction module, method and computer readable recording medium for eye tracking
CN106371565A (en) * 2015-07-24 2017-02-01 由田新技股份有限公司 Control method based on eye movement and device applied by control method
US10039445B1 (en) 2004-04-01 2018-08-07 Google Llc Biosensors, communicators, and controllers monitoring eye movement and methods for using them
WO2019196133A1 (en) * 2018-04-09 2019-10-17 杭州瑞杰珑科技有限公司 Head-mounted visual aid device
WO2020042542A1 (en) * 2018-08-31 2020-03-05 深圳市沃特沃德股份有限公司 Method and apparatus for acquiring eye movement control calibration data
CN113095182A (en) * 2021-03-31 2021-07-09 广东奥珀智慧家居股份有限公司 Iris feature extraction method and system for human eye image

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8885877B2 (en) 2011-05-20 2014-11-11 Eyefluence, Inc. Systems and methods for identifying gaze tracking scene reference locations

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
SE524003C2 (en) * 2002-11-21 2004-06-15 Tobii Technology Ab Procedure and facility for detecting and following an eye and its angle of view
CN101026777B (en) * 2006-02-21 2011-01-26 台湾薄膜电晶体液晶显示器产业协会 Display dynamic image color shift detection system and detection method
CN1889016A (en) * 2006-07-25 2007-01-03 周辰 Eye-to-computer cursor automatic positioning controlling method and system
CN101344919B (en) * 2008-08-05 2012-08-22 华南理工大学 Sight tracing method and disabled assisting system using the same

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10039445B1 (en) 2004-04-01 2018-08-07 Google Llc Biosensors, communicators, and controllers monitoring eye movement and methods for using them
US8929589B2 (en) 2011-11-07 2015-01-06 Eyefluence, Inc. Systems and methods for high-resolution gaze tracking
US9740281B2 (en) 2012-11-13 2017-08-22 Huawei Technologies Co., Ltd. Human-machine interaction method and apparatus
CN103809737A (en) * 2012-11-13 2014-05-21 华为技术有限公司 Method and device for human-computer interaction
CN103850582A (en) * 2012-11-30 2014-06-11 由田新技股份有限公司 Eye-movement operation password input method and safe using same
CN103784112A (en) * 2013-10-10 2014-05-14 杨松 Eye movement sensing method, flexible contact, external sensing coil and system
CN104656885A (en) * 2013-11-15 2015-05-27 由田新技股份有限公司 Handheld eye-controlled eye contact device, password input device and method thereof, and computer readable recording medium
CN104656885B (en) * 2013-11-15 2018-05-15 由田新技股份有限公司 Handheld eye-controlled eye contact device, password input device and method thereof, and computer readable recording medium
CN104915013B (en) * 2015-07-03 2018-05-11 山东管理学院 A kind of eye tracking calibrating method based on usage history
CN104915013A (en) * 2015-07-03 2015-09-16 孙建德 Eye tracking and calibrating method based on usage history
CN106371565A (en) * 2015-07-24 2017-02-01 由田新技股份有限公司 Control method based on eye movement and device applied by control method
CN106371566A (en) * 2015-07-24 2017-02-01 由田新技股份有限公司 Correction module, method and computer readable recording medium for eye tracking
WO2019196133A1 (en) * 2018-04-09 2019-10-17 杭州瑞杰珑科技有限公司 Head-mounted visual aid device
WO2020042542A1 (en) * 2018-08-31 2020-03-05 深圳市沃特沃德股份有限公司 Method and apparatus for acquiring eye movement control calibration data
CN113095182A (en) * 2021-03-31 2021-07-09 广东奥珀智慧家居股份有限公司 Iris feature extraction method and system for human eye image

Also Published As

Publication number Publication date
CN101807110B (en) 2012-07-04

Similar Documents

Publication Publication Date Title
CN101807110B (en) Pupil positioning method and system
TWI432172B (en) Pupil location method, pupil positioning system and storage media
Grubert et al. A survey of calibration methods for optical see-through head-mounted displays
CN104113680B (en) Gaze tracking system and method
US11556012B2 (en) Spectacles with electrically-tunable lenses controllable by an external system
US10330566B2 (en) Methods and apparatus for small aperture lensometer
US9465237B2 (en) Automatic focus prescription lens eyeglasses
US9285872B1 (en) Using head gesture and eye position to wake a head mounted device
JP2019091051A (en) Display device, and display method using focus display and context display
US9727130B2 (en) Video analysis device, video analysis method, and point-of-gaze display system
EP3987355B1 (en) Imaging device with field-of-view shift control
CN103838378A (en) Head wearing type eye control system based on pupil recognition positioning
Bohme et al. Remote eye tracking: State of the art and directions for future development
JP2010259605A (en) Gaze measurement apparatus and gaze measurement program
CN102125422A (en) Pupil center-corneal reflection (PCCR) based sight line evaluation method in sight line tracking system
CN101872237A (en) Pupil Tracking Method and System and Correction Method and Module for Pupil Tracking
CN113196141A (en) Optical system using segmented phase profile liquid crystal lens
JP4500992B2 (en) 3D viewpoint measuring device
CN103782131A (en) Measuring device that can be operated without contact and control method for such a measuring device
WO2020253949A1 (en) Systems and methods for determining one or more parameters of a user&#39;s eye
CN112782854B (en) Head-mounted display device and distance measurer
US11243448B2 (en) Varifocal optical assembly providing astigmatism compensation
US20240248527A1 (en) Mixed reality interaction with eye-tracking techniques
US11253149B2 (en) Holographic real space refractive sequence
CN112578564A (en) Virtual reality display equipment and display method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant