Cornea Radius Calibration for Remote 3D Gaze Tracking Systems

Cornea radius estimation is a key technique for 3D gaze estimation in the single-camera 3D gaze tracking system. Traditional methods with one-camera-one-light-source systems or one-camera-two-light-source systems cannot achieve 3D gaze estimation. The 3D line-of-sight can be estimated only when the cornea radius is pre-calibrated by the user. A cornea radius calibration method based on the iris radius is proposed in this paper for 3D gaze estimation in remote one-camera-two-light-source systems. We first calibrate the iris radius based on the binocular strategy, estimate the spatial iris center using the calibrated iris radius, and then calibrate the cornea radius by a set of non-linear equations under the constraint of equivalent distances from the cornea center to the iris edge points. The calibrated cornea radius is verified by binocular optimization constraints. Simulations and physical experiments validate the effectiveness of the proposed method. The iris-based cornea radius calibration approach is novel; it can be used to obtain the cornea radius and 3D gaze using remote one-camera-one-light-source or one-camera-multi-light-source systems.


I. INTRODUCTION
Gaze tracking technology involves the use of electronic, mechanical, optical, and other detection methods to obtain a given subject's current ''visual attention''. It is widely used in human-computer interactions (HCIs), virtual reality (VR), driver assistance, human factor analysis, and psychological analysis applications [1]- [3]. Eye gaze tracking methods generally are feature-based or appearance-based. In general, feature-based gaze tracking methods detect the visual features of the eye (such as the pupil center, long and short axis of iris ellipse, or corneal reflection point) from captured images, extract relevant gaze feature parameters, then employ a mapping function to estimate the point-of-regard (PoR) or real 3D line-of-sight (LoS) via a spatial geometric model. Feature-based gaze tracking methods, unlike appearance-based methods, have relatively high accuracy and allow for free head movement [4].Appearance-based methods perform eye gaze estimation based on the eye rather than its distinct features, wherein a regression function that maps The associate editor coordinating the review of this manuscript and approving it for publication was Michele Nappi . the eye appearance to the PoR on the screen is directly constructed from a large set of training samples. Due to the large amount of statistical sampling information that is needed, appearance-based methods are more robust than feature-based methods [4], [5].
Feature-based gaze tracking methods are generally divided into two categories: 2D mapping-based [5]- [9] and 3D model-based [10]- [13]. 2D mapping-based methods normally consist of a single camera and a single light source. The camera captures human face and eye images, extracts gaze feature parameters, and then determines the mapping model with individual differences as per user calibration to calculate the intersection of the LoS and the computer screen as the PoR. The 2D gaze tracking system is simple in system configuration, but is greatly affected by head movements. If the user's head deviates from the calibration position, the PoR estimation accuracy rapidly degrades. 2D gaze tracking methods usually use a second-order non-linear mapping model to calculate the PoR through the gaze parameters, however, the mapping models often have multiple parameters with individual differences. The user is generally required to stare at more than nine calibration points on the screen to calculate these model parameters during user calibration process, which is rather tedious and time-consuming.
By comparison, 3D gaze estimation methods reconstruct the spatial 3D optical axis (OA) and visual axis (VA) of the eye with spatial geometric and imaging models of the human eye. 3D gaze tracking systems are generally multicamera-multi-light-source (MCMLS) or one-camera-multilight-source (OCMLS). Guestrin and Eizenman [13] proved that without knowing the individual difference parameters of the eyeball, 3D gaze tracking systems need at least two cameras or more than two light sources to calculate the 3D coordinates of the cornea center and reconstruct the OA. 3D gaze tracking methods use the reflection of light sources on the corneal surface to determine the invariant parameters (e.g., cornea radius) of the human eyeball during user calibration.
During 3D gaze estimation, variable parameters related to the LoS (e.g., the cornea center or pupil center) are calculated in real time according to the structure of the eyeball and the imaging model so as to reconstruct the OA. Next, according to the kappa angle between the OA and the VA calibrated through user calibration, the LoS can be estimated based on the reconstructed OA. Finally, the intersection point between the LoS and the screen plane is obtained as the PoR. 3D gaze estimation calculates the real direction of the 3D LoS in real time through the eyeball structure and the imaging model, so it is not affected by head movements and make single-point calibration possible [11], [14].
In summary, 2D gaze tracking technology has a simple system configuration, but can only detect the PoR on the screen, requires complex user calibration processes, and is affected by head movements. 3D gaze tracking technology can detect the 3D gaze in real time, is not affected by head movements, and uses a relatively simple user calibration process. Therefore, 3D gaze tracking technology is superior to 2D gaze tracking technology.
As described above, only OCMLS and MCMLS systems can realize 3D gaze estimation; 3D gaze estimation requires a more complicated system than 2D. A simplified system configuration is necessary in some cases to achieve 3D gaze estimation. In principle, systems with one camera and two light sources can achieve 3D gaze estimation provided that the light sources are not collinear with the camera optical center. Otherwise, the traditional method cannot be used to calculate the corneal parameters [15]. However, the general configuration of typical remote gaze tracking systems is a single camera with dual light sources located on either sides of the camera -the light sources and camera are basically on the same straight line. It is impossible to estimate the 3D gaze using traditional methods based on the typical remote gaze tracking system.
The cornea radius must be user-calibrated to achieve 3D gaze estimation on typical remote systems, thus, the cornea center can be calculated in real time and the pupil center can be further calculated to construct the OA of the eyeball during gaze estimation. Effective cornea radius calibration is essential for estimating the 3D LoS in a remote one-cameratwo-light-sources (OCTLS) system. This paper proposes a cornea radius calibration method based on iris radius and binocular optimization for 3D gaze estimation via remote OCTLS system. This method is divided into two steps: (1) Calibrating the iris radius by the equal kappa angles of both eyes to determine the spatial iris center and (2) Calibrating the cornea radius based on the iris features when the iris radius and iris center are known.
The contribution of this paper mainly lies in: (1) It is proved that in a typical remote OCTLS system where the camera and the two light sources are basically collinear, the condition for achieving the 3D gaze estimation is that the cornea radius is known. That is, this paper gives the necessary conditions for such a typical remote OCTLS system to achieve 3D gaze estimation; (2) A cornea radius calibration method based on iris features is proposed, which resolves the cornea radius calibration problem of the typical remote OCTLS gaze tracking system, and makes the 3D gaze estimation of this typical remote OCTLS system possible.
The rest of this paper is organized as follows. Section II provides a brief introduction to the significance of cornea radius calibration in 3D gaze tracking systems. Cornea radius calibration problems in OCTLS systems are described in Section III. Section IV presents a cornea radius calibration method for remote OCTLS systems. Section V reports the experiments we performed to test the proposed method. Section VI gives a brief summary and conclusion.

II. SIGNIFICANCE OF CORNEA RADIUS CALIBRATION
The general process of 3D gaze estimation is essentially a two-step process. The first step is user calibration, where eyeball invariant parameters (e.g., cornea radius, iris radius, distance between the cornea center and pupil center, and kappa angle) are calibrated. The second step is 3D gaze estimation. According to the calibrated eyeball invariant parameters, eyeball-related variables (e.g., cornea center, pupil center, iris center, and iris normal vector) are calculated. These variable parameters are used to construct the OA of the eyeball and finally to estimate the LoS. The cornea radius is an important eyeball invariant parameter of user calibration, and cornea radius estimation is the key to effective 3D gaze tracking. Many existing 3D gaze tracking systems depend on accurate calibration of the cornea radius.
(1) MCMLS systems are generally used in wearable gaze tracking applications. Because multiple light sources reflect on the corneal surface, the 3D gaze estimation algorithm needs to know the correspondence between the light sources and glints in the image. The relationships between the light sources and glints can only be determined during gaze estimation once the cornea radius is properly calibrated [11], [13], [16], [17].
(2) OCMLS systems can be applied to both the remote gaze tracking systems and wearable gaze tracking systems. According to the principle of the MCMLS gaze estimation algorithm [12], [15], [16], only when the cornea radius is calibrated during user calibration can the cornea center be quickly calculated by the nonlinear equations set based on corneal reflections during gaze estimation. The pupil center can be further calculated using the estimated cornea center. The calculated cornea center and pupil center are used to reconstruct the OA and estimate the 3D LoS.
(3) When the two light sources in a remote OCTLS system are nearly collinear with the camera optical center, the system degenerates into a special OCMLS system. Traditional 3D gaze estimation methods of OCMLS systems cannot be used to estimate the 3D LoS, and generally can only be used as 2D gaze trackers. However, 3D gaze estimation can be achieved after accurately calibrating the cornea radius.
(4) The remote one-camera-one-light-source (OCOLS) system is almost applicable for estimating the 2D PoR. Without knowing any eye invariant parameters, however, 3D gaze estimation is not possible; it is possible, however, after accurately calibrating the invariant parameters of the eyeball [15], [16], [18]. 3D gaze estimation using OCOLS systems is an important way of HCIs for mobile devices.
The cornea radius cannot be calibrated in existing special OCMLS and OCOLS systems. The proposed method was designed to remedy this. The cornea radius is calibrated by the calibrated iris radius, which is the only way currently for cornea radius calibration in an OCOLS or OCMLS system.

III. PROBLEM STATEMENT
Gaze tracking systems can be generally divided into remote and head-mounted categories. Multi-light-source systems are often used in helmet-mounted (or glasses-mounted) systems, as shown in Fig. 1(a). The helmet-mounted gaze tracking system consists of two cameras and multiple light sources which form a near-eye light ring to reflect on the cornea and generate the necessary glints. The OA and VA of the eyeball can be constructed accordingly [15]. The structure of a universal remote gaze tracking system with OCMLS is shown in Fig. 1(b), which consists of a camera and two visible light sources placed basically collinearly on both sides of the camera. This remote gaze tracking system can actually be degraded to an OCOLS system. It retains certain particularity, however, because it contains two light sources. Villanueva and Cabeza [15] described an OA reconstruction method based on the OCTLS system as shown below. As shown in Fig. 2, the light source L i (i = 1, 2) reflects on the cornea surface. The reflection point is G i , the reflected ray passes through the camera optical center O and intersects the camera image plane at the glint g i , and the normal line is the line connecting the cornea center C and the reflection point G i . As the reflection point G i , the camera optical center O, and the glint g i are collinear, the reflection point G i can be expressed as where u i is a ratio factor. If the cornea radius is r c , then: Since the four points L i ,G i ,O,C are coplanar, the reflection plane can be expressed as: Since the law of reflection states that the angles of incidence and reflection are equal, the following holds: Six equations can be written based on Eqs. (1)-(3). There are six unknowns: scale factor (u i (i = 1, 2)), coordinates of cornea center C(c x ,c y ,c z ), and cornea radius r c . The cornea radius r c can be calculated accordingly. However, when the two light sources are collinear with the camera optical center, the two reflection planes are merged into one plane and only one equation can be obtained by Eq. (2). As there are five equations and six unknowns, it is not possible to obtain the cornea radius r c .
To solve this problem, we attempted to further analyze some relations in cornea radius calibration. The traditional method of finding the cornea center in a 3D space can be transformed into finding the cornea center in a 2D plane. As shown in Fig. 3, in this special case, a corneal sphere is cut by the plane to obtain a circle with the center C and the radius r c . The geometric model is simplified from the reflection of light on a sphere to the reflection of light on a circle.
According to the corneal reflection model shown in Fig. 3, the two light sources can be expressed as L 1 = l x1 l y1 l z1 , L 2 = l x2 l y2 l z2 and the two reflection points as G 1 = g x1 g y1 g z1 , G 2 = g x2 g y2 g z2 . On the reflection plane , L 1 , G 1 , O, g 1 and L 2 , G 2 , O, g 2 all satisfy the law of reflection. Combining with Eq. (3), the Y-coordinate of G i is given by: where is the normal vector of the plane . The reflected light passes through the reflection point G i and the camera optical center, so its unit vector can be expressed as: The line connecting the cornea center C and the reflection point G i is the normal vector of the light reflection model, so the unit vector of the normal is: Assume that the cornea center is denoted by C = c x c y c z , then The unit vector of incident light can be expressed as: where According to the law of reflection,r i , n i , l i satisfy: The reflection point G i falls on the cornea circle, so the distance between the cornea center C and the reflection points G i is equal to the cornea radius, as shown in Eq. (1). There are two intersection points on the circle when solving G i . G i should be near the intersection of the camera optical center; that is, the direction of CG i should face the camera, which is expressed as: A set of non-linear equations can be obtained by Eqs. (1), (8), and (9) by substituting Eqs. (5)-(7) into Eq. (8) to determine the cornea center. There are five unknowns involving the X-and Z-coordinates of the cornea center (c x ,c z ), the ratio factor of the reflection points (u i ), and the cornea radius (r c ).
However, there is no unique solution for the equations since each cornea radius corresponds to a cornea center.
We conducted a simulation to find that the cornea centers calculated by the above equations are all located on the line connecting by the cornea center C and the camera optical center O. Therefore, although the cornea radius cannot be solved by the equations, the line connecting the cornea center C and the camera optical center O can be determined. That is, the unit vector of OC can be calculated. So the cornea center C can be expressed as C = τ OC OC , where τ is the unknown scale factor. By substituting Eqs. (5)-(7) into Eq. (8), a set of non-linear equations can be obtained by Eqs. (1) and (8) which contains four equations and the unknowns τ , u i , and r c . To solve the above equations in our simulation, we operated the following step-wise algorithm.

Algorithm 1 Possible Value Calibration of Cornea Radius
(3); Output: Cornea radius r c , Cornea center C; 1: for τ = τ 1 to τ n do 2: 8: error 2 = r 2 + l 2 − 2(n 2 • r 2 )n 2 ; 10: 12: e = error 2 1 + error 2 2 + error 2 3 + error 2 4 ; 13: end for 14: τ = Index(min(e)); 15: According to the initial value of τ and OC, the initial cornea center C can be obtained, and the ratio factor of the reflection points u i can be calculated by Eq. (3) since it is the only unknown. Then the vectors of the incident light, the reflected light and the normal can be expressed. By substituting these parameters into Eq. (1) and Eq. (8), the sum of squared errors of each equation can be obtained. The ratio factor τ that minimizes this sum of squared errors is taken as the optimal τ . The cornea center C and the cornea radius r c can be calculated accordingly.
Our simulation results indicated that the cornea center and cornea radius cannot be uniquely determined. As shown in Fig. 4, when the cornea center C is determined, the cornea radius can be uniquely determined. Correspondingly, when the cornea radius is determined, the cornea center can be uniquely determined. Therefore, when the cornea radius is known, there is a unique solution for the cornea center that can be obtained by the nonlinear Eqs. (1), (8), and (9). That is to say, in practice, when the cornea radius is calibrated during user calibration, the cornea center can be determined definitely during the gaze estimation process. However, as shown in Fig. 4(b), the cornea center and radius are both unknown during user calibration, so there is no unique solution for cornea center estimation and cornea radius calibration. There are many circles with different radii and centers that all satisfy the corneal reflection imaging model.
In summary, the traditional method can be used to estimate the cornea radius and the cornea center in an OCTLS system provided that the two light sources and the camera optical center are not collinear. That is, the reflection planes formed by two light sources should intersect at the line connecting the cornea center and the camera optical center. Traditional gaze estimation methods used with the special OCTLS systems shown in Fig. 1 (b) cannot calibrate the cornea radius during user calibration [4], [5]. During 3D gaze estimation, the cornea center cannot be estimated nor can the OA of the eyeball be reconstructed, thus the 3D LoS cannot be estimated.

IV. CORNEA RADIUS ESTIMATION A. OVERVIEW OF PROPOSED METHOD
As discussed in Section III, the cornea radius cannot be calibrated through the light sources and pupil information in a universal remote OCTLS system that a camera and two light sources placed basically collinearly on both sides of the camera. To remedy the shortcomings of the traditional cornea radius calibration method [15], we established a cornea radius calibration method based on iris radius calibration for this remote OCTLS system. The procedure of proposed method is shown in Fig. 5. System calibration includes camera parameter calibration and light source position calibration. This paper uses a small ball that can reflect the light source to simulate the eyeball. As the ball radius is already known, the positions of the light sources in the camera coordinate system can be calculated [19], which solves the problem that the light sources are not in the field of view of the system camera. While system calibration and feature detection are completed, the iris radius of the human eye is calibrated according to the strategy of equal kappa angles of both eyes, then the cornea radius is calibrated. Finally, the calibrated cornea radius is confirmed based on the binocular optimization strategy.

B. CORNEA RADIUS CALIBRATION BASED ON IRIS RADIUS 1) IRIS RADIUS CALIBRATION
We established an iris radius calibration method based on the binocular strategy for OCOLS systems in a previous study conducted in our laboratory [20]. The iris of the user's eye can be regarded as a circular target, so the iris center and its normal vector are expressed by the feature parameters of iris ellipse from images of the user's face. Assuming that the iris radius is r, the 3D position of iris center I and its normal vector D are: (10) λ 1 , λ 2 , λ 3 and e 1 , e 2 , e 3 are the eigenvalues and the corresponding normalized eigenvectors of a real symmetric matrix obtained by the coefficients of the cone equation, which is determined by the iris image ellipse in the camera coordinate system and the focal length. For simplicity, two solutions of the iris center and the normal vector corresponding to the iris center are used here: The OA of the user's eye is perpendicular to the plane determined by the iris edges and passes through the iris center, so it can be expressed by: By the law of reflection, both the incident and reflected lights are contained in a plane together with the light source L, cornea center C, camera optical center O, and glint g. Therefore, Thus, the 3D position of cornea center C can be obtained by solving Eqs. (11) and (12). Assuming that L and g are represented as L = l 1 l 2 l 3 and g = g 1 g 2 g 3 , respectively, the cornea center can be expressed as: The coordinates of cornea center C are also related to the iris radius r. This can be expressed as C = r c 1 c 2 c 3 .
As Fig. 6 shows, when the user gazes at the screen point S, the unit vector of the OA can be expressed by the iris normal vector D and the unit vector of VA can be expressed as V = S−C S−C . The angles between the visual and optical axes of the left and right eyes have the same magnitude, so the equal angles between visual and optical axes of left and right eyes are expressed as: The iris radius r, which is the only unknown, can then be estimated. We used an optimization process to secure the most accurate possible iris radius. The theoretical value range of iris radii is 5-6.5 mm; the value of iris radius was traversed here at a step length of 0.0001 mm. Only when the difference in kappa angle between left and right eyes was less than the set accuracy error was the traverse process deployed. The corresponding value of the iris radius was then regarded as the calibration result.

2) CORNEA RADIUS CALIBRATION
As shown in Fig. 7, each iris edge point i k in the image corresponds to a spatial iris edge point I k . Oi k is the projection line of the spatial iris edge point I k . T k is the intersection point of Oi k and the cornea surface. i k is the unit vector of incident light, which can be denoted by Oi k , and f k is the unit vector of refracted light inside the cornea. l is the distance between the iris center I and the cornea center C. The iris is regarded as a circular target, so the line connecting the iris center I and cornea center C is perpendicular to the iris plane. The line connecting the cornea center C and the spatial iris edge point I k thus forms a regular cone, the generatrix of which is CI k . T k I can be denoted by T k I = T k I k + I k I, so The line connecting any two points on the bottom surface of the cone is perpendicular to the height of the cone, which satisfies: This can be simplified as: T k I k can be expressed as T k I k = tf k , where t is the length between T k and I k , so: Each T k , I k , and f k satisfy: Due to the equal generatrix of the ortho cone, it exists CI is perpendicular to the iris plane and the distance between the iris edge point and the iris center is the iris radius, so The iris radius can be calibrated by the method presented in Section IV-B-1), then the iris center can be calculated according to Eq. (10). As discussed in Section III, the line connecting the camera optical center and the cornea center can be obtained for a typical OCTLS system; the cornea center C can be expressed as C = τ OC OC , where τ is the scale factor. The intersection point T k of Oi k and the cornea surface can be set as T k = O+µ k (O−i k ), where µ k can be expressed by the unknown cornea center and the cornea radius. Assume that there is no refraction of the spatial iris edge points on the cornea and that the edge of the spatial iris intersects the outer surface of the cornea. In the nonlinear equations composed of Eqs. (20)- (22), T k can be directly replaced by I k . Thus, the cornea radius r c and the cornea center C can be solved.

3) VERIFICATION OF CORNEA RADIUS BASED ON BINOCULAR OPTIMIZATION CONSTRAINTS
After calibrating the cornea radius (Fig. 3), according to the equal cornea radius (Eq. (1)) and the law of light reflection on the cornea surface (Eq. (3)), the following equations hold: where θ 1 − θ 6 are known constants and u 1 , u 2 , τ , r c are unknowns. The non-linear equation set (Eq. (23)) with four expressions and four unknowns have many different solutions and the cornea radius cannot be estimated directly. Assuming that the cornea radius is known (after adding the constraint of the cornea radius), there is a unique solution for Eq. (23). There is also a unique correspondence between the unknown coefficients u 1 , u 2 , τ , and the cornea radius r c , which can be represented as: For the left and right eyes, that is: The cornea centers of the two eyes are: The distance between the cornea centers of the left and right eye can be expressed as: The cornea radii of both eyes are the same. As the human eye rotates, D varies very little and can be regarded as approximately fixed. The user stares at different respective points during the user calibration process. His or her eyes can rotate freely while the system camera captures images of them. After processing the eye image in each frame when staring at different calibration points and obtaining the line connecting the cornea center and the camera optical center, different D values can be estimated by Eq. (27) while setting different values of the cornea radius r c . The relationship between D and r c as established in our experiment is shown in Table 1, where r c1 , r c2 , . . . , r cn are independent variables and D 11 , D 12 , . . . , D 1n are dependent variables (Curve 1, Image 1). We determined different relationship curves from different images similarly. In theory, the curves 1 − n are approximately coincident. When r c is fixed, D varies very little. The variance of D can be expressed as: When the value of cornea radius changes around the true value, the objective function can be optimized as: In this way, the candidate parameters of the cornea radius can be obtained. When the calibrated cornea radius r c (Section IV-B) falls into the candidate values, it can be considered to be reliable and fully available. The cornea radius is validated during user calibration by Algorithm 2. for r c = r c1 to r cn do 2: ; end for 10: r c = Index(min(J )); return r c ;

V. EXPERIMENTAL RESULTS AND ANALYSIS
The effectiveness of the proposed method was tested in this study by a series of simulations and physical experiments on an OCTLS gaze tracking system.

A. SIMULATION EXPERIMENTS 1) ESTABLISHMENT OF SIMULATION ENVIRONMENT
The simulation was developed in 3D modeling software Rhinoceros 5.0 and compiled in Python 3.7. The remote OCTLS gaze tracking system was built in Rhinoceros 5.0, as Fig. 8 shows, based on Le Grand's human eye model [21]. The positional relationships between the light sources, screen, and camera were set referring to the actual system settings. In the simulation model, all the parameters of the gaze tracking system are known (including the positions of the light sources and the calibration point, the feature parameters of the eyeball, and the intrinsic parameters of the camera). The method presented in Section IV-B was used to calibrate the invariant parameters of the eyeball and a method from the literature [15] was used to calculate the PoR on the screen. As shown in Fig. 8, the coordinate system O − XYZ was defined with the camera optical center as the origin. The X -axis and Y -axis were established in a plane that contains the origin and is parallel to the image plane; the Z -axis was set perpendicular to image plane with its positive direction pointing to the eyeball. The focal length of the camera was 6 mm in this case and the pixel size was 2.2 um. The distance between the two eyes is 53 mm and the eyeball radius is 12 mm. Since it is generally believed that the cornea radius of the human eye is 6-8 mm, the cornea radius was set to 6.8 mm in our simulation; the distance between the cornea center and the eyeball center was 5.84 mm, the iris radius was 6.57 mm, and the distance between the pupil center and the cornea center was 4.2 mm. The light source L 1 was located at 100 0 0 and L 2 at −100 0 0 (unit: mm).
The effectiveness of the proposed algorithm was verified by comparing the calculated PoRs with the preset calibration points. The influence of iris feature parameters on the calibration results was also analyzed, as discussed below.

2) ALGORITHM FEASIBILITY SIMULATION
A program for calculating the cornea radius in the above remote gaze tracking system was compiled using Python 3.7. By taking the eye imaging parameters and light source parameters as inputs, the simulation results of the cornea radius were obtained, as listed in Table 2. The iris-radius-based method appears to be able to accurately extract the cornea radius; the value of the cornea radius estimated each time approaches the true value, which shows that the proposed method is feasible.

3) ALGORITHM PERFORMANCE SIMULATION
The cornea radius calibration depends on the iris radius calibration. The calibrated iris radius and the estimated spatial iris center may affect the accuracy of the calibrated cornea radius. VOLUME 8, 2020 We evaluated the effects of iris feature parameters on cornea radius calibration to test the proposed method.

a: IRIS FEATURE PARAMETERS AFFECTING CORNEA RADIUS CALIBRATION
The spatial iris center error originates from iris radius and iris imaging ellipse errors. The iris imaging ellipse error is attribute to errors in the edge points of the iris imaging ellipse. We centered our simulation on the influence of the iris edge error and iris radius error on the cornea radius calibration.
We first added Gaussian noise with a mean value of 0 and a variance of 0.1 pixels to the coordinates of iris edge points. The average values of the 100 calculation results were taken as the simulation results under each noise condition, as shown in Table 3. The effect of the iris edge error on the cornea radius calibration is shown in Fig. 9. The cornea radius error reached about 2 mm when iris edge error was 1 pixel, but when the iris edge error was within 0.6 pixels, it had little effect on the cornea radius. The iris edge detection method is very important in any practical application of the proposed method.  Considering the influence of the iris radius on the cornea radius calibration, we added Gaussian noise with a mean value of 0 and a variance of 0.005 mm to the iris radius. The average values of the 100 calculation results were taken as the simulation results under each noise condition. As shown in Table 4, if the influence of iris center error is not considered (in this simulation, the iris radius error also affects the spatial iris center), the cornea radius error caused by the iris radius  error is always less than 0.00092 mm and thus can be ignored. Therefore, the cornea radius error is primarily attributable to the spatial iris center error; high-precision iris edge detection is necessary to operate the proposed method effectively.

b: INFLUENCE OF COMPREHENSIVE FACTORS OF IRIS FEATURE PARAMETERS ON CORNEA RADIUS CALIBRATION
To analyze the comprehensive effects of the cornea radius calibration based on the iris feature parameters, we simultaneously increased the above errors of iris edge and iris radius in an additional simulation. The average values of the 100 calculation results were taken as the simulation results under each noise condition, as shown in Table 5. We also transformed the data in Table 5 into a line chart as shown in Fig. 10. We found that error in the iris radius has little effect on the cornea radius; the relationship curve fluctuates in a stable manner. When the error of the iris radius was constant and an error was added to the iris edge, there was substantial fluctuation of the cornea radius across a step-wise growth trend. When the error was added to the iris radius and the iris edge simultaneously, the errors did not cancel each other out. It was still the iris edge point that has a significant influence on the cornea radius. To this effect, it is necessary to ensure that the error of the iris edge is within 0.6 pixels to preserve the accuracy of final calibration results.

B. PHYSICAL EXPERIMENTS
We also conducted physical experiments to test the proposed method. In actual systems, our method can be implemented based on infrared illumination or active infrared light sources. During user calibration, the irises are segmented to perform the cornea radius calibration. The 3D LoS is then estimated by segmenting the pupil during gaze estimation.
In this paper, we sought to verify the feasibility of the iris-based cornea radius calibration algorithm, so we used a gaze estimation system based on visible illumination to calibrate the cornea radius and determine the PoRs. Our system consists of one camera and two visible light sources located on both sides of the camera lens and symmetrically distributed around the camera. The outside of the iris consists of white sclera, which gives it starker contrast under a visible light source than under an infrared light source; these visible light sources can be used to capture the iris images of the eyes clearly. The focal length of the camera lens we used is 3.66 mm, the pixel size of the image is 2.2 um, and the size of imaging plane is 576 × 768 pixels. To facilitate screen calibration and PoR estimation, the OCTLS system was placed in the center of the screen. Experimental diagram of physical system is shown in Fig. 11.

1) CORNEA RADIUS CALIBRATION EXPERIMENT
As shown in Fig. 11(a), five user calibration points were set at the center and four corners of the computer screen. Each user was asked to sit in front of the screen at a distance between 350 mm to 600 mm and gaze at the calibration points for a few seconds each. During the staring process, they were allowed to pan, pitch and yaw their heads within a certain range, as long as the iris can be clearly imaged in the system camera. However, head rolling movement was not allowed for the following reason: 3D gaze estimation usually needs to reconstruct the OA direction of the eyeball first, and then convert to the VA direction of the eyeball according to the fixed spatial angle (i.e., kappa angle) between the OA and the VA. If the head rolls, the kappa angle cannot be calculated at this time, and the 3D LoS cannot be estimated.

a: IRIS RADIUS CALIBRATION
The iris radius was calibrated (Section IV-B-1)) with seven subjects participating. Ten images of the subject's face were captured by the system camera as he or she stared at each calibration point, then feature parameters of the iris and glint of each eye in each image were extracted during subsequent  image processing [22], [23]. Five sets of iris radii were obtained for each subject. We calculated the average values of the five sets of calibration results for each subject as listed in Table 6.

b: CORNEA RADIUS CALIBRATION
Cornea radius calibration was performed next based on the iris radius calibration results. The cornea radius were estimated (Section IV-B-2) as shown in Table 6, where the cornea radii are the average values of those radii measured when the subjects stared at each the calibration point. The iris and cornea radii calibration results across different calibration points are listed for the sake of comparison in Table 7.
The subject's calibrated cornea radii differed at different calibration points. When the subject stared at the calibration point in the middle of the screen, the calibrated cornea radii were more stable and closer to the true values. This is because the iris image is distorted when the iris deviates from the camera's optical axis. The image distortion does not affect the calibration of iris radius and cornea radius in principle, however, larger iris imaging distortion had a substantial impact on our calibration results. The outer surface of the cornea is not a standard sphere, so the cornea radii we obtained also differed as the subject gazed at different calibration points.
The iris radii and cornea radii calculated from different frame images when Subject 2 stared at the calibration point in the middle of the screen are shown in Fig. 12. The cornea radii are relatively stable and distributed around the typical values.

2) 3D GAZE ESTIMATION EXPERIMENT
We conducted 3D gaze estimation experiments with the same subjects to further verify the accuracy of cornea radius calibration. According to the cornea radius of each subject (Section V-B-1)), the OA of the eyeball was reconstructed by a method described in the literature [16], wherein the pupil feature is replaced by the iris feature. The kappa angle was then calibrated according to another previously proposed method [17]. Finally, the intersection point of the LoS and screen (that, is the subject's PoR) was calculated by optimization [20]. There were 16 test points distributed on the screen in this test which each subject gazed at sequentially, as shown in Fig. 11(b). Each subject was also asked to sit in front of the screen at a distance between 350 mm to 600 mm and look at the 16 known test points in sequence; as long as   the iris can be clearly imaged in the system camera, they were allowed to pan, pitch and yaw their heads within a certain range, but they were not allowed to roll their heads. The gaze estimation accuracy for each subject was determined by comparing the estimated PoRs with the real test points as shown in Table 8. The PoR results were plotted as shown in Fig. 13, wherein the real and estimated PoRs of Subject 3 are shown in Table 9. The X-and Y-direction errors for this subject were within 8 mm and all estimated PoRs were located near the real respective PoRs. The estimation accuracy was within 2 • in the 3D gaze tracking system, which meets the relevant requirements for accurate gaze estimation. This suggests that the proposed cornea radius calibration method can be applied in OCTLS 3D gaze tracking systems.

C. EXPERIMENTAL ANALYSIS REVIEW
Accurate cornea radius calibration is the key to effective 3D gaze estimation based on remote gaze tracking systems with OCTLS. The proposed method may be applied with passive infrared light sources or active infrared light sources. The purpose of the experiments is to verify the effectiveness of the iris-based cornea radius calibration method. Therefore, in order to perform iris image processing more easily in our experiments, it is reasonable to design the experiments in this way: a single-camera-two-visible-light system was used to calibrate the iris radius and cornea radius, and then the traditional method [16] was used to reconstruct the OA in the process of LoS estimation, wherein the pupil feature was replaced by the iris feature.

1) ANALYSIS OF IRIS-BASED CORNEA RADIUS CALIBRATION METHOD
The cornea radius calibration method proposed in this paper is based on the iris features. The iris is a circular target attached to the front of the cornea ball, and the iris edge is directly connected with the outer surface of the cornea. Therefore, the imaging of the iris in the system camera is not affected by the corneal refraction, which is equivalent to the elliptical image of the spatial circular target directly in the camera. As mentioned in Section III, the unique cornea center cannot be determined using traditional methods, but the calculated cornea centers are all located on the line determined by the cornea center C and the camera optical center O. Since the cornea center is on the line OC, the cornea radius is solved by statistical optimization method on the condition that the length of the generatrix extending from the cornea center to the iris edge is equal.
Our results show that the accuracy of cornea radius calibration depends on the accuracy of iris radius calibration and is of statistical significance. This is an optimization process in which the error range can be controlled. The error range of the cornea radius can be determined according to the calibration result of the iris radius, thereby ensuring that the calibration error of the cornea radius is controlled within a certain range. The iris radius calibration method we used in this study is one we built in a previous study and does yet need further improvement. In practice, a more accurate iris detection method would yield a more robust iris radius calibration method. It can be seen from Section V-A-3) that when a certain range of error is added to the iris radius, the calibration error of the cornea radius is also within a very small range.

2) ANALYSIS OF THE INFLUENCE OF CORNEA RADIUS ERROR ON OA RECONSTRUCTION
According to Section III, for an OCTLS system, the cornea radius must be calibrated through the user calibration process before the OA of the eyeball can be reconstructed. The cornea radius is first used to estimate the cornea center. The cornea center is located on the line determined by the cornea center C and the camera optical center O. Since the line OC is determined by the reflection planes of the light sources, the accuracy of the line OC mainly depends on the position calibration of the light sources, regardless of the eye parameters. So an accurate line OC can be obtained as long as the position of each light source is accurate. In our previous work [19], accurate light source position calibration has been achieved. Therefore, an accurate line OC can be obtained to lay the foundation for determining an accurate cornea center, and the cornea center error is reflected in the deviation of the cornea center from the correct position on the line OC. The cornea radius error will affect the position of the cornea center on the line OC.
Since the OA of the eyeball passes through the cornea center and the iris center, the OA of the eyeball always lies in the plane determined by the line OC and the iris center. Let this plane be π. Based on the calibrated cornea radius and the cornea center determined above, we take the cornea center as the vertex, and the line connecting the cornea center and the iris edge as the generatrix to establish a cone. According to the equal length of the generatrix, the iris center is estimated. We found that even if the cornea center deviates from the correct position on the line OC, the estimated iris center always makes the line connecting the iris center and the camera optical center on the plane π. By traversing the cornea radius with a theoretical value of 7.8 mm in the range of ±0.5 mm, the obtained cornea center error and iris center error are shown in Fig. 14. It can be seen that the cornea radius error caused the cornea center error along the line OC, but the cornea center error and the iris center error are basically synchronized.  According to the estimated cornea center and iris center, the unit vector of the OA is determined. The change trend of the OA error with the cornea radius is shown in Fig. 15. It can be seen that the cornea radius error has little effect on the OA of the eyeball. When the cornea radius deviates from the true value by 0.5 mm, the OA error is still less than 0.1 • . It shows that although the cornea radius error will cause the cornea center error and the iris center error, the cornea center error and the iris center error will inhibit each other in the OA reconstruction, resulting in the cornea radius error having little effect on the accuracy of the OA. The main reason is: the line connecting the iris center and the camera optical center is always on the plane π, and the projection line of the iris edge point is around the line connecting the iris center and the camera optical center, so determining the OA on the condition that the length of the generatrix connecting the cornea center to the iris edge is equal will weaken the influence of the cornea center error on the line OC.
The OA of the eyeball is determined by the cornea center and the unit vector of the OA, and the unit vector of the OA is more important than the cornea center for OA reconstruction. According to the relatively accurate OA direction obtained above, even if there is an error of several millimeters in the cornea center, it has little effect on the accuracy of the OA, and thus has little effect on the accuracy of gaze estimation.
In summary, the cornea radius calibration method proposed in this paper can meet the accuracy requirements of OA reconstruction and LoS estimation, and is of great significance for the 3D gaze tracking of this typical OCTLS system.

D. COMPARISON WITH STATE-OF-THE-ART METHODS
Eye gaze tracking methods are usually divided into appearance-based methods and feature-based methods. Feature-based methods are further divided into 2D mappingbased methods and 3D model-based methods. Among them, 2D mapping-based methods include: iris-corner technique (ICT)-based methods, iris center-cornea reflection (ICCR)based methods, cross-ratio (CR)-based methods, and homography normalization (HN)-based methods.
To further evaluate the quality of the proposed cornea radius calibration method for gaze tracking, a comparison of several state-of-the-art gaze tracking methods has been studied. These state-of-the-art methods are analyzed from the four aspects of their required system configuration (i.e., the number of cameras and the number of light sources), whether head movement is allowed, the gaze estimation accuracy, and their special features, as shown in Table 10.
It can be seen from Table 10 that appearance-based and Kinect-based methods show lower accuracy than other stateof-the-art methods. ICT-based, ICCR-based, and HN-based methods can achieve comparative accuracy without head movement. CR-based method can reach an accuracy of 0.7 • at present, but it requires four or more light sources. Most 3D model-based methods use OCMLS systems or MCMLS systems. When our proposed cornea radius calibration method is used for gaze estimation, the comparative accuracy can be obtained based on a simpler system configuration such as a typical OCTLS system where the camera and the two light sources are basically collinear. Therefore, the cornea radius calibration method proposed in this paper not only solves the problem that traditional methods cannot achieve 3D gaze estimation in this typical gaze tracking system, but also achieves comparative gaze accuracy based on the realization of 3D gaze estimation in such a system, which meets the needs of the application.

VI. CONCLUSION
In 3D OCMLS gaze tracking systems, especially remote OCTLS gaze trackers, the cornea radius needs to be calibrated by the user to estimate the cornea center, construct the OA of the eye, and estimate the user's gaze. Cornea radius calibration plays an important role in 3D gaze estimation for OCMLS systems.
A cornea radius calibration method based on iris radius calibration was established for 3D remote gaze tracking systems with OCTLS in this study. Under the constraint that the distance from the cornea center to the iris edge points is equal, a nonlinear system of equations was constructed to calibrate the cornea radius. The calibrated cornea radius was then verified under binocular optimization constraints. Simulations and physical experiments were conducted to verify the feasibility and effectiveness of the method. The proposed method is novel and resolves the cornea radius calibration problem of the typical remote OCTLS gaze tracking system. It makes the 3D gaze estimation of a typical remote OCTLS system possible, and is currently the only workable approach to estimating the 3D LoS on a typical OCTLS system.
There are many issues that merit further research, including better iris radius calibration methods and effective determination of the influence of various error factors. We plan to address these problems in the future in order to realize real 3D gaze estimation in actual systems.
DENG WANG received the bachelor's degree from the Taiyuan University of Technology, in 2017, and the master's degree from the University Science and Technology Beijing, in 2020. Her main research interests include eye tracking and Alzheimer's disease.
NING LU received the B.S. degree from the University of Science and Technology Beijing, Beijing, China, in 2016, where he is currently pursuing the M.S. degree with the Instrument Science and Technology. His main work includes the system calibration of the gaze tracking system, and image processing.
ZHILIANG WANG received the M.S. and Ph.D. degrees in control theory and control engineering from Yanshan University and Harbin Institute of Technology, in 1982 and 1989, respectively. He is currently a Professor and Doctorate Supervisor with the School of Computer and Communication Engineering, University of Science and Technology, Beijing, China. His research interests include security control of cyber physical systems and active industrial control systems, and theory and application of networked control systems, and artificial intelligence. He is the Senior Board Member of the Chinese Association of Artificial Intelligence and the Director of the Beijing Society of Internet of Things. VOLUME 8, 2020