Human Gait Estimation Using Multiple 2D LiDARs

This paper proposes a human gait estimation algorithm using a multiple 360 degree 2D LiDARs system. The system is fixed on the ground at the shin height to scan human legs during gait. The multiple LiDARs system is to overcome the drawbacks of a single LiDAR system, which could lose data due to occlusion between legs during walking and has a short tracking range. The performance of a sensor fusion system strongly depends on the calibration. In this paper, we propose a calibration method using a cylinder with known radius as a specific marker. In contrast to other methods, the calibration parameters and the cylinder center points at different positions are estimated by a proposed iterative algorithm. The measurement noises in the LiDAR output are considered to increase the accuracy of calibration and human leg center points estimation. Instead of using least square fitting of circle algorithm to estimate the leg center point, a new iterative algorithm which includes measurement noises is proposed. Although multiple LiDARs are used, the discontinuities of leg center points could still happen. Therefore, a quadratic optimization based eighth-order splines algorithm is derived to interpolate and smooth the data. Two configurations of three LiDARs are tested in the experiment. The former is the triangle configuration in which the whole walking path is covered by all three LiDARs. This configuration minimizes the occlusion between legs. The maximum RMSE of step length estimation of this configuration compared with the optical camera system is 0.03m. The latter is the line configuration in which each LiDAR covers a certain walking path sequentially. This configuration maximizes the tracking range. The experiment with 20m straight walking has the RMSE of about 0.10m.


I. INTRODUCTION
The human gait estimation is helpful in various applications and has received extensive attention for decades. It can be used for human identification [1], [2], clinical diagnosis [3], human-robot interaction [4], [5], sport [6] and human navigation [7]. To estimate the human gait, various techniques are presented. They can be divided into two groups: body-attachment and non-body-attachment. In bodyattachment group, the Vicon optical camera system [8] which captures the infrared markers attached on the body is a golden system with high precision. Another method estimates the body movement by using inertial measurement units (IMU) attached on different parts of body (foot, waist or wrist) [9], [10]. In the non-body-attachment technique, a camera system without markers can capture and estimate human gait using digital image processing methods [11], [12]. Light The associate editor coordinating the review of this manuscript and approving it for publication was John Xun Yang . detection and ranging (LiDAR) sensor which is another type of optic sensors can also be used [13]. GAITRite system based on floor force plate sensors is known as a gold standard of this group [14]. Although the golden standard methods provide highly accurate gait parameters, they require high operating cost and complicated installation efforts. Body-worn wearable sensor system is light weight and low-cost. However, a major issue of the IMU-based human gait estimation is the accumulation error, which can significantly affect the position estimation.
A 2D LiDAR, which has some advantages of no external markers and simple installation, has been used in autonomous vehicles [15], [16], mobile robot navigation [17], [18] and human gait tracking [19]- [24]. Comparing to a 3D LiDAR, a 2D LiDAR is less expensive both in terms of price and computational cost. The LiDAR can be placed on a moving platform such as mobile robots or smart walkers to track human gait during walking [19], [20]. Some additional sensors such as IMU and encoders need to be installed in these systems to VOLUME 9, 2021 This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/ estimate the orientation of moving platform. Although these systems can be used in a large space, they are complicated and expensive.
In [21]- [23] and our previous research [24], a 360 degree 2D LiDAR is placed on the ground to track the human shin during walking. The system requires only a single LiDAR so that it is simple to install everywhere in a minute. However, a single LiDAR system has disadvantage of limited tracking range. Because of the small human leg radius and the coarse angular resolution of low cost LiDAR, the human leg tracking range is only few meters. In this paper, we propose a human gait tracking method using a multiple 2D LiDARs system. These LiDARs are placed at the same height on the ground to scan the human legs during walking.
The use of multiple LiDARs gives an issue of calibration to find the relative information between them; thus, many efforts have been made to accurately calibrate the system. Different from multiple 3D LiDARs system that can find the corresponding points of scan data in 3D spaces and directly determine the extrinsic calibration parameters, the multiple 2D LiDARs system is more difficult. One of the most popular methods uses additional sensor such as camera [25], [26] or odometer [27] as a bridge. Schenk et al. [28] calibrate the stationary network of multiple 2D LiDARs by matching the trajectories of moving people. Although it does not require additional sensor or external marker, the accuracy is affected by the errors of human trajectory estimation and the placement of LiDARs (they have to scan in a common plane). Another method is to calibrate the multiple LiDARs system on mobile robot by using some specific markers such as orthogonal planes [29], a trihedron [30] or cuboidshaped corridor [31]. The straight line is fitted to the scan data on the marker plane for each LiDAR, then the intersecting points are used to estimate the calibration parameters. Therefore, these systems require the LiDARs to be placed in different planes and usually are used in mobile robots.
In the proposed method, we use a cylinder to calibrate the multiple 2D LiDARs system. In the calibration process, the cylinder is placed at different positions to capture the scan data. Unlike other methods, our proposed calibration method can estimate the cylinder center points at each position and the calibration parameters. Also, the LiDAR measurement noises, which were neglected in almost previous studies, are considered in this method. We investigate a linear model of measurement errors and the scanning distance of the LiDAR system, then it is used to reduce the error in the calibration process and the human leg tracking algorithm. The human leg is modeled as a circle with a known radius as in [24]; however, a new iterative algorithm is proposed to estimate the leg center point from the LiDAR scan data with the including of measurement errors. After that, a quadratic-optimization based eighth-order splines algorithm is derived to estimate the human legs trajectories by combining legs center points data.

II. SYSTEM OVERVIEW
The proposed tracking system consists of three 2D LiDARs placed on the ground at the same height of 0.3m to track the human leg. Each LiDAR has the sampling rate of 10Hz. As shown in Fig. 1, the system consists of four modules: single LiDAR processing, calibration, data combination and quadratic optimization based splines algorithm. The single LiDAR processing module is applied to each LiDAR data separately. This part is similar to our previous single LiDAR study [24] except the human leg center point estimation part. In contrast with previous study, the measurement errors of LiDAR are explicitly considered in this study and a new center point estimation algorithm is proposed. This part outputs the human legs center points, their estimation error covariances and time stamps of left and right side. The calibration module is to estimate the relative information between the reference LiDAR and the others. The data combination module is to transform the center points data from each LiDAR system into the reference system coordinate using the calibration parameters. Finally, the quadratic optimization based eighth-order splines algorithm outputs the left and right legs trajectories.

III. HUMAN GAIT ESTIMATION
In this paper, a e denotes the error between the estimated valuê a and the true valueã, i.e.,ã =â + a e .

A. 2D LiDAR MEASUREMENT MODEL
A sample data s of the 2D LiDAR consists of (d, φ, t s ), where d is the distance from the LiDAR rotating core to the object, φ is the heading angle and t s is the sample time stamp (see Fig. 2). This polar coordinate data can be transformed to Cartesian coordinate data (x s , y s ) as follows: x s = dsinφ, y s = dcosφ. Assume that the LiDAR output s consists of the true values and the measurement noise v s ∈ R 2 . Let δd and δφ be the measurement noises of distance and heading angle, respectively. We can model the LiDAR measurement output as follows: where D = sinφ dcosφ cosφ −dsinφ . We assume that δ d and δ φ are independent and zero mean white Gaussian noises, where covariance are given by: The covariance of LiDAR measurement noises can be computed as follows: where superscript ''T denotes the transpose.
In this paper, R φ is considered to be constant while R d is assumed to depend on measured distance. An experiment is given in Section IV to find the model of R d .

B. SINGLE LiDAR DATA PROCESSING
For each LiDAR, we manually assign the boundary of the walking area (polygon in Fig. 3). Each segment scan data is then divided into clusters which consist of leg scan data. After estimating the center point of each cluster, a simple algorithm is used to classify left and right legs center points. Except the center point estimation part, the details of other parts can be found in previous study [24]. In this paper, we propose a new human leg center point estimation algorithm which includes the measurement noises in the LiDAR output model.
Human leg center point estimation algorithm: The human leg is modeled as a circle with a fixed radius. Letr h ∈ R and c ∈ R 2 denote the true values of human leg radius and center point coordinate, respectively (see the right side of Fig. 3). Ther h consists of the estimated valuer h which is manually measured by a tape, and a small error r h,e . This algorithm is to estimate the error of the center point estimation c e , then use it to correct the center point coordinate.
Assume that the leg cluster consists of n scan data points s i . The circle fitting equation for a sample data is as follows: where The following equation is the measurement equation for a sample scan data s i in the cluster: Then for a cluster, the measurement equation becomes: where z =   r . The covariance matrix of measurement noises is computed as follows: where R r h,e = E{r h,e r T h,e } and C i (i = 1, . . . , n) is computed as (4).
The estimated error of the center point is then estimated as in (9):ĉ where P = (H T Q −1 H ) −1 is the center point estimation error covariance. After that, we can update the estimated center point:ĉ This process is iterated until a stop condition is satisfied: The initial estimation of the leg center point is computed as follows: The time stamp of the estimated center point t c is the time stamp of the middle point of the cluster.
Left/Right legs classification: After estimating the center points of human leg cluster, we will classify these center points into left and right side. Firstly, the center line formed by middle points of two center points in a scan segment is estimated. This center line divides the center points into two sides. Then, the left and right legs can be classified easily. The details of this part can be found in [24].
For each LiDAR, the center points, estimation error covariances and time stamps are now classified into left (l) and right (r) sides: {c l,L , t c,l,L , P l,L } and {c r,L , t c,r,L , P r,L }, where L = 1, 2, 3 represents LiDARs. Due to the occlusion, which happens when the laser signal to the farther leg is stopped by the closer leg, the center points and their time stamps can be missed at some scans. In this proposed method, the missing center points are estimated using the eighth-order splines algorithm. Therefore, the time stamps of the missing center points are required.
Let τ l,L and τ r,L denote the full time stamps of left and right legs of LiDAR L which consists of classified time stamps and the estimated missing time stamps. We assume that the time interval between missing time stamp and the previous adjacent available time stamp equals to sampling period T s . The equations to estimate τ are given in Algorithm 1.

Algorithm 1 Estimation of Full Time Stamps
Require: t c,l(r),L , T s Ensure: τ l(r),L 1: τ l(r),L,1 ← t c,l(r),L,1 2: for 2 ≤ i ≤ N seg do 3: if i is missing then 4: τ l(r),L,i ← τ l(r),L,i−1 + T s 5: Finally, the outputs of the single LiDAR processing part are {c l,L ; t c,l,L ; τ l,L ; P l,L } and {c r,L ; t c,r,L ; τ r,L ; P r,L }. Note that, these center points are expressed in the Cartesian coordinate frame.

C. CALIBRATION ALGORITHM
The data of each LiDAR are expressed on its own coordinate. To fuse these data, they need to be transformed in the same coordinate system. In this paper, we propose a calibration algorithm of multiple 2D LiDARs using a cylinder with a known radiusr c . The cylinder is placed at m different positions in the LiDARs tracking range ( Fig. 6 and Fig. 12). At each position, we capture one second of scan data from three LiDARs. In contrast with other methods, the center points of the cylinder at all positions and the calibration parameters are estimated at the same time in our proposed method. Also, the measurement errors of LiDAR are considered to reduce the estimation errors.
Suppose that we choose a LiDAR coordinate system as a reference coordinate (call LiDAR 1). The aim of the calibration process is to find the transformation matrix R(θ)T ∈ R 2×3 between LiDAR 1 and the others, where θ ∈ R and T ∈ R 2 represent the rotation angle and translation vector, respectively.
Let x c denote the calibration state vector that consists of the estimation of the cylinder's center points at m positions and the calibration parameters between LiDAR 1 and the others: Let s i L,k denote a sample measurement data of LiDAR, where L = 1, 2, 3 represents LiDARs, k = 1, . . . , m denotes the position of cylinder, and i L,k = 1, . . . , N L,k is the discrete index of the scan data of LiDAR L at position k of cylinder. N L,k is the number of cylinder scan data of LiDAR L at position k of cylinder.

1) INITIALIZATION
To estimate the initial state vectorx c,0 , the cylinder's center points at each position k (expressed in each LiDAR coordinate) L c L,k,0 are firstly estimated using least square fitting algorithm of circle: The m cylinder's center points estimated from LiDAR 1 scan data are chosen as the m initial estimated of cylinder's center points in the state vector. Then the initial calibration parameters (θ 1L,0 ,T 1L,0 , L = 2, 3) are estimated using least square method to solve the following linear model:

2) THE SMALL ERRORS ESTIMATION
Due to the measurement noises in LiDAR output, the estimated calibration state vector always has errors. In this proposed algorithm, these errors are estimated and used to correct the calibration state vector. The error state vector x c,e consists of the estimation errors of cylinder center points at m positions c k,e ∈ R 2 , k = 1, . . . , m (expressed in LiDAR 1 coordinate) and the estimation errors of rotation angles and translation vectors from LiDAR 1 to the others (θ 12,e , T 12,e ), (θ 13,e , T 13,e ): x c,e = c T 1,e . . . c T m,e θ 12,e T T 12,e θ 13,e T T 13,e T .
Since the estimated cylinder center points are expressed in LiDAR 1 coordinate, the circle fitting equations for each LiDAR at a position k are as follows: For LiDAR 1, the measurement equation is similar to (5) and (6): and v s i 1,k ∈ R 2 is the measurement errors of sample data s i 1,k .
For LiDAR L = 2, 3, using the assumptionθ =θ + θ e , we approximate the rotation matrix as follows: where By inserting (18) into (15) and (16) and replacing the true value with the estimated and error values, we have: Then the measurement equation for LiDAR L = 2, 3 is given as follows: where , o i L,k = p i L,k R(θ 1L ), and The measurement equation for all m positions of cylinder is as follows: where The computation of covariance matrix of measurement noise Q c is similar to (8): Similarly to (9) and (10), we can estimate the error state vector and update the calibration state vector as follows: A stop condition is also used to stop the iteration: x c,e < γ c . The finalx c is the estimation result of cylinder's center points and the calibration parameters (θ 12 ,T 12 ), (θ 13 ,T 13 ).
The combination data are then sorted in ascending order of time. The sorted data are denoted as S * l(r) = {c * l(r) , t * c,l(r) , τ * l(r) , P * l(r) }. As explained in single LiDAR processing, the occlusion between legs can result the missing center points in c l(r) . This leads the discontinuity in the trajectory of the leg. To solve this problem, an eighth-order splines algorithm is used to interpolate and smooth the center points data [32].

E. QUADRATIC OPTIMIZATION-BASED EIGHTH-ORDER SPLINES ALGORITHM
In this section, the splines function of human leg center points ζ l(r) are derived using the data set S * l(r) . For simplicity, we remove the subscript l(r) in the following equations.
Given the center points data set c * i = f (t * c,i ), i ∈ [0, N ], the i-th spline segment of ζ is given by: whereT is the normalized length of each spline segment and a i,k ∈ R 2×1 are the coefficients of the i-th spline segment. The requirement of continuity of ζ (t) and its first three derivatives at the (N −1) interior knots results in the following equation: where A ∈ R 16×16 and is given in (28 where Among all piecewise continuous polynomials that satisfy the above criteria, find the one that minimizes the function: where is the knot spacing, and P * i is the estimation error covariance of c * i . The parameter α j (j = 1, 2, 3) is to control the magnitude of each of the derivatives of the spline.
If we useT = 1 as the normalized spline segment length, the cost function (30) becomes: The integration of three derivatives can be computed using (29) as follows: where Inserting (32) into (31), we have the following: From (27), we havē Let U , V , and W be defined by where U k , V k , W k (k = 1, 2, 3, 4) ∈ R 8×8 . Then the cost function can be rewritten as follows: Let an optimization variableX be defined bȳ It is not difficult to see that (36) is a quadratic function ofX : where M 1 , M 2 , and M 3 can be computed from (36). The M 1 and M 2 matrices are given in Appendix (M 3 matrix is omitted, since it is irrelevant in the optimization). The minimizing solution of (38) can be found by solving the following equation: whereX * is the minimizing solution. Once these optimal values are obtained, they can be used in (34) to obtain the optimal spline coefficientsā k (k = 1, . . . , N ). After that we can estimate the spline leg center points trajectory and its velocity (first derivative) with the full time stamp τ * using (26) and (29).

A. DEPENDENCE OF MEASUREMENT ON RANGE
Since the accuracy of the RPLIDAR, which is used in this paper, is not listed in the data sheet, we did a simple experiment to investigate the relationship between R d and the measured range. This experiment was introduced in [33]. We placed a white sheet board at different distances D = 1m, 2m, 3m, 4m and 5m to the LiDAR core. At each distance, the data were captured in about 10 minutes. Fig. 4a shows the experiment setup and Fig. 4b shows the scan data at each distance. We can see that the farther distance, the larger measurement errors. The results of mean distanceD and standard deviation R d are given in Table 1. By fitting all the computed data, we can find the following relationship between the LiDAR range measurement standard deviation and the measured distance: R d = 0.001D. In this paper, we use R φ = 0.001 o .

B. HUMAN GAIT TRACKING PERFORMANCE
The proposed human gait tracking method was verified with two experiments and two male healthy subjects. The information of two subjects are given in Table 2. The parameters used in this paper are given in Table 3. Two configurations of three LiDARs system are tested: triangle and line configuration. In the first experiment, to provide the ground truth of legs trajectories we used an optical motion capture system of OptiTrack, composed of six Flex 13 cameras with a resolution of 1280 × 1024 at 120 Hz. The system can track passive spherical markers with a sub millimeter accuracy.

1) TRIANGLE CONFIGURATION EXPERIMENT
The first experiment used the triangle configuration of three LiDARs as in Fig. 6. Each LiDAR can cover all the walking path. This configuration can minimize the occlusion between legs during walking, so that we can get more data of human leg. In this experiment, each subject walks straight 6 steps in 15 times which is divided into three speed levels: slow, normal, and fast. Two optical markers were attached to the frontside of human shanks to get the true step length estimation. Fig. 5 shows the average speed of each walking of two subjects that is computed from the optical data.  To calibrate the three LiDARs system, the cylinder is placed at 5 different positions in this experiment. The calibration parameters between three LiDARs are estimated by using the proposed calibration algorithm.
The initial estimations of calibration parameters which are estimated using (12) and (13) are considered as the results of conventional method. Using the estimated calibration parameters, we transform the measurement data into the LiDAR 1 reference coordinate system. Fig. 7 shows the results of the conventional method and the proposed method. The LiDARs positions are represented by the circles and their measurement data are represented by dots. Blue, red and black colors denote the LiDAR 1, 2 and 3, respectively. We can see that the transformation using calibration parameters of the proposed method has better performance than the conventional method.
By attaching an optical marker on the top of each LiDAR, we can estimate the true position of each LiDAR in the camera system. However, we cannot know the true heading angle of each LiDAR. Therefore, we compute the relative information of distances d 12 , d 13 and the angle α formed by these lines (right side of Fig. 6). The errors of these relative information compared with the camera system are given   in Table 4. The relative distance errors were significantly reduced using the proposed method.
The calibration parameters are then used in the human gait tracking algorithm. Fig. 8a and Fig. 8b show the combined legs scan data and their center points estimation from three LiDARs after transforming to LiDAR 1 coordinate, respectively. Fig. 9 plots three center points estimation of left leg from three LiDARs and their corresponding estimation error covariance ellipses. The distances from center points to the LiDARs are also calculated. We can see that when the distance increases, the error ellipse is also bigger. This estimation error covariance is used as an adaptive weight in the spline algorithm (P * i in (36)). This means a bigger weight is given on the closer center point, and vice versa.
Based on the assumption that the leg velocity is minimized when the foot is totally on the ground, we can apply a simple  threshold-based method to the estimated spline velocity to detect the walking step as in [24]. Fig. 8c shows the estimated spline trajectories of leg center points (dots) and the detected walking step (diamonds). The left and right legs are classified and represented by black and magenta dots, respectively.
We compare the estimation results from the single LiDAR system and the multiple three LiDARs system with the optical camera system. The total walking steps of two subjects are 90 for each left or right side. Fig. 10 shows the step length estimation errors of left and right legs. We can see the mean VOLUME 9, 2021 FIGURE 11. The RMSE (m) of the step length estimation from single LiDAR and three LiDARs systems. and the maximum of the step length estimation errors of multiple LiDARs system are smaller than a single LiDAR system. After removing the outliers, the root mean square errors (RMSEs) of the step length estimation for each speed levels are given in Fig. 11. As we can see, the RMSEs increase when the walking speed increase for both single and multiple systems. The multiple LiDARs system has equal or better results compared with a single LiDAR system. As the walking speed increases, the leg scan segment data decreases that downgrades the accuracy of walking step length estimation. For the slow and normal walking speeds, the maximum RMSEs are about 0.01 m and 0.02 m, respectively. With the fast walking speed, the maximum RMSEs of multiple LiDARs system are about 0.030m and 0.015m for left and right legs, respectively.

2) LINE CONFIGURATION EXPERIMENT
The second experiment used the line configuration of three LiDARs as in Fig. 12. Each subject was asked to walk straight from the start to the end footprints in 10 times with normal and fast speed levels. The distance between two footprints is measured by a tape and equals to 20m. In this configuration, each LiDAR covers a certain distance sequentially, so that it can maximizes the tracking range of the system. An example of the available leg scan data segments of each LiDAR is shown in Fig. 13.
In this configuration, the calibration process was repeated on two LiDARs due to the large distance between them. The cylinder was placed at different positions between LiDARs 1-2 and LiDARs 1-3. Fig. 14 shows the calibration results of this configuration. Using the estimated calibration parameters, we transformed the leg scan data from LiDAR 2 and 3 systems into LiDAR 1 system. The transformation results are plotted in Fig. 15a. Fig. 15b shows the estimation results of the legs center points. We can see the occlusion between   legs during walking since all the LiDARs are on one side of walking path. The estimated spline trajectories of left and right legs are plotted in Fig. 15c. Table 5 shows the estimated results of total walking distance. The estimated walking distances have maximum errors of 0.155m (0.78%) and maximum RMSE of 0.106m.
Through experimental results, our proposed method can estimate human gait parameters from multiple 2D LiDARs. However, the walking movement is limited in a horizontal plane because the 2D LiDAR only captures the depth data in two dimensional plane. In this study, only young  and healthy people are included. The performance is better with lower walking speed which indicates that the system could also be applied for older person. However, there are many characteristics of older person that could affect to the system performance so that an experiment on the older person is necessary in the future.

V. CONCLUSION AND FUTURE WORK
In this paper, a multiple 360 degree 2D LiDAR system is used to estimate the human gait. The system is portable, low cost and can be deployed easily. With the use of multiple LiDARs, the losing data, which are caused by the occlusion between legs, is compensated and the tracking range can be increased. In the multiple sensors system, the performance critically depends on the calibration between sensors. To solve this problem, we proposed an iterative algorithm in which a cylinder with known radius is used as a specific marker. The estimation of cylinder's center points in each LiDAR coordinate system gives the optimal values of calibration parameters. After combining the data, a quadratic optimization based splines algorithm is derived to interpolate the missing data and smooth the trajectory.
The measurement noises, which are neglected in most study, are also considered in the LiDAR output. The scan data at different distance shows the effect of distance on measurement noises. Therefore, an assumption of measurement noises is given in which the standard deviation of heading angle measurement noise is a small constant value and the standard deviation of distance measurement noise is a linear function of measured distance. This measurement data model is used in the calibration process and the human leg center point estimation to enhance the accuracy.
Two configurations of three LiDARs system are given to verify the proposed method. The triangle configuration can observe the human leg scan from both side of walking path, so that it can reduce the occlusion between legs. The line configuration is to extend the tracking range of the system by arranging the LiDARs sequentially. The experimental results confirm the tracking performance of our proposed system. By using 2D LiDARs, the computational cost is less then 3D LiDAR system that make our system possible to work in real time. This will be considered in future work. In addition, the leg scan data is currently fit by a circle with a fixed radius; however, when the leg is moving, the shape of leg scan data is usually larger. Another model of moving leg could be used to improve the system performance.