Relationship Between Long-Term Variability of Facial Hue Information in Physiological and Psychological ROIs and Health Condition

In this study, we evaluate the relationship between long-term variability of facial hue information in physiological and psychological regions of interests (ROIs) and health condition. Long-term facial hue information, RGB, and L*a*b* color space, were obtained from 13 physiological and psychological ROIs in visible facial images measured for approximately six months. In this study, the evaluation was performed based on four conditions (absolute value of RGB, absolute value of L*a*b*, relative value of RGB, and relative value of L*a*b*). These four long-term variabilities of facial hue information were applied to principal component analysis to clarify the relationship between facial hue information and health condition. The results suggested that long-term variability of chromaticity information around the periorbital region may be related to health conditions. Furthermore, physiological and psychological states related to health condition were estimated with the accuracy of approximately 80% using relative chromaticity information around the periorbital region.


I. INTRODUCTION
The spread of COVID-19 increased in the last few months, and there have been an increasing number of situations wherein non-contact sensors are being employed to remotely detect people with fever and measure their body temperature. Moreover, there is a growing demand for remote vital sign measurement, as evidenced by the widespread use of applications to estimate heart rate [1], [2], respiratory rate [3], [4], and oxygen saturation [5], [6] from facial images captured by in-camera smartphones.
The face is an organ that is exposed at all times and contains a variety of information such as facial expressions, facial skin temperature, and facial color. Based on this information, an evaluation of various physiological and psychological states, such as emotion, mental stress, and drowsiness, has The associate editor coordinating the review of this manuscript and approving it for publication was Xinyu Du . been performed. Work on emotion recognition based on facial expression has been afoot for a long time. Recently, deep learning algorithm has been employed and recognition with high accuracy has been implemented [7], [8]. However, facial skin temperature and facial color are known to fluctuate due to skin blood flow controlled by autonomic nervous system activity, and thus, they have been used in many studies as remotely measurable indicators of autonomic nervous system activity. Studies dealing with facial skin temperature have included the estimation of mental stress based on the time-series variation of nasal skin temperature [9], [10]. In addition, there exists a causal relationship between skin temperature in the forehead, cheeks, and lips as well as the nasal region and physiological and psychological states, which has been established in our previous study [11]. Studies dealing with facial color have included the assessment of emotion [12]- [14], melanin or hemoglobin distribution [15]- [17], skin pigmentation [18], and the relationship between facial expression and facial color [19]. However, most research on physiological and psychological state assessment based on facial color has been conducted to assess short-term emotions.
In daily life, we have a habit of subjectively judging a person's health condition, such as fatigue or lack of sleep, by looking at his or her facial skin color, such as dark rings under the eyes. Recently, the investigation for color variation in facial skin and health condition has also been conducted. Jones et al. evaluated color variation in facial skin of forehead, cheek, and periorbital areas [20]. It was found that although there was little correlation between skin color and health, the luminance of the periorbital region in particular fluctuated depending on the health condition. However, there is a major limitation in this study in that the long-term measurements of facial visible images were not performed and the evaluation areas on face were limited. A previous study by Jones et al. measured facial images of large group of women simultaneously, and divided the images into two groups, good and bad physical conditions. Thereafter, hue information in forehead, periorbital, and cheek regions between the two groups was compared. To realize the assessment of health and other conditions based on facial color, it is necessary to evaluate the long-term changes in facial color in various regions.
In this study, an evaluation of long-term variability of facial hue information in 13 physiological and psychological ROIs was performed. Furthermore, estimation of physiological and psychological states related to health condition based on long-term variability of facial hue information was performed.

II. EXPERIMENT
To achieve the purpose of this study, we measured the facial visible images and subjective sensations related to health condition over a long period of time and clarified the relationship between long-term variability of facial hue information and subjective sensations. In this section, we describe the experiment conducted to measure facial visible images and subjective sensations related to health condition over a long period of time.

A. PROTOCOL OF THE EXPERIMENT
In this study, an experiment for measuring facial visible images and subjective sensations related to health condition was performed for approximately six months (from November 2020 to May 2021). Measuring facial visible images and subjective sensations was performed using a self-made application on an iPad Air (3rd generation, Apple, California, USA). The subjects answered a four-point scale questionnaire regarding their health condition, and then measured their own facial visible images using the camera of the iPad.
The questionnaire has four question items inquiring about the awake condition, physical condition, comfort, and energy. The four-point scale for question items related to health condition are listed in Table 1. A lower score indicates a good health condition, while a higher score indicates a poor health condition. In this study, total score of the four subjective sensations was used as an indicator of health condition.
After answering the questionnaire, the subject measured their own facial visible image using the camera of the iPad. Measured facial visible images were saved in JPG format of size 2894 × 2316 pixels.

B. EXPERIMENTAL CONDITIONS
The experiment was performed in the room where the temperature was 25±1 • C and the illumination was approximately 900-1000 lux. The illumination of the experimental room was not affected by sunlight or other factors. The subjects were 24 healthy adults (18 males and 6 females, age: 21-50 yrs). The subjects participated in the experiment for approximately six months and 506 sets of data (results of questionnaire and facial visible images) were obtained. Before performing the analysis, we visually checked that the color of obtained facial visible images were not saturated. The study obtained official approval from the Life Science Committee of the Department of Science and Engineering, Aoyama Gakuin University (Approval number: H17-M13-3).

III. ANALYSIS OF FACIAL HUE INFORMATION IN PHYSIOLOGICAL AND PSYCHOLOGICAL ROIs
In this section, methods for analysis of long-term variability of facial hue information in 13 physiological and psychological ROIs were discussed. The evaluation of long-term facial hue information in 13 physiological and psychological ROIs and the estimation of health status based on this hue information were performed in the flow shown in Fig.1. First, the absolute and relative values of hue extracted from the 13 ROIs on the face region, represented in RGB and L*a*b* color spaces, were obtained by applying image processing algorithms to the face visible images. Next, principal component analysis (PCA) was applied to the absolute and relative hue information and the indicator of health condition to identify hue information related to health condition. Finally, classification of health condition using linear discriminant (LD), support vector machine (SVM), and  neural network (NN) based on hue information related to health condition was performed. The details were described below. Figure 2 shows overview of the construction of physiological and psychological ROIs, which were constructed based on facial landmarks. In a previous study, a causal relationship was found between skin temperature fluctuations in different facial regions such as the nose, cheeks, upper orbit, lower orbit, forehead, and chin, and physiological and psychological states related to emotion [21]. In this study, 13 physiological and psychological ROIs were constructed. The physiological and psychological ROIs were located in 1: The methods for constructing the ROIs were shown in below. First, a total of 68 facial landmarks were extracted on the saved facial visible images using constrained local model, which is used in facial expression recognition [22], [23]. In this study, the landmarks were obtained using the methods proposed by Yan [24]. Next, position, width and height of the 13 ROIs were determined based on coordinates of the landmarks. Table 2 shows the top-left corner coordinates, width and height of each ROI described in Fig.2(b). x n and y n indicates x and y coordinates of the n th landmark described in Fig.2(a). The size of the ROIs differed slightly among the subjects, but as described in section III B, the average value of the color information in the ROI was analyzed, so the difference in the size of the ROIs had no effect.

B. EVALUATION OF LONG-TERM VARIABILITY OF FACIAL HUE INFORMATION IN PHYSIOLOGICAL AND PSYCHOLOGICAL ROIs
In general, hue information is expressed using the RGB color space. Most studies using facial visible images have evaluated the absolute value of RGB.
However, there are two major problems concerning evaluating the absolute value of RGB. First, human perception of changes in color values is different in RGB color space. However, because the RGB color space is a color model intended for output to displays, human perception of changes in color values is different. Considering the fact that the state of health from facial color may be judged in a variety of situations, it is suggested that evaluation based on a color space that considers perceptual uniformity in response to changes in color values is necessary. Second, absolute value of hue information can be greatly affected by disturbance, such as outside light. Furthermore, there are individual differences in skin color.
Therefore, we focused on the following two considerations. One is to use the CIE 1976 L*a*b* color space as hue information, which is a color space represented by lightness L and chromaticities a and b, and indicates hue and  A conceptual diagram of evaluating relative value of hue information in face. The skin color, which is mainly related to blood flow, fluctuates depending on physiological and psychological conditions such as physical condition and emotions and fluctuates greatly in areas with many blood vessels and small in areas with few blood vessels. The color of the face could not be affected by skin color components by evaluating relatively by the difference value between areas with many blood vessels and with few blood vessels. L*a*b* color space is designed to approximate human vision, so color differences can be handled equally and perceptually.
The other is to evaluate using relative value of hue information. A conceptual diagram of evaluating relative value of hue information in face is shown in Fig.3. Facial color consists of original color of skin (''Skin color'' in Fig.3) and color component relative to skin blood flow (''Blood flow component'' in Fig.3). The skin color, which is primarily related to blood flow, fluctuates depending on physiological and psychological conditions such as physical condition and emotions and fluctuates greatly in areas that have many blood vessels and less in areas with fewer blood vessels. It can be assumed that the color of the face is not affected by skin color components by evaluating it relatively considering the difference value between areas with many blood vessels and those with fewer blood vessels. In other words, we considered that it is possible to evaluate the variation in blood flow associated with the variation in physiological and psychological states by evaluating the difference in hue information within two ROIs. Therefore, two ROIs were selected from the 13 ROIs, and the differences in their hue information were evaluated.
Therefore, the long-term variations of absolute value of RGB, absolute value of L*a*b*, relative value of RGB, and relative value of L*a*b* were evaluated. The methods for obtaining the hue information were shown below. The absolute value of RGB for each ROI was obtained by averaging the RGB values within each ROI. The relative value of RGB was obtained by selecting two of the 13 ROIs and subtracting the absolute value of RGB within the two ROIs. The number of relative values of RGB was 78 (i.e., 13 C 2 ). The absolute and relative values of L*a*b* were obtained by following method [25]   Next, linear and nominal values were converted into CIE XYZ color space as follows: Finally, CIE XYZ color space was transferred into CIE 1976 L*a*b* color space as follows: where q ∈ X /X n , Y /Y n , Z /Z n , X n , Y n , and Z n are tristimulus values of the illuminant. Matlab 2021a (MathWorks, Massachusetts, USA) was used to convert from RGB color space to Lab color space.

C. CAUSALITY ASSESSMENT BETWEEN FACIAL HUE INFORMATION AND SUBJECTIVE SENSATIONS
It is expected that there is facial hue information which varies greatly or not among facial hue information measured for long-time. Furthermore, it is expected that some of the hue information that fluctuates due to health condition, while others fluctuate due to other factors. To identify the facial hue information which varies due to health condition, principal component analysis (PCA) was applied to the longterm variations of absolute value of RGB, absolute value of L*a*b*, relative value of RGB, relative value of L*a*b*, and the indicator of health condition. PCA was performed for absolute value of RGB, absolute value of L*a*b*, relative value of RGB, and relative value of L*a*b* respectively. The input data for PCA were 506 sets of the standardized hue information and indicator of health condition. The principal components up to a cumulative contribution rate of more than 80% were included in the analysis. The number of principal components that were analyzed are shown in Table 3. As listed in the table, the number of principal components extracted ranged from 4 to 10.
As mentioned above, it is expected that some of the hue information that fluctuates due to health condition,  while others fluctuate due to other factors. To identify the principal component which is related to health condition, we selected principal components with large absolute factor loadings related to indicator of health condition. In the case of absolute value of RGB, absolute value of L*a*b* and relative value of RGB, one principal component with large absolute factor loadings related to indicator of health condition was found, while the factor loadings of the remaining principal components were low. As for the case of relative value of L*a*b*, two principal components with large absolute factor loadings related to indicator of health condition were found, while the factor loadings of the remaining principal components were low. Therefore, two principal components with large absolute factor loadings related to indicator of health condition (Define as PC_A and PC_B) were analyzed in the case of relative value of L*a*b*.
In addition, to identify hue information related to health condition, the hue information with the top 10 principal component scores in the principal components related to health condition (Hereinafter referred to as ''hue information related to health condition'') were selected.

D. CLASSIFICATION OF HEALTH CONDITION BASED ON FACIAL HUE INFORMATION
In this section, detail of an evaluation for estimating health condition based on hue information related to health condition was described.
In this study, the accuracy of the classification into a good health group, which the indicator of health was low and, and a poor health group, which the indicator of health was high, was evaluated. The main problem was that the number of data for the poor health group was small. Therefore, synthetic minority oversampling technique (SMOTE), an oversampling algorithm, was applied to the facial hue information and the indicator of health. In this study, the number of data with indicator of health of 6 or more was made equal to the number of data with the indicator of health of less than 6. The number of data in each score range in the case of before and after applying SMOTE are shown in Table 4.
The aim of this study was to detect poor health condition, but the number of data, which the indicator of health was very high, was small, even after applying oversampling. Considering the number of data in the poor health group, the threshold dividing into good and poor health group was set to 9. Therefore, groups which the indicator of health condition was less than 9 were designated as good health group, and groups which the indicator of health condition was greater than 9 were designated as poor health group.
Classification of health condition based on oversampled data was performed using linear discrimination (LD), quadratic support vector machine (SVM), and 2-layer neural network (NN) created in MATLAB 2021a (Mathworks Inc., Massachusetts, USA). Furthermore, evaluation for classification was performed based on confusion matrix, obtained from the result of 5-fold cross validation. In this study, accuracy and precision were evaluated.

A. PRINCIPAL COMPONENT ANALYSIS
Selected absolute and relative hue information of RGB and L*a*b* related to health condition were shown in Tables 5 (absolute RGB and L*a*b*) and 6 (relative RGB and L*a*b*), respectively. In table 5 The selected principal components with large absolute factor loadings related to indicator of health condition were the 2 nd (absolute RGB), the 4 th (absolute L*a*b*), the 5 th (relative RGB), the 7 th (relative L*a*b*, PC_A), and the 5 th (relative L*a*b*, PC_B) principal component. From Table 5, 5 hue information in periorbital area were selected in the case of absolute RGB and L*a*b*. In addition, 9 out of 10 hue information in the case of absolute L*a*b* were a and b, which represent the chromaticity information. Further, from Table 6, 10 relative hue information in periorbital area were selected in the case of relative RGB, and 6 relative hue information in periorbital area in the case of relative L*a*b* (PC_B). Furthermore, 9 out of 10 hue information in the case of relative L*a*b* (PC_B) were a and b, which represent the chromaticity information. As for the case of relative L*a*b* (PC_A), almost all the selected hue information were L, which represents the lightness information. Table 7 and 8 present the accuracy for classification of health condition and precision based on facial hue information. From Table 7 and 8, it is evident that all accuracies and precisions in the case of relative L*a*b* (PC_B) were the highest in the three classification methods. In contrast, for relative L*a*b* (PC_A), the accuracies and precisions were low in LD and quadratic SVM. Confusion matrix for PC_B, 2-layer NN is shown in Fig.4. In this case, the good health group and the poor health group were treated as negative and positive, respectively. As shown in Fig.4, accuracy, recall, precision, and F-measure were determined to be 86.9%, 89.3%, 85.1%, 87.2%, respectively.

B. CLASSIFICATION OF HEALTH CONDITION BASED ON FACIAL HUE INFORMATION
The accuracies for classification of health condition were higher in the case of using L*a*b* value compared to the RGB value and using relative value of hue information compared to absolute value of hue information. Here,   we performed a detailed analysis focusing on relative value of L*a*b*. Plot of factor loadings of PC_A and PC_B, and scatter plot of principal component score of PC_A and PC_B are shown in Figs. 5 and 6, respectively. In Fig. 5, L, a, and b indicate their respective relative values and ''Health'' indicates the factor load related to health condition. Furthermore, the number in the figure indicates the number of physiological and psychological ROIs as shown in Fig.2(b). For example, '(1-7)_L' indicates the difference between L value in right cheek and philtrum. In Fig.6, blue and red scatter plots indicate the data of poor health group and good health group. As shown in Figs. 5 and 6, the worse the health condition was, the larger the values of ' (10)(11)(12)  Groups which the indicator of health condition was less than 9 were designated as good health group, and groups which the indicator of health condition was greater than 9 were designated as poor health group.
As explained in Section III. B, evaluating RGB value presents major problems. From the result of relative L*a*b* (PC_A), almost all the selected hue information were L, which represents the lightness information and the accuracies for classification of health condition were relatively low. The previous study also showed that hue information around the periorbital region in particular fluctuated depending on the health condition [20]. Furthermore, Axelsson et al. showed that darker coloration around the periorbital region was increased by the lack of sleep, which had a negative impact on perceived health [26]. Thus, the results indicate that long-term variability of chromaticity information around the periorbital region can be related to health condition.

V. LIMITATIONS
There are four major limitations in this study.
The first limitation is that the measurement location. In this study, facial visible images, which were measured in a place with a constant lighting environment, were analyzed. However, visible images can be affected by disturbance, such as outside light. In addition, it is necessary to confirm whether the proposed method can be used to analyze facial hue information in visible face images acquired in different environments. Thus, for analyzing facial visible images, measurements need to be performed in a place with a varied lighting environment.
The second limitation is that the number of subjects is small. In this study, the number of female subjects is six. In addition, females often wear makeup, and their facial skin color changes depending on the condition of the makeup. In future studies, the number of female subjects should be increased and there is a need to further assess the long-term variability of facial skin color in women and their health condition based on it.
The third limitation is that the analysis area is limited. In this study, 13 physiological and psychological ROIs were selected as the analysis area, and it was confirmed that long-term variability of chromaticity information around the periorbital region could be related to health condition. Using the information of skin color of the entire face is expected to further improve the accuracy of health condition estimation. In future studies, an evaluation of long-term variability of hue information of the entire face is needed.
The fourth limitation is that facial hue information could be changed by other factor which is not related to health condition. In this study, we focused on health condition related to awake, physical condition, comfort, and energy. Facial hue could be changed by other factor which is not related to health condition, such as stress, shame, fear, exercise, anger, cold, hot. Experiment was conducted in a room with a constant room temperature and it was suggested to be no effects of cold or hot, but it is necessary to perform an evaluation of the relationship between facial hue information and other factor which is not related to health condition in our future study.

VI. CONCLUSION
In this study, an evaluation of long-term variability of facial hue information in physiological and psychological ROIs was performed. In addition, estimation of physiological and psychological states related to health condition based on long-term variability of facial hue information was also performed. Based on the results, it was concluded that long-term variability of chromaticity information around the periorbital region could be related to health condition, and physiological and psychological states related to health condition could be estimated with an accuracy of around 80% using relative chromaticity information around the periorbital region. However, analyzing facial visible images, which are measured in a place with a varied lighting environment, increasing the number of subjects, an evaluation of longterm variability of hue information on the entire face, and an evaluation of the relationship between facial hue information and other factor which is not related to health condition are needed in future studies to gain better insights.