Uncertainty Measure of Basic Probability Assignment Based on Renyi Entropy and Its Application in Decision-Making

Since the Dempster-Shafer evidence theory was developed, it has been extensively concerned by researchers. Compared with Bayesian probability theory, Dempster-Shafer evidence theory satisfies weaker constraints and has the advantage to indicate uncertainty, so it is widely used in the information system. However, how to measure the uncertainty of basic probability assignment in Dempster-Shafer evidence theory is still a problem worthy of attention. Therefore, based on Renyi entropy, this paper proposes a novel method to measure the uncertainty of basic probability assignment in Dempster-Shafer evidence theory. In addition, after proving that this method is compatible with Shannon entropy, a large number of comparative experiments are carried out to illustrate its effectiveness. Finally, through the application in decision-making, it is proved that the combination rule considering uncertainty can produce more reasonable results.


I. INTRODUCTION
The real world is full of uncertainty, and the existence of uncertainty will interfere with our correct judgment of things. In order to solve this issue, a number of theories have been proposed, such as Bayesian probability theory [1], Dempster-Shafer evidence theory [2]- [4], generalized evidence theory [5], [6], soft set [7], rough set [8], [9], D-number [10], Z-number [11], [12], Pythagorean fuzzy set [13], and other theories. In addition, based on these theories, a number of researchers have developed many methods [14]- [19]. Among these theories and methods, Dempster-Shafer evidence theory has been widely developed by researchers because of its weak constraints [20]. Because of this advantage, Dempster-Shafer evidence theory is applied in a number of fields, such as DS-VIKOR [21], Dempster-Shafer Electre [22], measuring divergence [23], multi-sensor data fusion [24], decisionmaking [25], and many other fields. However, how to measure the uncertainty of basic probability assignment (BPA) in Dempster-Shafer evidence theory is still a problem worthy of attention.
In 1865, the concept of entropy [26] was first proposed by German physicist Clausius. In the beginning, he only applied The associate editor coordinating the review of this manuscript and approving it for publication was Ali Kashif Bashir .
it to the study of chemistry and thermodynamics. In 1948, Shannon proposed Shannon entropy [27] and introduced it to information theory for the first time. In the following decades, many researchers continue to improve Shannon entropy and use entropy theory to measure the uncertainty of BPA in Dempster-Shafer evidence theory [28]- [32]. When measuring uncertainty, researchers usually start from inconsistency and non-specificity. Therefore, there are three famous entropy models in entropy theory based on the two aspects: (1) Dubois and Prade proposed weighted Hartley entropy [33]. This model measures the uncertainty of BPA from a non-specific point of view (2) Yager investigated the inconsistency of BPA based on the plausibility function [34]. (3) Based on Shannon entropy, Kilr considers the disorder and conflict measurement of BPA [35], [36], and proposes five axioms of the entropy model.
Recently, Deng's entropy [28] and Gao's method based on Tsallis entropy [31] are proposed to measure the uncertainty of BPA. Experiments show that these two kinds of entropy have superior performance in uncertainty measurement, and meet the five axioms proposed by Kilr [36].
Another entropy widely used in image segmentation is Renyi entropy [37], which is also an extension of Shannon entropy. Renyi entropy introduces an adjustable parameter q, which makes the measurement of information more general and flexible. However, what should be noted in Renyi entropy is that how to define the parameter q. Therefore, we propose a new BPA uncertainty measurement model based on Renyi entropy and determines the value of q by identifying the size of the frame of discernment and the cardinality of BPA. It's proved that the model is compatible with Shannon entropy.
Other sections of this paper can be organized as follows. In section II, we will review classical Dempster-Shafer evidence theory and introduce three methods of uncertainty measuring. In section III, the mathematical formula of Renyi entropy is reasoned and rewritten. After it, a novel method of measuring BPA's uncertainty is proposed. In addition, we also summarize several properties of Renyi entropy model through mathematical proof. In section IV, several numerical experiments are carried out to verify the effectiveness of the proposed model. After comparing with other methods, our method is superior. In section V, we apply the proposed method to the decision-making field. The experimental results show that the proposed method is reasonable. In section VI, this paper summarizes the advantages and discusses the further research of the model.

II. PRELIMINARIES
In this section, the traditional Dempster-Shafer evidence theory proposed will be reviewed, and several important entropy models will be introduced.

A. EVIDENCE THEORY
The treatment of uncertainty problems has promoted the birth of Dempster-Shafer evidence theory. And Dempster-Shafer evidence theory [38]- [42] has become a useful tool for researchers to measure uncertainty because of its weaker constraints than Bayesian probability theory [20]. Therefore, this section will introduce several important definitions.
Definition 1: Suppose that is a nonempty finite set. Each element A is mutually exclusive and corresponds to a possible proposition. The set is known as the frame of discernment.
Definition 2: Suppose there are n elements in the set, then the length of it is n, and its power set with 2 n samples can be defined as: Definition 3: Denote there is a frame of discernment = {A 1 , A 2 , . . . , A n }. For any subset A of the frame of discernment, let it correspond to a function m, which satisfies the following definitions [43]: with Such a function m is named mass function or basic probability assignment (BPA).

Definition 4:
For any subset A of , the plausibility function and belief function can be defined as follows [43]: According to the above definition, it's easy to know that the relationship between plausibility function and belief function satisfies Pl (A) = 1 − Bel A . In addition, for ∀A ∈ , the numerical relationship between them is Bel (A) < Pl (A). Therefore, if we regard Bel (A) as the lower bound and Pl (A) as the upper bound, we can obtain the belief interval of A, that is [Bel (A) , Pl (A)]. And the belief interval can represent the support degree of evidence [43].
Definition 5: The reason why Dempster-Shafer evidence theory can be developed is that Dempster-Shafer proposed a combination formula for each evidence. Suppose the system has generated two BPAs, the fusion rule between them can be represented as: where K ∈ [0, 1) is the conflict coefficient between evidences. The greater the K is, the stronger the conflict will be. And the smaller the K is, the weaker the conflict will be. The definition of K is represented as [43]: These are the main definitions of Dempster-Shafer evidence theory.

B. SHANNON ENTROPY
Entropy in chemistry and thermodynamics is a measure of the total amount of energy that cannot do work in dynamics [26]. In the beginning, it is used to represent the disorder phenomenon in the system. Shannon introduced the concept of entropy into the information field in 1948 [27]. He proposed Shannon entropy to express the uncertainty of a piece of information. Let H = (p 1 , p 2 , . . . p n , ) be a finite discrete distribution set, where i is the event and p i is the probability corresponding to the event. Therefore, for ∀p ∈ H , p i ≥ 0, with [27] n i=1 p i = 1 (9) Definition 6: On the basis of the above conditions, Shannon entropy formula is represented as [27]: Through the formula, it is not difficult to find that the greater the probability of the event, the less uncertainty of information contained. VOLUME 9, 2021 C. DENG ENTROPY Recently, to measure the uncertainty of BPA, Deng proposed Deng entropy based on Shannon entropy [28]. He verified the superiority of Deng entropy in many aspects through examples.
Definition 7: Assuming that the system has generated a BPA for subset A, Deng entropy can be defined as [28]: where |A| is the number of elements of A. In order to simplify the operation, Deng defined the change form of Deng entropy from the perspective of nonspecificity and discord, which is represented as [28]: where m (A) log 2 2 |A| − 1 is the total nonspecificity of BPA. And m (A) log 2 m (A) is the measure of discord of BPA.

D. TSALLIS ENTROPY
In 1988, Brazilian physicist Constantino Tsallis proposed an entropy expression with exponent q, which is a non-extensive entropy S q [44]. Gao et al. used Tsallis entropy to measure uncertainty from the side of information field [31].
Definition 8: The definition of Tsallis entropy can be represented as [44]: where α is the Boltzmann constant. When q → 1, Tsallis entropy degenerates to Shannon entropy. Definition 9: The method based on Tsallis to measure the uncertainty proposed by Gao can be defined as [31]:

Renyi entropy is a new entropy model proposed by Alfred
Renyi [37]. Alfred extended the probability distribution of Shannon entropy and introduced an exponential parameter θ (In order to distinguish the exponential parameter in Tsallis entropy, we use θ to replace the parameter q in Renyi entropy). Definition 10: The basic formula of Renyi entropy can be represented as [37]: where θ is the exponential coefficient of Renyi entropy. Similar to Tsallis entropy, if θ → 1, Renyi entropy will degenerate to Shannon entropy. To simplify the operation, we can rewrite Renyi entropy as Eq.16 [31].

III. A NEW METHOD FOR MEASURING UNCERTAINTY A. MEASURE UNCERTAINTY BASED ON RENYI ENTROPY
Shannon introduced entropy theory into the field of information, which broadened the research of uncertainty measurement [27]. Since Shannon entropy was proposed, entropy theory has been widely developed and improved [45]- [49]. However, in Dempster-Shafer evidence theory, how to calculate the uncertainty of BPA is still a developmental problem. Therefore, based on Renyi entropy, we propose a novel approach to handle the uncertainty of BPA. Suppose there has been a frame of discernment , the value of each subset attaches to a BPA and the size of each subset is n. In Dempster-Shafer evidence theory, the problem of probability assignment of empty set is not considered, so there are at most 2 n − 1 possible potential states for each BPA. Therefore, we only need to divide each BPA by 2 n − 1 to represent the uncertainty measure of it. In addition, we can replace the value of exponent parameter θ in Renyi entropy with the cardinality of every BPA. Thus, the method can be defined as follow: It should be noted that when n = 1, R (m) will degenerate into Shannon entropy. The proof process of this property is as follows:

B. THE PROPERTY OF PROPOSED MODEL
In this section, we will discuss the specific properties of Renyi entropy model.
Proof: Analyzing Eq.17, it's easy to know that . . , A n }, that is to say, all the elements in are a set of a single proposition, the Renyi entropy will degenerate to Shannon entropy.
Proof: Define function: The form of this model is the same as Shannon entropy. Therefore, in Dempster-Shafer evidence theory, the uncertainty model proposed by us based on Renyi entropy is compatible with the Shannon entropy. Proof: Let x = |A|, when m (A) = 1, the model can change into , and we do the following operations It's easy to know that Therefore, the function increases monotonically in the interval [2, +∞], and the minimum value of g (x) is g (2).
Therefore, R (m) does not satisfy the set consistency. Theorem 4 (Nonadditivity): Denote X and Y be two different frames of discernment. If the set satisfies additivity, then Example: Suppose there are two frame of discernments X = {x 1 , x 2 }, Y = {y 1 , y 2 , y 3 } and the system has generated four BPAs, they are m (X ) = 1, m (Y ) = 0.7, m (y 1 ) = 0.2, m (y 2 , y 3 ) = 0.1 respectively. The result of m (X × Y ) are as follows: Use Renyi entropy model to calculate the uncertainty of BPAs, the results are as follows: It's easy to see that R (m (X )) + R (m (Y )) = R (m (XY )). Therefore, the model proposed in this paper does not satisfy the additivity.
Example: The example in theorem 4 is also applicable to this theorem. Therefore, the model based on Renyi entropy does not satisfy subadditivity.

IV. EXAMPLES AND DISCUSSION
In this section, we will use a large number of calculation examples to verify the effectiveness of Renyi entropy model. In addition, at the end of this section, we will compare the proposed method with other methods and existing entropy.
From the results, we can find that for BPA of a single subset proposition, if its probability of occurrence is 1, then the amount of information it contains is 0. The uncertainty of BPA measurement based on Renyi entropy produces a common-sense result, and this example also verifies the non-negativity of our method.
By analyzing examples 4.1 and 4.2, it can be found that when each BPA is a single subset proposition, Deng entropy, Gao's method and the method proposed in this paper are equal to Shannon entropy. In addition, this example verifies the probability consistency of R (m), and also shows that Dempster-Shafer evidence theory is an extension of Bayesian probability theory.
We can find that under the same frame of discernment, the result of this example is greater than that of example 4.2. But our method is reasonable. This is because when BPA is not a proposition of a single subset, probability distribution no longer conforms to Bayesian probability theory. On the one hand, our proposed method considers the cardinality of BPA, and the more information contained in BPA, the greater the uncertainty of it. On the other hand, the elements in BPA may interact with each other, which will also lead to an increase in uncertainty. Therefore, the results obtained by our method are reasonable.  Since the elements are mutually exclusive, the uncertainty should vary linearly. In Fig. 1, it's easy to find that Gao's method changes exponentially. This is because Gao's method has a coefficient 2 x − 1. While Deng entropy changes linearly, and our method is very close to Deng entropy. To be noticed, R (m) increases linearly with the increase of the cardinality of uncertain information sources. In other words, the uncertainty of BPA increases linearly with the increase of parameter θ . However, Gao's method changes exponentially, which is inconsistent with reality. Therefore, from this point of view, it is reasonable that we use the cardinality of BPA to replace the parameter θ in Renyi entropy model.
From the result, we can find that when BPA assigns probability to multi subset propositions, R (m) can also effectively measure their uncertainty. Since the information in the real world is usually not a single proposition, we can know from this point that R (m) is universal. To be noticed, the value of R (m) is large in some cases.  Table 2 lists all the calculation results. The analysis results show that Gao's method increases exponentially. This may cause a problem that the amount of information contained is not very large, but the evidence uncertainty is very great. However, Deng entropy and our method grow linearly, which is in line with the development law of things. It should be noted that the value of R (m) is greater than the value of Deng entropy. This is because we use the cardinality of BPA to replace the parameter θ in Renyi entropy model, which is more reasonable.

B. COMPARATIVE ANALYSIS
In this section, we will compare the proposed method with other methods and existing entropy. And table 3 lists some of the existing uncertainty measures of BPA.
Since Shannon introduced entropy into the field of information, using information entropy to calculate uncertainty has been extensively considered by researchers. In the past decades, many researchers have proposed different measurement methods to measure uncertainty, but not all of them are effective.
For example, Dubois and Prade's method cannot satisfy the probability consistency. However, the fact is that in Dempster-Shafer evidence theory, when each BPA is a single subset proposition, Dempster-Shafer evidence theory will degenerate into Bayesian probability theory. Therefore, the proposed method should also satisfy the probability consistency. In addition, uncertainty measuring is used to calculate the amount of information. Common sense tells us that the results obtained according to the established model should be non-negative, but some methods do not perform well in this aspect.
From the previous analysis results, although Gao's method has excellent properties, when the size of BPA is large, the result of this model increases exponentially, which will lead to a huge increase in uncertainty.
However, through the above discussion, we can know that the proposed method can be compatible with Shannon entropy and also satisfy the probability consistency and nonnegativity. In terms of measurement uncertainty, the model has a linear growth and can produce results that do not violate common sense. Therefore, compared with some other methods, the method based on Renyi entropy can show superiority [50].

C. ADDRESS THE INCOMPLETE INFORMATION
In the real world, the information we collect is usually incomplete. In this case, if we use evidence theory for information fusion, we may get results contrary to common sense. Therefore, researchers put forward the generalized evidence theory [5] to solve this problem. Generalized evidence theory allows the allocation of probability to empty sets, which is also an important standard to measure the integrity of information. However, how to measure the uncertainty of incomplete information is still an open problem. Therefore, in this subsection, we will discuss this issue.
Example 4.7: Suppose there is a frame of discernment and a BPA.
where 0 ≤ a ≤ 0.5. We do not know whether the information contains element d, so it is incomplete. When calculating the uncertainty of m (∅), since the information may or may not contain d, the parameter n in Eq.17 should be determined as the maximum length of the information source, which is equal to the length of the frame of discernment. Fig. 3 shows the variation of uncertainty with parameter a. Fig. 3 shows that the uncertainty decreases significantly with the expansion of parameter a, which is consistent with common sense. When the value of parameter a is very small, the probability of assigning to the empty set is very large. The greater the value of m (∅), the greater the degree of incompleteness of information. Therefore, the degree of uncertainty is very large. On the contrary, when the value of parameter a is large, the probability assigned to the empty set is very small, and the completeness of information increases gradually. Therefore, the uncertainty will gradually decrease.
Example 4.8: In this example, we consider a completely incomplete situation. This means that we do not know the information source and cannot determine the probability distribution. Suppose there is a frame of discernment and a BPA, which are defined as follows: where X is defined in Table 1.  Fig. 4 intuitively shows this situation. When the information source contains more information and the value of parameter a is very small, the uncertainty of information is very large. This is because of the more information volume, the greater the probability of incomplete information, and the smaller the value of parameter a, the higher the degree of incomplete information. These reasons will make the uncertainty of information very large. On the contrary, when the information source contains less information and the value of parameter a is large, the uncertainty of information is small. This is because of the less information, the lower the possibility of generating incomplete information. The greater the value of parameter a, the more complete the information, thus reducing the uncertainty of information.
Examples 4.7 and 4.8 discuss the uncertainty of measuring incomplete information with the method proposed R (m) in  this paper. We get satisfactory answers through these two examples. This shows that our method can not only measure the uncertainty of complete information, but also effectively measure the uncertainty of incomplete information, which is the performance of the superiority of R (m).

V. APPLICATION
In this section, we will illustrate the effectiveness of the proposed method through the application of R (m) in decisionmaking. The flowchart of the model is shown in Fig. 5.   Step 1: Determine the frame of discernment by sensors, and then the basic probability assignment is generated to obtain the evidence source.
Step 2: Calculated the weight of each evidence. Firstly, the uncertainty of each basic probability assignment is calculated by R (m), and then the weight of each evidence can be obtained. The weight calculation formula is as follows.
Step 3: Modify the basic probability assignment according to the weight of each evidence. The formula is as follows Step 4: Use Eq.7 to combine the new basic probability assignment for t −1 times, where t is the number of evidence.
Example: Suppose that the sensor has generated a frame of discernment = {a, b, c}. The detailed calculation process is as follows: Step 1: Based on these evidences, we obtain some basic probability assignments, which are In order to better reflect the advantages of our proposed method, we compare it with several other data fusion methods, and the results are shown in Table 4. In this case, m 2 is a highly conflicting evidence. It is easy to know by analyzing the experimental results that when the evidences are highly conflicting, the traditional Dempster-Shafer fusion method will produce results that violate common sense.
Yager, Zhang and Deng considered the weight of evidence in their method, which overcomes the problem that Dempster-Shafer fusion rule cannot effectively combine conflicting evidence. However, these three methods do not notice the uncertainty of evidence. They combine the weighted evidence into a single new evidence, which is unreasonable.
Therefore, in the process of information fusion, we use the uncertainty of evidence to determine it's weight. The highly conflicting evidence has a greater weight due to its great uncertainty, which also reflects the impact of conflicting evidence on the fusion results. In addition, the information in the real world is always full of uncertainty. Hence, our method is universal and reasonable.

VI. CONCLUSION
Because of the weaker restriction, Dempster-Shafer evidence theory has been extensively developed in a number of fields. Nevertheless, how to measure the uncertainty of BPA in Dempster-Shafer evidence theory is still an open issue. Shannon first developed Shannon entropy to measure uncertainty. After that, many researchers proposed different methods based on Shannon entropy to deal with uncertainty, such as Tsallis entropy proposed by Tsallis, Dubois and Prade's weighted Hartley entropy, Deng entropy proposed by Deng, and many other methods. However, not all methods are reasonable. Renyi entropy is a new entropy proposed by Alfred Renyi based on Shannon entropy. Renyi adds a parameter θ to the model, which makes Renyi entropy more flexible and can better measure the uncertainty of general information. Because of its flexibility and generality, Renyi entropy is widely used in image segmentation and probability distribution.
Therefore, based on Renyi entropy, a new measurement uncertainty method is proposed. Through strict mathematical deduction, it can be proved that our method based on Renyi entropy is compatible with Shannon entropy and broadens the value boundary of Shannon entropy. Furthermore, this method can satisfy the non-negativity and probability consistency, so it will not produce abnormal results. In addition, we also apply the proposed method to decision-making. The example shows that the result is more reasonable after considering the uncertainty of evidence.
However, the proposed method also has some limitations. The form of the formula is complex, and the readability is poor. In addition, we do not consider the differences between evidences, which may produce errors in the process of decision-making.
This paper develops a novel approach to measure uncertainty based on Renyi entropy. Renyi proposed Renyi entropy on the basis of Shannon entropy. Due to the introduction of order parameter θ , Renyi entropy has very elastic properties. In addition, Renyi also proposed Renyi divergence based on Kullback-Leibler Divergence. It can represent the difference between probability distributions. At present, there is little work to apply Renyi entropy and Renyi divergence to Dempster-Shafer evidence theory. In addition, Pythagorean fuzzy set is also an effective way to deal with uncertainty and an extension of the intuitionistic fuzzy set. Pythagorean fuzzy set can be well applied to the field of decision-making. However, it lacks tools to express randomness and probability information. Therefore, in future work, the combination of Renyi entropy and Pythagorean fuzzy set is also a project worthy of research. Moreover, we will continue to explore the properties of these models, combine their advantages with Dempster-Shafer evidence theory, and make more contributions to the field of information fusion.