A New Bound for the Jensen Gap With Applications in Information Theory

In this manuscript, we adopt a novel approach to present a new bound for the Jensen gap for functions whose double derivatives in absolute function, are convex. We demonstrate two numerical experiments to verify the main result and to discuss the tightness of the bound. Then we utilize the bound for deriving two new converses of the Hölder inequality and a bound for the Hermite-Hadamard gap. Finally, we demonstrate applications of the main result for various divergences in information theory. Also, we present a numerical example to verify the bound for Shannon entropy.


I. INTRODUCTION AND PRELIMINARIES
The field of mathematical inequalities and their applications has recorded an exponential and significant growth in the last three decades with considerable impact in various areas of Science such as Engineering [12], Economics [25], Mathematical Statistics [24], Qualitative Theory of Integral and Differential Equations [21], Information Theory and Coding [16], [18] etc. It is noteworthy that many innovative ideas about mathematical inequalities and their applications in various areas of Science can be developed by convexity [3], [4], [9], [11], [14], [27], [29], [32], [33]. One of the most important inequality for convex functions is Jensen inequality, which generalizes the classical convexity. This inequality is of pivotal importance, because other classical inequalities for example Hermite-Hadamard, Hölder, Ky-Fan, Beckenbach-Dresher, Minkowski's, arithmetic-geometric and Young's inequalities etc can be deduced from this inequality. An extensive literature exists regarding estimates for the Jensen gap and their applications in many branches of Science [1]- [8], [12], [15], [20], [24]- [26], [28]. In this manuscript, we present a new bound as an estimate for the Jensen gap.
In the following theorem, Jensen integral inequality has been presented [17]: (1) For deriving the main result, we need the following Green function defined on [ω 1 , ω 2 ] × [ω 1 , ω 2 ] [22]: This is a convex function with respect to both the variables s and x. Also, the following identity for the function T ∈ C 2 [ω 1 , ω 2 ] holds, which is related to the Green function (2) [22]: We organize the remaining paper as: In Section II, we present a new main result following by a remark, two numerical experiments, a proposition, two corollaries and an another remark which completes the section. Numerical experiments give the surety of the tightness of the bound which is presented as the main result, Proposition 1 presents a converse

II. MAIN RESULT
In the following theorem, we present a new bound for the Jensen gap by using functions whose double derivatives in the absolute function, are convex. and Subtracting (6) from (5), we obtain the following result Taking absolute of (7), we have Using the change of variable Since |T | is a convex function, therefore (9) takes the form Now by using the change of variable Replacing h(y) byh in (11), we get Also, Replacing h(y) byh in (13), we get Substituting the values from (11)- (14) in (10) and simplifying, we get the result (4). Remark 1: If we use the Green functions G 2 − G 5 as given in [22] instead of G in Theorem 2, we obtain the same result (4). Now we demonstrate some numerical experiments to show the tightness of the bound (4).
Now taking right hand side of inequality (5) in [13], we get It is important to note that g(c) attains its minimum value at c = 0.5 which is g(0.5) ≈ 0.1458 and thus from (16) we Hence from inequality (5) in [13], we get Similarly taking right hand side of inequality (8) in [13], we get Now l(c) attains its minimum value at c = 0.5 which is l(0.5) ≈ 0.0833 and thus from (18) we deduce Hence from inequality (8) in [13], we get From (15), (17) and (19), it can easily be concluded that the bound in (4) for the Jensen gap is better than the bounds in (5), (8) from [13]. Also, inequality (15) verifies the tightness of the bound in (4) Now taking right hand side of inequality (5) in [13], we get Now g(c) attains its minimum value at c ≈ 0.31 which is g(0.31) ≈ 0.1536 and thus from (21) we get Hence from inequality (5) in [13], we get Similarly taking right hand side of inequality (8) in [13], we obtain Now l(c) attains its minimum value at c ≈ 0.33 which is l(0.33) ≈ 0.1208 and thus from (23) we get Hence from inequality (8) in [13], we get 0.0671 < 0.1208.
Now inequalities in (20), (22) and (24) show that the bound in (4) for the Jensen gap is better than the bounds in (5), (8) from [13]. Also, inequality (20) gives the surety of the tightness of the bound in (4), towards the Jensen gap.
In the following proposition, we present a converse of the Hölder inequality as an application of the above theorem. −ω 1 ω Proof: Utilizing the inequality Now using (27) in (26), we get (25). In the following corollary, we demonstrate another converse of the Hölder inequality as an application of Theorem 2.    (iii): If p 1 < 0, we have 0 < p 2 < 1, which shows that this case is the reflection of case (ii). Therefore, replacement of p 1 , p 2 , ζ 1 , ζ 2 by p 2 , p 1 , ζ 2 , ζ 1 in (29) will lead us towards the result (30). The following corollary proposes a bound for the Hermite-Hadamard gap as an application of Theorem 2.

III. APPLICATIONS IN INFORMATION THEORY
Information theory is a branch of Science, which scientifically deals with the communication, quantification and storage of different kinds of information. Information is an abstract entity, therefore it cannot be quantified easily. A probability density function can be used for quantification of information about a certain event. A divergence can measure the difference between two probability densities. Csiszár [22] introduced a divergence known as Csiszár divergence, which is the base for other divergences for example Kullback-Leibler divergence, χ 2 -divergence, Jeffrey's divergence etc. Divergences have many applications in various fields of Science, Technology and Art for example Pattern recognition [23], Genetics [10], Applied Statistics [19], Signal processing and Coding [31] etc. Jensen inequality plays a vital role to deduce the estimates for various divergences [16], [18], [20], [28]. In this section, we present some applications of our main result for various divergences.
Remark 3: It is important to note that we can give the discrete version of the results presented in this manuscript.

IV. CONCLUDING REMARKS
A growing interest in applying the notion of convexity to various fields of science has been recorded, in the last few decades. Convex functions have some rational properties such as differentiability, monotonicity and continuity, which help pretty good in their applications. The Jensen inequality has generalized the concept of classical convexity. This inequality and the results around its gap, resolve some difficulties in the modeling of some physical phenomena. Thus, we have derived a new bound for the integral version of the Jensen gap involving functions whose absolute value of second derivative are convex. Based on this bound, we have deduced converses of the Hölder inequality as well. Also, a bound for the Hermite-Hadamard gap has been obtained. Finally, we have demonstrated some bounds for Csiszár, Jeffrey's and Kullback-Leibler divergences etc in information theory as applications of the main result. The numerical experiments, which are demonstrated in Section II not only confirm the sharpness of the Jensen inequality but also give the surety of the tightness of the bound in (4) towards the Jensen gap. An application of the main result for Shannon entropy has also been discussed through a numerical example, which verifies the bound of Shannon entropy. Also, it is important to note that the bounds around various divergences can be applied for signal processing, magnetic resonance image analysis, pattern recognition and image segmentation etc. The proposed idea may inculcate further research in the area of mathematical inequalities.