Burbea, J. ; Rao, C. (1982) On the convexity of higher order Jensen differences based on entropy functions IEEE Transactions on Information Theory, 28 (6). pp. 961-963. ISSN 0018-9448
Full text not available from this repository.
Official URL: http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumb...
Related URL: http://dx.doi.org/10.1109/TIT.1982.1056573
Abstract
In an earlier work, the authors introduced a divergence measure, called the first-order Jensen difference, or in shortcal j-divergence, which is based on entropy functions of degreealpha. This provided a generalization of the measure of mutual information based on Shannon's entropy (corresponding to β = 1). It was shown that the first-ordercal j-divergence is a convex function only when a is restricted to some range. We define higher order Jensen differences and show that they are convex functions only when the underlying entropy function is of degree two. A statistical application requiring the convexity of higher order Jensen differences is indicated.
Item Type: | Article |
---|---|
Source: | Copyright of this article belongs to IEEE. |
ID Code: | 71852 |
Deposited On: | 28 Nov 2011 04:10 |
Last Modified: | 28 Nov 2011 04:10 |
Repository Staff Only: item control page