Gelfand–Yaglom–Perez theorem for generalized relative entropy functionals

Dukkipati, Ambedkar ; Bhatnagar, Shalabh ; Narasimha Murty, M. (2007) Gelfand–Yaglom–Perez theorem for generalized relative entropy functionals Information Sciences, 177 (24). pp. 5707-5714. ISSN 0020-0255

Full text not available from this repository.

Official URL: http://doi.org/10.1016/j.ins.2007.07.017

Related URL: http://dx.doi.org/10.1016/j.ins.2007.07.017

Abstract

The measure-theoretic definition of Kullback–Leibler relative-entropy (or simply KL-entropy) plays a basic role in defining various classical information measures on general spaces. Entropy, mutual information and conditional forms of entropy can be expressed in terms of KL-entropy and hence properties of their measure-theoretic analogs will follow from those of measure-theoretic KL-entropy. These measure-theoretic definitions are key to extending the ergodic theorems of information theory to non-discrete cases. A fundamental theorem in this respect is the Gelfand–Yaglom–Perez (GYP) Theorem [M.S. Pinsker, Information and Information Stability of Random Variables and Process, 1960, Holden-Day, San Francisco, CA (English ed., 1964, translated and edited by Amiel Feinstein), Theorem. 2.4.2] which states that measure-theoretic relative-entropy equals the supremum of relative-entropies over all measurable partitions. This paper states and proves the GYP-theorem for Rényi relative-entropy of order greater than one. Consequently, the result can be easily extended to Tsallis relative-entropy.

Item Type:Article
Source:Copyright of this article belongs to Elsevier B.V.
ID Code:116561
Deposited On:12 Apr 2021 06:50
Last Modified:12 Apr 2021 06:50

Repository Staff Only: item control page