The Australian Journal of Mathematical Analysis and Applications

Home News Editors Volumes RGMIA Subscriptions Authors Contact

ISSN 1449-5910  


Paper Information

Paper Title:

Refinement Inequalities Among Symmetric Divergence Measures


Inder Jeet Taneja

Departamento de Matemática,
Universidade Federal de Santa Catarina, 88.040-900
Florianópolis, Sc, Brazil



There are three classical divergence measures in the literature on information theory and statistics, namely, Jeffryes-Kullback-Leiber’s J-divergence, Sibson-Burbea-Rao’s Jensen- Shannon divegernce and Taneja’s arithemtic - geometric mean divergence. These bear an interesting relationship among each other and are based on logarithmic expressions. The divergence measures like Hellinger discrimination, symmetric χ2divergence, and triangular discrimination are not based on logarithmic expressions. These six divergence measures are symmetric with respect to probability distributions. In this paper some interesting inequalities among these symmetric divergence measures are studied. Refinements of these inequalities are also given. Some inequalities due to Dragomir et al. [6] are also improved.

Full Text PDF:

© 2004-2021 Austral Internet Publishing