|
||||||||||||
if(isset($title)){?> }?> if(isset($author)){?> }?> |
Paper Title:
Refinement Inequalities Among Symmetric Divergence Measures
Author(s):
Inder Jeet Taneja
Departamento de Matemática,
Universidade Federal de Santa Catarina, 88.040-900
Florianópolis, Sc, Brazil
taneja@mtm.ufsc.br
URL: http://www.mtm.ufsc.br/~taneja
Abstract:
There are three classical divergence measures in the literature
on information theory and statistics, namely, Jeffryes-Kullback-Leiber’s
J-divergence, Sibson-Burbea-Rao’s Jensen- Shannon divegernce and
Taneja’s arithemtic - geometric mean divergence. These bear an
interesting relationship among each other and are based on logarithmic
expressions. The divergence measures like Hellinger discrimination,
symmetric χ2−divergence, and triangular discrimination
are not based on logarithmic expressions. These six divergence measures are
symmetric with respect to probability distributions. In this paper some
interesting inequalities among these symmetric divergence measures are studied.
Refinements of these inequalities are also given. Some inequalities due to
Dragomir et al. [6]
are also improved.
Full Text PDF: