|
||||||||||||
if(isset($title)){?> }?> if(isset($author)){?> }?> |
Paper's Title:
Refinement Inequalities Among Symmetric Divergence Measures
Author(s):
Inder Jeet Taneja
Departamento de Matemática,
Universidade Federal de Santa Catarina, 88.040-900
Florianópolis, Sc, Brazil
taneja@mtm.ufsc.br
URL: http://www.mtm.ufsc.br/~taneja
Abstract:
There are three classical divergence measures in the literature
on information theory and statistics, namely, Jeffryes-Kullback-Leiber’s
J-divergence, Sibson-Burbea-Rao’s Jensen- Shannon divegernce and
Taneja’s arithemtic - geometric mean divergence. These bear an
interesting relationship among each other and are based on logarithmic
expressions. The divergence measures like Hellinger discrimination,
symmetric χ2−divergence, and triangular discrimination
are not based on logarithmic expressions. These six divergence measures are
symmetric with respect to probability distributions. In this paper some
interesting inequalities among these symmetric divergence measures are studied.
Refinements of these inequalities are also given. Some inequalities due to
Dragomir et al. [6]
are also improved.
Paper's Title:
Inequalities for Discrete F-Divergence Measures: A Survey of Recent Results
Author(s):
Sever S. Dragomir1,2
1Mathematics, School of Engineering
& Science
Victoria University, PO Box 14428
Melbourne City, MC 8001,
Australia
E-mail: sever.dragomir@vu.edu.au
2DST-NRF Centre of Excellence in the Mathematical and Statistical Sciences,
School of Computer Science & Applied Mathematics,
University of the Witwatersrand,
Private Bag 3, Johannesburg 2050,
South Africa
URL:
http://rgmia.org/dragomir
Abstract:
In this paper we survey some recent results obtained by the author in providing various bounds for the celebrated f-divergence measure for various classes of functions f. Several techniques including inequalities of Jensen and Slater types for convex functions are employed. Bounds in terms of Kullback-Leibler Distance, Hellinger Discrimination and Varation distance are provided. Approximations of the f-divergence measure by the use of the celebrated Ostrowski and Trapezoid inequalities are obtained. More accurate approximation formulae that make use of Taylor's expansion with integral remainder are also surveyed. A comprehensive list of recent papers by several authors related this important concept in information theory is also included as an appendix to the main text.
Paper's Title:
A Sum Form Functional Equation and Its Relevance in Information Theory
Author(s):
Prem Nath and Dhiraj Kumar Singh
Department of Mathematics
University of Delhi
Delhi - 110007
India
pnathmaths@gmail.com
dksingh@maths.du.ac.in
Abstract:
The general solutions of a sum form functional equation containing four unknown mappings have been investigated. The importance of these solutions in relation to various entropies in information theory has been emphasised.
Paper's Title:
Some Inequalities for a Certain Class of Multivalent Functions
Using Multiplier Transformation
Author(s):
K. Suchithra, B. Adolf Stephen, A. Gangadharan and S. Sivasubramanian
Department Of Applied Mathematics
Sri Venkateswara College Of Engineering
Sriperumbudur, Chennai - 602105,
India.
suchithravenkat@yahoo.co.in
Department Of Mathematics,
Madras Christian College
Chennai - 600059,
India.
adolfmcc2003@yahoo.co.in
Department Of Applied Mathematics
Sri Venkateswara College Of Engineering
Sriperumbudur, Chennai - 602105,
India.
ganga@svce.ac.in
Department Of Mathematics,
Easwari Engineering College
Ramapuram, Chennai - 600089,
India.
ganga@svce.ac.in
Abstract:
The object of the present paper is to derive several inequalities
associated with differential subordinations between analytic functions
and a linear operator defined for a certain family of p-valent
functions, which is introduced here by means of a family of extended
multiplier transformations. Some special cases and consequences of
the main results are also considered.
Search and serve lasted 0 second(s).