Record Details

Minimization Problems Based On A Parametric Family Of Relative Entropies

Electronic Theses of Indian Institute of Science

View Archive Info
 
 
Field Value
 
Title Minimization Problems Based On A Parametric Family Of Relative Entropies
 
Creator Ashok Kumar, M
 
Subject Information Theory
Information Geometry
Kullback-Leiber Divergence
Linear Entropy
Power-law Family
Tsallis Entropy
Pythagorean Property
Relative Entropy
Renyi Entropy
Exponential Family
Relative Entropy Minimization
Ropbust Statistics
Information Projection
Parametric Family
Relative Entropies
Computer Science
 
Description We study minimization problems with respect to a one-parameter family of generalized relative entropies. These relative entropies, which we call relative -entropies (denoted I (P; Q)), arise as redundancies under mismatched compression when cumulants of compression lengths are considered instead of expected compression lengths. These parametric relative entropies are a generalization of the usual relative entropy (Kullback-Leibler divergence). Just like relative entropy, these relative -entropies behave like squared Euclidean distance and satisfy the Pythagorean property. We explore the geometry underlying various statistical models and its relevance to information theory and to robust statistics. The thesis consists of three parts.
In the first part, we study minimization of I (P; Q) as the first argument varies over a convex set E of probability distributions. We show the existence of a unique minimizer when the set E is closed in an appropriate topology. We then study minimization of I on a particular convex set, a linear family, which is one that arises from linear statistical constraints. This minimization problem generalizes the maximum Renyi or Tsallis entropy principle of statistical physics. The structure of the minimizing probability distribution naturally suggests a statistical model of power-law probability distributions, which we call an -power-law family. Such a family is analogous to the exponential family that arises when relative entropy is minimized subject to the same linear statistical constraints.
In the second part, we study minimization of I (P; Q) over the second argument. This minimization is generally on parametric families such as the exponential family or the - power-law family, and is of interest in robust statistics ( > 1) and in constrained compression settings ( < 1).
In the third part, we show an orthogonality relationship between the -power-law family and an associated linear family. As a consequence of this, the minimization of I (P; ), when the second argument comes from an -power-law family, can be shown to be equivalent to a minimization of I ( ; R), for a suitable R, where the first argument comes from a linear family. The latter turns out to be a simpler problem of minimization of a quasi convex objective function subject to linear constraints. Standard techniques are available to solve such problems, for example, via a sequence of convex feasibility problems, or via a sequence of such problems but on simpler single-constraint linear families.
 
Contributor Sundaresan, Rajesh
 
Date 2017-08-21T14:50:59Z
2017-08-21T14:50:59Z
2017-08-21
2015-05
 
Type Thesis
 
Identifier http://etd.iisc.ernet.in/handle/2005/2649
http://etd.ncsi.iisc.ernet.in/abstracts/3459/G26742-Abs.pdf
 
Language en_US
 
Relation G26742