Jensen's inequality

From Wikimization

Revision as of 14:14, 15 July 2008 by Dattorro (Talk | contribs)
Jump to: navigation, search

By definition LaTeX: \,\phi\, is convex if and only if

LaTeX: \phi(ta + (1-t)b) \leq t \phi(a) + (1-t) \phi(b)

whenever LaTeX: \,0 \leq t \leq 1\, and LaTeX: \,a, b\, are in the domain of LaTeX: \,\phi\,.

It follows by induction on LaTeX: \,n\, that if LaTeX: \,t_j \geq 0\, for LaTeX: \,j = 1, 2\ldots n\, then


LaTeX: \phi(\sum t_j a_j) \leq \sum t_j \phi(a_j)           (1)


Jensen's inequality says this:
If LaTeX: \,\mu\, is a probability measure on LaTeX: \,X\,,
LaTeX: \,f\, is a real-valued function on LaTeX: \,X\,,
LaTeX: \,f\, is integrable, and
LaTeX: \,\phi\, is convex on the range of LaTeX: \,f\, then


LaTeX: \phi(\int f d \mu) \leq \int \phi \circ f d \mu\qquad          (2)


Proof 1: By some limiting argument we can assume that LaTeX: \,f\, is simple (this limiting argument is the missing detail).
That is, LaTeX: \,X\, is the disjoint union of LaTeX: \,X_1 \,\ldots\, X_n\, and LaTeX: \,f\, is constant on each LaTeX: \,X_j\,.

Say LaTeX: \,t_j=\mu(X_j)\, and LaTeX: \,a_j\, is the value of LaTeX: \,f\, on LaTeX: \,X_j\,. Then (1) and (2) say exactly the same thing. QED.


Proof 2: The lemma shows that LaTeX: \,\phi\, has a right-hand derivative at every point and that the graph of LaTeX: \,\phi\, lies above the "tangent" line through any point on the graph with slope = the right derivative.

Say LaTeX: \,a = \int f d \mu\,, let LaTeX: \,m =\, the right derivative of LaTeX: \,\phi\, at LaTeX: \,a\,, and let

LaTeX: \,L(t) = \phi(a) + m(t-a)\,

The comment above says that LaTeX: \,\phi(t) \geq L(t)\, for all LaTeX: \,t\, in the domain of LaTeX: \,\phi\,. So

LaTeX: \begin{array}{rl}\int \phi \circ f &\geq \int L \circ f\\ 
</p>
<pre>        &= \int (\phi(a) + m(f - a))\\ 
        &= \phi(a) + (m \int f) - ma\\
        &= \phi(a)\\
        &= \phi(\int f)\end{array}

LaTeX: \,-\,D. Ullrich

Personal tools