Jensen's inequality
From Wikimization
(ljAJKoLGrJZCIpcw) |
m (Reverted edits by 65.79.227.14 (Talk); changed back to last version by Cslaw) |
||
Line 1: | Line 1: | ||
- | + | By definition <math>\,\phi\,</math> is convex if and only if | |
+ | |||
+ | <math>\phi(ta + (1-t)b) \leq t \phi(a) + (1-t) \phi(b)</math> | ||
+ | |||
+ | whenever <math>\,0 \leq t \leq 1\,</math> and <math>\,a\,, b\,</math> are in the domain of <math>\,\phi\,</math>. | ||
+ | |||
+ | It follows by induction on | ||
+ | <math>\,n\,</math> that if <math>\,t_j \geq 0\,</math> for <math>\,j = 1, 2\ldots n\,</math> then | ||
+ | |||
+ | <br><math>\phi(\sum t_j a_j) \leq \sum t_j \phi(a_j) </math> (1) | ||
+ | |||
+ | <br>Jensen's inequality says this: <br>If <math>\,\mu\,</math> is a probability | ||
+ | measure on <math>\,X\,</math> , <br><math>\,f\,</math> is a real-valued function on <math>\,X\,</math> , | ||
+ | <br><math>\,f\,</math> is integrable, and <br><math>\,\phi\,</math> is convex on the range | ||
+ | of <math>\,f\,</math> then | ||
+ | |||
+ | <br><math>\phi(\int f d \mu) \leq \int \phi \circ f d \mu\qquad</math> (2) | ||
+ | |||
+ | <br>'''Proof 1:''' By some limiting argument we can assume | ||
+ | that <math>\,f\,</math> is simple. (This limiting argument is a missing detail to this proof...) | ||
+ | <br>That is, <math>\,X\,</math> is the disjoint union of <math>\,X_1 \,\ldots\, X_n\,</math> | ||
+ | and <math>\,f\,</math> is constant on each <math>\,X_j\,</math> . | ||
+ | |||
+ | Say <math>\,t_j=\mu(X_j)\,</math> and <math>\,a_j\,</math> is the value of <math>\,f\,</math> on <math>\,X_j\,</math> . | ||
+ | |||
+ | Then (1) and (2) say exactly the same thing. QED. | ||
+ | |||
+ | <br>'''Proof 2:''' | ||
+ | |||
+ | Lemma. If <math>\,a < b,\, \,a' < b',\, \,a \leq a'\,</math> and <math>\,b \leq b'\,</math> then | ||
+ | |||
+ | <math>\,(f(a) - f(b)) / (a - b) \leq (f(a') - f(b')) / (a' - b')\quad\diamond</math> | ||
+ | |||
+ | The lemma shows: | ||
+ | *<math>\,\phi\,</math> has a right-hand derivative at every point, and | ||
+ | *the graph of <math>\,\phi\,</math> lies above the "tangent" line through any point on the graph with slope equal to the right derivative. | ||
+ | |||
+ | Say <math>\,a = \int f d \mu\,</math> | ||
+ | |||
+ | Let <math>\,m\,</math> be the right derivative of <math>\,\phi\,</math> | ||
+ | at <math>\,a\,</math> , and let | ||
+ | |||
+ | <math>\,L(t) = \phi(a) + m(t-a)\,</math> | ||
+ | |||
+ | The bullets above say <math>\,\phi(t)\geq L(t)\,</math> for | ||
+ | all <math>\,t\,</math> in the domain of <math>\,\phi\,</math> . So | ||
+ | |||
+ | <math>\begin{array}{rl}\int \phi \circ f &\geq \int L \circ f\\ | ||
+ | &= \int (\phi(a) + m(f - a))\\ | ||
+ | &= \phi(a) + (m \int f) - ma\\ | ||
+ | &= \phi(a)\\ | ||
+ | &= \phi(\int f)\end{array}</math> | ||
+ | |||
+ | <math>\,-\,</math>David C. Ullrich |
Revision as of 16:48, 11 November 2009
By definition is convex if and only if
whenever and
are in the domain of
.
It follows by induction on
that if
for
then
(1)
Jensen's inequality says this:
If is a probability
measure on
,
is a real-valued function on
,
is integrable, and
is convex on the range
of
then
(2)
Proof 1: By some limiting argument we can assume
that is simple. (This limiting argument is a missing detail to this proof...)
That is, is the disjoint union of
and
is constant on each
.
Say and
is the value of
on
.
Then (1) and (2) say exactly the same thing. QED.
Proof 2:
Lemma. If and
then
The lemma shows:
has a right-hand derivative at every point, and
- the graph of
lies above the "tangent" line through any point on the graph with slope equal to the right derivative.
Say
Let be the right derivative of
at
, and let
The bullets above say for
all
in the domain of
. So
David C. Ullrich