Jensen's inequality
From Wikimization
m (Protected "Jensen's inequality" [edit=autoconfirmed:move=autoconfirmed]) |
|||
Line 50: | Line 50: | ||
&= \phi(a) + (m \int f) - ma\\ | &= \phi(a) + (m \int f) - ma\\ | ||
&= \phi(a)\\ | &= \phi(a)\\ | ||
- | &= \phi(\int f)\end{array}</math> | + | &= \phi(\int f)\end{array} |
+ | </math> | ||
<math>\,-\,</math>David C. Ullrich | <math>\,-\,</math>David C. Ullrich |
Current revision
By definition is convex if and only if
whenever and
are in the domain of
.
It follows by induction on
that if
for
then
(1)
Jensen's inequality says this:
If is a probability
measure on
,
is a real-valued function on
,
is integrable, and
is convex on the range
of
then
(2)
Proof 1: By some limiting argument we can assume
that is simple. (This limiting argument is a missing detail to this proof...)
That is, is the disjoint union of
and
is constant on each
.
Say and
is the value of
on
.
Then (1) and (2) say exactly the same thing. QED.
Proof 2:
Lemma. If and
then
The lemma shows:
has a right-hand derivative at every point, and
- the graph of
lies above the "tangent" line through any point on the graph with slope equal to the right derivative.
Say
Let be the right derivative of
at
, and let
The bullets above say for
all
in the domain of
. So
David C. Ullrich