# Linear matrix inequality

### From Wikimization

(→Convexity of the LMI constraint) |
m (Reverted edits by 81.180.65.6 (Talk); changed back to last version by Cslaw) |
||

Line 8: | Line 8: | ||

This linear matrix inequality specifies a convex constraint on ''y''. | This linear matrix inequality specifies a convex constraint on ''y''. | ||

- | + | == Convexity of the LMI constraint == | |

+ | <math>LMI(y)\succeq 0</math> is a convex constraint on ''y'' which means membership to a dual (convex) cone as we now explain: '''('''[http://meboo.convexoptimization.com/BOOK/convexgeometry.pdf Dattorro, Example 2.13.5.1.1]''')''' | ||

+ | |||

+ | Consider a peculiar vertex-description for a closed convex cone defined over the positive semidefinite cone | ||

+ | |||

+ | '''('''instead of the more common nonnegative orthant, <math>x\succeq0</math>''')''': | ||

+ | |||

+ | for <math>X\!\in\mathbb{S}^n</math> given <math>\,A_j\!\in\mathbb{S}^n</math>, <math>\,j\!=\!1\ldots m</math> | ||

+ | |||

+ | <math>\begin{array}{ll}\mathcal{K} | ||

+ | \!\!&=\left\{\left[\begin{array}{c}\langle A_1\,,\,X^{}\rangle\\\vdots\\\langle A_m\;,\,X^{}\rangle\end{array}\right]|~X\!\succeq_{\!}0\right\}\subseteq_{}\reals^m\\\\ | ||

+ | &=\left\{\left[\begin{array}{c}\textrm{svec}(A_1)^T\\\vdots\\\textrm{svec}(A_m)^T\end{array}\right]\!\textrm{svec}X~|~X\!\succeq_{\!}0\right\}\\\\ | ||

+ | &:=\;\{A\,\textrm{svec}X~|~X\!\succeq_{\!}0_{}\} | ||

+ | \end{array}</math> | ||

+ | |||

+ | where | ||

+ | *<math>A\!\in_{}\!\mathbb{R}^{m\times n(n+1)/2}</math>, | ||

+ | *symmetric vectorization svec is a stacking of columns defined in '''('''[http://meboo.convexoptimization.com/BOOK/convexgeometry.pdf Dattorro, Ch.2.2.2.1]''')''', | ||

+ | *<math>A_0=\mathbf{0}</math> is assumed without loss of generality. | ||

+ | |||

+ | <math>\mathcal{K}</math> is a convex cone because | ||

+ | |||

+ | <math>A\,\textrm{svec}{X_{{\rm p}_1}}_{\,},_{_{}}A\,\textrm{svec}{X_{{\rm p}_2}}\!\in\mathcal{K}~\Rightarrow~ | ||

+ | A(\zeta_{\,}\textrm{svec}{X_{{\rm p}_1\!}}+_{}\xi_{\,}\textrm{svec}{X_{{\rm p}_2}})\in_{}\mathcal{K} | ||

+ | \textrm{~~for\,all~\,}\zeta_{\,},\xi\geq0</math> | ||

+ | |||

+ | since a nonnegatively weighted sum of positive semidefinite matrices must be positive semidefinite. | ||

+ | |||

+ | Now consider the (closed convex) dual cone: | ||

+ | |||

+ | <math>\begin{array}{rl}\mathcal{K}^* | ||

+ | \!\!\!&=_{}\left\{_{}y~|~\langle z\,,\,y_{}\rangle\geq_{}0\,~\textrm{for\,all}~\,z\!\in_{_{}\!}\mathcal{K}_{}\right\}\subseteq_{}\reals^m\\ | ||

+ | &=_{}\left\{_{}y~|~\langle z\,,\,y_{}\rangle\geq_{}0\,~\textrm{for\,all}~\,z_{\!}=_{\!}A\,\textrm{svec}X\,,~X\succeq0_{}\right\}\\ | ||

+ | &=_{}\left\{_{}y~|~\langle A\,\textrm{svec}X\,,~y_{}\rangle\geq_{}0\,~\textrm{for\,all}~\,X\!\succeq_{_{}\!}0_{}\right\}\\ | ||

+ | &=\left\{y~|~\langle\textrm{svec}X\,,\,A^{T\!}y\rangle\geq_{}0\;~\textrm{for\,all}~\,X\!\succeq_{\!}0\right\}\\ | ||

+ | &=\left\{y~|~\textrm{svec}^{-1}(A^{T\!}y)\succeq_{}0\right\} | ||

+ | \end{array}</math> | ||

+ | |||

+ | that follows from Fejer's dual generalized inequalities for the positive semidefinite cone: | ||

+ | |||

+ | * <math>Y\succeq0~\Leftrightarrow~\langle Y\,,\,X\rangle\geq0\;~\textrm{for\,all}~\,X\succeq0</math> | ||

+ | |||

+ | This leads directly to an equally peculiar halfspace-description | ||

+ | |||

+ | <math>\mathcal{K}^*=\{y\!\in_{}\!\mathbb{R}^m~|\,\sum\limits_{j=1}^my_jA_j\succeq_{}0_{}\}</math> | ||

+ | |||

+ | The summation inequality with respect to the positive semidefinite cone | ||

+ | is known as a ''linear matrix inequality''. | ||

== LMI Geometry == | == LMI Geometry == |

## Revision as of 14:35, 5 December 2008

In convex optimization, a **linear matrix inequality (LMI)** is an expression of the form

where

- is a real vector,
- are symmetric matrices in the subspace of symmetric matrices ,
- is a generalized inequality meaning is a positive semidefinite matrix belonging to the positive semidefinite cone in the subspace of symmetric matrices .

This linear matrix inequality specifies a convex constraint on *y*.

## Contents |

## Convexity of the LMI constraint

is a convex constraint on *y* which means membership to a dual (convex) cone as we now explain: **(**Dattorro, Example 2.13.5.1.1**)**

Consider a peculiar vertex-description for a closed convex cone defined over the positive semidefinite cone

**(**instead of the more common nonnegative orthant, **)**:

for given ,

where

- ,
- symmetric vectorization svec is a stacking of columns defined in
**(**Dattorro, Ch.2.2.2.1**)**, - is assumed without loss of generality.

is a convex cone because

since a nonnegatively weighted sum of positive semidefinite matrices must be positive semidefinite.

Now consider the (closed convex) dual cone:

that follows from Fejer's dual generalized inequalities for the positive semidefinite cone:

This leads directly to an equally peculiar halfspace-description

The summation inequality with respect to the positive semidefinite cone
is known as a *linear matrix inequality*.

## LMI Geometry

Although matrix is finite-dimensional, is generally not a polyhedral cone (unless equals 1 or 2) simply because .

Provided the matrices are linearly independent, then relative interior = interior

meaning, the cone interior is nonempty; implying, the dual cone is pointed (Dattorro, ch.2).

If matrix has no nullspace, on the other hand, then is an isomorphism in between the positive semidefinite cone and range of matrix .

In that case, convex cone has relative interior

and boundary

When the matrices are linearly independent, function on is a linear bijection.

Inverse image of the positive semidefinite cone under must therefore have dimension .

In that circumstance, the dual cone interior is nonempty

having boundary

## Applications

There are efficient numerical methods to determine whether an LMI is feasible (*i.e.*, whether there exists a vector such that ), or to solve a convex optimization problem with LMI constraints.
Many optimization problems in control theory, system identification, and signal processing can be formulated using LMIs. The prototypical primal and dual semidefinite program is a minimization of a real linear function respectively subject to the primal and dual convex cones governing this LMI.

## External links

- S. Boyd, L. El Ghaoui, E. Feron, and V. Balakrishnan, Linear Matrix Inequalities in System and Control Theory

- C. Scherer and S. Weiland Course on Linear Matrix Inequalities in Control, Dutch Institute of Systems and Control (DISC).