Euclidean distance matrix completion via semidefinite facial reduction

From Wikimization

(Difference between revisions)
Jump to: navigation, search
Line 6: Line 6:
EDMC is a ''fundamental problem of distance geometry, (FPDG)''.
EDMC is a ''fundamental problem of distance geometry, (FPDG)''.
-
Let <math>\mathcal S^n</math> be the vector space of symmetric matrices equipped with the trace inner product, <math>\langle S, T \rangle = \mathrm{trace}(ST) </math>; let <math>D \in \mathcal E^n</math>, the set of <math>n \times n</math> Euclidean distance matrices, i.e. <math>D_{ij}=|| p_i-p_j||^2</math>, for points <math>p_i\in \mathbb R^r, i=1,...n</math>, with embedding dimension <math>r</math>;
+
Let <math>\mathcal S^n</math> be the vector space of symmetric matrices equipped with the trace inner product, <math>\langle S, T \rangle = \mathrm{trace}(ST) </math>; let <math>D \in \mathcal E^n</math>, the set of <math>n \times n</math> Euclidean distance matrices, i.e. <math>D_{ij}=|| p_i-p_j||^2</math>, for points <math>p_i\in \mathbb{R}^r, i=1,...n</math>, with embedding dimension <math>r</math>;
and, let <math>\mathcal S^n_+</math> be the set (convex cone) of (symmetric) positive semidefinite matrices. Defining <math>\mathcal K : \mathcal S^n \rightarrow \mathcal S^n</math> by
and, let <math>\mathcal S^n_+</math> be the set (convex cone) of (symmetric) positive semidefinite matrices. Defining <math>\mathcal K : \mathcal S^n \rightarrow \mathcal S^n</math> by
Line 13: Line 13:
we have that <math> \mathcal K(\mathcal S^n_+) = \mathcal E^n</math>. Note that <math>\mathcal K</math> is a one-one linear transformation between the ''centered symmetric matrices'', <math>S \in \mathcal S_c,</math> and the ''hollow matrices'' <math>\mathcal S_H</math>, where centered means row (and column) sums are all zero, and hollow means that the diagonal is zero.
we have that <math> \mathcal K(\mathcal S^n_+) = \mathcal E^n</math>. Note that <math>\mathcal K</math> is a one-one linear transformation between the ''centered symmetric matrices'', <math>S \in \mathcal S_c,</math> and the ''hollow matrices'' <math>\mathcal S_H</math>, where centered means row (and column) sums are all zero, and hollow means that the diagonal is zero.
<p>
<p>
-
A matrix <math>D \in \mathcal S^n</math> is a Euclidean distance matrix with embedding dimension <math>r\,</math> if and only if there exists <math>P \in \mathbb R^{n \times r}</math> such that <math>D = \mathcal K(PP^{\rm{T}}).</math>
+
A matrix <math>D \in \mathcal S^n</math> is a Euclidean distance matrix with embedding dimension <math>r\,</math> if and only if there exists <math>P \in \mathbb{R}^{n \times r}</math> such that <math>D = \mathcal K(PP^{\rm{T}}).</math>
Suppose <math>D\,</math> is a partial Euclidean distance matrix with embedding dimension <math>r\,</math>. The ''low-dimensional Euclidean distance matrix completion'' problem is
Suppose <math>D\,</math> is a partial Euclidean distance matrix with embedding dimension <math>r\,</math>. The ''low-dimensional Euclidean distance matrix completion'' problem is
-
<center><math>\begin{array}{rl}\mbox{find}&P \in \mathbb R^{n \times r} \\ \mbox{s.t.}&H \circ \mathcal K(PP^{\rm{T}}) = H \circ D \\ & P^{\rm{T}}e = 0, \end{array}</math></center>
+
<center><math>\begin{array}{rl}\mbox{find}&P \in \mathbb{R}^{n \times r} \\ \mbox{s.t.}&H \circ \mathcal K(PP^{\rm{T}}) = H \circ D \\ & P^{\rm{T}}e = 0, \end{array}</math></center>
-
where <math>e \in \mathbb R^n</math> is the vector of all ones, and <math>H\,</math> is the adjacency matrix of the graph <math>G = (V,E)\,</math> associated with the partial Euclidean distance matrix <math>D.</math>
+
where <math>e \in \mathbb{R}^n</math> is the vector of all ones, and <math>H\,</math> is the adjacency matrix of the graph <math>G = (V,E)\,</math> associated with the partial Euclidean distance matrix <math>D.</math>
===Properties of <math>\mathcal K </math> [http://orion.math.uwaterloo.ca/~hwolkowi/henry/reports/edmapr04.pdf]===
===Properties of <math>\mathcal K </math> [http://orion.math.uwaterloo.ca/~hwolkowi/henry/reports/edmapr04.pdf]===
Line 59: Line 59:
== Single Clique Facial Reduction Theorem [http://www.optimization-online.org/DB_HTML/2009/05/2297.html] ==
== Single Clique Facial Reduction Theorem [http://www.optimization-online.org/DB_HTML/2009/05/2297.html] ==
-
Let <math>C \subseteq V</math> be a clique in the graph <math>G\,</math> such that the embedding dimension of <math>D[C]\,</math> is <math>r\,.</math> Then there exists <math>U \in \mathbb R^{n \times (n-|C|+r)}</math> such that
+
Let <math>C \subseteq V</math> be a clique in the graph <math>G\,</math> such that the embedding dimension of <math>D[C]\,</math> is <math>r\,.</math> Then there exists <math>U \in \mathbb{R}^{n \times (n-|C|+r)}</math> such that
<center><math> \mathbf{face} \left\{ Y \in \mathcal S^n_+ : \mathcal{K}(Y[C]) = D[C], Ye = 0 \right\} = U \mathcal S^{n-|C|+r}_+ U^{\rm{T}}. </math></center>
<center><math> \mathbf{face} \left\{ Y \in \mathcal S^n_+ : \mathcal{K}(Y[C]) = D[C], Ye = 0 \right\} = U \mathcal S^{n-|C|+r}_+ U^{\rm{T}}. </math></center>

Revision as of 23:11, 4 December 2011

Nathan Krislock

Henry Wolkowicz

Contents

Euclidean distance matrix completions (EDMC), background

EDMC is a fundamental problem of distance geometry, (FPDG). Let LaTeX: \mathcal S^n be the vector space of symmetric matrices equipped with the trace inner product, LaTeX: \langle S, T \rangle = \mathrm{trace}(ST) ; let LaTeX: D \in \mathcal E^n, the set of LaTeX: n \times n Euclidean distance matrices, i.e. LaTeX: D_{ij}=|| p_i-p_j||^2, for points LaTeX: p_i\in \mathbb{R}^r, i=1,...n, with embedding dimension LaTeX: r; and, let LaTeX: \mathcal S^n_+ be the set (convex cone) of (symmetric) positive semidefinite matrices. Defining LaTeX: \mathcal K : \mathcal S^n \rightarrow \mathcal S^n by

LaTeX: \mathcal K(Y)_{ij} := Y_{ii} + Y_{jj} - 2Y_{ij} ~~\mbox{for all}~ i,j = 1, \ldots, n,

we have that LaTeX:  \mathcal K(\mathcal S^n_+) = \mathcal E^n. Note that LaTeX: \mathcal K is a one-one linear transformation between the centered symmetric matrices, LaTeX: S \in \mathcal S_c, and the hollow matrices LaTeX: \mathcal S_H, where centered means row (and column) sums are all zero, and hollow means that the diagonal is zero.

A matrix LaTeX: D \in \mathcal S^n is a Euclidean distance matrix with embedding dimension LaTeX: r\, if and only if there exists LaTeX: P \in \mathbb{R}^{n \times r} such that LaTeX: D = \mathcal K(PP^{\rm{T}}). Suppose LaTeX: D\, is a partial Euclidean distance matrix with embedding dimension LaTeX: r\,. The low-dimensional Euclidean distance matrix completion problem is

LaTeX: \begin{array}{rl}\mbox{find}&P \in \mathbb{R}^{n \times r} \\ \mbox{s.t.}&H \circ \mathcal K(PP^{\rm{T}}) = H \circ D \\ & P^{\rm{T}}e = 0, \end{array}

where LaTeX: e \in \mathbb{R}^n is the vector of all ones, and LaTeX: H\, is the adjacency matrix of the graph LaTeX: G = (V,E)\, associated with the partial Euclidean distance matrix LaTeX: D.

Properties of LaTeX: \mathcal K [1]

Let LaTeX: 
\mathcal D_e(Y):=\operatorname{diag}(Y)e^{\rm{T}}+e\operatorname{diag}(Y)^{\rm{T}}
, where diag denotes the diagonal of the argument. Then alternatively, we can write

LaTeX: \mathcal K(Y) = \mathcal D_e(Y) -2Y.

The adjoints of these linear transformations are

LaTeX: 
<p>\mathcal D^*(D) = 2 \operatorname{Diag} (De),
\qquad 
\mathcal K^*(D) = 2 (\operatorname{Diag} (De) -D),
</p>

where Diag is the diagonal matrix formed from its argument and is the adjoint of diag. The Moore-Penrose generalized inverse is

LaTeX: \mathcal K^\dagger(D) = -\frac 12 J \operatorname{offDiag} (D) J,

where the matrix LaTeX:  J:=I-\frac 1n ee^{\rm{T}} and the linear operator (orthogonal projection) LaTeX: \operatorname{offDiag}(D):=D-\operatorname{Diag}\operatorname{diag}(D). The orthogonal projections onto the range of LaTeX: \mathcal K and range of LaTeX: \mathcal K^* is given by

LaTeX: \mathcal K\mathcal K^\dagger (D)=\operatorname{offDiag}(D), \qquad
<p>\mathcal K^\dagger \mathcal K (Y)=JYJ ,

respectively. These are the hollow and centered subspaces of LaTeX: \mathcal S^n, respectively. The nullspaces of LaTeX: \mathcal K, \mathcal K^\dagger are the ranges of Diag, LaTeX: \mathcal D_e, respectively.

Semidefinite programming relaxation of the low-dimensional Euclidean distance matrix completion problem

Using the substitution LaTeX: Y = PP^{\rm{T}}\,, and relaxing the (NP-hard) condition that LaTeX: \mathrm{rank}(Y) = r\,, we obtain the semidefinite programming relaxation

LaTeX: \begin{array}{rl}\mbox{find}& Y \in \mathcal S^n_+ \\ \mbox{s.t.}& H \circ \mathcal K(Y) = H \circ D \\ & Ye = 0. \end{array}

Single Clique Facial Reduction Theorem [2]

Let LaTeX: C \subseteq V be a clique in the graph LaTeX: G\, such that the embedding dimension of LaTeX: D[C]\, is LaTeX: r\,. Then there exists LaTeX: U \in \mathbb{R}^{n \times (n-|C|+r)} such that

LaTeX:  \mathbf{face} \left\{ Y \in \mathcal S^n_+ : \mathcal{K}(Y[C]) = D[C], Ye = 0 \right\} = U \mathcal S^{n-|C|+r}_+ U^{\rm{T}}.

Personal tools