Talk:Beginning with CVX
From Wikimization
(Difference between revisions)
(New page: <pre> lamda_W=eig(full(W)) </pre>) |
|||
Line 1: | Line 1: | ||
<pre> | <pre> | ||
lamda_W=eig(full(W)) | lamda_W=eig(full(W)) | ||
+ | </pre> | ||
+ | -------------------------------------------------------------------- | ||
+ | Thanks for the idea<pre>full(W)</pre> it's great, it works!!! Thank you very much :D. | ||
+ | |||
+ | I have an answer, how to calculate the normalized eigenvector. | ||
+ | |||
+ | Maybe? <pre>v_W=eig(full(W))/max(eig(full(W)))</pre> | ||
+ | |||
+ | And... how does i have to undersand the result? | ||
+ | |||
+ | In the tutorial don't explain anything, or, with type K in Matlab(is the variable I want to know) thats all? | ||
+ | |||
+ | Thanks a lot again. | ||
+ | |||
+ | Here is the new code: | ||
+ | |||
+ | <pre> | ||
+ | |||
+ | clear all; | ||
+ | n=2; m=1; | ||
+ | |||
+ | A_a=3*eye(2*n,2*n) | ||
+ | B_a=4*eye(2*n,2*m) | ||
+ | W=eye(4) | ||
+ | R=(zeros(2,4)) | ||
+ | |||
+ | cvx_begin | ||
+ | |||
+ | expression K(2*m,2*n) | ||
+ | |||
+ | H=W*A_a'+A_a*W-B_a*R-R'*B_a' | ||
+ | |||
+ | variables p1 p2 Epsilon1 Epsilon2 W(4,4) R(2,4) | ||
+ | |||
+ | minimize (p1+p2) | ||
+ | |||
+ | subject to | ||
+ | for p=1:2 | ||
+ | W(1,1)<=p1 | ||
+ | W(2,2)<=p1 | ||
+ | W(3,3)==W(1,1) | ||
+ | W(4,4)==W(2,2) | ||
+ | |||
+ | for q=1 | ||
+ | R(1,1)>=-p2 | ||
+ | R(1,1)<=p2 | ||
+ | R(2,3)==R(1,1) | ||
+ | |||
+ | R(1,2)>=-p2 | ||
+ | R(1,2)<=p2 | ||
+ | R(2,4)==R(1,2) | ||
+ | end | ||
+ | end | ||
+ | W>=Epsilon1*eye(2*n,2*n) | ||
+ | H<=-Epsilon2*eye(2*n,2*n) | ||
+ | |||
+ | cvx_end | ||
+ | |||
+ | lamda_W=eig(full(W)) | ||
+ | lamda_H=eig(H) | ||
+ | v_W=eig(full(W))/max(eig(full(W)))%%normalized eigenvector :| | ||
+ | v_H=eig(H)/max(eig(H)) | ||
+ | |||
+ | para=0 %STOP | ||
+ | |||
+ | while para==0 | ||
+ | |||
+ | if ( Epsilon1 - lamda_W )>(lamda_H+Epsilon2) | ||
+ | |||
+ | cvx_begin | ||
+ | |||
+ | H=W*A_a'+A_a*W-B_a*R-R'*B_a' | ||
+ | |||
+ | variables p1 p2 Epsilon1 Epsilon2 W(4,4) R(2,4) | ||
+ | |||
+ | minimize (p1+p2) | ||
+ | |||
+ | subject to | ||
+ | for p=1:2 | ||
+ | W(1,1)<=p1 | ||
+ | W(2,2)<=p1 | ||
+ | W(3,3)==W(1,1) | ||
+ | W(4,4)==W(2,2) | ||
+ | |||
+ | for q=1 | ||
+ | R(1,1)>=-p2 | ||
+ | R(1,1)<=p2 | ||
+ | R(2,3)==R(1,1) | ||
+ | |||
+ | R(1,2)>=-p2 | ||
+ | R(1,2)<=p2 | ||
+ | R(2,4)==R(1,2) | ||
+ | end | ||
+ | end | ||
+ | W>=Epsilon1*eye(2*n,2*n) | ||
+ | H<=-Epsilon2*eye(2*n,2*n) | ||
+ | |||
+ | v_W'*W*v_w>=Epsilon1 | ||
+ | |||
+ | cvx_end | ||
+ | |||
+ | else | ||
+ | |||
+ | cvx_begin | ||
+ | |||
+ | H=W*A_a'+A_a*W-B_a*R-R'*B_a' | ||
+ | |||
+ | variables p1 p2 Epsilon1 Epsilon2 W(4,4) R(2,4) | ||
+ | |||
+ | minimize (p1+p2) | ||
+ | |||
+ | subject to | ||
+ | for p=1:2 | ||
+ | W(1,1)<=p1 | ||
+ | W(2,2)<=p1 | ||
+ | W(3,3)==W(1,1) | ||
+ | W(4,4)==W(2,2) | ||
+ | |||
+ | for q=1 | ||
+ | R(1,1)>=-p2 | ||
+ | R(1,1)<=p2 | ||
+ | R(2,3)==R(1,1) | ||
+ | |||
+ | R(1,2)>=-p2 | ||
+ | R(1,2)<=p2 | ||
+ | R(2,4)==R(1,2) | ||
+ | end | ||
+ | end | ||
+ | W>=Epsilon1*eye(2*n,2*n) | ||
+ | H<=-Epsilon2*eye(2*n,2*n) | ||
+ | |||
+ | v_H'*W*v_H<=-Epsilon2 | ||
+ | |||
+ | cvx_end | ||
+ | end | ||
+ | |||
+ | lamda_W=eig(full(W)) | ||
+ | lamda_H=eig(H) | ||
+ | v_W=eig(full(W))/min(eig(full(W)))%%Cálculo del normalized eigenvector | ||
+ | v_H=eig(H)/min(eig(H)) | ||
+ | |||
+ | %STOP | ||
+ | if(lamda_W>=Epsilon1) | ||
+ | if(lamda_H<=-Epsilon2) para=1 | ||
+ | else para = 0 | ||
+ | end | ||
+ | else para =0 | ||
+ | end | ||
+ | |||
+ | end | ||
+ | |||
+ | R | ||
+ | W | ||
+ | K=R/W | ||
</pre> | </pre> |
Revision as of 23:32, 3 February 2009
lamda_W=eig(full(W))
Thanks for the idea
full(W)it's great, it works!!! Thank you very much :D.
I have an answer, how to calculate the normalized eigenvector.
Maybe?v_W=eig(full(W))/max(eig(full(W)))
And... how does i have to undersand the result?
In the tutorial don't explain anything, or, with type K in Matlab(is the variable I want to know) thats all?
Thanks a lot again.
Here is the new code:
clear all; n=2; m=1; A_a=3*eye(2*n,2*n) B_a=4*eye(2*n,2*m) W=eye(4) R=(zeros(2,4)) cvx_begin expression K(2*m,2*n) H=W*A_a'+A_a*W-B_a*R-R'*B_a' variables p1 p2 Epsilon1 Epsilon2 W(4,4) R(2,4) minimize (p1+p2) subject to for p=1:2 W(1,1)<=p1 W(2,2)<=p1 W(3,3)==W(1,1) W(4,4)==W(2,2) for q=1 R(1,1)>=-p2 R(1,1)<=p2 R(2,3)==R(1,1) R(1,2)>=-p2 R(1,2)<=p2 R(2,4)==R(1,2) end end W>=Epsilon1*eye(2*n,2*n) H<=-Epsilon2*eye(2*n,2*n) cvx_end lamda_W=eig(full(W)) lamda_H=eig(H) v_W=eig(full(W))/max(eig(full(W)))%%normalized eigenvector :| v_H=eig(H)/max(eig(H)) para=0 %STOP while para==0 if ( Epsilon1 - lamda_W )>(lamda_H+Epsilon2) cvx_begin H=W*A_a'+A_a*W-B_a*R-R'*B_a' variables p1 p2 Epsilon1 Epsilon2 W(4,4) R(2,4) minimize (p1+p2) subject to for p=1:2 W(1,1)<=p1 W(2,2)<=p1 W(3,3)==W(1,1) W(4,4)==W(2,2) for q=1 R(1,1)>=-p2 R(1,1)<=p2 R(2,3)==R(1,1) R(1,2)>=-p2 R(1,2)<=p2 R(2,4)==R(1,2) end end W>=Epsilon1*eye(2*n,2*n) H<=-Epsilon2*eye(2*n,2*n) v_W'*W*v_w>=Epsilon1 cvx_end else cvx_begin H=W*A_a'+A_a*W-B_a*R-R'*B_a' variables p1 p2 Epsilon1 Epsilon2 W(4,4) R(2,4) minimize (p1+p2) subject to for p=1:2 W(1,1)<=p1 W(2,2)<=p1 W(3,3)==W(1,1) W(4,4)==W(2,2) for q=1 R(1,1)>=-p2 R(1,1)<=p2 R(2,3)==R(1,1) R(1,2)>=-p2 R(1,2)<=p2 R(2,4)==R(1,2) end end W>=Epsilon1*eye(2*n,2*n) H<=-Epsilon2*eye(2*n,2*n) v_H'*W*v_H<=-Epsilon2 cvx_end end lamda_W=eig(full(W)) lamda_H=eig(H) v_W=eig(full(W))/min(eig(full(W)))%%Cálculo del normalized eigenvector v_H=eig(H)/min(eig(H)) %STOP if(lamda_W>=Epsilon1) if(lamda_H<=-Epsilon2) para=1 else para = 0 end else para =0 end end R W K=R/W