Tuesday, 19 January 2021

How to find the common eigenvectors of two matrices with distincts eigenvalues

I am looking for finding or rather building common eigenvectors matrix X between 2 matrices A and B such as :

AX=aX with "a" the diagonal matrix corresponding to the eigenvalues

BX=bX with "b" the diagonal matrix corresponding to the eigenvalues

where A and B are square and diagonalizable matrices.

I took a look in a similar post but had not managed to conclude, i.e having valid results when I build the final wanted endomorphism F defined by : F = P D P^-1

I have also read the wikipedia topic and this interesting paper but couldn't have to extract methods pretty easy to implement.

Particularly, I am interested by the eig(A,B) Matlab function.

I tried to use it like this :

% Search for common build eigen vectors between FISH_sp and FISH_xc
[V,D] = eig(FISH_sp,FISH_xc);
% Diagonalize the matrix (A B^-1) to compute Lambda since we have AX=Lambda B X
[eigenv, eigen_final] = eig(inv(FISH_xc)*FISH_sp);
% Compute the final endomorphism : F = P D P^-1
FISH_final = V*eye(7).*eigen_final*inv(V)

But the matrix FISH_final don't give good results since I can do other computations from this matrix FISH_final (this is actually a Fisher matrix) and the results of these computations are not valid.

So surely, I must have done an error in my code snippet above. In a first time, I prefer to conclude in Matlab as if it was a prototype, and after if it works, look for doing this synthesis with MKL or with Python functions. Hence also tagging python.

How can I build these common eigenvectors and finding also the eigenvalues associated? I am a little lost between all the potential methods that exist to carry it out.

The screen capture below shows that the kernel of commutator has to be different from null vector :

Eigen vectors common and kernel of commutator

EDIT 1: From maths exchange, one advices to use Singular values Decomposition (SVD) on the commutator [A,B], that is in Matlab doing by :

"If 𝑣 is a common eigenvector, then ‖(𝐴𝐵−𝐵𝐴)𝑣‖=0. The SVD approach gives you a unit-vector 𝑣 that minimizes ‖(𝐴𝐵−𝐵𝐴)𝑣‖ (with the constraint that ‖𝑣‖=1)"

So I extract the approximative eigen vectors V from :

[U,S,V] = svd(A*B-B*A)

Is there a way to increase the accuracy to minimize ‖(𝐴𝐵−𝐵𝐴)𝑣‖ as much as possible ?

IMPORTANT REMARK : Maybe some of you didn't fully understand my goal.

Concerning the common basis of eigen vectors, I am looking for a combination (vectorial or matricial) of V1 and V2, or directly using null operator on the 2 input Fisher marices, to build this new basis "P" in which, with others eigenvalues than known D1 and D2 (noted D1a and D2a), we could have :

F = P (D1a+D2a) P^-1

To compute the new Fisher matrix F, I need to know P, assuming that D1a and D2a are equal respectively to D1 and D2 diagonal matrices (coming from diagonalization of A and B matrices)

If I know common basis of eigen vectors P, I could deduce D1a and Da2 from D1 and D2, couldn't I ?

The 2 Fisher matrices are available on these links :

Matrix A

Matrix B



from How to find the common eigenvectors of two matrices with distincts eigenvalues

No comments:

Post a Comment