numpy - How to efficiently get eigenvector decomposition with scipy or lapack? -
I want to find an eigenvector decomposition of a dense complex matrix Matrix needs only a few eigenvectors to regenerate properly for your needs, although I need to determine which eigenvectors to include For some of my non-trivial filtering performance on eigenvalues, it is not appropriate to use a singular value decomposition, as eigenvalues / eigenvectors are physically meaningful results that I want to work with. Are there. I have a convenient cover around However, when I compare two methods, It seems that both eigenvectors left in discrete metrics are much less precise than inverted using the right eigenvectors. Probably this is a round-of-the-clock or a typo error. I do not like to overturn the matrix of the correct eigenvectors, because it is potentially large enough (in the order of 1000s) and I will need to repeat this operation again several times. Is there anything available in regular SPP (or lapack or some other routine that I could have been able to wrap myself) which efficiently and accurately decomposes? I have now realized that the left eigenvectors so I'm right that you have a
A = VDiEG (Lambda). V = -1
scipy.linalg.eig , which is the code of Lepac's code, JEGEEV routine,
V ^ - 1 should be available from the appropriate scale version of left eigenvectors. I used to expect that this to be more efficient and more stable to turn the matrix of corrected eigenvectors
U to
It is possible to get V ^ -1 , still this is the trick to normalize the elements of
u for the elements of
V I myself < Code> U , which is wrong, but close enough that it looks almost right
V ^ -1 Can get
U from , in my implementation It was only a mistake.
Comments
Post a Comment