Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

How accurate is this two-pass approach in general? From my outsider's perspective, it always looked like most of the difficulty in implementing Lanczos was reorthogonalization, which will be hard to do with the two-pass algorithm.

Or is this mostly a problem when you actually want to calculate the eigenvectors themselves, and not just matrix functions?



That's an interesting question. I don't have too much experience, but here's my two cents.

For matrix function approximations, loss of orthogonality matters less than for eigenvalue computations. The three-term recurrence maintains local orthogonality reasonably well for moderate iteration counts. My experiments [1] show orthogonality loss stays below $10^{-13}$ up to k=1000 for well-conditioned problems, and only becomes significant (jumping to $10^{-6}$ and higher) around k=700-800 for ill-conditioned spectra. Since you're evaluating $f(T_k)$ rather than extracting individual eigenpairs, you care about convergence of $\|f(A)b - x_k\|$, not spectral accuracy. If you need eigenvectors themselves or plan to run thousands of iterations, you need the full basis, and the two-pass method won't help. Maybe methods like [2] would be more suitable?

[1] https://github.com/lukefleed/two-pass-lanczos/raw/master/tex...

[2] https://arxiv.org/abs/2403.04390




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: