You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
BTW, @rth (and anyone else who thinks about matrices, I don't think about them that much admittedly), I've been wondering if we could use LU decomposition on the kriging matrix to speed up the solution. Any thoughts? I haven't thought about this that much, but since the LHS matrix is the same (just the RHS vector that's changing), we might be able to leverage this for the looping backend...
The text was updated successfully, but these errors were encountered:
would it be faster to get rid of matrix inversion, and use LU decomposition instead?
in the cases when linalg.solve is used repeatedly inside a loop could we precompute the LU decomposition outside of the loop and solve a simpler system in the loop?
Both sound interesting and would be worth exploring (strarting from the pure Python implementations that are easier to change). In any case we need benchmarks to see how it would impact performance and memory use (cf. PR #36) ..
rth
changed the title
Optimization of loop kriging backend with LU matrix decomposition
Performance optimization with LU matrix decomposition
Jan 14, 2017
rth
changed the title
Performance optimization with LU matrix decomposition
Performance optimization with LU decomposition
Jan 14, 2017
We should use the properties of the kriging matrix to optimize that.
The kriging equation can always be reformulated to use the covariance-matrix instead of the semi-variogram-matrix (as done now). Then, this part of the matrix is a symmetric and positive definite matrix (if we use a valid covariance model), which could be tackled by the Cholesky decomposition. The full inversion could then be computed by block matrix-inversion as stated here: #52 (comment)
Making this @bsmurphy comment a separate issue,
The text was updated successfully, but these errors were encountered: