Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add lu decomposition for Tensors #94

Merged
merged 10 commits into from
Nov 13, 2023
Merged

Add lu decomposition for Tensors #94

merged 10 commits into from
Nov 13, 2023

Conversation

jofrevalles
Copy link
Member

Summary

This PR adds the new lu function for tensors, extending the LinearAlgebra.lu function (resolve #26). The new lu function returns the LU decomposition of a Tensor, where the tensor can be recovered by contracting the permutation tensor P, the tensor L, and tensor U. The tensors L and U are reshaped versions of the original lower and upper triangular matrices obtained during the decomposition process, respectively.

This implementation is inspired by the LU decomposition in the scipy library, as it returns the permutation tensor P allowing the original tensor A to be recovered with the contraction A = P * L * U. This contrasts with LinearAlgebra, where the permutation vector p is returned, and the original matrix can be recovered with P' * A = L * U (where P' is the permutation matrix built from p).

Please let me know if there are any concerns or issues with extending the LinearAlgebra library in this manner.

We have also added tests for this new function.

Example

A usage example of the lu function:

julia> using Tenet; using LinearAlgebra; using Test

julia> tensor = Tensor(rand(4, 4, 4), (:i, :j, :k))
4×4×4 Tensor{Float64, 3, Array{Float64, 3}}:
...

julia> P, L, U = lu(tensor, left_inds = labels(tensor)[1:2])
...

julia> @test contract(contract(P, L), U)  tensor
Test Passed

@codecov
Copy link

codecov bot commented Sep 18, 2023

Codecov Report

Attention: 1 lines in your changes are missing coverage. Please review.

Comparison is base (5b3e11b) 87.71% compared to head (525f14e) 88.18%.
Report is 1 commits behind head on develop.

Additional details and impacted files
@@             Coverage Diff             @@
##           develop      #94      +/-   ##
===========================================
+ Coverage    87.71%   88.18%   +0.46%     
===========================================
  Files           10       10              
  Lines          570      584      +14     
===========================================
+ Hits           500      515      +15     
+ Misses          70       69       -1     
Files Coverage Δ
src/Tensor.jl 88.99% <100.00%> (ø)
src/Transformations.jl 98.41% <100.00%> (ø)
src/Numerics.jl 92.20% <97.72%> (+3.31%) ⬆️

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@mofeing mofeing linked an issue Nov 12, 2023 that may be closed by this pull request
@mofeing
Copy link
Member

mofeing commented Nov 12, 2023

@jofrevalles I've updated the code to the recent changes. Also, I've refactored the SVD and QR factorizations. Would you mind giving a quick review?

@jofrevalles
Copy link
Member Author

@jofrevalles I've updated the code to the recent changes. Also, I've refactored the SVD and QR factorizations. Would you mind giving a quick review?

I have checked it and everything looks good to me!

@mofeing mofeing merged commit dd9b6d3 into develop Nov 13, 2023
5 checks passed
@mofeing mofeing deleted the feature/implement-lu branch November 13, 2023 08:44
@mofeing mofeing restored the feature/implement-lu branch September 28, 2024 20:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Implement LU factorization for Tensor
2 participants