Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[NDTensors] [BUG] denseblocks on DiagBlockSparse does not conserve QNs #1619

Closed
AFeuerpfeil opened this issue Feb 20, 2025 · 4 comments · Fixed by #1621 or #1622
Closed

[NDTensors] [BUG] denseblocks on DiagBlockSparse does not conserve QNs #1619

AFeuerpfeil opened this issue Feb 20, 2025 · 4 comments · Fixed by #1621 or #1622
Labels
bug Something isn't working NDTensors Requires changes to the NDTensors.jl library.

Comments

@AFeuerpfeil
Copy link

Description of bug

When performing denseblocks on a DiagBlockSparse tensor, the output is a tensor, which in general does not conserve QNs and thus has no well-defined flux.

Minimal code demonstrating the bug or unexpected behavior

julia> i=Index([QN()=>2])
julia> j=Index([QN(("N",0))=>1,QN(("N",1))=>1])
julia> T=delta(dag(i),j)
julia> x=random_itensor(i)
julia> @show norm(x*T-x*denseblocks(T))

Expected output or behavior

The code should output 0.

Actual output or behavior

The output, however, is nonzero.
The reason for that is that T conserves the QNs. However, denseblocks(T) does not, which can be seen from the block-structure. ITensors, however, still asigns denseblocks(T) and x a flux QN(), even though their flux is not well-defined.

I suspect the problem is related to how NDTensors stores a diagblocksparse tensor internally, as it only stores one matrix element. When performing denseblocks, it probably does not check which flux was used in the constructor delta([::Type{ElT} = Float64, ][flux::QN = QN(), ]is)

julia> @show norm(x*T-x*denseblocks(T))
1.0470682860833809
julia> @show flux(T)
QN("N",0)
julia> @show flux(denseblocks(T))
QN("N",0)
julia> @show flux(x*denseblocks(T))
QN("N",0)
julia> @show ITensors.matrix(T)
[1.0 0.0; 0.0 1.0]
julia> @show ITensors.matrix(denseblocks(T))
[1.0 0.0; 0.0 1.0]

Version information

  • Output from versioninfo():
julia> versioninfo()
Julia Version 1.11.3
Commit d63adeda50d (2025-01-21 19:42 UTC)
Build Info:
  Official https://julialang.org/ release
Platform Info:
  OS: Linux (x86_64-linux-gnu)
  CPU: 12 × 13th Gen Intel(R) Core(TM) i7-1360P
  WORD_SIZE: 64
  LLVM: libLLVM-16.0.6 (ORCJIT, goldmont)
Threads: 12 default, 0 interactive, 6 GC (on 12 virtual cores)
Environment:
  JULIA_EDITOR = code
  JULIA_NUM_THREADS = 12
  JULIA_PKG_SERVER = https://juliahub.com/
  • Output from using Pkg; Pkg.status("ITensors"):
julia> using Pkg; Pkg.status("ITensors")
Status `~/.julia/environments/v1.11/Project.toml`
  [9136182c] ITensors v0.8.0
@AFeuerpfeil AFeuerpfeil added bug Something isn't working NDTensors Requires changes to the NDTensors.jl library. labels Feb 20, 2025
@mtfishman
Copy link
Member

Interesting, thanks for the report, we'll look into it.

@mtfishman
Copy link
Member

Should be fixed now if you update to the latest version of NDTensors.jl.

@AFeuerpfeil
Copy link
Author

Thanks, that fixed the problem. The only inconsistency I still observed is in the norm function for the same example as above:

julia> @show norm(T)
1.4142135623730951
julia> @show norm(denseblocks(T))
1.0

The norm for denseblocks(T) is correct, whilst norm(T) gives the result corresponding to a tensor with all ones in the diagonal.

@mtfishman mtfishman reopened this Feb 21, 2025
@mtfishman
Copy link
Member

Ah that's too bad, I'll look into that one too...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working NDTensors Requires changes to the NDTensors.jl library.
Projects
None yet
2 participants