Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OpSum to TTN refactor #166

Merged
merged 19 commits into from
May 10, 2024
Merged

OpSum to TTN refactor #166

merged 19 commits into from
May 10, 2024

Conversation

emstoudenmire
Copy link
Contributor

This PR refactors the ttn_svd function into three functions, make_sparse_ttn, svd_bond_coefs, and compress_ttn.

Also the calc_qn function and associated cache has been turned into a "function object" called TermQN.

Other changes include:

  • remove some unnecessary helper data structures in opsum_to_ttn.jl
  • move code for MatElem and QNArrElem types into separate files
  • add more ITensors.Ops functions to using statement in opsum_to_ttn.jl
  • fix a missing keyword argument in src/apply.jl

src/ITensorNetworks.jl Outdated Show resolved Hide resolved
# MatElem
#

struct MatElem{T}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We could just use FillArrays.OneElement for this can get rid of this type: https://juliaarrays.github.io/FillArrays.jl/stable/#FillArrays.OneElement. Not needed for this PR, I added a note about it to #117.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, this is for sure. Good to know.

# QNArrElem (sparse array with QNs) #
#####################################

struct QNArrElem{T,N}
Copy link
Member

@mtfishman mtfishman May 3, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems like this could also just be a FillArrays.OneElement. Added to #117.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I will have to think about what happens to the QN data but I hope it can be a OneElement also

@@ -32,6 +29,13 @@ end
# Tree adaptations of functionalities in ITensors.jl/src/physics/autompo/opsum_to_mpo.jl
#

function determine_val_type(terms)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
function determine_val_type(terms)
function promote_coefficient_type(terms)

Also I think we can write this logic in a better way, for example:

function promote_coefficient_type(terms)
  return mapreduce(t -> typeof(coefficient(t)), promote_type, terms)
end

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Interestingly, this is a code improvement that reveals another possible bug, this time in OpSum or Ops. Benedikt noted this bug in a comment in the code too. Here's the bug:

using ITensors: OpSum
let
  terms = OpSum()
  terms .+= 1.0, "Sz",1,"Sz",2
  @show typeof(coefficient(first(terms)))
  #
  # Prints:
  #   typeof(coefficient(first(terms))) = ComplexF64
  #
  return
end

The typeof(coefficient(t)) for any term in an OpSum is ComplexFloat64 regardless of whether the original coefficient is real or not.

Copy link
Member

@mtfishman mtfishman May 4, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's by design, OpSum() is defined as OpSum{ComplexF64}().

That was to allow backwards compatibility with the old OpSum/AutoMPO type (which were hardcoded to have ComplexF64 coefficients), otherwise:

  terms = OpSum()
  terms .+= 1.0 + 2.0im, "Sz",1,"Sz",2

would error.

If you want to check if the coefficients are numerically real (i.e. Float64, or ComplexF64 with zero imaginary part) you would have to use a runtime check like isreal rather than check the type, or constrain the OpSum to be real with OpSum{Float64}().

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good to know – I had forgotten if it was by design or not. I'll come up with a similar function to promote_coefficient_type above, then, that uses a more generic and modern style.

function svd_bond_coefs(
coefficient_type, sites, ordered_verts, ordered_edges, inbond_coefs; kws...
)
Vs = Dict(e => Dict{QN,Matrix{coefficient_type}}() for e in ordered_edges)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This could be a DataGraph with that data on the edges of the graph.

I also wonder if Dict{QN,Matrix{coefficient_type}} could be a block diagonal BlockSparseMatrix where those matrices are the diagonal blocks and the QNs are the sector labels of the graded axes.

Copy link
Contributor Author

@emstoudenmire emstoudenmire May 6, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Using DataGraphs could be really helpful. There are a lot of complicated dictionaries where the keys are things like edge=>QN pairs, which makes logical sense and is a carryover from the MPO code, but makes the code hard to read. So maybe a DataGraph with a dictionary having QN keys on each edge could work and would be much cleaner.

@emstoudenmire
Copy link
Contributor Author

emstoudenmire commented May 10, 2024

Ok so the status of this PR is that I've made all the changes you recommended.

I have not done the following. I.e. these will be for future PR's:

  • have not replaced MatElem with FillArrays.OneElem (turned out to be a little complicated to do for reasons I can explain)
  • have not started using SparseArraysDOK yet (but that was for later anyway)
  • have not started using DataGraphs yet either (also pending more planning and design considerations for future PR's)

@mtfishman
Copy link
Member

Sounds like a reasonable place to leave things, I'll merge once tests pass.

@emstoudenmire
Copy link
Contributor Author

Great. Yes this PR was mostly just about splitting the functions. The excessive arguments ought to come down once the data structures and graph analysis patterns are better (in future PR's).

@mtfishman mtfishman merged commit 7ecd931 into ITensor:main May 10, 2024
5 of 6 checks passed
@mtfishman mtfishman changed the title OpSum to TTN Refactor OpSum to TTN refactor May 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants