-
Notifications
You must be signed in to change notification settings - Fork 23
OpSum to TTN refactor #166
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
OpSum to TTN refactor #166
Conversation
…eeded data structures.
| # MatElem | ||
| # | ||
|
|
||
| struct MatElem{T} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We could just use FillArrays.OneElement for this can get rid of this type: https://juliaarrays.github.io/FillArrays.jl/stable/#FillArrays.OneElement. Not needed for this PR, I added a note about it to #117.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, this is for sure. Good to know.
| # QNArrElem (sparse array with QNs) # | ||
| ##################################### | ||
|
|
||
| struct QNArrElem{T,N} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Seems like this could also just be a FillArrays.OneElement. Added to #117.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I will have to think about what happens to the QN data but I hope it can be a OneElement also
| # Tree adaptations of functionalities in ITensors.jl/src/physics/autompo/opsum_to_mpo.jl | ||
| # | ||
|
|
||
| function determine_val_type(terms) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
| function determine_val_type(terms) | |
| function promote_coefficient_type(terms) |
Also I think we can write this logic in a better way, for example:
function promote_coefficient_type(terms)
return mapreduce(t -> typeof(coefficient(t)), promote_type, terms)
endThere was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Interestingly, this is a code improvement that reveals another possible bug, this time in OpSum or Ops. Benedikt noted this bug in a comment in the code too. Here's the bug:
using ITensors: OpSum
let
terms = OpSum()
terms .+= 1.0, "Sz",1,"Sz",2
@show typeof(coefficient(first(terms)))
#
# Prints:
# typeof(coefficient(first(terms))) = ComplexF64
#
return
endThe typeof(coefficient(t)) for any term in an OpSum is ComplexFloat64 regardless of whether the original coefficient is real or not.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's by design, OpSum() is defined as OpSum{ComplexF64}().
That was to allow backwards compatibility with the old OpSum/AutoMPO type (which were hardcoded to have ComplexF64 coefficients), otherwise:
terms = OpSum()
terms .+= 1.0 + 2.0im, "Sz",1,"Sz",2would error.
If you want to check if the coefficients are numerically real (i.e. Float64, or ComplexF64 with zero imaginary part) you would have to use a runtime check like isreal rather than check the type, or constrain the OpSum to be real with OpSum{Float64}().
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Good to know – I had forgotten if it was by design or not. I'll come up with a similar function to promote_coefficient_type above, then, that uses a more generic and modern style.
| function svd_bond_coefs( | ||
| coefficient_type, sites, ordered_verts, ordered_edges, inbond_coefs; kws... | ||
| ) | ||
| Vs = Dict(e => Dict{QN,Matrix{coefficient_type}}() for e in ordered_edges) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This could be a DataGraph with that data on the edges of the graph.
I also wonder if Dict{QN,Matrix{coefficient_type}} could be a block diagonal BlockSparseMatrix where those matrices are the diagonal blocks and the QNs are the sector labels of the graded axes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Using DataGraphs could be really helpful. There are a lot of complicated dictionaries where the keys are things like edge=>QN pairs, which makes logical sense and is a carryover from the MPO code, but makes the code hard to read. So maybe a DataGraph with a dictionary having QN keys on each edge could work and would be much cleaner.
|
Ok so the status of this PR is that I've made all the changes you recommended. I have not done the following. I.e. these will be for future PR's:
|
|
Sounds like a reasonable place to leave things, I'll merge once tests pass. |
|
Great. Yes this PR was mostly just about splitting the functions. The excessive arguments ought to come down once the data structures and graph analysis patterns are better (in future PR's). |
This PR refactors the
ttn_svdfunction into three functions,make_sparse_ttn,svd_bond_coefs, andcompress_ttn.Also the
calc_qnfunction and associated cache has been turned into a "function object" calledTermQN.Other changes include:
opsum_to_ttn.jlMatElemandQNArrElemtypes into separate filesITensors.Opsfunctions tousingstatement inopsum_to_ttn.jlsrc/apply.jl