Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Writing loss and parameters in function reconstruct_points #18

Merged
merged 6 commits into from
Oct 5, 2023
Merged

Conversation

ctroupin
Copy link
Member

@ctroupin ctroupin commented Oct 5, 2023

Using the code from the function reconstruct( )

DINCAE.jl/src/model.jl

Lines 552 to 573 in 06bdb3f

if paramfile !== nothing
NCDataset(paramfile,"c") do ds_
defVar(ds_,"losses",losses,("epochs",))
ds_.attrib["epochs"] = epochs
ds_.attrib["batch_size"] = batch_size
ds_.attrib["truth_uncertain"] = Int(truth_uncertain)
ds_.attrib["enc_nfilter_internal"] = Vector{Int}(collect(enc_nfilter_internal))
ds_.attrib["skipconnections"] = Vector{Int}(collect(skipconnections))
ds_.attrib["clip_grad"] = clip_grad
ds_.attrib["regularization_L1_beta"] = regularization_L1_beta
ds_.attrib["regularization_L2_beta"] = regularization_L2_beta
ds_.attrib["save_epochs"] = Vector{Int}(save_epochs)
ds_.attrib["is3D"] = Int(is3D)
ds_.attrib["upsampling_method"] = string(upsampling_method)
ds_.attrib["ntime_win"] = ntime_win
ds_.attrib["learning_rate"] = learning_rate
ds_.attrib["learning_rate_decay_epoch"] = learning_rate_decay_epoch
ds_.attrib["min_std_err"] = min_std_err
ds_.attrib["loss_weights_refine"] = Vector{Float64}(collect(loss_weights_refine))
ds_.attrib["cycle_periods"] = Vector{Float64}(collect(cycle_periods))
end
end

to also have the same functionality in reconstruct_point.

Comments:

  1. not all the parameters of reconstruct are used in reconstruct_point`, and vice versa
  2. I had to change losses = Float64[] to losses = [], otherwise an error appeared:
└ @ Main ~/CPR-DINCAE/src/run_DINCAE_testparams.jl:118
ERROR: LoadError: MethodError: no method matching fillvalue(::Type{Any})

Closest candidates are:
  fillvalue(::Union{NCDatasets.MFCFVariable{T}, NCDatasets.MFVariable{T}}) where T
   @ NCDatasets ~/.julia/packages/NCDatasets/st9Jz/src/multifile.jl:289
  fillvalue(::CommonDataModel.CFVariable)
   @ CommonDataModel ~/.julia/packages/CommonDataModel/RSBF3/src/cfvariable.jl:200
  fillvalue(::NCDatasets.Variable{NetCDFType, N}) where {NetCDFType, N}
   @ NCDatasets ~/.julia/packages/NCDatasets/st9Jz/src/variable.jl:255

probably because the type of losses was Vector{Any}.
3. A few parameters from reconstruct are not written in the file, I'll do it in another PR.

DINCAE.jl/src/model.jl

Lines 351 to 357 in 06bdb3f

cycle_periods = (365.25,), # days
output_ndims = 1,
direction_obs = nothing,
remove_mean = true,
paramfile = nothing,
laplacian_penalty = 0,
laplacian_error_penalty = laplacian_penalty,

4. Shall we consider the possibility to write the parameters in the same file as the result?

@ctroupin ctroupin added the enhancement New feature or request label Oct 5, 2023
@ctroupin ctroupin linked an issue Oct 5, 2023 that may be closed by this pull request
@Alexander-Barth
Copy link
Member

Alexander-Barth commented Oct 5, 2023

Thanks a lot, this is fine!
For point 4: In fact, it used to be the same file bit I changed this here (b7ed446 ) because, I typically make a lot of tests and when I need to clean-up the files I prefer to just delete the reconstructions and keep some the error statistics and the network configuration (so that I could repeat the reconstruction if necessary).

But we can cover both cases if the user sets ( paramfile = "results.nc") and we are careful to not erase the results with jkust the parameters :-)

ncmode = (paramfile in fnames_rec ? "a" : "c")
NCDataset(paramfile,ncmode) do ds_
...

@ctroupin
Copy link
Member Author

ctroupin commented Oct 5, 2023

Yes you're right for 4., in fact I'm running now with the values written in an extra file, it's really safer.

@ctroupin ctroupin merged commit 65bee41 into main Oct 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
Status: Done
Development

Successfully merging this pull request may close these issues.

Save loss vector when
2 participants