Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Isensee_JCB2018: fix SBML parameter values #253

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

dilpath
Copy link
Collaborator

@dilpath dilpath commented Nov 12, 2024

In #132, this problem was changed to remove model selection parameters (if they were not selected).

All model selection parameters in the SBML model were also set to 0. This changed the value of the xi_i_* model selection parameters, which led to a change in the objective function.

This PR reverses that change. AMICI previously expected a likelihood of -3.9494e+03. With the following code and this PR, I get a posterior of -3953.8289641759175 (without prior: -3949.3759644713145, prior-only: -4.452999704602744)

import petab
import amici
import pypesto.petab
pp = petab.Problem.from_yaml("Isensee_JCB2018.yaml")
pi = pypesto.petab.PetabImporter(pp)
p=pi.create_problem()
p.objective._objectives[0].amici_solver.setAbsoluteTolerance(1e-8)
p.objective._objectives[0].amici_solver.setRelativeTolerance(1e-8)
p.objective._objectives[0].amici_model.setSteadyStateSensitivityMode(amici.SteadyStateSensitivityMode.integrationOnly)
print(p.objective(pp.x_nominal_free_scaled, sensi_orders=(0,1)))

However, it's unclear to me whether this PR is an improvement, because in d2d, xi_i_* parameters are set to 0.
https://github.com/Data2Dynamics/d2d/blob/f3e55b8600529ee213eb53b3e2a1628d44210b24/arFramework3/Examples/Isensee_JCB2018/Setup.m#L84-L90

@FFroehlich
Copy link
Collaborator

FFroehlich commented Nov 12, 2024

Okay, so the reference value in Isensee for the log-likelihood value in the amici tests is (and always has been) -3949.375966548649, see https://github.com/AMICI-dev/AMICI/blob/master/tests/benchmark-models/benchmark_models.yaml. This value was checked against the reference value from the matlab benchmark collection: https://github.com/Benchmarking-Initiative/Benchmark-Models/blob/master/Benchmark-Models/Isensee_JCB2018/General_info.xlsx with "2*log-likelihood" equal to 7950.84 (likely log-posterior = -3975.42, still small discrepancy, maybe due to bessel correction?) and chi2 equal to 691.13. It looks like I also observed this discrepancy and only checked the chi2 level, which appears to have agreed.

The parameters table in the matlab benchmark collection also indicates that the xi_i_* parameters should be set to 0, but in the corresponding sbml files they are set to 1: https://github.com/Benchmarking-Initiative/Benchmark-Models/blob/master/Benchmark-Models/Isensee_JCB2018/SBML/model1_data100_l2v4.xml. I can't find any special note on the model in the original publication of the matlab benchmark.

I ran the following code to evaluate chi2 values (would have been a bit easier using AMICI, but alas):

import amici
import pypesto.petab
import benchmark_models_petab
pp = benchmark_models_petab.get_problem("Isensee_JCB2018")
pi = pypesto.petab.PetabImporter(pp)
p=pi.create_problem()
p.objective._objectives[0].amici_solver.setAbsoluteTolerance(1e-8)
p.objective._objectives[0].amici_solver.setRelativeTolerance(1e-8)
p.objective._objectives[0].amici_model.setSteadyStateSensitivityMode(amici.SteadyStateSensitivityMode.integrationOnly)
p.objective._objectives[0].amici_model.setAddSigmaResiduals(True)
rs = p.objective(pp.x_nominal_free_scaled, sensi_orders=(0,), return_dict=True)[pypesto.C.RDATAS]
print(sum([np.sum(np.square(r['res'][:r['y'].size])) for r in rs]))

with values set to 0 (pip install git+https://github.com/Benchmarking-Initiative/Benchmark-Models-PEtab.git@master#subdirectory=src/python): 5884.859699304807
with values set to 1 (pip install git+https://github. com/Benchmarking-Initiative/Benchmark-Models-PEtab.git@a68be27#subdirectory=src/python): 686.7479370970433

adding the amici reported chi2 value for xi_i_* parameters set to 1 (686.7479370970433) to the value of the prior (4.452999704602744) puts us suspiciously close to the reported value of 691.13, which to me suggests that the benchmark model was run (for whatever reason) with H1 and H3 enabled and I would propose to keep it that way.

@dilpath
Copy link
Collaborator Author

dilpath commented Nov 12, 2024

Thanks for looking in to this. I would be fine with changing the problem in this collection back to being the H1+H3 model, since it's still valid for the benchmark collection. However, it might actually be the H1-only model when all xi_i_* parameters are 1...

I see that in the supplementary to the original publication, file "Model_description.xlsx", sheet "Model Variants", it specifies that the "off" value for the xi_i_* parameters is 1 (see first image).

However, the MATLAB code "SetupFitting_PKA_cycle__batch.m" in the same supplementary instead defaults to 0 (see second image)... I will ask @JanHasenauer

image

image

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants