-
Notifications
You must be signed in to change notification settings - Fork 45
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
very different SAEM results and other tutorial issues #29
Comments
It was very difficult for me to read this since you pasted your R session as output in the comment. I edited the comment so I could read it. I hope you don't mind. In the future please use ```R on the first line and then close it with ``` on the last line to enclose R code. Also I'm unsure exactly what your questions were; I think these are your questions:
Why are the liklihoods different between nlme and SAEM?Each method makes it own likelhood approximation. The nlme approximation is not the same as the FOCEi approximation. The SAEM algorithm stotastically draws from the probabilitiy of the model and then uses for its maximizing the stoastic probability (See https://www.math.u-bordeaux.fr/MAS10/DOC/PDF-PRES/Lavielle.pdf). Therefore, the conventional wisdom is that liklihood between two different methods should not be compared because the approxmations are different and the optimizations between the algorithms are different. In fact for nlme when using REML you cannot really even compare two models at all (See https://stats.stackexchange.com/questions/116770/reml-or-ml-to-compare-two-mixed-effects-models-with-differing-fixed-effects-but) Why are the parameter estimates different?This is because of the different algorithms used to optimize the problem. Why are the objects different?This is manily due to a CRAN limitation on running vignettes in a timely manner and having software that is available on CRAN machines. I don't believe they have a recent python or sympy installation on those systems. The changing of a traditional
Note that:
Hopefully this answers your questions. |
Hi Matt, Thank-you for poring over my question; I'm sorry it was hard to decipher. I don't mind in the least that you re-worded it for clarity for your sake. Thank-you for the Your understanding of my questions was as follows: Why do the likelihoods differ between nlme and saem? This is close but not quite. Below are my questions more clearly laid out: (1) Why do the likelihoods differ so much between saem and nlme? Regarding question (1), I understand that different algorithms will give different answers for likelihoods, parameter estimates, and so on, but I am a bit surprised by just how different they are between SAEM, which is an approximate solution to an exact equation, and NLME, which is an "exact" solution (subject to floating point math rounding errors) to an approximate equation to what is implied by the model. Perhaps I shouldn't be quite so surprised by a 68% discrepancy, but I am. I'd expect that for such a simple problem that the discrepancy between algorithms would be maybe 10-15%, but from your example I guess I'm being naive since calc.resid=TRUE showed both nlme and FOCEI values, the latter being better and closely agreeing with SAEM. If there is a correct answer to the question of what is the likelihood for a given model, data set combination (and there is, barring some strange pathological cases), then I'd expect it to be better approximated by a more or less decent approximation to the desired model than by a more exact answer to an almost-right model. As long as a correct value for the likelihood exists, then the conventional wisdom about not comparing values from different algorithms rings hollow to me, because comparing such values would be a measure of algorithmic error if the correct answer is known to very good approximation, and if a highly accurate gold-standard algorithm is not available, then the situation becomes like statistically summarizing inter-rater variability, e.g., among subjective raters for figure skaters. The correct value does exist, and so we should be able to say something about algorithmic errors. Regarding question (2), I am even more surprised that, using ostensibly the same algorithm (which yes, I realize uses stochastic approximation) and data set, one would get answers that differ by a factor of 2.6 depending on whether one is solving ODEs or using the ODE solution directly. This isn't a difficult ODE, so why a 2.6-fold difference in answers? Using nlme and ODEs vs. the ODE solution, the difference in likelihoods is very small: -179.3253 versus -179.3246, which I can chalk up to minor round off error, probably more so in the ODE vs. in the direct solution, but for the SAEM algorithm the values are -58.07448 versus -148.4241, which is just so much bigger a difference that I cannot wave it away. Also some minor tangential questions: (3) What is it about saem fitting with ODEs that's different from SAEM fitting with solved equations? When I tried the latter in a folder that had spaces in its path, it worked, but when trying the same with ODEs in the same folder, it didn't work. (I didn't include the output from this, and for the purpose of my question I re-did things first in a folder without spaces; this was my first command in what I did show.) This seems somehow related to the C++ compiler as this output shows: > getwd()
[1] "C:/Users/James/Documents"
> setwd("C:\\Users\\James\\Documents\\academic\\conceptual sciences")
> # SAEM fittings
>
> # For SAEM, it's nice to avoid printing each C++ iteration to the monitor.
> sink("saem output")
>
> fit.saem.lin <- nlmixr(uif.lin, theo_sd, est="saem") # works with spaces in path
Compiling SAEM user function...C:/RTools/3.4/mingw_64/bin/g++ -I"C:/R/R-34~1.3/include" -DNDEBUG -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/nlmixr/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/STANHE~1/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/Rcpp/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/RCPPAR~1/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/RCPPEI~1/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/BH/include -O2 -Wall -mtune=generic -c saemf68125d2995x64.cpp -o saemf68125d2995x64.o
C:/RTools/3.4/mingw_64/bin/g++ -shared -s -static-libgcc -o saemf68125d2995x64.dll tmp.def saemf68125d2995x64.o -LC:/R/R-34~1.3/bin/x64 -lRblas -LC:/R/R-34~1.3/bin/x64 -lRlapack -lgfortran -lm -lquadmath -LC:/R/R-34~1.3/bin/x64 -lR
done.
Using sympy via SnakeCharmR
C:/RTools/3.4/mingw_64/bin/gcc -I"C:/R/R-34~1.3/include" -DNDEBUG -O2 -Wall -std=gnu99 -mtune=generic -c rx_920cdb7f8c8c7b5e42b1a8db66863398_x64.c -o rx_920cdb7f8c8c7b5e42b1a8db66863398_x64.o
C:/RTools/3.4/mingw_64/bin/gcc -shared -s -static-libgcc -o rx_920cdb7f8c8c7b5e42b1a8db66863398_x64.dll tmp.def rx_920cdb7f8c8c7b5e42b1a8db66863398_x64.o -LC:/R/R-34~1.3/bin/x64 -lRblas -LC:/R/R-34~1.3/bin/x64 -lRlapack -lgfortran -lm -lquadmath -LC:/R/R-34~1.3/bin/x64 -lR
The model has been setup to calculate residuals.
It will be cached for future runs.
Diagonal form: sqrt
[,1]
[1,] 1
Calculate symbolic inverse...done
Calculate symbolic determinant of inverse...done
Calculate d(Omega)/d(Est) and d(Omega^-1)/d(Est)...
.0
...done.
Calculating Table Variables...
done
nlmixr UI combined dataset and properties
> # Now this shouldn't work, because there are spaces in the current path:
> # NB: The following won't work with spaces in file full file path, don't know why:
> fit.saem.ode <- nlmixr(uif.ode, theo_sd, est="saem") # fails with spaces in path
Compiling RxODE differential equations...C:/RTools/3.4/mingw_64/bin/gcc -I"C:/R/R-34~1.3/include" -DNDEBUG -O2 -Wall -std=gnu99 -mtune=generic -c rx_7cbddd1a680b22377f3ee375ef2ddda8_x64.c -o rx_7cbddd1a680b22377f3ee375ef2ddda8_x64.o
C:/RTools/3.4/mingw_64/bin/gcc -shared -s -static-libgcc -o rx_7cbddd1a680b22377f3ee375ef2ddda8_x64.dll tmp.def rx_7cbddd1a680b22377f3ee375ef2ddda8_x64.o -LC:/R/R-34~1.3/bin/x64 -lRblas -LC:/R/R-34~1.3/bin/x64 -lRlapack -lgfortran -lm -lquadmath -LC:/R/R-34~1.3/bin/x64 -lR
done.
C:/RTools/3.4/mingw_64/bin/g++ -I"C:/R/R-34~1.3/include" -DNDEBUG -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/nlmixr/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/STANHE~1/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/Rcpp/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/RCPPAR~1/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/RCPPEI~1/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/BH/include -O2 -Wall -mtune=generic -c saemf68340822c6x64.cpp -o saemf68340822c6x64.o
C:/RTools/3.4/mingw_64/bin/g++ -shared -s -static-libgcc -o saemf68340822c6x64.dll tmp.def saemf68340822c6x64.o C:/Users/James/Documents/academic/conceptual sciences/rx_7cbddd1a680b22377f3ee375ef2ddda8_x64.dll -LC:/R/R-34~1.3/bin/x64 -lRblas -LC:/R/R-34~1.3/bin/x64 -lRlapack -lgfortran -lm -lquadmath -LC:/R/R-34~1.3/bin/x64 -lR
g++.exe: error: C:/Users/James/Documents/academic/conceptual: No such file or directory
g++.exe: error: sciences/rx_7cbddd1a680b22377f3ee375ef2ddda8_x64.dll: No such file or directory
Error in inDL(x, as.logical(local), as.logical(now), ...) :
unable to load shared object 'C:/Users/James/Documents/academic/conceptual sciences/saemf68340822c6x64.dll':
LoadLibrary failure: The specified module could not be found.
> sink()
> # NB: The following won't work with spaces in file full file path, don't know why:
> fit.saem.ode <- nlmixr(uif.ode, theo_sd, est="saem") # fails with spaces in path
Compiling RxODE differential equations...done.
C:/RTools/3.4/mingw_64/bin/g++ -I"C:/R/R-34~1.3/include" -DNDEBUG -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/nlmixr/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/STANHE~1/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/Rcpp/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/RCPPAR~1/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/RCPPEI~1/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/BH/include -O2 -Wall -mtune=generic -c saemf6832983e6bx64.cpp -o saemf6832983e6bx64.o
C:/RTools/3.4/mingw_64/bin/g++ -shared -s -static-libgcc -o saemf6832983e6bx64.dll tmp.def saemf6832983e6bx64.o C:/Users/James/Documents/academic/conceptual sciences/rx_7cbddd1a680b22377f3ee375ef2ddda8_x64.dll -LC:/R/R-34~1.3/bin/x64 -lRblas -LC:/R/R-34~1.3/bin/x64 -lRlapack -lgfortran -lm -lquadmath -LC:/R/R-34~1.3/bin/x64 -lR
g++.exe: error: C:/Users/James/Documents/academic/conceptual: No such file or directory
g++.exe: error: sciences/rx_7cbddd1a680b22377f3ee375ef2ddda8_x64.dll: No such file or directory
Error in inDL(x, as.logical(local), as.logical(now), ...) :
unable to load shared object 'C:/Users/James/Documents/academic/conceptual sciences/saemf6832983e6bx64.dll':
LoadLibrary failure: The specified module could not be found.
> getwd()
[1] "C:/Users/James/Documents/academic/conceptual sciences"
> # Now to a directory without spaces.
> setwd("C:\\Users\\James\\Documents\\pharmacometrics\\practice\\nlmixr")
> fit.saem.ode <- nlmixr(uif.ode, theo_sd, est="saem")
Compiling RxODE differential equations...done.
C:/RTools/3.4/mingw_64/bin/g++ -I"C:/R/R-34~1.3/include" -DNDEBUG -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/nlmixr/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/STANHE~1/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/Rcpp/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/RCPPAR~1/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/RCPPEI~1/include -IC:/Users/James/DOCUME~1/R/WIN-LI~1/3.4/BH/include -O2 -Wall -mtune=generic -c saemf682e2637edx64.cpp -o saemf682e2637edx64.o
C:/RTools/3.4/mingw_64/bin/g++ -shared -s -static-libgcc -o saemf682e2637edx64.dll tmp.def saemf682e2637edx64.o C:/Users/James/Documents/pharmacometrics/practice/nlmixr/rx_7cbddd1a680b22377f3ee375ef2ddda8_x64.dll -LC:/R/R-34~1.3/bin/x64 -lRblas -LC:/R/R-34~1.3/bin/x64 -lRlapack -lgfortran -lm -lquadmath -LC:/R/R-34~1.3/bin/x64 -lR
done.
1: -3.2788 0.1920 -0.6129 1.9000 0.9500 0.9500 9.8654
2: -3.3025 0.3758 -0.6708 1.8050 0.9025 0.9025 4.3859
... lots of similar lines ...
498: -3.2179 0.4576 -0.7816 0.0719 0.4173 0.0182 0.4813
499: -3.2179 0.4576 -0.7816 0.0719 0.4173 0.0182 0.4812
500: -3.2179 0.4576 -0.7816 0.0719 0.4175 0.0182 0.4811
Diagonal form: sqrt
[,1]
[1,] 1
Calculate symbolic inverse...done
Calculate symbolic determinant of inverse...done
Calculate d(Omega)/d(Est) and d(Omega^-1)/d(Est)...
.0
...done.
Calculating Table Variables...
done
nlmixr UI combined dataset and properties
$ par.hist : Parameter history (if available)
$ par.hist.stacked : Parameter history in stacked form for easy plotting (if available)
$ par.fixed : Fixed Effect Parameter Table
$ eta : Individual Parameter Estimates
$ seed : Seed (if applicable)
$ model.name : Model name (from R function)
$ data.name : Name of R data input (4) Why do the small objects curi and x appear? Object curi appeared with value 0 after executing fit.saem.lin <- nlmixr(uif.lin, theo_sd, est="saem") . Object x is a simple data frame that appears after calling fit.saem.ode or fit.saem.lin. (I was wrong in writing it appeared after calling fit.saem.ode only.) Finally, a new question based on your answer. You wrote, "The 95% confidence intervals between each of the two model's parameters overlap, so they are not statistically different," but from what you pasted (pasted again below), I only see one set of 95% CI, not two. Does $par.fixed pertain to FOCEI or to NLME based values? If I had to guess, I'd guess FOCEI since it's more accurate. Parameters ($par.fixed):
Estimate SE CV Untransformed (95%CI)
tka 0.445 0.198 44.4% 1.56 (1.06, 2.30)
tcl -3.21 0.0845 2.6% 0.0403 (0.0341, 0.0475)
tv -0.786 0.0463 5.9% 0.456 (0.416, 0.499)
add.err 0.692 0.692 Thanks again. |
The |
To compare SAEM ode and nlme ode side by side, I have: > fit
nlmixr SAEM fit (ODE)
OBJF AIC BIC Log-likelihood
116.1369 130.1369 150.3165 -58.06845
Time (sec; $time):
saem setup Likelihood Calculation covariance table
elapsed 31.75 21.55 0.55 0 1.07
Parameters ($par.fixed):
Estimate SE CV Untransformed (95%CI)
tka 0.455 0.197 43.3% 1.58 (1.07, 2.32)
tcl -3.22 0.0802 2.5% 0.0401 (0.0343, 0.0469)
tv -0.784 0.0431 5.5% 0.457 (0.420, 0.497)
add.err 0.691 0.691
Omega ($omega):
eta.ka eta.cl eta.v
eta.ka 0.4367946 0.00000000 0.00000000
000 0.06929573 0.00000000
eta.v 0.0000000 0.00000000 0.01842363
Fit Data (object is a modified data.frame):
# A tibble: 132 x 15
ID TIME DV IPRED PRED IRES RES IWRES WRES CWRES CPRED
<fct> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 1 0 0.740 0 0 0.740 0.740 1.07 1.07 1.07 0
2 1 0.250 2.84 3.87 2.83 -1.03 0.00529 -1.50 0.00308 0.0761 2.67
3 1 0.570 6.57 6.80 5.07 -0.230 1.50 -0.332 0.659 0.614 4.84
# ... with 129 more rows, and 4 more variables: CRES <dbl>, eta.ka <dbl>,
# eta.cl <dbl>, eta.v <dbl>
> fit.nlme
nlmixr nlme fit by maximum likelihood (ODE)
FOCEi-based goodness of fit metrics:
OBJF AIC BIC Log-likelihood
116.051 130.051 150.2306 -58.0255
nlme-based goodness of fit metrics:
AIC BIC Log-likelihood
372.6507 392.8304 -179.3254
Time (sec; $time):
nlme setup FOCEi Evaulate covariance table
elapsed 6.82 0.81 0.24 0 1.18
Parameters ($par.fixed):
Estimate SE CV Untransformed (95%CI)
tka 0.445 0.198 44.4% 1.56 (1.06, 2.30)
tcl -3.21 0.0845 2.6% 0.0403 (0.0341, 0.0475)
tv -0.786 0.0463 5.9% 0.456 (0.416, 0.499)
add.err 0.692 0.692
Omega ($omega):
eta.ka eta.cl eta.v
eta.ka 0.4126161 0.00000000 0.00000000
eta.cl 0.0000000 0.06994126 0.00000000
eta.v 0.0000000 0.00000000 0.01810577
Fit Data (object is a modified data.frame):
# A tibble: 132 x 15
ID TIME DV IPRED PRED IRES RES IWRES WRES CWRES CPRED CRES
<fct> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 1 0 0.740 0 0 0.740 0.740 1.07 1.07 1.07 0 0.740
2 1 0.250 2.84 3.86 2.82 -1.02 0.0220 -1.47 0.0131 0.0861 2.65 0.185
3 1 0.570 6.57 6.78 5.05 -0.207 1.52 -0.299 0.684 0.637 4.82 1.75
# ... with 129 more rows, and 3 more variables: eta.ka <dbl>, eta.cl <dbl>,
# eta.v <dbl> I personally don't see any big differences between the two models. Note the Standard errors are used to calculate the confidence intervals. These come from nlme and saem directly, not from FOCEi. The only thing that FOCEi does is calculate possibly comparible objective functions. |
When looking at the solved systems side by side I get > fit
nlmixr nlme fit by maximum likelihood (Solved)
FOCEi-based goodness of fit metrics:
OBJF AIC BIC Log-likelihood
296.5985 310.5985 330.7781 -148.2992
nlme-based goodness of fit metrics:
AIC BIC Log-likelihood
372.6488 392.8285 -179.3244
Time (sec; $time):
nlme setup FOCEi Evaulate covariance table
elapsed 2.99 1.56 2.1 0 0.85
Parameters ($par.fixed):
Estimate SE CV Untransformed (95%CI)
tka 0.445 0.198 44.4% 1.56 (1.06, 2.30)
tcl -3.21 0.0845 2.6% 0.0403 (0.0341, 0.0475)
tv -0.786 0.0463 5.9% 0.456 (0.416, 0.499)
add.err 0.692 0.692
Omega ($omega):
eta.ka eta.cl eta.v
eta.ka 0.412483 0.0000000 0.0000000
eta.cl 0.000000 0.0699386 0.0000000
eta.v 0.000000 0.0000000 0.0181094
Fit Data (object is a modified data.frame):
# A tibble: 132 x 15
ID TIME DV IPRED PRED IRES RES IWRES WRES CWRES CPRED
<fct> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 1 0 0.740 0 0 0.740 0.740 1.07 0.0000207 -1.00 48128
2 1 0.250 2.84 3.86 2.82 -1.02 0.0218 -1.47 0.0316 -1.47 3.86
3 1 0.570 6.57 6.78 5.05 -0.207 1.52 -0.299 2.19 -0.299 6.78
# ... with 129 more rows, and 4 more variables: CRES <dbl>, eta.ka <dbl>,
# eta.cl <dbl>, eta.v <dbl>
> fit.saem
nlmixr SAEM fit (Solved)
OBJF AIC BIC Log-likelihood
296.8482 310.8482 331.0278 -148.4241
Time (sec; $time):
saem setup Likelihood Calculation covariance table
elapsed 21.04 1.58 0.36 0 1.17
Parameters ($par.fixed):
Estimate SE CV Untransformed (95%CI)
tka 0.450 0.194 43.0% 1.57 (1.07, 2.29)
tcl -3.22 0.0816 2.5% 0.0401 (0.0342, 0.0471)
tv -0.784 0.0435 5.6% 0.457 (0.419, 0.497)
add.err 0.692 0.692
Omega ($omega):
eta.ka eta.cl eta.v
eta.ka 0.4187747 0.0000000 0.00000000
eta.cl 0.0000000 0.0696184 0.00000000
eta.v 0.0000000 0.0000000 0.01819676
Fit Data (object is a modified data.frame):
# A tibble: 132 x 15
ID TIME DV IPRED PRED IRES RES IWRES WRES CWRES CPRED
<fct> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl> <dbl>
1 1 0 0.740 0 0 0.740 0.740 1.07 0.0000206 -0.999 48098
2 1 0.250 2.84 3.85 2.82 -1.01 0.0178 -1.46 0.0257 -1.46 3.85
3 1 0.570 6.57 6.76 5.06 -0.191 1.51 -0.276 2.19 -0.276 6.76
# ... with 129 more rows, and 4 more variables: CRES <dbl>, eta.ka <dbl>,
# eta.cl <dbl>, eta.v <dbl>
> |
There is no difference between the solved systems and their results. The difference comes between the solved and ODE objective functions (though the parameters themselves are quite similar). This is because of the increased accuracy of the solved systems. It also shows that objective function differences may not lead to substatially different models. As far as the fairness of likelihood measures between estimation methods, I think that the only way it would be fair is if the likelihood is exact. Otherwise the biases in the likelihood approximations may give better models for different methods. However, comparing the likelihood of the same method is OK if you believe the assumptions and approximations of the methods are similiar enough between the models. |
As far as the difference between solved and ODE systems is the compiling of RxODE models. There is some bug there that doesn't allow spaces to be in the system. |
You can test if spaces are allowed in RxODE style models for SAEM and if the global environment is polluted by updating to the latest nlmixr on github install_gihub("nlmixrdevelopment/nlmixr") |
Thank-you, Matt! For a closed topic, there has sure been a lot of work, questions, and comments. I'm happy to have helped you in revealing some bugs. I think there's still at least 1 bug remaining revealed from all this. As you've laid them side by side for nmle vs. saem comparisons, I agree that the differences are very minor, but I'm still amazed by a 2.6 fold difference in likelihood between ODE-based vs. explicit-based solutions. I downloaded the latest (today's) nlmixr from Github and re-ran the (more explicit) tutorial and got thus: fit.nlme.lin <- nlmixr(uif.lin, theo_sd, est="nlme")
fit.nlme.lin.crF <- nlmixr(uif.lin, theo_sd, est="nlme", calc.resid=FALSE)
fit.nlme.ode <- nlmixr(uif.ode, theo_sd, est="nlme")
fit.nlme.ode.crF <- nlmixr(uif.ode, theo_sd, est="nlme", calc.resid=FALSE)
fit.saem.lin <- nlmixr(uif.lin, theo_sd, est="saem")
fit.saem.ode <- nlmixr(uif.ode, theo_sd, est="saem")
# Now compare fits for explicit (i.e., linCmt() ) model vs. ODE, and by algorithms:
fit.nlme.lin # Log-likelihood: -1527.988 (FOCEi); -179.3246 (nlme)
fit.nlme.lin.crF # Log-likelihood: -179.3246
fit.nlme.ode # Log-likelihood: -58.02594 (FOCEi); -179.3253 (nlme)
fit.nlme.ode.crF # Log-likelihood: -179.3253
fit.saem.lin # Log-likelihood: -148.4241
fit.saem.ode # Log-likelihood: -58.07448 Where did the -1527.988 FOCEi value come from for fit.nlme.lin? I re-ran it in the same session and got the value you got earlier: fit.nlme.lin # Log-likelihood: -148.2993 (FOCEi); -179.3246 (nlme) So this anomaly wasn't repeatable. I attach both the edited (for clarity) and unedited (for completeness) R session outputs in case you want to take a look. Also, I tested the saem with ODE-based solution in a file name with spaces, and that worked, so that bug is fixed. I then repeated this tutorial with a clean slate (no .Rhistory in folder, no History commands in RStudio, fresh session with no C files or DLL files in folder). This time the -1527.988 anomaly did not appear; I got these values which I take as "final": fit.nlme.lin # Log-likelihood: -148.2993 (FOCEi); -179.3246 (nlme)
fit.nlme.lin.crF # Log-likelihood: -179.3246
fit.nlme.ode # Log-likelihood: -58.02594 (FOCEi); -179.3253 (nlme)
fit.nlme.ode.crF # Log-likelihood: -179.3253
fit.saem.lin # Log-likelihood: -148.4241
fit.saem.ode # Log-likelihood: -58.07448 So, using NLME gives ~ -179 and using the default calc.resid=TRUE option gives FOCEI that agrees with SAEM depending on whether the exact solution was used (~ -148) or the ODE form of the model was used (~ -58). This exercise has been very informative! Finally, there is still a little bug in that the nuisance, cluttering object x still appeared after running this: # NLME fittings
fit.nlme.lin <- nlmixr(uif.lin, theo_sd, est="nlme")
# The cluttering object x has now appeared!
x tutorial - edited session 2017-02-06.txt |
|
I cannot reproduce the "x" in the output. uif <- function() {
ini({
tka <- .5
tcl <- -3.2
tv <- -1
eta.ka ~ 1
eta.cl ~ 2
eta.v ~ 1
add.err <- 0.1
})
model({
ka <- exp(tka + eta.ka)
cl <- exp(tcl + eta.cl)
v <- exp(tv + eta.v)
linCmt() ~ add(add.err)
})
}
fit <- nlmixr(uif, theo_sd, est="nlme") After running I get: > ls()
[1] "fit" "uif" |
The tutorial (https://github.com/nlmixrdevelopment/nlmixr/blob/master/vignettes/running_nlmixr.Rmd) has code for compiling a model for closed form solution using linCmt() and for ODE solution, and also shows how one might use NMLE and SAEM algorithms. With the single dose theophylline data at hand, this gives 4 possibilities. I followed along the tutorial but made a few changes for clarity and for comparison and found that the NMLE result for closed form and for ODE agree very well (but not exactly), whereas the SAEM objective function values, AIC, BIC, and log-likelihood are very different for the explicit versus the ODE models and do not agree with the NMLE results either. The SAEM results seem to be way off. Below is the code and results (and by the way, is there a good way to make all of the following pasted text appear in a uniform typewriter font?):
tutorial.txt
tutorial output.txt
`The tutorial (https://github.com/nlmixrdevelopment/nlmixr/blob/master/vignettes/running_nlmixr.Rmd) has code for compiling a model for closed form solution using linCmt() and for ODE solution, and also shows how one might use NMLE and SAEM algorithms. With the single dose theophylline data at hand, this gives 4 possibilities. I followed along the tutorial but made a few changes for clarity and for comparison and found that the NMLE result for closed form and for ODE agree very well (but not exactly), whereas the SAEM objective function values, AIC, BIC, and log-likelihood are very different for the explicit versus the ODE models and do not agree with the NMLE results either. The SAEM results seem to be way off. Below is the code and results:
The text was updated successfully, but these errors were encountered: