Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Clean up code flow in doing and returning conventional parameter optimization #13

Open
AdityaSavara opened this issue Dec 5, 2019 · 1 comment
Labels
enhancement New feature or request good first issue Good for newcomers help wanted Extra attention is needed

Comments

@AdityaSavara
Copy link
Owner

Clean up code flow in doing and returning conventional parameter optimization

We probably don't need to plot the conventional parameter optimization response surface, we can probably just return the final values from BPE versus CPE.

It'd be better to have the CPE plotted also as a point on the same graph as the prior and posterior like in the perspective, but that is not something we will support at this time.

@AdityaSavara
Copy link
Owner Author

AdityaSavara commented May 6, 2020

Should include some form of weighted CPE with error bars returned:
https://stackoverflow.com/questions/21469620/how-to-do-linear-regression-taking-errorbars-into-account

Maybe also allow Levenberg-Marquardt through sicipy optimize root.

https://docs.scipy.org/doc/scipy/reference/generated/scipy.optimize.root.html#scipy.optimize.root

https://mathoverflow.net/questions/257699/gauss-newton-vs-gradient-descent-vs-levenberg-marquadt-for-least-squared-method
"The Levenberg-Marquardt curve-fitting method is actually a combination of the two other minimization methods: the gradient descent method and the Gauss-Newton method. In the gradient descent method, the sum of the squared errors is reduced by updating the parameters in the steepest-descent direction. In the Gauss-Newton method, the sum of the squared errors is reduced by assuming the least squares function is locally quadratic, and finding the minimum of the quadratic. The Levenberg-Marquardt method acts more like a gradient-descent method when the parameters are far from their optimal value, and acts more like the Gauss-Newton method when the parameters are close to their optimal value."

@AdityaSavara AdityaSavara added enhancement New feature or request good first issue Good for newcomers help wanted Extra attention is needed labels Oct 15, 2020
AdityaSavara pushed a commit that referenced this issue Jul 6, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request good first issue Good for newcomers help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

1 participant