Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adds Base Likelihoods, Maximum Likelihood Example #218

Merged
merged 11 commits into from
Mar 8, 2024
Merged
5 changes: 3 additions & 2 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,9 @@

## Features

- [#215](https://github.com/pybop-team/PyBOP/pull/215) - Adds `release_workflow.md` and updates `release_action.yaml`
- [#204](https://github.com/pybop-team/PyBOP/pull/204) - Splits integration, unit, examples, plots tests, update workflows. Adds pytest `--examples`, `--integration`, `--plots` args. Adds tests for coverage after removal of examples. Adds examples and integrations nox sessions. Adds `pybop.RMSE._evaluateS1()` method
- [#218](https://github.com/pybop-team/PyBOP/pull/218) - Adds likelihood base class, `GaussianLogLikelihoodKnownSigma`, `GaussianLogLikelihood`, and `ProbabilityBased` cost function. As well as addition of a maximum likelihood estimation (MLE) example.
- [#215](https://github.com/pybop-team/PyBOP/pull/215) - Adds `release_workflow.md` and updates `release_action.yaml`.
- [#204](https://github.com/pybop-team/PyBOP/pull/204) - Splits integration, unit, examples, plots tests, update workflows. Adds pytest `--examples`, `--integration`, `--plots` args. Adds tests for coverage after removal of examples. Adds examples and integrations nox sessions. Adds `pybop.RMSE._evaluateS1()` method.
- [#206](https://github.com/pybop-team/PyBOP/pull/206) - Adds Python 3.12 support with corresponding github actions changes.
- [#18](https://github.com/pybop-team/PyBOP/pull/18) - Adds geometric parameter fitting capability, via `model.rebuild()` with `model.rebuild_parameters`.
- [#203](https://github.com/pybop-team/PyBOP/pull/203) - Adds support for modern Python packaging via a `pyproject.toml` file and configures the `pytest` test runner and `ruff` linter to use their configurations stored as declarative metadata.
Expand Down
1 change: 1 addition & 0 deletions examples/scripts/spm_IRPropMin.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@
),
]

# Generate data
sigma = 0.001
t_eval = np.arange(0, 900, 2)
values = model.predict(t_eval=t_eval)
Expand Down
63 changes: 63 additions & 0 deletions examples/scripts/spm_MLE.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
import pybop
import numpy as np

# Define model
parameter_set = pybop.ParameterSet.pybamm("Chen2020")
model = pybop.lithium_ion.SPM(parameter_set=parameter_set)

# Fitting parameters
parameters = [
pybop.Parameter(
"Negative electrode active material volume fraction",
prior=pybop.Gaussian(0.6, 0.05),
bounds=[0.5, 0.8],
),
pybop.Parameter(
"Positive electrode active material volume fraction",
prior=pybop.Gaussian(0.48, 0.05),
bounds=[0.4, 0.7],
),
]

# Generate data
sigma = 0.01
t_eval = np.arange(0, 900, 2)
values = model.predict(t_eval=t_eval)
corrupt_values = values["Voltage [V]"].data + np.random.normal(0, sigma, len(t_eval))

# Form dataset
dataset = pybop.Dataset(
{
"Time [s]": t_eval,
"Current function [A]": values["Current [A]"].data,
"Voltage [V]": corrupt_values,
}
)

# Generate problem, cost function, and optimisation class
problem = pybop.FittingProblem(model, parameters, dataset)
likelihood = pybop.GaussianLogLikelihoodKnownSigma(problem, sigma=sigma)
optim = pybop.Optimisation(likelihood, optimiser=pybop.CMAES)
optim.set_max_unchanged_iterations(20)
optim.set_min_iterations(20)
optim.set_max_iterations(100)

# Run the optimisation
x, final_cost = optim.run()
print("Estimated parameters:", x)

# Plot the timeseries output
pybop.quick_plot(x[0:2], likelihood, title="Optimised Comparison")

# Plot convergence
pybop.plot_convergence(optim)

# Plot the parameter traces
pybop.plot_parameters(optim)

# Plot the cost landscape
pybop.plot_cost2d(likelihood, steps=15)

# Plot the cost landscape with optimisation path and updated bounds
bounds = np.array([[0.55, 0.77], [0.48, 0.68]])
pybop.plot_cost2d(likelihood, optim=optim, bounds=bounds, steps=15)
15 changes: 11 additions & 4 deletions pybop/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,16 @@
# Absolute path to the pybop repo
script_path = path.dirname(__file__)

#
# Problem class
#
from ._problem import BaseProblem, FittingProblem, DesignProblem

#
# Likelihood classes
#
from ._likelihoods import BaseLikelihood, GaussianLogLikelihood, GaussianLogLikelihoodKnownSigma

#
# Cost function class
#
Expand All @@ -31,6 +41,7 @@
RootMeanSquaredError,
SumSquaredError,
ObserverCost,
ProbabilityCost,
)
from .costs.design_costs import (
DesignCost,
Expand Down Expand Up @@ -84,10 +95,6 @@
from .parameters.parameter_set import ParameterSet
from .parameters.priors import Gaussian, Uniform, Exponential

#
# Problem class
#
from ._problem import FittingProblem, DesignProblem

#
# Observer classes
Expand Down
173 changes: 173 additions & 0 deletions pybop/_likelihoods.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,173 @@
import numpy as np


class BaseLikelihood:
"""
Base class for likelihoods
"""

def __init__(self, problem, sigma=None):
self.problem = problem
self._n_output = problem.n_outputs
self._n_times = problem.n_time_data
self.sigma0 = sigma or np.zeros(self._n_output)
self.x0 = problem.x0
self.bounds = problem.bounds
self._n_parameters = problem.n_parameters
self._target = problem._target

def __call__(self, x):
"""
Calls the problem.evaluate method and calculates
the log-likelihood
"""
raise NotImplementedError

def set_sigma(self, sigma):
"""
Setter for sigma parameter
"""
if np.any(sigma <= 0):
raise ValueError("Sigma must not be negative")
else:
self.sigma0 = sigma

def get_sigma(self):
"""
Getter for sigma parameter
"""
return self.sigma0

def get_n_parameters(self):
"""
Returns the number of parameters
"""
return self._n_parameters

@property
def n_parameters(self):
return self._n_parameters


class GaussianLogLikelihoodKnownSigma(BaseLikelihood):
"""
This class represents a Gaussian Log Likelihood with a known sigma,
which assumes that the data follows a Gaussian distribution and computes
the log-likelihood of observed data under this assumption.

Attributes:
_logpi (float): Precomputed offset value for the log-likelihood function.
"""

def __init__(self, problem, sigma=None):
super(GaussianLogLikelihoodKnownSigma, self).__init__(problem, sigma=sigma)
if sigma is not None:
self.set_sigma(sigma)
self._offset = -0.5 * self._n_times * np.log(2 * np.pi / self.sigma0)
self._multip = -1 / (2.0 * self.sigma0**2)
self.sigma2 = self.sigma0**-2
self._dl = np.ones(self._n_parameters)

def __call__(self, x):
"""
Calls the problem.evaluate method and calculates
the log-likelihood
"""
e = self._target - self.problem.evaluate(x)
return np.sum(self._offset + self._multip * np.sum(e**2, axis=0))

def _evaluateS1(self, x):
"""
Calls the problem.evaluateS1 method and calculates
the log-likelihood
"""

y, dy = self.problem.evaluateS1(x)
if len(y) < len(self._target):
likelihood = -np.float64(np.inf)
dl = self._dl * np.ones(self._n_parameters)

Check warning on line 88 in pybop/_likelihoods.py

View check run for this annotation

Codecov / codecov/patch

pybop/_likelihoods.py#L87-L88

Added lines #L87 - L88 were not covered by tests
else:
dy = dy.reshape(
(
self._n_times,
self._n_output,
self._n_parameters,
)
)
e = self._target - y
likelihood = np.sum(self._offset + self._multip * np.sum(e**2, axis=0))
dl = np.sum((self.sigma2 * np.sum((e.T * dy.T), axis=2)), axis=1)

return likelihood, dl


class GaussianLogLikelihood(BaseLikelihood):
"""
This class represents a Gaussian Log Likelihood, which assumes that the
data follows a Gaussian distribution and computes the log-likelihood of
observed data under this assumption.

Attributes:
_logpi (float): Precomputed offset value for the log-likelihood function.
"""

def __init__(self, problem):
super(GaussianLogLikelihood, self).__init__(problem)
self._logpi = -0.5 * self._n_times * np.log(2 * np.pi)
self._dl = np.ones(self._n_parameters + self._n_output)

def __call__(self, x):
"""
Evaluates the Gaussian log-likelihood for the given parameters.

Args:
x (array_like): The parameters for which to evaluate the log-likelihood.
The last `self._n_output` elements are assumed to be the
standard deviations of the Gaussian distributions.

Returns:
float: The log-likelihood value, or -inf if the standard deviations are received as non-positive.
"""
sigma = np.asarray(x[-self._n_output :])

if np.any(sigma <= 0):
return -np.inf

e = self._target - self.problem.evaluate(x[: -self._n_output])
return np.sum(
self._logpi
- self._n_times * np.log(sigma)
- np.sum(e**2, axis=0) / (2.0 * sigma**2)
)

def _evaluateS1(self, x):
"""
Calls the problem.evaluateS1 method and calculates
the log-likelihood
"""
sigma = np.asarray(x[-self._n_output :])

if np.any(sigma <= 0):
return -np.inf, self._dl

Check warning on line 151 in pybop/_likelihoods.py

View check run for this annotation

Codecov / codecov/patch

pybop/_likelihoods.py#L151

Added line #L151 was not covered by tests

y, dy = self.problem.evaluateS1(x[: -self._n_output])
if len(y) < len(self._target):
likelihood = -np.float64(np.inf)
dl = self._dl

Check warning on line 156 in pybop/_likelihoods.py

View check run for this annotation

Codecov / codecov/patch

pybop/_likelihoods.py#L155-L156

Added lines #L155 - L156 were not covered by tests
else:
dy = dy.reshape(
(
self._n_times,
self._n_output,
self._n_parameters,
)
)
e = self._target - y
likelihood = self.__call__(x)
dl = np.sum((sigma**-(2.0) * np.sum((e.T * dy.T), axis=2)), axis=1)

# Add sigma gradient to dl
dsigma = -self._n_times / sigma + sigma**-(3.0) * np.sum(e**2, axis=0)
dl = np.concatenate((dl, dsigma))

return likelihood, dl
32 changes: 28 additions & 4 deletions pybop/_optimisation.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,24 +40,27 @@
def __init__(
self,
cost,
x0=None,
optimiser=None,
sigma0=None,
verbose=False,
physical_viability=True,
allow_infeasible_solutions=True,
):
self.cost = cost
self.x0 = x0
self.optimiser = optimiser
self.verbose = verbose
self.x0 = cost.x0
self.bounds = cost.bounds
self.sigma0 = sigma0 or cost.sigma0
self.n_parameters = cost.n_parameters
self.physical_viability = physical_viability
self.allow_infeasible_solutions = allow_infeasible_solutions
self.log = []

# Convert x0 to pints vector
# Catch x0, and convert to pints vector
if x0 is None:
self.x0 = cost.x0
self._x0 = pints.vector(self.x0)

# Set whether to allow infeasible locations
Expand All @@ -74,11 +77,11 @@
self._transformation = None

# Check if minimising or maximising
self._minimising = not isinstance(cost, pints.LogPDF)
self._minimising = not isinstance(cost, pybop.BaseLikelihood)
if self._minimising:
self._function = self.cost
else:
self._function = pints.ProbabilityBasedError(cost)
self._function = pybop.ProbabilityCost(cost)

Check warning on line 84 in pybop/_optimisation.py

View check run for this annotation

Codecov / codecov/patch

pybop/_optimisation.py#L84

Added line #L84 was not covered by tests
del cost

# Construct Optimiser
Expand Down Expand Up @@ -122,6 +125,10 @@
self._max_iterations = None
self.set_max_iterations()

# Minimum iterations
self._min_iterations = None
self.set_min_iterations()

# Maximum unchanged iterations
self._unchanged_threshold = 1 # smallest significant f change
self._unchanged_max_iterations = None
Expand Down Expand Up @@ -289,6 +296,7 @@
halt = (
self._unchanged_max_iterations is not None
and unchanged_iterations >= self._unchanged_max_iterations
and iteration >= self._min_iterations
)
if running and halt:
running = False
Expand Down Expand Up @@ -448,6 +456,22 @@
raise ValueError("Maximum number of iterations cannot be negative.")
self._max_iterations = iterations

def set_min_iterations(self, iterations=2):
"""
Set the minimum number of iterations as a stopping criterion.

Parameters
----------
iterations : int, optional
The minimum number of iterations to run (default is 100).
Set to `None` to remove this stopping criterion.
"""
if iterations is not None:
iterations = int(iterations)
if iterations < 0:
raise ValueError("Minimum number of iterations cannot be negative.")
self._min_iterations = iterations

def set_max_unchanged_iterations(self, iterations=5, threshold=1e-5):
"""
Set the maximum number of iterations without significant change as a stopping criterion.
Expand Down
Loading
Loading