Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

5/19 meeting #379

Merged
merged 4 commits into from
May 24, 2020
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions bet/calculateP/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

* :mod:`~bet.calculateP.calculateP` provides methods for approximating probability densities in the measure-theoretic framework.
* :mod:`~bet.calculateP.simpleFunP` provides methods for creating simple function approximations of probability densities for the measure-theoretic framework.
* :mod:`~bet.calculateP.dataConsistent` provides methods for data-consistent stochastic inversion.
* :mod:`~bet.calculateP.calculateR` provides methods for data-consistent stochastic inversion.
* :mod:`~bet.calculateP.calculateError` provides methods for approximating numerical and sampling errors.
"""
__all__ = ['calculateP', 'simpleFunP', 'calculateError', 'dataConsistent']
__all__ = ['calculateP', 'simpleFunP', 'calculateError', 'calculateR']
42 changes: 25 additions & 17 deletions bet/calculateP/dataConsistent.py → bet/calculateP/calculateR.py
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
# Copyright (C) 2014-2020 The BET Development Team

r"""
This module contains functions for data-consistent stochastic inversion.
This module contains functions for data-consistent stochastic inversion based on ratios of densities.

* :meth:`~bet.calculateP.dataConsistent.generate_output_kdes` generates KDEs on output sets.
* :meth:`~bet.calculateP.dataConsistent.invert_to_kde` solves SIP for weighted KDEs.
* :meth:`~bet.calculateP.dataConsistent.invert_to_gmm` solves SIP for a Gaussian Mixture Model.
* :meth:`~bet.calculateP.dataConsistent.invert_to_multivariate_gaussian` solves SIP for a multivariate Gaussian.
* :meth:`~bet.calculateP.dataConsistent.invert_to_random_variable` solves SIP for random variables.
* :meth:`~bet.calculateP.dataConsistent.invert_rejection_sampling` solves SIP with rejection sampling.
* :meth:`~bet.calculateP.calculateR.generate_output_kdes` generates KDEs on output sets.
* :meth:`~bet.calculateP.calculateR.invert_to_kde` solves SIP for weighted KDEs.
* :meth:`~bet.calculateP.calculateR.invert_to_gmm` solves SIP for a Gaussian Mixture Model.
* :meth:`~bet.calculateP.calculateR.invert_to_multivariate_gaussian` solves SIP for a multivariate Gaussian.
* :meth:`~bet.calculateP.calculateR.invert_to_random_variable` solves SIP for random variables.
* :meth:`~bet.calculateP.calculateR.invert_rejection_sampling` solves SIP with rejection sampling.

"""
import bet.sample
Expand All @@ -33,13 +33,13 @@ def generate_output_kdes(discretization, bw_method=None):
discretization.local_to_global()

predict_set = discretization.get_output_sample_set()
obs_set = discretization.get_output_probability_set()
obs_set = discretization.get_output_observed_set()
if predict_set.get_region() is None or obs_set.get_region() is None:
predict_set.set_region(np.array([0] * predict_set.check_num()))
obs_set.set_region(np.array([0] * obs_set.check_num()))

if predict_set.get_cluster_maps() is None:
num_clusters = int(max(np.max(predict_set.get_region()), np.max(obs_set.get_region())) + 1)
num_clusters = int(np.max(predict_set.get_region()) + 1)
else:
num_clusters = len(predict_set.get_cluster_maps())

Expand Down Expand Up @@ -80,7 +80,7 @@ def generate_output_kdes(discretization, bw_method=None):
obs_kdes.append(gaussian_kde(obs_set.get_values()[obs_pointer].T, bw_method=bw_method))
else:
obs_kdes.append(None)
return predict_set, predict_kdes, obs_set, obs_kdes, num_clusters
return predict_kdes, obs_kdes, num_clusters


def invert(discretization, bw_method = None):
Expand All @@ -96,7 +96,8 @@ def invert(discretization, bw_method = None):
:return: marginal probabilities and cluster weights
:rtype: list, `np.ndarray`
"""
predict_set, predict_kdes, obs_set, obs_kdes, num_clusters = generate_output_kdes(discretization, bw_method)
predict_kdes, obs_kdes, num_clusters = generate_output_kdes(discretization, bw_method)
predict_set = discretization.get_output_sample_set()

rs = []
r = []
Expand Down Expand Up @@ -136,10 +137,12 @@ def invert_to_kde(discretization, bw_method = None):
"""
from scipy.stats import gaussian_kde

predict_set, predict_kdes, obs_set, obs_kdes, num_clusters = generate_output_kdes(discretization, bw_method)
predict_kdes, obs_kdes, num_clusters = generate_output_kdes(discretization, bw_method)

rs, r, lam_ptr = invert(discretization, bw_method)

obs_set = discretization.get_output_observed_set()

# Compute marginal probabilities for each parameter and initial condition.
param_marginals = []
cluster_weights = []
Expand Down Expand Up @@ -176,8 +179,7 @@ def invert_rejection_sampling(discretization, bw_method=None):
:return: sample set containing samples
:rtype: :class:`bet.sample.sample_set`
"""
predict_set, predict_kdes, obs_set, obs_kdes, num_clusters = generate_output_kdes(discretization,
bw_method=bw_method)
predict_kdes, obs_kdes, num_clusters = generate_output_kdes(discretization, bw_method)

rs, r, lam_ptr = invert(discretization, bw_method)

Expand Down Expand Up @@ -233,10 +235,12 @@ def weighted_mean_and_cov(x, weights):
cov1 = cov1 / sum_weights
return mean1, cov1

predict_set, predict_kdes, obs_set, obs_kdes, num_clusters = generate_output_kdes(discretization, bw_method)
predict_kdes, obs_kdes, num_clusters = generate_output_kdes(discretization, bw_method)

rs, r, lam_ptr = invert(discretization, bw_method)

obs_set = discretization.get_output_observed_set()

# Compute multivariate normal for each cluster
means = []
covariances = []
Expand Down Expand Up @@ -289,10 +293,12 @@ def weighted_mean_and_cov(x, weights):
cov1 = cov1 / sum_weights
return mean1, cov1

predict_set, predict_kdes, obs_set, obs_kdes, num_clusters = generate_output_kdes(discretization, bw_method)
predict_kdes, obs_kdes, num_clusters = generate_output_kdes(discretization, bw_method)

rs, r, lam_ptr = invert(discretization, bw_method)

obs_set = discretization.get_output_observed_set()

# Compute multivariate normal
cluster_weights = []
num_obs = obs_set.check_num()
Expand Down Expand Up @@ -358,10 +364,12 @@ def invert_to_random_variable(discretization, rv, num_reweighted=10000, bw_metho
else:
raise bet.sample.wrong_input("rv must be a string, list, or tuple.")

predict_set, predict_kdes, obs_set, obs_kdes, num_clusters = generate_output_kdes(discretization, bw_method)
predict_kdes, obs_kdes, num_clusters = generate_output_kdes(discretization, bw_method)

rs, r, lam_ptr = invert(discretization, bw_method)

obs_set = discretization.get_output_observed_set()

# Compute multivariate normal
cluster_weights = []
num_obs = obs_set.check_num()
Expand Down
8 changes: 4 additions & 4 deletions bet/postProcess/compareP.py
Original file line number Diff line number Diff line change
Expand Up @@ -132,9 +132,9 @@ def evaluate_pdfs(self):
sup2 = np.equal(self.pdfs2, 0.0)
self.pdfs_zero = np.sum(np.logical_and(sup1, sup2))

def distance(self, functional='tv', normalize=True, **kwargs):
def distance(self, functional='tv', normalize=False, **kwargs):
"""
Compute the discrete statistical distance between the probability measures
Compute the statistical distance between the probability measures
evaluated at the comparison points.

:param functional: functional defining type of statistical distance
Expand Down Expand Up @@ -183,10 +183,10 @@ def distance(self, functional='tv', normalize=True, **kwargs):
dist = functional(self.pdfs1, self.pdfs2, **kwargs)
return dist

def distance_marginal(self, i, interval=None, num_points=1000, compare_factor=0.0, normalize=True,
def distance_marginal(self, i, interval=None, num_points=1000, compare_factor=0.0, normalize=False,
functional='tv', **kwargs):
"""
Compute the discrete statistical distance between the marginals of the probability measures
Compute the statistical distance between the marginals of the probability measures
evaluated at equally spaced points on an interval. If the interval is not defined,
one is computed by the maximum and minimum values. This domain is extended by the proportion
set by `compare_factor`.
Expand Down
34 changes: 32 additions & 2 deletions bet/sample.py
Original file line number Diff line number Diff line change
Expand Up @@ -2238,12 +2238,13 @@ class discretization(object):
#: :class:`sample.sample_set_base`
sample_set_names = ['_input_sample_set', '_output_sample_set',
'_emulated_input_sample_set', '_emulated_output_sample_set',
'_output_probability_set']
'_output_probability_set', '_output_observed_set']

def __init__(self, input_sample_set, output_sample_set,
output_probability_set=None,
emulated_input_sample_set=None,
emulated_output_sample_set=None):
emulated_output_sample_set=None,
output_observed_set=None):
"""
Initialize the discretization.

Expand All @@ -2257,6 +2258,8 @@ def __init__(self, input_sample_set, output_sample_set,
:type emulated_input_sample_set: :class:`bet.sample.sample_set_base`
:param emulated_output_sample_set: Emulated output set
:type emulated_output_sample_set: :class:`bet.sample.sample_set_base`
:param output_observed_set: Observed output set
:type output_observed_set: :class:`bet.sample.sample_set_base`

"""
#: Input sample set :class:`~bet.sample.sample_set_base`
Expand All @@ -2271,6 +2274,8 @@ def __init__(self, input_sample_set, output_sample_set,
self._output_probability_set = output_probability_set
#: Pointer from ``self._output_sample_set`` to
#: ``self._output_probability_set``
#: Observed output sample set :class:`~bet.sample.sample_set_base`
self._output_observed_set = output_observed_set
self._io_ptr = None
#: Pointer from ``self._emulated_input_sample_set`` to
#: ``self._input_sample_set``
Expand Down Expand Up @@ -2533,6 +2538,31 @@ def set_output_sample_set(self, output_sample_set):
else:
raise AttributeError("Wrong Type: Should be sample_set_base type")

def get_output_observed_set(self):
"""

Returns a reference to the output observed sample set for this discretization.

:rtype: :class:`~bet.sample.sample_set_base`
:returns: output sample set

"""
return self._output_observed_set

def set_output_observed_set(self, output_sample_set):
"""

Sets the output observed sample set for this discretization.

:param output_sample_set: output observed sample set.
:type output_sample_set: :class:`~bet.sample.sample_set_base`

"""
if isinstance(output_sample_set, sample_set_base):
self._output_observed_set = output_sample_set
else:
raise AttributeError("Wrong Type: Should be sample_set_base type")

def get_output_probability_set(self):
"""

Expand Down
2 changes: 1 addition & 1 deletion bet/sampling/useLUQ.py
Original file line number Diff line number Diff line change
Expand Up @@ -143,7 +143,7 @@ def make_disc(self):
# Prediction discretization
disc1 = sample.discretization(input_sample_set=self.predict_set,
output_sample_set=predict_output,
output_probability_set=obs_output)
output_observed_set=obs_output)

# Observation discretization
disc2 = sample.discretization(input_sample_set=self.obs_set,
Expand Down
8 changes: 2 additions & 6 deletions doc/bet.calculateP.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@ bet.calculateP.calculateError module

.. automodule:: bet.calculateP.calculateError
:members:
:special-members:
:undoc-members:
:show-inheritance:

Expand All @@ -18,16 +17,14 @@ bet.calculateP.calculateP module

.. automodule:: bet.calculateP.calculateP
:members:
:special-members:
:undoc-members:
:show-inheritance:

bet.calculateP.dataConsistent module
bet.calculateP.calculateR module
------------------------------------

.. automodule:: bet.calculateP.dataConsistent
.. automodule:: bet.calculateP.calculateR
:members:
:special-members:
:undoc-members:
:show-inheritance:

Expand All @@ -36,7 +33,6 @@ bet.calculateP.simpleFunP module

.. automodule:: bet.calculateP.simpleFunP
:members:
:special-members:
:undoc-members:
:show-inheritance:

Expand Down
5 changes: 0 additions & 5 deletions doc/bet.postProcess.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@ bet.postProcess.compareP module

.. automodule:: bet.postProcess.compareP
:members:
:special-members:
:undoc-members:
:show-inheritance:

Expand All @@ -18,7 +17,6 @@ bet.postProcess.plotDomains module

.. automodule:: bet.postProcess.plotDomains
:members:
:special-members:
:undoc-members:
:show-inheritance:

Expand All @@ -27,7 +25,6 @@ bet.postProcess.plotP module

.. automodule:: bet.postProcess.plotP
:members:
:special-members:
:undoc-members:
:show-inheritance:

Expand All @@ -36,7 +33,6 @@ bet.postProcess.plotVoronoi module

.. automodule:: bet.postProcess.plotVoronoi
:members:
:special-members:
:undoc-members:
:show-inheritance:

Expand All @@ -45,7 +41,6 @@ bet.postProcess.postTools module

.. automodule:: bet.postProcess.postTools
:members:
:special-members:
:undoc-members:
:show-inheritance:

Expand Down
4 changes: 0 additions & 4 deletions doc/bet.rst
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,6 @@ bet.Comm module

.. automodule:: bet.Comm
:members:
:special-members:
:undoc-members:
:show-inheritance:

Expand All @@ -28,7 +27,6 @@ bet.sample module

.. automodule:: bet.sample
:members:
:special-members:
:undoc-members:
:show-inheritance:

Expand All @@ -37,7 +35,6 @@ bet.surrogates module

.. automodule:: bet.surrogates
:members:
:special-members:
:undoc-members:
:show-inheritance:

Expand All @@ -46,7 +43,6 @@ bet.util module

.. automodule:: bet.util
:members:
:special-members:
:undoc-members:
:show-inheritance:

Expand Down
3 changes: 0 additions & 3 deletions doc/bet.sampling.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@ bet.sampling.LpGeneralizedSamples module

.. automodule:: bet.sampling.LpGeneralizedSamples
:members:
:special-members:
:undoc-members:
:show-inheritance:

Expand All @@ -18,7 +17,6 @@ bet.sampling.basicSampling module

.. automodule:: bet.sampling.basicSampling
:members:
:special-members:
:undoc-members:
:show-inheritance:

Expand All @@ -27,7 +25,6 @@ bet.sampling.useLUQ module

.. automodule:: bet.sampling.useLUQ
:members:
:special-members:
:undoc-members:
:show-inheritance:

Expand Down
2 changes: 0 additions & 2 deletions doc/bet.sensitivity.rst
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@ bet.sensitivity.chooseQoIs module

.. automodule:: bet.sensitivity.chooseQoIs
:members:
:special-members:
:undoc-members:
:show-inheritance:

Expand All @@ -18,7 +17,6 @@ bet.sensitivity.gradients module

.. automodule:: bet.sensitivity.gradients
:members:
:special-members:
:undoc-members:
:show-inheritance:

Expand Down
2 changes: 1 addition & 1 deletion doc/overview.rst
Original file line number Diff line number Diff line change
Expand Up @@ -152,7 +152,7 @@ The package layout is as follows::
calculateP
calculateError
simpleFunP
dataConsistent
calculateR
sampling/
basicSampling
useLUQ
Expand Down
Loading