From 4fcbfde8d5ebfba587d40f7caf7299b847bf3742 Mon Sep 17 00:00:00 2001 From: RosalynLP Date: Thu, 25 Feb 2021 10:28:00 +0000 Subject: [PATCH 01/13] update theorycov docs with new specifying of point prescriptions --- doc/sphinx/source/vp/theorycov/examples.rst | 15 ++++++++++----- .../source/vp/theorycov/point_prescrip.rst | 18 +++++++++++++----- doc/sphinx/source/vp/theorycov/summary.rst | 4 +++- 3 files changed, 26 insertions(+), 11 deletions(-) diff --git a/doc/sphinx/source/vp/theorycov/examples.rst b/doc/sphinx/source/vp/theorycov/examples.rst index 2c36055ad6..34319eea66 100644 --- a/doc/sphinx/source/vp/theorycov/examples.rst +++ b/doc/sphinx/source/vp/theorycov/examples.rst @@ -14,6 +14,10 @@ You need to provide the central theory under the ``default_theory`` flag, corresponding to :math:`(\mu_F, \mu_R) = (0,0)`, which for NLO is theory 163. +You need to provide the required point prescription using the flag in +:ref:`this section `, e.g. ``point_prescription: "3 point"`` +in the case below. + ``dataspecs`` associates a chosen label (``speclabel``) with each of the theory choices. This details what scale variation the theory corresponds to. @@ -22,11 +26,12 @@ Here the cuts and PDF are taken from the central NLO scale-varied fit. You must also list all the experiments you wish to include, along with any relevant c-factors. -*IMPORTANT*: In order to ensure backwards compatibility now that the structure -of data in runcards has been updated and ``experiments`` is deprecated, you must -also include ``metadata_group: nnpdf31_process`` in the runcards, so that the -scale variation prescriptions are done by process rather than by experiment. See -:ref:`backwards-compatibility` for more details. +.. warning:: + In order to ensure backwards compatibility now that the structure + of data in runcards has been updated and ``experiments`` is deprecated, you must + also include ``metadata_group: nnpdf31_process`` in the runcards, so that the + scale variation prescriptions are done by process rather than by experiment. See + :ref:`backwards-compatibility` for more details. .. code-block:: yaml :linenos: diff --git a/doc/sphinx/source/vp/theorycov/point_prescrip.rst b/doc/sphinx/source/vp/theorycov/point_prescrip.rst index bf994251e8..1133c23a74 100644 --- a/doc/sphinx/source/vp/theorycov/point_prescrip.rst +++ b/doc/sphinx/source/vp/theorycov/point_prescrip.rst @@ -1,3 +1,4 @@ +.. _pointprescrips: Point prescriptions for theory covariance matrices ================================================== @@ -6,7 +7,8 @@ appear in ``validphys2``. 3 points -------- - +|``theoryids``: 163, 180, 173 +|``point_prescription: '3 point'`` .. math:: s_{11} = \frac{1}{2}\bigg\{ \Delta_1(+,+)^2 + \Delta_1(-,-)^2 \bigg\} .. math:: s_{12} = \frac{1}{4}\bigg\{\bigg(\Delta_1(+,+) + \Delta_1(-,-) \bigg) \bigg(\Delta_2(+,+) + \Delta_2(-,-) \bigg) \bigg\} @@ -14,7 +16,8 @@ appear in ``validphys2``. 5 points --------- - +|``theoryids``: 163, 177, 176, 179, 174 +|``point_prescription: '5 point'`` .. math:: s_{11} = \frac{1}{2}\bigg\{ \Delta_1(+,0)^2 + \Delta_1(-,0)^2 + \Delta_1(0,+)^2 + \Delta_1(0,-)^2 \bigg\} .. math:: @@ -26,7 +29,8 @@ appear in ``validphys2``. :math:`\mathbf{\overline{5}}` points ------------------------------------ - +|``theoryids:`` 163, 180, 173, 175, 178 +|``point_prescription: '5bar point'`` .. math:: s_{11} = \frac{1}{2}\bigg\{ \Delta_1(+,+)^2 + \Delta_1(-,-)^2 + \Delta_1(+,-)^2 + \Delta_1(-,+)^2 \bigg\} .. math:: @@ -38,9 +42,9 @@ appear in ``validphys2``. 7 points - original ------------------- - +| ``theoryids:`` 163, 177, 176, 179, 174, 180, 173 | Specify in the runcard ``seventheories: original`` - +|``point_prescription: '7 point'`` .. math:: \begin{split} @@ -57,6 +61,8 @@ appear in ``validphys2``. 7 points - Gavin (default) -------------------------- +``theoryids:`` 163, 177, 176, 179, 174, 180, 173 +|``point_prescription: '7 point'`` .. math:: @@ -76,6 +82,8 @@ appear in ``validphys2``. 9 points -------- +``theoryids:`` 163, 177, 176, 179, 174, 180, 173, 175, 178 +|``point_prescription: '9 point'`` .. math:: diff --git a/doc/sphinx/source/vp/theorycov/summary.rst b/doc/sphinx/source/vp/theorycov/summary.rst index 2df834ef25..e77e5abcad 100644 --- a/doc/sphinx/source/vp/theorycov/summary.rst +++ b/doc/sphinx/source/vp/theorycov/summary.rst @@ -41,7 +41,9 @@ See the `short #. ``scalevariationtheoryids.yaml``: correspondence between each scale combination and a theoryid for a given central theoryid -- The prescription must be one of 3 point, 5 point, 7 point or 9 point. +- The prescription must be one of 3 point, 5 point, 5bar point, 7 point or 9 point. You can specify + this using ``point_prescription: "x point"`` in the runcard. The translation of this flag + into the relevant ``theoryids`` is handled by the ``scalevariations`` module in ``validphys``. - In the case of 5 theories, you must further specify whether the 5 or :math:`\bar{5}` prescription is required. You can do this by From 5545a19e9724eca562932777fc348c0dc2f18cd2 Mon Sep 17 00:00:00 2001 From: RosalynLP Date: Thu, 25 Feb 2021 10:43:11 +0000 Subject: [PATCH 02/13] add note/warning boxes --- .../source/vp/theorycov/point_prescrip.rst | 50 +++++++++++++------ 1 file changed, 36 insertions(+), 14 deletions(-) diff --git a/doc/sphinx/source/vp/theorycov/point_prescrip.rst b/doc/sphinx/source/vp/theorycov/point_prescrip.rst index 1133c23a74..918bdad9a5 100644 --- a/doc/sphinx/source/vp/theorycov/point_prescrip.rst +++ b/doc/sphinx/source/vp/theorycov/point_prescrip.rst @@ -1,3 +1,7 @@ +.. |br| raw:: html + +
+ .. _pointprescrips: Point prescriptions for theory covariance matrices ================================================== @@ -7,8 +11,11 @@ appear in ``validphys2``. 3 points -------- -|``theoryids``: 163, 180, 173 -|``point_prescription: '3 point'`` +.. note:: + + ``theoryids``: 163, 180, 173 |br| + ``point_prescription: '3 point'`` + .. math:: s_{11} = \frac{1}{2}\bigg\{ \Delta_1(+,+)^2 + \Delta_1(-,-)^2 \bigg\} .. math:: s_{12} = \frac{1}{4}\bigg\{\bigg(\Delta_1(+,+) + \Delta_1(-,-) \bigg) \bigg(\Delta_2(+,+) + \Delta_2(-,-) \bigg) \bigg\} @@ -16,8 +23,10 @@ appear in ``validphys2``. 5 points --------- -|``theoryids``: 163, 177, 176, 179, 174 -|``point_prescription: '5 point'`` +.. note:: + + ``theoryids``: 163, 177, 176, 179, 174 |br| + ``point_prescription: '5 point'`` .. math:: s_{11} = \frac{1}{2}\bigg\{ \Delta_1(+,0)^2 + \Delta_1(-,0)^2 + \Delta_1(0,+)^2 + \Delta_1(0,-)^2 \bigg\} .. math:: @@ -29,8 +38,11 @@ appear in ``validphys2``. :math:`\mathbf{\overline{5}}` points ------------------------------------ -|``theoryids:`` 163, 180, 173, 175, 178 -|``point_prescription: '5bar point'`` +.. note:: + + ``theoryids:`` 163, 180, 173, 175, 178 |br| + ``point_prescription: '5bar point'`` + .. math:: s_{11} = \frac{1}{2}\bigg\{ \Delta_1(+,+)^2 + \Delta_1(-,-)^2 + \Delta_1(+,-)^2 + \Delta_1(-,+)^2 \bigg\} .. math:: @@ -42,10 +54,15 @@ appear in ``validphys2``. 7 points - original ------------------- -| ``theoryids:`` 163, 177, 176, 179, 174, 180, 173 -| Specify in the runcard ``seventheories: original`` -|``point_prescription: '7 point'`` - .. math:: + +.. warning:: + + **Deprecated prescription!** |br| + ``theoryids:`` 163, 177, 176, 179, 174, 180, 173 |br| + Specify in the runcard ``seventheories: original`` |br| + ``point_prescription: '7 point'`` + +.. math:: \begin{split} s_{11} = \frac{1}{3}\bigg\{ &\Delta_1(+,0)^2 + \Delta_1(-,0)^2 + \Delta_1(0,+)^2 + \Delta_1(0,-)^2 \\ + &\Delta_1(+,+)^2 + \Delta_1(-,-)^2 \bigg\} @@ -61,8 +78,10 @@ appear in ``validphys2``. 7 points - Gavin (default) -------------------------- -``theoryids:`` 163, 177, 176, 179, 174, 180, 173 -|``point_prescription: '7 point'`` +.. note:: + + ``theoryids:`` 163, 177, 176, 179, 174, 180, 173 |br| + ``point_prescription: '7 point'`` .. math:: @@ -82,8 +101,11 @@ appear in ``validphys2``. 9 points -------- -``theoryids:`` 163, 177, 176, 179, 174, 180, 173, 175, 178 -|``point_prescription: '9 point'`` + +.. note:: + + ``theoryids:`` 163, 177, 176, 179, 174, 180, 173, 175, 178 |br| + ``point_prescription: '9 point'`` .. math:: From 44c2aaa6bd1280ad41e4bc5e441fd9ace0742eb3 Mon Sep 17 00:00:00 2001 From: RosalynLP Date: Thu, 25 Feb 2021 10:43:28 +0000 Subject: [PATCH 03/13] start th cov tutorial --- doc/sphinx/source/tutorials/index.rst | 1 + 1 file changed, 1 insertion(+) diff --git a/doc/sphinx/source/tutorials/index.rst b/doc/sphinx/source/tutorials/index.rst index dba5946283..e71b6ec3c0 100644 --- a/doc/sphinx/source/tutorials/index.rst +++ b/doc/sphinx/source/tutorials/index.rst @@ -7,6 +7,7 @@ Tutorials ./run-fit.md ./run-legacy-fit.rst ./run-iterated-fit.rst + ./thcov_tutorial.rst ./compare-fits.md ./list-resources.md ./report.md From 212ee17369a259ccdbd09246e5268c514a427c41 Mon Sep 17 00:00:00 2001 From: RosalynLP Date: Thu, 25 Feb 2021 11:37:40 +0000 Subject: [PATCH 04/13] add thcovmat fit runcard example --- .../source/tutorials/thcov_tutorial.rst | 250 ++++++++++++++++++ .../theory_covariance/fit_with_thcovmat.yaml | 180 +++++++++++++ 2 files changed, 430 insertions(+) create mode 100644 doc/sphinx/source/tutorials/thcov_tutorial.rst create mode 100644 validphys2/examples/theory_covariance/fit_with_thcovmat.yaml diff --git a/doc/sphinx/source/tutorials/thcov_tutorial.rst b/doc/sphinx/source/tutorials/thcov_tutorial.rst new file mode 100644 index 0000000000..45a53fbac4 --- /dev/null +++ b/doc/sphinx/source/tutorials/thcov_tutorial.rst @@ -0,0 +1,250 @@ +How to include a theory covariance matrix in a fit +================================================== +:Author: Contact Rosalyn (r.l.pearson@ed.ac.uk) for further information. + +This section details how to include scale variation covariance matrices (covmats) +in a PDF fit. At the present time this can only be done at NLO, for which the +central theory is theory 163. + +First, decide which theory covmat you want +------------------------------------------ +- Choose the desired point-prescription listed :ref:`here `. +- Each prescription comes with a ``point_prescription`` flag to include in + the runcard, one of ["3 point", "5 point", "5bar point", "7 point", "9 point"] + +Next, add necessary flags to the runcard +---------------------------------------- +- Remember to list the required datasets using ``dataset_inputs`` (see :ref:`data_specification`). +- Specify ``metadata_group: nnpdf31_process`` in the runcard. This means that + the grouping will be done according to process type specified in the ``plotting`` + files. +.. warning:: + If ``metadata_group`` is not set to ``nnpdf31_process`` it will default to + ``experiment`` and renormalisation scale variation correlations will be + between experiments rather than processes. See :ref:`backwards-compatibility` + for details. +- Add ``theorycovmatconfig`` to the runcard. An example is in the following code snippet: + +.. code:: yaml + + ############################################################ + theory: + theoryid: 163 # database id + + theorycovmatconfig: + point_prescription: "3 point" + theoryids: + from_: scale_variation_theories + pdf: NNPDF31_nlo_as_0118 + use_thcovmat_in_fitting: true + use_thcovmat_in_sampling: true + + sampling_t0: + use_t0: false + + fitting_t0: + use_t0: true + + ############################################################ + +- ``pdf`` is the PDF used to generate the scale varied predictions which + construct the theory covmat. Choose something close to the PDF you are + trying to fit, such as a previous iteration if available. +- ``theoryids`` are necessary for the construction of the theory covmat. + To avoid user error in entering them in the correct configuration and order, + this is handled by the ``produce_scale_variation_theories`` action in + `config `_, + using the information in + `the scalevariations module `_. +- The flags ``use_thcovmat_in_fitting`` and ``use_thcovmat_in_sampling`` specify + where to use the theory covmat in the code. There are two possible places: + the fitting (i.e. \\(\\chi^2\\) minimiser) and the sampling (i.e. pseudodata + generation). The default is ``True`` for both. + +Example runcard +--------------- +The following is an example runcard for an NLO NNPDF3.1 style fit with a 3 point theory covmat. +It can be found `here `_. + +.. code:: yaml + + # + # Configuration file for NNPDF++ + # + ########################################################################################## + description: Example runcard for NLO NNPDF3.1 style fit with 3pt theory covariance matrix + + ########################################################################################## + # frac: training fraction + # ewk: apply ewk k-factors + # sys: systematics treatment (see systypes) + dataset_inputs: + - {dataset: NMCPD, frac: 0.5} + - {dataset: NMC, frac: 0.5} + - {dataset: SLACP, frac: 0.5} + - {dataset: SLACD, frac: 0.5} + - {dataset: BCDMSP, frac: 0.5} + - {dataset: BCDMSD, frac: 0.5} + - {dataset: CHORUSNU, frac: 0.5} + - {dataset: CHORUSNB, frac: 0.5} + - {dataset: NTVNUDMN, frac: 0.5} + - {dataset: NTVNBDMN, frac: 0.5} + - {dataset: HERACOMBNCEM, frac: 0.5} + - {dataset: HERACOMBNCEP460, frac: 0.5} + - {dataset: HERACOMBNCEP575, frac: 0.5} + - {dataset: HERACOMBNCEP820, frac: 0.5} + - {dataset: HERACOMBNCEP920, frac: 0.5} + - {dataset: HERACOMBCCEM, frac: 0.5} + - {dataset: HERACOMBCCEP, frac: 0.5} + - {dataset: HERAF2CHARM, frac: 0.5} + - {dataset: CDFZRAP, frac: 1.0} + - {dataset: D0ZRAP, frac: 1.0} + - {dataset: D0WEASY, frac: 1.0} + - {dataset: D0WMASY, frac: 1.0} + - {dataset: ATLASWZRAP36PB, frac: 1.0} + - {dataset: ATLASZHIGHMASS49FB, frac: 1.0} + - {dataset: ATLASLOMASSDY11EXT, frac: 1.0} + - {dataset: ATLASWZRAP11, frac: 0.5} + - {dataset: ATLAS1JET11, frac: 0.5} + - {dataset: ATLASZPT8TEVMDIST, frac: 0.5} + - {dataset: ATLASZPT8TEVYDIST, frac: 0.5} + - {dataset: ATLASTTBARTOT, frac: 1.0} + - {dataset: ATLASTOPDIFF8TEVTRAPNORM, frac: 1.0} + - {dataset: CMSWEASY840PB, frac: 1.0} + - {dataset: CMSWMASY47FB, frac: 1.0} + - {dataset: CMSDY2D11, frac: 0.5} + - {dataset: CMSWMU8TEV, frac: 1.0} + - {dataset: CMSZDIFF12, frac: 1.0, cfac: [NRM]} + - {dataset: CMSJETS11, frac: 0.5} + - {dataset: CMSTTBARTOT, frac: 1.0} + - {dataset: CMSTOPDIFF8TEVTTRAPNORM, frac: 1.0} + - {dataset: LHCBZ940PB, frac: 1.0} + - {dataset: LHCBZEE2FB, frac: 1.0} + - {dataset: LHCBWZMU7TEV, frac: 1.0, cfac: [NRM]} + - {dataset: LHCBWZMU8TEV, frac: 1.0, cfac: [NRM]} + + ############################################################ + datacuts: + t0pdfset: 190310-tg-nlo-global # PDF set to generate t0 covmat + q2min: 13.96 # Q2 minimum + w2min: 12.5 # W2 minimum + combocuts: NNPDF31 # NNPDF3.0 final kin. cuts + jetptcut_tev: 0 # jet pt cut for tevatron + jetptcut_lhc: 0 # jet pt cut for lhc + wptcut_lhc: 30.0 # Minimum pT for W pT diff distributions + jetycut_tev: 1e30 # jet rap. cut for tevatron + jetycut_lhc: 1e30 # jet rap. cut for lhc + dymasscut_min: 0 # dy inv.mass. min cut + dymasscut_max: 1e30 # dy inv.mass. max cut + jetcfactcut: 1e30 # jet cfact. cut + use_cuts: fromintersection + cuts_intersection_spec: + - theoryid: 163 + - theoryid: 53 + + ############################################################ + theory: + theoryid: 163 # database id + + theorycovmatconfig: + point_prescription: "3 point" + theoryids: + from_: scale_variation_theories + fivetheories: None + pdf: NNPDF31_nlo_as_0118 + use_thcovmat_in_fitting: true + use_thcovmat_in_sampling: true + + sampling_t0: + use_t0: false + + fitting_t0: + use_t0: true + + ############################################################ + fitting: + seed: 65532133530 # set the seed for the random generator + genrep: on # on = generate MC replicas, off = use real data + rngalgo: 0 # 0 = ranlux, 1 = cmrg, see randomgenerator.cc + fitmethod: NGA # Minimization algorithm + ngen: 30000 # Maximum number of generations + nmutants: 80 # Number of mutants for replica + paramtype: NN + nnodes: [2, 5, 3, 1] + + # NN23(QED) = sng=0,g=1,v=2,t3=3,ds=4,sp=5,sm=6,(pht=7) + # EVOL(QED) = sng=0,g=1,v=2,v3=3,v8=4,t3=5,t8=6,(pht=7) + # EVOLS(QED)= sng=0,g=1,v=2,v8=4,t3=4,t8=5,ds=6,(pht=7) + # FLVR(QED) = g=0, u=1, ubar=2, d=3, dbar=4, s=5, sbar=6, (pht=7) + fitbasis: NN31IC # EVOL (7), EVOLQED (8), etc. + basis: + # remeber to change the name of PDF accordingly with fitbasis + # pos: on for NN squared + # mutsize: mutation size + # mutprob: mutation probability + # smallx, largex: preprocessing ranges + - {fl: sng, pos: off, mutsize: [15], mutprob: [0.05], smallx: [1.046, 1.188], largex: [ + 1.437, 2.716]} + - {fl: g, pos: off, mutsize: [15], mutprob: [0.05], smallx: [0.9604, 1.23], largex: [ + 0.08459, 6.137]} + - {fl: v, pos: off, mutsize: [15], mutprob: [0.05], smallx: [0.5656, 0.7242], largex: [ + 1.153, 2.838]} + - {fl: v3, pos: off, mutsize: [15], mutprob: [0.05], smallx: [0.1521, 0.5611], largex: [ + 1.236, 2.976]} + - {fl: v8, pos: off, mutsize: [15], mutprob: [0.05], smallx: [0.5264, 0.7246], largex: [ + 0.6919, 3.198]} + - {fl: t3, pos: off, mutsize: [15], mutprob: [0.05], smallx: [-0.3687, 1.459], largex: [ + 1.664, 3.373]} + - {fl: t8, pos: off, mutsize: [15], mutprob: [0.05], smallx: [0.5357, 1.267], largex: [ + 1.433, 2.866]} + - {fl: cp, pos: off, mutsize: [15], mutprob: [0.05], smallx: [-0.09635, 1.204], + largex: [1.654, 7.456]} + + ############################################################ + stopping: + stopmethod: LOOKBACK # Stopping method + lbdelta: 0 # Delta for look-back stopping + mingen: 0 # Minimum number of generations + window: 500 # Window for moving average + minchi2: 3.5 # Minimum chi2 + minchi2exp: 6.0 # Minimum chi2 for experiments + nsmear: 200 # Smear for stopping + deltasm: 200 # Delta smear for stopping + rv: 2 # Ratio for validation stopping + rt: 0.5 # Ratio for training stopping + epsilon: 1e-6 # Gradient epsilon + + ############################################################ + positivity: + posdatasets: + - {dataset: POSF2U, poslambda: 1e6} # Positivity Lagrange Multiplier + - {dataset: POSF2DW, poslambda: 1e6} + - {dataset: POSF2S, poslambda: 1e6} + - {dataset: POSFLL, poslambda: 1e6} + - {dataset: POSDYU, poslambda: 1e10} + - {dataset: POSDYD, poslambda: 1e10} + - {dataset: POSDYS, poslambda: 1e10} + + ############################################################ + closuretest: + filterseed: 0 # Random seed to be used in filtering data partitions + fakedata: off # on = to use FAKEPDF to generate pseudo-data + fakepdf: MSTW2008nlo68cl # Theory input for pseudo-data + errorsize: 1.0 # uncertainties rescaling + fakenoise: off # on = to add random fluctuations to pseudo-data + rancutprob: 1.0 # Fraction of data to be included in the fit + rancutmethod: 0 # Method to select rancutprob data fraction + rancuttrnval: off # 0(1) to output training(valiation) chi2 in report + printpdf4gen: off # To print info on PDFs during minimization + + ############################################################ + lhagrid: + nx: 150 + xmin: 1e-9 + xmed: 0.1 + xmax: 1.0 + nq: 50 + qmax: 1e5 + + ############################################################ + debug: off diff --git a/validphys2/examples/theory_covariance/fit_with_thcovmat.yaml b/validphys2/examples/theory_covariance/fit_with_thcovmat.yaml new file mode 100644 index 0000000000..ae4324a48e --- /dev/null +++ b/validphys2/examples/theory_covariance/fit_with_thcovmat.yaml @@ -0,0 +1,180 @@ +# +# Configuration file for NNPDF++ +# +###################################################################################### +description: Example runcard for NNPDF3.1 style fit with 3pt theory covariance matrix + +###################################################################################### +# frac: training fraction +# ewk: apply ewk k-factors +# sys: systematics treatment (see systypes) +dataset_inputs: + - {dataset: NMCPD, frac: 0.5} + - {dataset: NMC, frac: 0.5} + - {dataset: SLACP, frac: 0.5} + - {dataset: SLACD, frac: 0.5} + - {dataset: BCDMSP, frac: 0.5} + - {dataset: BCDMSD, frac: 0.5} + - {dataset: CHORUSNU, frac: 0.5} + - {dataset: CHORUSNB, frac: 0.5} + - {dataset: NTVNUDMN, frac: 0.5} + - {dataset: NTVNBDMN, frac: 0.5} + - {dataset: HERACOMBNCEM, frac: 0.5} + - {dataset: HERACOMBNCEP460, frac: 0.5} + - {dataset: HERACOMBNCEP575, frac: 0.5} + - {dataset: HERACOMBNCEP820, frac: 0.5} + - {dataset: HERACOMBNCEP920, frac: 0.5} + - {dataset: HERACOMBCCEM, frac: 0.5} + - {dataset: HERACOMBCCEP, frac: 0.5} + - {dataset: HERAF2CHARM, frac: 0.5} + - {dataset: CDFZRAP, frac: 1.0} + - {dataset: D0ZRAP, frac: 1.0} + - {dataset: D0WEASY, frac: 1.0} + - {dataset: D0WMASY, frac: 1.0} + - {dataset: ATLASWZRAP36PB, frac: 1.0} + - {dataset: ATLASZHIGHMASS49FB, frac: 1.0} + - {dataset: ATLASLOMASSDY11EXT, frac: 1.0} + - {dataset: ATLASWZRAP11, frac: 0.5} + - {dataset: ATLAS1JET11, frac: 0.5} + - {dataset: ATLASZPT8TEVMDIST, frac: 0.5} + - {dataset: ATLASZPT8TEVYDIST, frac: 0.5} + - {dataset: ATLASTTBARTOT, frac: 1.0} + - {dataset: ATLASTOPDIFF8TEVTRAPNORM, frac: 1.0} + - {dataset: CMSWEASY840PB, frac: 1.0} + - {dataset: CMSWMASY47FB, frac: 1.0} + - {dataset: CMSDY2D11, frac: 0.5} + - {dataset: CMSWMU8TEV, frac: 1.0} + - {dataset: CMSZDIFF12, frac: 1.0, cfac: [NRM]} + - {dataset: CMSJETS11, frac: 0.5} + - {dataset: CMSTTBARTOT, frac: 1.0} + - {dataset: CMSTOPDIFF8TEVTTRAPNORM, frac: 1.0} + - {dataset: LHCBZ940PB, frac: 1.0} + - {dataset: LHCBZEE2FB, frac: 1.0} + - {dataset: LHCBWZMU7TEV, frac: 1.0, cfac: [NRM]} + - {dataset: LHCBWZMU8TEV, frac: 1.0, cfac: [NRM]} + +############################################################ +datacuts: + t0pdfset: 190310-tg-nlo-global # PDF set to generate t0 covmat + q2min: 13.96 # Q2 minimum + w2min: 12.5 # W2 minimum + combocuts: NNPDF31 # NNPDF3.0 final kin. cuts + jetptcut_tev: 0 # jet pt cut for tevatron + jetptcut_lhc: 0 # jet pt cut for lhc + wptcut_lhc: 30.0 # Minimum pT for W pT diff distributions + jetycut_tev: 1e30 # jet rap. cut for tevatron + jetycut_lhc: 1e30 # jet rap. cut for lhc + dymasscut_min: 0 # dy inv.mass. min cut + dymasscut_max: 1e30 # dy inv.mass. max cut + jetcfactcut: 1e30 # jet cfact. cut + use_cuts: fromintersection + cuts_intersection_spec: + - theoryid: 163 + - theoryid: 53 + +############################################################ +theory: + theoryid: 163 # database id + +theorycovmatconfig: + point_prescription: "3 point" + theoryids: + from_: scale_variation_theories + fivetheories: None + pdf: NNPDF31_nlo_as_0118 + use_thcovmat_in_fitting: true + use_thcovmat_in_sampling: true + +sampling_t0: + use_t0: false + +fitting_t0: + use_t0: true + +############################################################ +fitting: + seed: 65532133530 # set the seed for the random generator + genrep: on # on = generate MC replicas, off = use real data + rngalgo: 0 # 0 = ranlux, 1 = cmrg, see randomgenerator.cc + fitmethod: NGA # Minimization algorithm + ngen: 30000 # Maximum number of generations + nmutants: 80 # Number of mutants for replica + paramtype: NN + nnodes: [2, 5, 3, 1] + + # NN23(QED) = sng=0,g=1,v=2,t3=3,ds=4,sp=5,sm=6,(pht=7) + # EVOL(QED) = sng=0,g=1,v=2,v3=3,v8=4,t3=5,t8=6,(pht=7) + # EVOLS(QED)= sng=0,g=1,v=2,v8=4,t3=4,t8=5,ds=6,(pht=7) + # FLVR(QED) = g=0, u=1, ubar=2, d=3, dbar=4, s=5, sbar=6, (pht=7) + fitbasis: NN31IC # EVOL (7), EVOLQED (8), etc. + basis: + # remeber to change the name of PDF accordingly with fitbasis + # pos: on for NN squared + # mutsize: mutation size + # mutprob: mutation probability + # smallx, largex: preprocessing ranges + - {fl: sng, pos: off, mutsize: [15], mutprob: [0.05], smallx: [1.046, 1.188], largex: [ + 1.437, 2.716]} + - {fl: g, pos: off, mutsize: [15], mutprob: [0.05], smallx: [0.9604, 1.23], largex: [ + 0.08459, 6.137]} + - {fl: v, pos: off, mutsize: [15], mutprob: [0.05], smallx: [0.5656, 0.7242], largex: [ + 1.153, 2.838]} + - {fl: v3, pos: off, mutsize: [15], mutprob: [0.05], smallx: [0.1521, 0.5611], largex: [ + 1.236, 2.976]} + - {fl: v8, pos: off, mutsize: [15], mutprob: [0.05], smallx: [0.5264, 0.7246], largex: [ + 0.6919, 3.198]} + - {fl: t3, pos: off, mutsize: [15], mutprob: [0.05], smallx: [-0.3687, 1.459], largex: [ + 1.664, 3.373]} + - {fl: t8, pos: off, mutsize: [15], mutprob: [0.05], smallx: [0.5357, 1.267], largex: [ + 1.433, 2.866]} + - {fl: cp, pos: off, mutsize: [15], mutprob: [0.05], smallx: [-0.09635, 1.204], + largex: [1.654, 7.456]} + +############################################################ +stopping: + stopmethod: LOOKBACK # Stopping method + lbdelta: 0 # Delta for look-back stopping + mingen: 0 # Minimum number of generations + window: 500 # Window for moving average + minchi2: 3.5 # Minimum chi2 + minchi2exp: 6.0 # Minimum chi2 for experiments + nsmear: 200 # Smear for stopping + deltasm: 200 # Delta smear for stopping + rv: 2 # Ratio for validation stopping + rt: 0.5 # Ratio for training stopping + epsilon: 1e-6 # Gradient epsilon + +############################################################ +positivity: + posdatasets: + - {dataset: POSF2U, poslambda: 1e6} # Positivity Lagrange Multiplier + - {dataset: POSF2DW, poslambda: 1e6} + - {dataset: POSF2S, poslambda: 1e6} + - {dataset: POSFLL, poslambda: 1e6} + - {dataset: POSDYU, poslambda: 1e10} + - {dataset: POSDYD, poslambda: 1e10} + - {dataset: POSDYS, poslambda: 1e10} + +############################################################ +closuretest: + filterseed: 0 # Random seed to be used in filtering data partitions + fakedata: off # on = to use FAKEPDF to generate pseudo-data + fakepdf: MSTW2008nlo68cl # Theory input for pseudo-data + errorsize: 1.0 # uncertainties rescaling + fakenoise: off # on = to add random fluctuations to pseudo-data + rancutprob: 1.0 # Fraction of data to be included in the fit + rancutmethod: 0 # Method to select rancutprob data fraction + rancuttrnval: off # 0(1) to output training(valiation) chi2 in report + printpdf4gen: off # To print info on PDFs during minimization + +############################################################ +lhagrid: + nx: 150 + xmin: 1e-9 + xmed: 0.1 + xmax: 1.0 + nq: 50 + qmax: 1e5 + +############################################################ +debug: off From aadb5d8777bffebd0acbbae148409ca5575b86d0 Mon Sep 17 00:00:00 2001 From: RosalynLP Date: Thu, 25 Feb 2021 11:55:05 +0000 Subject: [PATCH 05/13] warnig for sampling fitting flags --- doc/sphinx/source/tutorials/thcov_tutorial.rst | 5 ++++- 1 file changed, 4 insertions(+), 1 deletion(-) diff --git a/doc/sphinx/source/tutorials/thcov_tutorial.rst b/doc/sphinx/source/tutorials/thcov_tutorial.rst index 45a53fbac4..7b950d07b9 100644 --- a/doc/sphinx/source/tutorials/thcov_tutorial.rst +++ b/doc/sphinx/source/tutorials/thcov_tutorial.rst @@ -60,7 +60,10 @@ Next, add necessary flags to the runcard where to use the theory covmat in the code. There are two possible places: the fitting (i.e. \\(\\chi^2\\) minimiser) and the sampling (i.e. pseudodata generation). The default is ``True`` for both. - +.. warning:: + Changing either of these to ``False`` will affect the fit outcome and should + be avoided unless you know what you are doing. + Example runcard --------------- The following is an example runcard for an NLO NNPDF3.1 style fit with a 3 point theory covmat. From 180e7215fbaf46897a5c1fecbe4d04ce1d2b98f8 Mon Sep 17 00:00:00 2001 From: Rosalyn Pearson <33020850+RosalynLP@users.noreply.github.com> Date: Tue, 9 Mar 2021 18:04:13 +0000 Subject: [PATCH 06/13] NLO spell out Co-authored-by: Cameron Voisey <32741139+voisey@users.noreply.github.com> --- doc/sphinx/source/tutorials/thcov_tutorial.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/sphinx/source/tutorials/thcov_tutorial.rst b/doc/sphinx/source/tutorials/thcov_tutorial.rst index 7b950d07b9..252e33a0e0 100644 --- a/doc/sphinx/source/tutorials/thcov_tutorial.rst +++ b/doc/sphinx/source/tutorials/thcov_tutorial.rst @@ -3,7 +3,7 @@ How to include a theory covariance matrix in a fit :Author: Contact Rosalyn (r.l.pearson@ed.ac.uk) for further information. This section details how to include scale variation covariance matrices (covmats) -in a PDF fit. At the present time this can only be done at NLO, for which the +in a PDF fit. At the present time this can only be done at next-to-leading order (NLO), for which the central theory is theory 163. First, decide which theory covmat you want From 2b556ac3870ae04517a45a1ac58149b571fef52b Mon Sep 17 00:00:00 2001 From: Rosalyn Pearson <33020850+RosalynLP@users.noreply.github.com> Date: Tue, 9 Mar 2021 18:04:36 +0000 Subject: [PATCH 07/13] add hyphen Co-authored-by: Cameron Voisey <32741139+voisey@users.noreply.github.com> --- doc/sphinx/source/tutorials/thcov_tutorial.rst | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/doc/sphinx/source/tutorials/thcov_tutorial.rst b/doc/sphinx/source/tutorials/thcov_tutorial.rst index 252e33a0e0..2836d18df8 100644 --- a/doc/sphinx/source/tutorials/thcov_tutorial.rst +++ b/doc/sphinx/source/tutorials/thcov_tutorial.rst @@ -66,7 +66,7 @@ Next, add necessary flags to the runcard Example runcard --------------- -The following is an example runcard for an NLO NNPDF3.1 style fit with a 3 point theory covmat. +The following is an example runcard for an NLO NNPDF3.1-style fit with a 3 point theory covmat. It can be found `here `_. .. code:: yaml From a1415c54987eba789f4b6b2bff6e4a8d2c0bed27 Mon Sep 17 00:00:00 2001 From: RosalynLP Date: Wed, 10 Mar 2021 10:33:26 +0000 Subject: [PATCH 08/13] review comments --- doc/sphinx/source/theory/theoryparamsinfo.md | 3 +++ .../source/tutorials/thcov_tutorial.rst | 27 ++++++++++++++++--- doc/sphinx/source/vp/theorycov/index.rst | 1 + doc/sphinx/source/vp/theorycov/tests.rst | 2 ++ 4 files changed, 29 insertions(+), 4 deletions(-) diff --git a/doc/sphinx/source/theory/theoryparamsinfo.md b/doc/sphinx/source/theory/theoryparamsinfo.md index 01a933c191..bd2d063826 100644 --- a/doc/sphinx/source/theory/theoryparamsinfo.md +++ b/doc/sphinx/source/theory/theoryparamsinfo.md @@ -1,3 +1,6 @@ +```eval_rst +.. _th_parameter_info: +``` # Looking up the parameters of a theory The parameters for all of the theories can be found in the `theory.db` file, diff --git a/doc/sphinx/source/tutorials/thcov_tutorial.rst b/doc/sphinx/source/tutorials/thcov_tutorial.rst index 2836d18df8..6282c7c267 100644 --- a/doc/sphinx/source/tutorials/thcov_tutorial.rst +++ b/doc/sphinx/source/tutorials/thcov_tutorial.rst @@ -2,9 +2,9 @@ How to include a theory covariance matrix in a fit ================================================== :Author: Contact Rosalyn (r.l.pearson@ed.ac.uk) for further information. -This section details how to include scale variation covariance matrices (covmats) +This section details how to include :ref:`scale variation covariance matrices (covmats) ` in a PDF fit. At the present time this can only be done at next-to-leading order (NLO), for which the -central theory is theory 163. +central theory is :ref:`theory 163 `. First, decide which theory covmat you want ------------------------------------------ @@ -63,11 +63,30 @@ Next, add necessary flags to the runcard .. warning:: Changing either of these to ``False`` will affect the fit outcome and should be avoided unless you know what you are doing. - + +If you want to compare data to another fit +------------------------------------------ +- Sometimes we want to compare data to another fit for validation, for example + we might want to compare predictions for the NLO fit with MHOUs to the known + NNLO fit (see :ref:`vptheorycov-tests`). +- To make sure the cuts match between these two fits, edit the ``datacuts`` + section of the runcard to include the following + +.. code:: yaml + + use_cuts: fromintersection + cuts_intersection_spec: + - theoryid: 163 + - theoryid: 53 + +- This ensures that the cuts on the data are the intersection of the cuts in + theory 53 (default NNLO) and theory 163 (central scale variation NLO). See + :ref:`here ` for theory definitions. + Example runcard --------------- The following is an example runcard for an NLO NNPDF3.1-style fit with a 3 point theory covmat. -It can be found `here `_. +It can be found `here `_. .. code:: yaml diff --git a/doc/sphinx/source/vp/theorycov/index.rst b/doc/sphinx/source/vp/theorycov/index.rst index 8868b47e35..fe2b2326fc 100644 --- a/doc/sphinx/source/vp/theorycov/index.rst +++ b/doc/sphinx/source/vp/theorycov/index.rst @@ -1,4 +1,5 @@ .. _vptheorycov-index: + The ``theorycovariance`` module ============================= diff --git a/doc/sphinx/source/vp/theorycov/tests.rst b/doc/sphinx/source/vp/theorycov/tests.rst index 74d66f3aab..92dbffd60e 100644 --- a/doc/sphinx/source/vp/theorycov/tests.rst +++ b/doc/sphinx/source/vp/theorycov/tests.rst @@ -1,3 +1,5 @@ + .. _vptheorycov-tests: + Tests ===== From 9d6095a33a7082fa4c3e1d6af92a39c1240c6154 Mon Sep 17 00:00:00 2001 From: RosalynLP Date: Thu, 11 Mar 2021 10:23:57 +0000 Subject: [PATCH 09/13] remove double reference link --- doc/sphinx/source/tutorials/thcov_tutorial.rst | 2 +- doc/sphinx/source/vp/theorycov/point_prescrip.rst | 2 -- 2 files changed, 1 insertion(+), 3 deletions(-) diff --git a/doc/sphinx/source/tutorials/thcov_tutorial.rst b/doc/sphinx/source/tutorials/thcov_tutorial.rst index 6282c7c267..862fa90461 100644 --- a/doc/sphinx/source/tutorials/thcov_tutorial.rst +++ b/doc/sphinx/source/tutorials/thcov_tutorial.rst @@ -8,7 +8,7 @@ central theory is :ref:`theory 163 `. First, decide which theory covmat you want ------------------------------------------ -- Choose the desired point-prescription listed :ref:`here `. +- Choose the desired point-prescription listed :ref:`here `. - Each prescription comes with a ``point_prescription`` flag to include in the runcard, one of ["3 point", "5 point", "5bar point", "7 point", "9 point"] diff --git a/doc/sphinx/source/vp/theorycov/point_prescrip.rst b/doc/sphinx/source/vp/theorycov/point_prescrip.rst index 304fa4a88d..8fd24ef5d5 100644 --- a/doc/sphinx/source/vp/theorycov/point_prescrip.rst +++ b/doc/sphinx/source/vp/theorycov/point_prescrip.rst @@ -1,7 +1,5 @@ .. |br| raw:: html -.. _pointprescrips: - .. _prescrips: Point prescriptions for theory covariance matrices From 8968c5cb5d3344972a72c5c68e5c76a3a4ab33e4 Mon Sep 17 00:00:00 2001 From: RosalynLP Date: Thu, 11 Mar 2021 10:26:40 +0000 Subject: [PATCH 10/13] add comment about |br| --- doc/sphinx/source/vp/theorycov/point_prescrip.rst | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/doc/sphinx/source/vp/theorycov/point_prescrip.rst b/doc/sphinx/source/vp/theorycov/point_prescrip.rst index 8fd24ef5d5..05ded1791c 100644 --- a/doc/sphinx/source/vp/theorycov/point_prescrip.rst +++ b/doc/sphinx/source/vp/theorycov/point_prescrip.rst @@ -1,4 +1,5 @@ -.. |br| raw:: html +.. The line below allows you to break to a new line by adding |br| at the end of the line +.. |br| raw:: html .. _prescrips: From ce4854a3694d94e803799e0363542931db20a9cb Mon Sep 17 00:00:00 2001 From: Cameron Voisey Date: Mon, 15 Mar 2021 12:35:30 +0000 Subject: [PATCH 11/13] Fix maths rendering and tidy up --- .../source/tutorials/thcov_tutorial.rst | 42 +++++++++---------- 1 file changed, 21 insertions(+), 21 deletions(-) diff --git a/doc/sphinx/source/tutorials/thcov_tutorial.rst b/doc/sphinx/source/tutorials/thcov_tutorial.rst index 862fa90461..f65210d467 100644 --- a/doc/sphinx/source/tutorials/thcov_tutorial.rst +++ b/doc/sphinx/source/tutorials/thcov_tutorial.rst @@ -3,7 +3,7 @@ How to include a theory covariance matrix in a fit :Author: Contact Rosalyn (r.l.pearson@ed.ac.uk) for further information. This section details how to include :ref:`scale variation covariance matrices (covmats) ` -in a PDF fit. At the present time this can only be done at next-to-leading order (NLO), for which the +in a PDF fit. At the present time this can only be done at next-to-leading order (NLO), for which the central theory is :ref:`theory 163 `. First, decide which theory covmat you want @@ -15,12 +15,12 @@ First, decide which theory covmat you want Next, add necessary flags to the runcard ---------------------------------------- - Remember to list the required datasets using ``dataset_inputs`` (see :ref:`data_specification`). -- Specify ``metadata_group: nnpdf31_process`` in the runcard. This means that +- Specify ``metadata_group: nnpdf31_process`` in the runcard. This means that the grouping will be done according to process type specified in the ``plotting`` - files. + files. .. warning:: If ``metadata_group`` is not set to ``nnpdf31_process`` it will default to - ``experiment`` and renormalisation scale variation correlations will be + ``experiment`` and renormalisation scale variation correlations will be between experiments rather than processes. See :ref:`backwards-compatibility` for details. - Add ``theorycovmatconfig`` to the runcard. An example is in the following code snippet: @@ -46,30 +46,30 @@ Next, add necessary flags to the runcard use_t0: true ############################################################ - -- ``pdf`` is the PDF used to generate the scale varied predictions which - construct the theory covmat. Choose something close to the PDF you are + +- ``pdf`` is the PDF used to generate the scale varied predictions which + construct the theory covmat. Choose something close to the PDF you are trying to fit, such as a previous iteration if available. - ``theoryids`` are necessary for the construction of the theory covmat. To avoid user error in entering them in the correct configuration and order, - this is handled by the ``produce_scale_variation_theories`` action in - `config `_, - using the information in + this is handled by the ``produce_scale_variation_theories`` action in + `config `_, + using the information in `the scalevariations module `_. - The flags ``use_thcovmat_in_fitting`` and ``use_thcovmat_in_sampling`` specify where to use the theory covmat in the code. There are two possible places: - the fitting (i.e. \\(\\chi^2\\) minimiser) and the sampling (i.e. pseudodata + the fitting (i.e. :math:`\chi^2` minimiser) and the sampling (i.e. pseudodata generation). The default is ``True`` for both. .. warning:: Changing either of these to ``False`` will affect the fit outcome and should be avoided unless you know what you are doing. - + If you want to compare data to another fit ------------------------------------------ - Sometimes we want to compare data to another fit for validation, for example we might want to compare predictions for the NLO fit with MHOUs to the known - NNLO fit (see :ref:`vptheorycov-tests`). -- To make sure the cuts match between these two fits, edit the ``datacuts`` + NNLO fit (see :ref:`vptheorycov-tests`). +- To make sure the cuts match between these two fits, edit the ``datacuts`` section of the runcard to include the following .. code:: yaml @@ -77,19 +77,19 @@ If you want to compare data to another fit use_cuts: fromintersection cuts_intersection_spec: - theoryid: 163 - - theoryid: 53 - -- This ensures that the cuts on the data are the intersection of the cuts in + - theoryid: 53 + +- This ensures that the cuts on the data are the intersection of the cuts in theory 53 (default NNLO) and theory 163 (central scale variation NLO). See - :ref:`here ` for theory definitions. - + :ref:`here ` for theory definitions. + Example runcard --------------- The following is an example runcard for an NLO NNPDF3.1-style fit with a 3 point theory covmat. It can be found `here `_. .. code:: yaml - + # # Configuration file for NNPDF++ # @@ -162,7 +162,7 @@ It can be found `here Date: Mon, 15 Mar 2021 15:04:53 +0000 Subject: [PATCH 12/13] review comment update --- .../source/tutorials/thcov_tutorial.rst | 14 ---------- doc/sphinx/source/vp/dataspecification.rst | 14 ---------- .../source/vp/theorycov/point_prescrip.rst | 27 +++++++++++-------- 3 files changed, 16 insertions(+), 39 deletions(-) diff --git a/doc/sphinx/source/tutorials/thcov_tutorial.rst b/doc/sphinx/source/tutorials/thcov_tutorial.rst index f65210d467..cc863faf14 100644 --- a/doc/sphinx/source/tutorials/thcov_tutorial.rst +++ b/doc/sphinx/source/tutorials/thcov_tutorial.rst @@ -15,14 +15,6 @@ First, decide which theory covmat you want Next, add necessary flags to the runcard ---------------------------------------- - Remember to list the required datasets using ``dataset_inputs`` (see :ref:`data_specification`). -- Specify ``metadata_group: nnpdf31_process`` in the runcard. This means that - the grouping will be done according to process type specified in the ``plotting`` - files. -.. warning:: - If ``metadata_group`` is not set to ``nnpdf31_process`` it will default to - ``experiment`` and renormalisation scale variation correlations will be - between experiments rather than processes. See :ref:`backwards-compatibility` - for details. - Add ``theorycovmatconfig`` to the runcard. An example is in the following code snippet: .. code:: yaml @@ -39,12 +31,6 @@ Next, add necessary flags to the runcard use_thcovmat_in_fitting: true use_thcovmat_in_sampling: true - sampling_t0: - use_t0: false - - fitting_t0: - use_t0: true - ############################################################ - ``pdf`` is the PDF used to generate the scale varied predictions which diff --git a/doc/sphinx/source/vp/dataspecification.rst b/doc/sphinx/source/vp/dataspecification.rst index 8619f39722..3b50fb9927 100644 --- a/doc/sphinx/source/vp/dataspecification.rst +++ b/doc/sphinx/source/vp/dataspecification.rst @@ -390,8 +390,6 @@ input .. code:: yaml - metadata_group: nnpdf31_process - experiments: - experiment: NMC datasets: @@ -418,18 +416,6 @@ The user should be aware, however, that any grouping introduced in this way is purely superficial and will be ignored in favour of the experiments defined by the metadata of the datasets. -*IMPORTANT*: Note that all theory uncertainties runcards will need to be -updated to explicitly set ``metadata_group: nnpdf31_process``, or else the -prescriptions for scale variations will not vary scales coherently for data -within the same process type, as usually desired, but rather for data within -the same experiment. When running the examples in -:ref:`theory-covmat-examples`, it should be obvious if this has been set -because the outputs will be plots grouped by experiment rather than by process -type. However, care must be taken when using the theory covariance matrix but -not plotting anything, since the aforementioned check is not relevant. For -example, if you only want to produce a 𝞆² you must be careful to set the -``metadata_group`` key as above. - Runcards that request actions that have been renamed will not work anymore. Generally, actions that were previously named ``experiments_*`` have been renamed to highlight the fact that they work with more general groupings. diff --git a/doc/sphinx/source/vp/theorycov/point_prescrip.rst b/doc/sphinx/source/vp/theorycov/point_prescrip.rst index 05ded1791c..6ed4d7cb76 100644 --- a/doc/sphinx/source/vp/theorycov/point_prescrip.rst +++ b/doc/sphinx/source/vp/theorycov/point_prescrip.rst @@ -1,6 +1,3 @@ -.. The line below allows you to break to a new line by adding |br| at the end of the line -.. |br| raw:: html - .. _prescrips: Point prescriptions for theory covariance matrices @@ -13,7 +10,8 @@ appear in ``validphys2``. -------- .. note:: - ``theoryids``: 163, 180, 173 |br| + ``theoryids``: 163, 180, 173 + ``point_prescription: '3 point'`` .. math:: s_{11} = \frac{1}{2}\bigg\{ \Delta_1(+,+)^2 + \Delta_1(-,-)^2 \bigg\} @@ -25,7 +23,8 @@ appear in ``validphys2``. --------- .. note:: - ``theoryids``: 163, 177, 176, 179, 174 |br| + ``theoryids``: 163, 177, 176, 179, 174 + ``point_prescription: '5 point'`` .. math:: s_{11} = \frac{1}{2}\bigg\{ \Delta_1(+,0)^2 + \Delta_1(-,0)^2 + \Delta_1(0,+)^2 + \Delta_1(0,-)^2 \bigg\} @@ -40,7 +39,8 @@ appear in ``validphys2``. ------------------------------------ .. note:: - ``theoryids:`` 163, 180, 173, 175, 178 |br| + ``theoryids:`` 163, 180, 173, 175, 178 + ``point_prescription: '5bar point'`` .. math:: s_{11} = \frac{1}{2}\bigg\{ \Delta_1(+,+)^2 + \Delta_1(-,-)^2 + \Delta_1(+,-)^2 + \Delta_1(-,+)^2 \bigg\} @@ -57,9 +57,12 @@ appear in ``validphys2``. .. warning:: - **Deprecated prescription!** |br| - ``theoryids:`` 163, 177, 176, 179, 174, 180, 173 |br| - Specify in the runcard ``seventheories: original`` |br| + **Deprecated prescription!** + + ``theoryids:`` 163, 177, 176, 179, 174, 180, 173 + + Specify in the runcard ``seventheories: original`` + ``point_prescription: '7 point'`` .. math:: @@ -80,7 +83,8 @@ appear in ``validphys2``. -------------------------- .. note:: - ``theoryids:`` 163, 177, 176, 179, 174, 180, 173 |br| + ``theoryids:`` 163, 177, 176, 179, 174, 180, 173 + ``point_prescription: '7 point'`` .. math:: @@ -104,7 +108,8 @@ appear in ``validphys2``. .. note:: - ``theoryids:`` 163, 177, 176, 179, 174, 180, 173, 175, 178 |br| + ``theoryids:`` 163, 177, 176, 179, 174, 180, 173, 175, 178 + ``point_prescription: '9 point'`` .. math:: From 2a67a1f51c8edc708f815ab5e8db0e60357d3a10 Mon Sep 17 00:00:00 2001 From: Cameron Voisey Date: Mon, 15 Mar 2021 18:07:02 +0000 Subject: [PATCH 13/13] Tidy up point prescriptions docs page --- .../source/vp/theorycov/point_prescrip.rst | 50 ++++++++++--------- 1 file changed, 26 insertions(+), 24 deletions(-) diff --git a/doc/sphinx/source/vp/theorycov/point_prescrip.rst b/doc/sphinx/source/vp/theorycov/point_prescrip.rst index 6ed4d7cb76..7a8124715b 100644 --- a/doc/sphinx/source/vp/theorycov/point_prescrip.rst +++ b/doc/sphinx/source/vp/theorycov/point_prescrip.rst @@ -8,12 +8,12 @@ appear in ``validphys2``. 3 points -------- -.. note:: +.. note:: + + ``theoryids``: 163, 180, 173 + + ``point_prescription: '3 point'`` - ``theoryids``: 163, 180, 173 - - ``point_prescription: '3 point'`` - .. math:: s_{11} = \frac{1}{2}\bigg\{ \Delta_1(+,+)^2 + \Delta_1(-,-)^2 \bigg\} .. math:: s_{12} = \frac{1}{4}\bigg\{\bigg(\Delta_1(+,+) + \Delta_1(-,-) \bigg) \bigg(\Delta_2(+,+) + \Delta_2(-,-) \bigg) \bigg\} @@ -23,8 +23,8 @@ appear in ``validphys2``. --------- .. note:: - ``theoryids``: 163, 177, 176, 179, 174 - + ``theoryids``: 163, 177, 176, 179, 174 + ``point_prescription: '5 point'`` .. math:: s_{11} = \frac{1}{2}\bigg\{ \Delta_1(+,0)^2 + \Delta_1(-,0)^2 + \Delta_1(0,+)^2 + \Delta_1(0,-)^2 \bigg\} @@ -39,10 +39,10 @@ appear in ``validphys2``. ------------------------------------ .. note:: - ``theoryids:`` 163, 180, 173, 175, 178 - + ``theoryids:`` 163, 180, 173, 175, 178 + ``point_prescription: '5bar point'`` - + .. math:: s_{11} = \frac{1}{2}\bigg\{ \Delta_1(+,+)^2 + \Delta_1(-,-)^2 + \Delta_1(+,-)^2 + \Delta_1(-,+)^2 \bigg\} .. math:: @@ -57,18 +57,19 @@ appear in ``validphys2``. .. warning:: - **Deprecated prescription!** - - ``theoryids:`` 163, 177, 176, 179, 174, 180, 173 - - Specify in the runcard ``seventheories: original`` - + **Deprecated prescription!** + + ``theoryids:`` 163, 177, 176, 179, 174, 180, 173 + + Specify in the runcard ``seventheories: original`` + ``point_prescription: '7 point'`` - + .. math:: \begin{split} - s_{11} = \frac{1}{3}\bigg\{ &\Delta_1(+,0)^2 + \Delta_1(-,0)^2 + \Delta_1(0,+)^2 + \Delta_1(0,-)^2 \\ + &\Delta_1(+,+)^2 + \Delta_1(-,-)^2 \bigg\} + s_{11} = \frac{1}{3}\bigg\{ &\Delta_1(+,0)^2 + \Delta_1(-,0)^2 + \Delta_1(0,+)^2 + \Delta_1(0,-)^2 \\ + + &\Delta_1(+,+)^2 + \Delta_1(-,-)^2 \bigg\} \end{split} .. math:: @@ -83,14 +84,15 @@ appear in ``validphys2``. -------------------------- .. note:: - ``theoryids:`` 163, 177, 176, 179, 174, 180, 173 - + ``theoryids:`` 163, 177, 176, 179, 174, 180, 173 + ``point_prescription: '7 point'`` .. math:: \begin{split} - s_{11} = \frac{1}{3}\bigg\{ &\Delta_1(+,0)^2 + \Delta_1(-,0)^2 + \Delta_1(0,+)^2 + \Delta_1(0,-)^2 \\ + &\Delta_1(+,+)^2 + \Delta_1(-,-)^2 \bigg\} + s_{11} = \frac{1}{3}\bigg\{ &\Delta_1(+,0)^2 + \Delta_1(-,0)^2 + \Delta_1(0,+)^2 + \Delta_1(0,-)^2 \\ + + &\Delta_1(+,+)^2 + \Delta_1(-,-)^2 \bigg\} \end{split} .. math:: @@ -108,8 +110,8 @@ appear in ``validphys2``. .. note:: - ``theoryids:`` 163, 177, 176, 179, 174, 180, 173, 175, 178 - + ``theoryids:`` 163, 177, 176, 179, 174, 180, 173, 175, 178 + ``point_prescription: '9 point'`` .. math:: @@ -117,7 +119,7 @@ appear in ``validphys2``. \begin{split} s_{11} = \frac{1}{4}\bigg\{ &\Delta_1(+,0)^2 + \Delta_1(-,0)^2 + \Delta_1(0,+)^2 + \Delta_1(0,-)^2 \\ - + &\Delta_1(+,+)^2 + \Delta_1(+,-)^2 + + &\Delta_1(+,+)^2 + \Delta_1(+,-)^2 + \Delta_1(-,+)^2 + \Delta_1(-,-)^2 \bigg\} \end{split}