-
Notifications
You must be signed in to change notification settings - Fork 0
/
srd.tex
209 lines (142 loc) · 20 KB
/
srd.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
\section{LSST System Requirements \& SRD Verification/Validation} \label{sec:srd}
\subsection{Operations Readiness Requirement}
The Project team shall characterize and document the performance of the integrated Rubin Observatory system with respect to the survey performance requirements and specifications enumerated in the LSST System Requirements (LSR; \citeds{LSE-29}) and Science Requirements Document (SRD; \citeds{LPM-17}) Section~3.
% Observatory System Specifications
\subsection{Objectives}
%The main objective is to quantify the range of scientific performance of the as-built Rubin Observatory through analysis of on-sky commissioning observations and informed simulations, including
The scope of system-level science verification and validation activities includes:
\begin{itemize}
\item Determining whether the specifications defined in the OSS, LSR, and SRD are being met;
\item Characterizing other system performance metrics in the context of the four primary science drivers;
\item Studying environmental dependencies and technical optimization that inform early operations;
\item Documenting system performance and verifying mechanisms to monitor system performance during operations; and
\item Validating data delivery, derived data products, and data access tools that will be used by the science community.
\end{itemize}
The specific objective of this requirement is to quantify the range of scientific performance of the as-built Rubin Observatory with respect to LSR and SRD requirements through analysis of on-sky commissioning observations and informed simulations, and thereby demonstrate that system performance at delivery is consistent with meeting the primary science goals of the 10-year LSST.
%The goal is to establish confidence that he as-built Rubin Observatory is capable of meeting the primary science goals of the 10-year LSST, as evaluated with respect to LSR requirements.
%by using a combination of on-sky data, informed simulations of the LSST system, and external datasets.
%demonstrate, from the analysis of commissioning data and informed simulations, that the as-built Rubin Observatory is capable of meeting the primary science goals of the 10-year LSST, as evaluated with respect to LSR requirements.
The LSR is a comprehensive definition of the highest level Rubin Observatory system requirements.
The LSR is derived from the SRD that describes the scientific motivations for the project, the survey capabilities of the Observatory, and the reference science missions used to develop detailed scientific specifications for the LSST.
In nearly all cases, adopted LSR specifications directly correspond to design specification values in the SRD, such that LSR verification will satisfy the intent of the SRD.
%verify and validate that data produced from on-sky observations with the as-built Rubin Observatory meet
%The objective of this criterion is to demonstrate, from the analysis of on-sky observations and informed simulations, that the science performance of the as-built Rubin Observatory is consistent
%that the scientific performance of the as-built Rubin Observatory, as determined from on-sky observations during the on-sky commissioning period, is capable of fulfilling the four primary science drivers of the LSST as defined in the SRD.
%with on-sky observations during the commissioning period, and informed simulations of the LSST survey, to establish that the as-built Rubin Observatory is capable of
%of the scientific performance of the as-built Rubin Observatory to establish that
%complete sufficient science verification, validation, and characterization studies to establish that the
%have been completed to establish confidence that the 10-year LSST survey can satisfy LSR and SRD requirements.
%The primary objective for this criterion is to verify and validate that data produced from on-sky observations with the as-built Rubin Observatory (e.g., the Science Validation surveys) meet the science verification requirements as described in the LSST Verification and Validation (LVV) elements and test cases.
%\begin{itemize}
% \item Quality Assessment: versatile pipelines to calculate performance metrics and other diagnostics
% \item Quality Control: ensure that metrics are routinely calculated and track their distributions as the pipelines evolve and encounter new data
%\end{itemize}
\subsubsection{Approach to verification and validation}
%\textbf{Discussion}
For the purpose of evaluating readiness we define verification, validation, and characterization of Rubin Observatory data and processing.
\begin{itemize}
\item {\it Verification}: Demonstrate that the system as built is consistent with the design. Ensure that the requirements for the system are met using Rubin Observatory and precursor data. Express the requirements in terms of metrics that can be evaluated using LSST and precursor data. Document the system performance for each of the verification metrics and requirements.
\item {\it Validation}: Demonstrate that the system is capable of meeting the scientific objectives of the survey. Ensure that the data products, data access, and science requirements can meet the objectives for LSST's four major science themes. Document the system performance for each of the validation metrics and requirements and verify that there exist mechanisms to monitor the system performance during operations. Validate that the derived data products and access tools meet the science requirements of the community.
\item {\it Characterization}: Determine how the performance of the system degrades as a function of environment and technical performance of the components of the system. Measure how the metrics used in verification change as a function of operational conditions (including weather, site, operations, telescope, instrument, and software).
\end{itemize}
%The System Integration and Commissioning plan includes science verification, validation, and characterizatin studies to evaluate the distribution of delivered performance.
Verification studies include:
\begin{itemize}
\item Generation of all required data products and services;
\item Demonstration that relevant metadata are being collected and archived;
\item Astrometric performance (relative and absolute);
\item Photometric performance (relative and absolute);
\item Data throughput and processing requirements for prompt data products;
\item Completeness and purity of sources detected in Alert Production (AP) and Data Release Production (DRP);
\item Image template generation;
\item Completeness and purity of moving object orbit calculations;
\item The impact of stray light and optical ghosts;
\item Image quality (defined for each subsystem: telescope, camera, data management); and
\item Crosstalk, filter response, and calibration.
\end{itemize}
The verification will make use of Quality Assessment (QA) and Quality Control (QC) tools developed during Data Management construction, and performance with be compared against the tabular requirements in the LSR.
%In particular, metrics together with additional test cases will be compared against the tabular requirements in the LSR.
Each LSR requirement has been decomposed into individual verification tickets. Each verification ticket has a designated verification method and domain of test, and has been associated with one of the CCRs to indicate the phasing of verification. The phasing can be summarized as follows:
\begin{itemize}
\item CCR1: system-level functional capabilities to support on-sky commissioning; no system-level science performance requirements from the LSR are associated with CCR1
\item CCR2: aspects of system-level science performance related to the intrinsic information content of the single-visit images, e.g., optical system throughout, image quality (PSF FWHM, ellipticity), ghosts/scattered light, sensor anomalies
\item CCR3: aspects of system-level science performance that characterize an ensemble of visit images and/or which relate to capability to calibrate visit images, e.g., PSF modeling, astrometric repeatability, photometric repeatability
\item CCR4: aspects of system-level science performance related to the survey performance and associated data products, e.g., photometric uniformity, PSF ellipticity residuals at full survey depth
\item Beyond CCR4: aspects of system-level science performance that require one or more years of survey operations to verify, e.g., cadence of annual LSST Data Releases
\end{itemize}
This verification phasing is designed to establish confidence that the as-built Rubin Observatory is capable of routinely acquiring acceptable science-grade imaging across the LSSTCam full focal plane (i.e., attainment of the System First Light technical milestone \citeds{SITCOMTN-061}) early in the on-sky commissioning period.
Science Validation surveys at the conclusion of the commissioning period (Section~\ref{sec:svs}) are designed to collect a volume data $\gtrsim 1\%$ of the 10-year LSST to enable survey-scale validation and characterization studies.
Allowing for time needed to process and scientifically analyze data from the Science Validation surveys, it is anticipated that Operations will commence prior to the final verification of all system-level science performance requirements to be reported at CCR4.
Most CCR4 requirements are expected to be verified during the course of on-sky commissioning, including final analysis of the Science Validation Surveys, and early operations.
%The capabilities of the Data Management system (Section~\ref{sec:dm}) and science data quality monitoring (Section~\ref{sec:sdqa}) can be verified for operational readiness using .
For system-level science performance verification, the majority of test cases described under the LSST Verification and Validation project will be implemented using metrics and/or data visualizations that are generated as part of Science Pipelines execution (e.g., analysis\_tools), as separate test procedures (e.g., Jupyter notebooks on the Rubin Science Platform), or via inspection/demonstration (e.g., to show that a service or data produce has been delivered).
In addition to the normative data quality requirements defined in the OSS, LSR, and SRD, there are several science validation and characterization objectives that represent important benchmarks of scientific capability. The optimization of associated algorithms is in many cases an active research topic, and performance is expected to improve throughout Operations. Potential science validation studies include:
\begin{itemize}
\item Characterization of blending effects, e.g., prevalence of unrecognized blends and object photometry in blended scenes;
\item Object classification, e.g., accuracy of star-galaxy separation;
\item Galaxy photometry, e.g., for photometric redshifts;
\item Difference image analysis photometry, e.g., for statistical variability metrics and lightcurves of transient objects;
\item Low surface brightness features;
\item Weak-lensing null tests and shear calibration;
\item Treatment of crowded fields.
\end{itemize}
A collection of topical working groups for science verification and validation have been organized to provide coverage of these science validation areas.
In addition, more than 100 individuals in the Rubin science community are making non-financial contributions to the System Integration, Test, and Commissioning effort to facilitate an efficient transition to LSST Operations and increase the overall scientific output of the survey \citeds{SITCOMTN-050}. By sharing their technical and scientific expertise, these individuals enhance and diversify the Project's planned commissioning effort. The named participants will work directly alongside Rubin Observatory staff in completing their assignments and, in exchange, will have access to commissioning data products as they are acquired. The Project will not rely on the contributions from non-Rubin-staff team members to fulfill core construction requirements and operational readiness criteria. However, science validation analyses performed by these individuals will provide a preview of realistic scientific workflows using commissioning data, are thus are complementary to the Early Science Program (\citeds{RTN-011}) for the purpose of validating data access services and science data quality from a science user perspective. No papers presenting novel scientific results based on commissioning data may be posted/submitted by anyone before the associated release as part of the Early Science Program.
%The goal is to quantify the range of demonstrated performance by using a combination of on-sky data, informed simulations of the LSST system, and external datasets.
%Observations taken during this period will enable higher-level data quality assessments that are not explic- itly identified as requirements in the LSR or SRD, but nonetheless represent important bench- marks of scientific performance (e.g., source detection completeness, accuracy of star-galaxy separation, precision of photometric redshifts, and weak-lensing null tests).
%All test cases as described under the LSST Verification and Validation project will be implemented as either part of the DM Key Performance Metric validation system, as separate test procedures (e.g., Jupyter notebooks), or via visual inspection (e.g., to demonstrate that a service or data produce has been delivered). The LSST Science Platform will be the primary tool for data access and exploration. All metrics will be applied to data from the two main Science Validation surveys (the Wide-area Science Validation Survey and the 10-year Depth Science Validation Survey) and evaluated against the numerical values described in the LSST System Requirements, Observatory System Specifications and Science Requirements Document.
%If the schedule for on-sky observations is compressed, there might be a tight timeline for data processing and subsequent analysis of the Science Validation surveys. The statistical power of tests may be more limited if there are fewer observations. In that case, the validation and characterization may be more limited. For example, if the baseline for the wide-area science verification survey is shortened we will have to verify variability measures (e.g., periods) to specific classes of object. We may want to specify which classes of variability we will prioritize. Similarly, for the data release products, priority might be assigned to the verification of science performance for a brighter sample of objects (e.g., magnitudes $i < 25$).
\subsection{Criteria for Completeness}
The characterization and documentation of science performance at the conclusion of the Construction project will be considered successfully complete when all requirements in the LSR have been verified.
%This criterion is met when sufficient science verification, validation, and characterization studies have been completed to establish confidence that the 10-year LSST survey can satisfy LSR and SRD requirements.
At a minimum, LSR requirements associated with CCR1, CCR2, and CCR3 must be verified at the end of Construction following the process defined in the Verification and Validation Process document (\citeds{LSE-160}) and associated documentation.
For those that are not, a waiver will be sought to enter Operations and they will be completed within the first year of Operations.
%The Project team shall complete sufficient science verification, validation, and characterization studies to be confident that the 10-year LSST survey can satisfy OSS, LSR, and SRD requirements. Some aspects of science performance are fixed by the telescope, camera, and observing strategy, while others can be continually improved through refinements of the Science Pipelines. In this context, key objectives of science verification are to distinguish between anomalies that can be addressed in the science pipelines and those that are more fundamental to the raw data, and to establish confidence that more subtle anomalies do not fundamentally limit science reach during Early Operations.
%To achieve this level of confidence, we identify several essential categories of science performance (in order of increasing algorithmic dependence):
%\begin{itemize}
% \item image quality (PSF FWHM, ellipticity), system throughput, ghosts/scattered light, sky brightness and readout noise, detector anomalies;
% \item instrument signature removal; and
% \item PSF modeling, photometric calibration, astrometric calibration.
%\end{itemize}
%Construction completeness is achieved when LSR and SRD metrics in the categories above pass the design requirements as stated in the SRD. Non-compliance exceptions to the above requirements will be considered following internal and external reviews of the assessed performance and operational impacts.
%In addition, substantial progress should be made on towards initial verification of difference imaging, de--blending, galaxy photometry including shape measurement, moving object linkage, and proper motions.
\subsection{Pre--Operations Interaction}
Brief the Operations team on status of science verification, validation, and characterization; and
Handoff of QA and QC tools. Ensure that Operations team can run these tools, interpret the results, and add new metrics and visualizations as needed.
\subsection{Artifacts for Completion}
The following artifacts will be provided:
\begin{itemize}
\item Minimum:
\begin{itemize}
\item A verification matrix containing entries for all LSR requirements (\citeds{LSE-29}) and specifications, including verification methods (inspection, demonstration, analysis, or test) for each requirement;
\item Final compliance status, including all non-compliance reports and associated impact studies;
\item Test plans and reports for all test campaigns associated with system-level science performance;
%\item Brief technical notes for a small
%\item Summary report of system-level science performance with scope sufficient to demonstrate the attainment of the System First Light technical milestone; it is anticipated that this report will be prepared
\item Draft of at least one Construction Paper with scope sufficient to demonstrate the attainment of the System First Light technical milestone to support the Early Science Program (not released prior to the Rubin First Look media event -- expected delivery by CCR3);
\item Outline of at least one Construction Paper to provide an overview of the components of the as-built Rubin Observatory and technical performance at the time of delivery (planned to be released around the time of CCR4).
\end{itemize}
\item Baseline:
\begin{itemize}
\item Artifacts above;
\item Technotes published to \url{lsst.io} that describe a small collection of end-to-end studies to demonstrate realistic workflows used for science validation (see examples above). It is envisioned that these studies might mature into full scientific publications during the first year of Operations and might involve collaboration with the broader scientific community (\citeds{SITCOMTN-076});
\item Drafts of additional Construction papers describing individual subsystems in greater detail.
\end{itemize}
\end{itemize}
%\begin{itemize}
%
% \item Minimum:
% \begin{itemize}
% \item Summary report of system--level science performance metrics, with comparison to specifications in the OSS, LSR, and SRD;
% \item Impact study in the case of non-compliance;
% \item Documentation of Quality Assessment and Quality Control tools;
% \item Draft of Construction Paper for Commissioning Science Verification and Validation (not released until time of public release of commissioning data products).
% \end{itemize}
%
% \item Baseline:
% \begin{itemize}
% \item For each science performance requirement in the LSR and SRD, summary statistic(s) or diagnostic plot(s) demonstrating the distribution of performance and correlations with environmental conditions, astrophysical foregrounds, etc.; and
% \item Brief reports for a small collection of end-to-end studies demonstrating realistic workflows used for science validation (see examples above). It is envisioned that these studies may mature into full scientific publications during the first year of operations and may involve collaboration with the larger scientific community.
% \end{itemize}
%
% \end{itemize}