Skip to content

Commit

Permalink
Fix up Ar39 estimates, add and dismiss noise, sharpen up SNB explanat…
Browse files Browse the repository at this point in the history
…ion and general wording.
  • Loading branch information
brettviren committed May 13, 2015
1 parent 5a0d889 commit cce39e6
Show file tree
Hide file tree
Showing 3 changed files with 118 additions and 98 deletions.
212 changes: 114 additions & 98 deletions annex-rate/chapter-ref-fd.tex
Original file line number Diff line number Diff line change
Expand Up @@ -17,59 +17,71 @@ \section{Overview}



\section{Thresholds for the LArTPC Data}
\section{Thresholds for the LArTPC Data Rate Estimations}

There are three threshold levels considered for the purposes of characterizing the LArTPC data rates.
These thresholds are assumed to be applied ``per-wire'' and on the basis of ADC values (which can be translated
to units like MeV with proper calibration).
Note that they are meant to reflect different energy scales of the physics phenomena being
measured in the TPC and must be considered separately from thresholds and other parameters used in the
DAQ for its internal real-time processing of the data (for example in the anticipated internal ``trigger stream'').

In other words, each such threshold will translate into a corresponding rate of data coming out of DAQ. These are:
Optimizing these thresholds for physics require additional study and
are used here only to provide benchmarks for the data rate estimations.
The data produced at each threshold are termed:

\begin{description}
\item[full-stream] The full-stream (FS) threshold means there is no threshold at all.
FS data is data where every time bin (as defined by the ADC clock) on every channel is read out.
\item[zero-suppressed] The zero-suppressed (ZS) mechanism that is
assumed here is one which effectively removes all electronics noise
while imposes an effective \chargezsthreshold energy threshold (which may be subject to further optimization).
\item[zero-suppressed] The zero-suppression (ZS) threshold is an ADC
level roughly corresponding to what an \chargezsthreshold deposition
would produce in a single time bin on a single wire.
All bins with ADC counts below this threshold are remove from the
data stream.
\item[high-energy] A high-energy (HE) threshold is assumed which is
such that all signals from radioactive decays are suppressed but low
enough to not impact activity from beam neutrino interactions or
potential nucleon decay activity.
Further studies are needed to determine this threshold but currently it is taken to be
\chargehethreshold.
Further studies are needed to determine this threshold but currently
it is taken, without rigor, to be \SI{1}{\MeV}--\SI{10}{MeV}.
\end{description}

The design of the DAQ is expected to allow reading of the TPC data into the system memory at FS rates.
It is also expected to be flexible enough to allow for different
thresholds to be applied to the data while it is still resident in its buffers, based on real-time calculations.
\section{Assumption on the DAQ}

\section{Sources of Data in the TPC}
The data rate estimates make the following assumptions on the DAQ capabilities.
In discussion with the DAQ experts they are expected to be satisfied.
More information on the DAQ is available in the Volume 4 of the CDR.

Data is expected to be produced by a number of specific sources (cf. cosmic ray muons vs beam neutrinos).
Each source will have a different rate depending on the threshold applied.
In the following the corresponding data rates are estimated individually for each source and threshold.
This provides a means to compare different assumptions, designs and strategies.
\begin{itemize}
\item The DAQ trigger farm will be able to identify isolated activity
collected in a single APA consistent with $^{39}Ar$ decay in order
to suppress reading it to the DAQ data farm.
\item The RCEs will be able to apply a zero suppression
algorithms with a given threshold to data sent to the trigger farm.
\item The RCEs will be able to apply trigger-specific zero suppression
algorithms and thresholds to the data sent to the data farm.
\item RCE-local data storage is left open as a possibility.
\end{itemize}

\section{Sources of Data in the TPC}

The sources of data being considered are:
The data rate estimates are produced considering a specific sources of
data.

\begin{description}
\item[in-spill] Activity in the detector which is coincident with the passage of beam neutrinos through the detector.
\item[cosmic-$\mu$] Activity due to the passing of cosmic-ray muons through the detector.
\item[in-spill] Any activity in the detector which is coincident with
the passage of beam neutrinos through the detector.
\item[with-beam-$\nu$] A subset of \textit{in-spill} where activity is
consistent with a beam-$\nu$ interacting in the detector.
\item[cosmic-$\mu$] Activity due to the passing of cosmic-ray muons
through the detector.
\item[radioactivity] Activity due to the decay of radioactive
isotopes.
The dominant source of data is expected to be from $^{39}$Ar and
others are not currently considered.
\item[atm-$\nu$] Activity which is not in-spill and which is
consistent with interactions from atmospheric neutrinos.
\item[atm-$\nu$] Activity consistent with interactions from
atmospheric neutrinos.
\item[noise] Fluctuation in electronics noise (common-mode not considered).
\end{description}

\section{Fundamental Parameters of the LArTPC}

This section provides the fundamental parameters taken as input to the
data rate estimations.
This section provides a selection of fundamental parameters used as
input to the data rate estimations.
The parameters are summarized in
table~\ref{tab:fundamental-parameters}

Expand All @@ -93,7 +105,8 @@ \section{Full-stream Data}
The third contains two numbers that characterize data volume relevant to a strategy which aims to record FS data
for Supernova Burst candidates.
The final row shows the total annual data volume that the DUNE DAQ is capable of producing (in theory).

These numbers are not meant to imply ongoing recording of full-stream
data to permanent storage.

\input{annex-rate/generated/full-stream-volume}

Expand All @@ -102,36 +115,25 @@ \section{Zero-suppressed Data}
There are options in choosing the exact zero-suppression (ZS) procedure,
and the final choice has not been made.
For these data rate estimates a very simple procedure is assumed: in each
channel all digitized time bins in which the ADC values are below
channel, all digitized time bins in which the ADC values are below
the given threshold are removed.
More discussion on possible alternative ZS methods and their impact on
the data rate are give below.

The ZS threshold is taken as the ADC equivalent to a
\chargezsthreshold
energy deposition near the CPA and localized in such a way that it passes near one
induction wire and is collected on one collection wire.
Given the requirement that the minimum signal to noise ratio is
\chargeminsignalnoiseratio this ZS threshold represents at least $3\sigma$
noise exclusion.
For the purpose of this estimate it is assumed that all noise is
removed.
The threshold is low enough that most pertinent activity in the
detector volume still be observed in ZS data.


% \input{annex-rate/generated/zs-parameters-table}

By construction, the ZS is assumed to remove electronics noise, hence
the ZS data rate depends mostly on the size of the class of the
particular event and the rate at which it is expected to occur.
This information is summarized in table~\ref{tab:zs-volume}.
The nominal ZS threshold is assumed to be the ADC counts produced when
the equivalent of \chargezsthreshold energy is deposited and the
ionized wire is collected by a single wire.
It is assumed that the application of zero-suppression at this
threshold completely removes ADC counts due to just noise although an
estimate of data rates due to noise is given.

Estimations of different sources of ZS data are summarized in table~\ref{tab:zs-volume}.

\input{annex-rate/generated/zs-volume-table}

The total $^{39}Ar$ clearly dominates the data volume and deserves
some discussion.
\subsubsection{$^{39}Ar$ Decays}

The $^{39}Ar$ decay could potentially dominate the data volume.
The end point of the $^{39}Ar$ decay is at \SI{565}{\keV} and about
25\% of the beta spectrum is above the ZS threshold\cite{docdb3018}.
The expected total decay rate is
Expand All @@ -151,57 +153,71 @@ \section{Zero-suppressed Data}
As small as these events are, they are numerous enough that their data
volume is not justified given their relative lack of physics importance.
Some mitigation is required and will be developed.

The simplest mitigation is to increase the ZS threshold to be above
the decay endpoint.

Another approach is to only raise this threshold for all events except
those coincident with beam neutrinos passing the far detector.
The beam rate is averaged over one full year assuming run fraction of
\beamrunfraction, a rep rate of \beamreprate and a beam spill detector
event occupancy of \beameventoccupancy.
The annual data volume from just selecting $^{39}Ar$ decays which are
coincident with a beam spill and above the nominal ZS threshold is
estimated to be \betainspillyear.
If a more sophisticated selection where to apply the nominal ZS
threshold only to these events which have significant activity
consistent with neutrino interactions then the $^{39}Ar$ decays would
add an additional annual data volume of \betainbeamyear.
If further, only the single drift cell which contains the higher
energy neutrino interaction activity is allowed to accept any
$^{39}Ar$ activity then their contribution to the data rate becomes
negligible.


The Supernova Burst (SNB) data is estimated assuming a \textit{false-positive} SNB
rate of \snbcandrate and a readout time of \snbreadouttime.
This will produce a negative impact in removing small energy
depositions associated with larger events and thus will only be
considered if absolutely required to mitigate the rate.

A better approach is the one that drives the requirement on the DAQ
that the trigger farm be capable of identifying isolated activity
consistent with $^{39}Ar$ on a per-APA basis in order to veto its
recording.
The DAQ is expected to be able to provide this functionality.
This then leaves $^{39}Ar$ which is accidentally coincident in the
same APA with readouts from other activity such as beam-$\nu$
interaction and cosmic muons.
The annual number of above ZS-threshold $^{39}Ar$ decay events
coincident anywhere in the DUNE detector with beam-$\nu$ activity is
given in table~{ref:zs-volume} as \betainbeamyear.
Of that only 3\% are coincident in the same APA bringing the added
data rate to about 10\% that of the beam-$\nu$ activity.

\subsubsection{Supernova Burst}

The Supernova Burst (SNB) data is estimated assuming a
\textit{false-positive} SNB rate of \snbcandrate and a readout time of
\snbreadouttime.
It should be emphasized that both these parameters are subject to
modification and are used simply to provide benchmark examples.
It is assumed that the bulk of the data in such candidate events will be due to signals produced by $^{39}$Ar decays.
This is due to the following factors.
Actual SNB events will have neutrino energies in tens of MeV and up to
\SI{100}{\MeV}.
Roughly speaking, 1000 events across DUNE are expected from a real SNB
with the neutrino front lasting some \SI{10}{\second}. This corresponds to less
than a single neutrino interaction per APA readout on average, with the rest of the readout time
filled with signals from radiological background (if not mitigated).

In terms of comparison with the beam neutrino data,
due to difference in energy scale each SNB neutrino is expected to contribute only
about 10\% of the data of the
\si{\GeV}-scale beam events or about \SI{1}{\mega\byte}.
For the entire SNB this would then add about \SI{1}{\giga\byte} to the
event and thus does not greatly impact the estimate.
%% This statement is correct but actually inconsistent with the plots
%% shown in the compression session which shows still 2.5MB/event for
%% a 100 MeV event.
%%

The SNB event size is what must be acquired promptly through that
readout time and the data rate is averaged over the year.
The annual data volume is what would be saved to disk if all false
positive events are kept.
The table also includes the an estimation assuming full-stream data is
kept for the SNB candidates.

On possible source of false positive SNB triggers may be an upward
fluctuation in the rate of $^{39}Ar$ decays.
Whatever initiates, it is assumed that the data volume of a
false-positive SNB trigger is dominated by $^{39}Ar$ decays.

In the case of an actual SNB, its neutrino interactions will produce
on order of 1000 events across the far detector modules over a time
around ten seconds and with a neutrino spectrum up to \SI{100}{\MeV}.
Given the importance of collecting SNB neutrinos, the same
trigger-based reduction of $^{39}Ar$ will not be employed and thus it
will dominate the data volume.
These addition data volume due to the SNB events themselves is not
significant.

Further, a lower ZS threshold may be considered for saving such
candidate SNB occurrences.
The possible data rate of SNB candidates is thus bound by the nominal
ZS $^{39}Ar$ rate and the FS rate.
The exact strategy for saving SNB candidates requires additional study
and may have implications on DAQ hardware.

\subsection{Noise}

% fixme: is this remotely true?
The nominal ZS threshold of \chargezsthreshold is based on an older
requirement of a 9:1 signal to noise ratio for a 1 MIP.
The most likely MIP is \SI{1.8}{\MeV/\cm} or \SI{0.9}{\MeV} per wire pitch.
This implies an RMS noise requirement equivalent to \SI{0.1}{\MeV} and
then the \SI{0.5}{\MeV} equivalent ZS threshold represents a $5\sigma$ cut.
Across the \dunenumberchannels channels and the
\daqreadoutchannelsamples per readout gives about \num{5000} samples
above the nominal ZS threshold.
Their distributed nature and isolated appearance make them subject to
the same rejection criteria as isolated $^{39}Ar$ and thus can be
ignored for the purposes of data volume estimates.


\section{High-energy Threshold}

Expand Down
2 changes: 2 additions & 0 deletions annex-rate/generated/zs-volume-table.tex
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@
Source & Event Rate & Event Size & Data Rate & Annual Data Volume \\ \toprowrule
\colhline
all $^{39}Ar$ & \SI[round-mode=places,round-precision=1]{11.16}{\mega\hertz} & \SI[round-mode=places,round-precision=0]{150.0}{\byte} & \SI[round-mode=places,round-precision=1]{1.674}{\giga\byte\per\second} & \SI[round-mode=places,round-precision=0]{52.8262940816}{\peta\byte}\\
all in-spill & & & & \SI[round-mode=places,round-precision=0]{158.558121686}{\tera\byte} \\
with-beam-$\nu$ & & & & \SI[round-mode=places,round-precision=0]{79.279060843}{\giga\byte} \\
\colhline
cosmic-$\mu$ & \SI[round-mode=places,round-precision=3]{0.258947264}{\hertz} &
\SI[round-mode=places,round-precision=1]{2.5}{\mega\byte} & \SI[round-mode=places,round-precision=1]{647.36816}{\kilo\byte\per\second} &
Expand Down
2 changes: 2 additions & 0 deletions annex-rate/templates/zs-volume-table.tex.j2
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,8 @@
Source & Event Rate & Event Size & Data Rate & Annual Data Volume \\ \toprowrule
\colhline
all $^{39}Ar$ & ~{{beta_rate.sicmd}}~ & ~{{beta_readout_size.sicmd}}~ & ~{{beta_data_rate.sicmd}}~ & ~{{beta_data_year.sicmd}}~\\
all in-spill & & & & ~{{beta_in_spill_year.sicmd}}~ \\
with-beam-$\nu$ & & & & ~{{beta_in_beam_year.sicmd}}~ \\
\colhline
cosmic-$\mu$ & ~{{cosmic_muon_rate.sicmd}}~ &
~{{cosmic_muon_event_size.sicmd}}~ & ~{{cosmic_muon_data_rate.sicmd}}~ &
Expand Down

0 comments on commit cce39e6

Please sign in to comment.