Skip to content

Latest commit

 

History

History
195 lines (132 loc) · 11.5 KB

QuickStart.md

File metadata and controls

195 lines (132 loc) · 11.5 KB

QuickStart

↑ Go to the Table of Content ↑ | Continue to Modules Development →

Requirements

A Linux machine (CC7 or Ubuntu) or a Mac. See the O2 instructions below for the exact supported versions.

Setup

  1. Setup O2 environment and tools
    We use alibuild, see complete instructions here (prefer the second option, not alidock). In particular make sure to follow these steps :

    1. Install GLFW to have GUIs in the DPL (optional, DPL GUIs do not work in containers nor over SSH).
      • CC7 : sudo yum install -y glfw-devel --enablerepo=epel
      • Mac : brew install glfw
    2. Prerequisites
    3. Install aliBuild
    4. Check setup and build O2
  2. Prepare the QualityControl development package

    • aliBuild init QualityControl@master --defaults o2
  3. Build/install the QualityControl, its GUI (qcg) and the readout. The simplest is to use the metapackage O2Suite.

    • aliBuild build O2Suite --defaults o2
    • At this point you might encounter a message about missing system requirements. Run aliDoctor O2Suite to get a full information about what is missing and how to install it.

Note : on non-CC7 systems, you can also use the alibuild "defaults" called o2-dataflow to avoid building simulation related packages.

Environment loading

Whenever you want to work with O2 and QualityControl, do either alienv enter O2Suite/latest or alienv load O2Suite/latest.

Execution

To make sure that your system is correctly setup, we are going to run a basic QC workflow attached to a simple data producer. We will use central services for the repository and the GUI. If you want to set them up on your computer or in your lab, please have a look here and here.

Basic workflow

We are going to run a basic workflow whose various processes are shown in the following schema.

basic-schema

The Producer is a random data generator. In a more realistic setup it would be a processing device or the Readout. The Data Sampling is the system in charge of dispatching data samples from the main data flow to the QC tasks. It can be configured to dispatch different proportion or different types of data. The tasks are in charge of analyzing the data and preparing QC objects, often histograms, that are then pushed forward every cycle. A cycle is 10 second in this example. In production it is closer to 1 minute. The Checker is in charge of evaluating the MonitorObjects produced by the QC tasks. It runs Checks defined by the users, for example checking that the mean is above a certain limit. It can also modify the aspect of the histogram, e.g. by changing the background color or adding a PaveText. Finally the Checker is also in charge of storing the resulting MonitorObject into the repository where it will be accessible by the web GUI. It also pushes it to a Printer for the sake of this tutorial.

To run it simply do:

o2-qc-run-basic

Thanks to the Data Processing Layer (DPL, more details later) it is a single process that steers all the devices, i.e. processes making up the workflow. A window should appear that shows a graphical representation of the workflow. The output of any of the processes is available by double clicking a box. If a box is red it means that the process has stopped, probably abnormally.

basic-dpl-gui

The example above consists of one DPL workflow which has both the main processing and the QC infrastructure declared inside. In the real case, we would usually prefer to attach the QC without modifying the original topology. It can be done by merging two (or more) workflows, as shown below:

o2-qc-run-producer | o2-qc --config json://${QUALITYCONTROL_ROOT}/etc/basic.json

basic-schema-2-exe

This command uses two executables. The first one contains only the _Producer (see Figure above), which represents the data flow to which we want to apply the QC. The second executable generates the QC infrastructure based on the given configuration file (more details in a few sections). These two workflows are joined together using the pipe | character. This example illustrates how to add QC to any DPL workflow by using o2-qc-run-qc and passing it a configuration file.

Repository and GUI

The data is stored in the ccdb-test at CERN. If everything works fine you should see the objects being published in the QC web GUI (QCG) at this address : https://qcg-test.cern.ch/?page=objectTree. The link brings you to the hierarchy of objects (see screenshot below). Open "/qc/TST/QcTask" (the task you are running) and click on "example" which is the name of your histogram.

alt text

TODO add a link to the user documentation of the QCG when it is written.

Configuration file

In the example above, the devices are configured in the config file named basic.json. It is installed in $QUALITYCONTROL_ROOT/etc. Each time you rebuild the code, $QUALITYCONTROL_ROOT/etc/basic.json is overwritten by the file in the source directory (~/alice/QualityControl/Framework/basic.json).

The configuration for the QC is made of many parameters described in an advanced section of the documentation. For now we can just see below the definition of a task. moduleName and className specify respectively the library and the class to load and instantiate to do the actual job of the task.

(...)
"tasks": {
  "QcTask": {
    "active": "true",
    "className": "o2::quality_control_modules::skeleton::SkeletonTask",
    "moduleName": "QcSkeleton",
    "cycleDurationSeconds": "10",
(...)

Try and change the name of the task by replace QcTask by a name of your choice. Relaunch the workflows. You should now see the object published under a different directory in the QCG.

Readout chain

In this second example, we are going to use the Readout as our data source.

alt text

This workflow is a bit different from the basic one. The Readout is not a DPL, nor a FairMQ, device and thus we have to have a proxy to get data from it. This is the extra box going to the Data Sampling, which then injects data to the task. This is handled in the Readout as long as you enable the corresponding configuration flag.

To do so, open the readout config file located at $READOUT_ROOT/etc/readout.cfg and make sure that the following properties are correct :

# First make sure we never exit
[readout]
(...)
exitTimeout=-1
(...)
# And enable the data sampling
[consumer-data-sampling]
consumerType=DataSampling
enabled=1
(...)

Start Readout :

readout.exe file://$READOUT_ROOT/etc/readout.cfg

Start the proxy, DataSampling and QC workflows :

o2-qc-run-readout | o2-qc --config json://${QUALITYCONTROL_ROOT}/etc/readout.json

The data sampling is configured to sample 1% of the data as the readout should run by default at full speed.

Getting real data from readout

The first option is to configure readout.exe to connect to a cru. Please refer to the Readout documentation.

A more practical approach is to record a data file with Readout and then replay it on your development setup to develop and test your QC. The configuration options are described here, in particular :

equipment-player-* 	filePath 	string 	
	Path of file containing data to be injected in readout.
equipment-player-* 	preLoad 	int 	1 	If 1, data pages preloaded with file content on startup. If 0, data is copied at runtime.
equipment-player-* 	fillPage 	int 	1 	If 1, content of data file is copied multiple time in each data page until page is full (or almost full: on the last iteration, there is no partial copy if remaining space is smaller than full file size). If 0, data file is copied exactly once in each data page.

Readout data format as received by the Task

The header is an O2 header populated with data from the header built by the Readout. The payload received is a 2MB (configurable) data page made of CRU pages (8kB).

Configuration file

The configuration file is installed in $QUALITYCONTROL_ROOT/etc. Each time you rebuild the code, $QUALITYCONTROL_ROOT/etc/readout.json is overwritten by the file in the source directory (~/alice/QualityControl/Framework/readout.json). To avoid this behaviour and preserve the changes you do to the configuration, you can copy the file and specify the path to it with the parameter --config when launch o2-qc.

To change the fraction of the data being monitored, change the option fraction.

"fraction": "0.01",

Post-processing example

Now we will run an additional application performing further processing of data generated by the basic workflow. Run it again in one terminal window:

o2-qc-run-basic

In another terminal window run the ExampleTrend post-processing task, as follows:

o2-qc-run-postprocessing --config json://${QUALITYCONTROL_ROOT}/etc/postprocessing.json --name ExampleTrend --rate 10

On the QCG website you will see a TTree and additional plots visible under the path /qc/TST/ExampleTrend. They show how different properties of the Example histogram change during time. The longer the applications are running, the more data will be visible.

The post-processing framework and its convenience classes allow to trend and correlate various characteristics of histograms and other data structures generated by QC tasks and checks. One can create their own post-processing tasks or use the ones included in the framework and configure them for one's own needs.

↑ Go to the Table of Content ↑ | Continue to Modules Development →