Skip to content

ARTIQ Experiment

Mingyu Fan edited this page Oct 9, 2021 · 4 revisions

ARTIQ experiments perform programmed sequences of control, analysis, and data saving. ARTIQ experiments are almost the sole method to interact with ARTIQ controlled hardware (e.g., Urukuls, TTLs), aside from MonInj, which supports limited access for TTLs. Therefore, even basic functionalities of ARTIQ control, such as turning on/off a DDS channel, needs to be done using an experiment.

Basics of ARTIQ experiments

Many of the information in this section can be found in the manual, and some important parts are summarized here. Check the ARTIQ manual for the details.

Experiment structure

ARTIQ experiments derive from the artiq.experiment.EnvExperiment class. There are in general three functions that should be implemented in an experiment class: build, prepare, and run. build function sets the experiment arguments (we typically don't use this) and it can set devices required (we typically don't use this either). prepare function is run when the experiment is scheduled, and pre-running computational heavy work should be done here. run function actually conducts the experiment, and may interact with hardware.

Experiment pipelines

When prepare is called, the experiment is scheduled in one of the pipelines. Each pipeline defines a queue of experiments. Only the first experiment in each queue is run. Experiments in different pipelines may run at the same time, but only one of them can access ARTIQ hardware. Therefore, it is recommended to schedule all experiments that use ARTIQ hardware in the same pipeline to prevent hardware access collision. Experiment can be scheduled with different priorities, so the highest priority experiment is at the top of the queue (despite possibly scheduled later). The current running experiment by default would not be stopped by a higher priority experiment, but the current experiment can yield control to the higher priority experiment, and pause/stop itself. Different priority of experiments allow for interleaved checks, spectroscopy, or a background experiment to be running when no experiment is scheduled.

Kernel and host

In the run function, the experiment may talk with the hardware. All control of the hardware needs to happen as @kernel functions. Such kernel functions use "ARTIQ python", which is a subset of the python language. Check the manual for the syntax / keyword supported. See ARTIQ python for some additional notes. Kernel functions may do remote process call (RPC), @rpc, to run code from the host (computer). Such functions may return values from host to device, as long as the values are ARTIQ python compatible.

Each time the host calls a kernel function, it would take time on the order of 1 to 10 seconds to compile the kernel code. Therefore, time critical loops should be done in the kernel rather than the host code.

Also, the overhead with a RPC is typically much less than the compile time of a kernel function, so try to use RPCs instead of going back and forth between kernel and host code.

Real-time input and output

All kernel code is run on the processor on the Kasli board. They may control the ARTIQ hardware using real-time input and output (RTIO) to output a pulse sequence. RTIO means that the output of the pulse sequence is not determined at the time that the kernel starts to run. The output parameters may change depends on input events (e.g., TTL high/low, PMT counts). It takes time for the processor to compute the output parameters, so the output events must be scheduled at a time in the future. While the output sequence runs, the processor continuously programs more events in the output sequence. If the an output event is programmed before the current time, an RTIOUnderflow error happens, and the user must add more delay between events to prevent RTIOUnderflow from happening. In such a case, the experiment would stop with exception raised.

In some cases the experiment may not stop in case of an error (but the output may be incorrect), and an exception would not be raised (See this discussion). Collisions are one of the most common errors among these. For these errors, artiq_coremgmt log must be run in the cmd to manually check.

Experiment dataset

Datasets are by default saved in the computer volatile memory (lost when restarting the artiq_master, but can be also saved on the disk so they are persistent across artiq_master restarts (use persist=True when setting those data). These datasets can be used for plotting, saving parameters, and others.

Saving data in HDF5 files

Data from individual experiments are saved in corresponding HDF5 files, identified by their run id (unique id for an experiment). HDF5 is a versatile format allowing archiving of complex data structure (e.g., data arrays with unequal length) and attributes (e.g., metadata associated with data).

Experiment loading

When used with a repository to store experiments, in the dashboard experiment scheduler, only the committed changes run. All classes inheriting from EnvExperiment are discovered as experiments, with the exception of experiment classes starting with an underscore _ (good technique to hide utility experiments not needed in the dashboard). Another way to run the experiment is to use artiq_run file_name.py. artiq_run uses the latest version saved, instead of the latest commit (double check whether this is correct).

Our experiment structure

Misc/utility experiments

We have a few utility experiments in artiq_exps.experiments.misc. These experiments are for special purposes and may not follow the experiment structure discussed here.

Base experiments

We have a few base experiments that handles different layers of shared code.

artiq_exps.experiments.experiment.Experiment is the lowest experiment level, inheriting from the HasEnvironment class. The Experiment class includes shared code for all experiments: It sets the parameter lists used by the experiment in the build function, and reads and saves the parameters in the prepare function. All experiments, including the experiments that don't need to access the ARTIQ hardware, should inherit from this class.

artiq_exps.experiments.ion_experiment.IonExperiment is a base experiment that all experiments using the ARTIQ hardware. It defines host_initialize which should be called at the beginning of the run function. It also defines kernel_initialize which should be called at the beginning of the first kernel function called.

artiq_exps.experiments.enterprise_experiment.EnterpriseExperiment is a base experiment for all enterprise experiments (similar base experiments should be written for other experiments). This should handle code specific for the enterprise experiments (trap voltages, etc.).

A typical experiment should inherit from both EnterpriseExperiment (or equivalent class for other experiments) and EnvExperiment. EnvExperiment allows automatic experiment discovery in the dashboard.

Pulse sequences

An experiment that uses the ARTIQ hardware has a corresponding pulse sequence defined. The pulse sequence defines the RTIO control code, and can have sub pulse sequences that nest under. All pulse sequences are based on the artiq_exps.pulse_sequences.PulseSequence class.

An pulse sequence should define required_parameters and required_subsequences class attributes to declare the experiment parameters required. In the __init__ function, there should be an _get_parameters function that calls add_ttl_input_ch, add_ttl_output_ch, add_dds_ch for all channels used. All pulse sequence parameters should be calculated to machine units in the host code if possible.

Parameters

We use the LabRAD parameter vault for supplying parameters to the experiment. Parameters are read during the prepare stage in the experiment, and saved in an artiq_exps.experiments.parameters.Parameters object. The parameters are also saved in the HDF5 file.