Skip to content
This repository has been archived by the owner on Nov 14, 2022. It is now read-only.

Code installation

Samuel edited this page Apr 12, 2022 · 49 revisions

Prerequisites

It is recommended to use either Linux or Mac for development. The Windows WSL 2 backend should work, but can run out of memory. Alternatively, one can develop through Windows instead.

Use Miniconda for environment management. Quickly install by running the following:

wget https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh -O /tmp/Miniconda3-latest-Linux-x86_64.sh
bash /tmp/Miniconda3-latest-Linux-x86_64.sh -b -p ~/miniconda
source ~/miniconda/bin/activate ~/miniconda

Alternatively, can use mamba or venv instead.

Will also need to install Docker for containerisation.

Getting the code

Run the following:

git clone https://github.com/autoreduction/autoreduce-workspace
cd autoreduce-workspace
bash runme.sh  # will checkout all the repositories

It is recommended to keep all Autoreduction repositories in a subfolder, as this makes building container images faster (due to the context sent to the daemon being smaller).

Installing Autoreduction

  1. Make a new environment Python environment via virtualenv, venv, or conda/mamba, or any other virtual environment source.
  2. Go into the folder where the repositories were checked out, and install them as editable:
    • Installing them as editable -e means you can test development changes without re-publishing the base packages and reinstalling them.
    • Note that if you checkout some of them to a different branch than what you are working on, it may break your other work. To work around that see PR testing.
pip install -e autoreduce-frontend
pip install -e autoreduce-rest-api
pip install -e autoreduce
pip install -e autoreduce-scripts
pip install -e autoreduce-db
pip install -e autoreduce-utils
  1. To check all repositories have been installed as editable, run pip freeze | grep auto and compare with the following:
-e git+https://github.com/autoreduction/autoreduce-db@7a11d1ee210836f1b358baf727ef52ccec7882b4#egg=autoreduce_db
-e git+https://github.com/autoreduction/autoreduce-frontend@1606644e35b2650b821e1b19dc429c01a1cd9dc9#egg=autoreduce_frontend
-e git+https://github.com/autoreduction/autoreduce@bef2219813cfcb8c08352f9be10f67625e66e42b#egg=autoreduce_qp
-e git+https://github.com/autoreduction/autoreduce-rest-api@b432701242a575e96f1804bdc00178723e1d1261#egg=autoreduce_rest_api
-e git+https://github.com/autoreduction/autoreduce-scripts@734ba68b576a968fb250d5f8fa8e793dcea9b2dc#egg=autoreduce_scripts
-e git+https://github.com/autoreduction/autoreduce-utils@3499829d90822e545c5834d953fcb96c9de2055a#egg=autoreduce_utils
  1. Development of Autoreduce uses direnv for environmental variables. These variables populate the application with development variables. Run 'direnv' to see if it is installed already. If you receive a message similar to "command not known", you will need to install it by running:
curl -sfL https://direnv.net/install.sh | bash

Then add the following line at the end of the ~/.bashrc file

eval "$(direnv hook bash)"

Restart your shell. Finally, in the autoreduce-workspace root folder, run

direnv allow
  1. Run make in autoreduce-frontend. This will migrate, create, and apply database migrations; creating a file at ~/.autoreduce/dev/sqlite3.db
  2. Run the server with autoreduce-webapp-manage runserver. By default it will run in development mode, which will put logs and the SQLite3 database file at ~/.autoreduce

Development / testing packages

Most of these are now shared among the repositories, but some specific to the frontend remain described only in the specific repository (e.g. Selenium). To install them all, run the following in the autoreduce-workspace directory:

pip install -r autoreduce-actions/requirements.txt
pip install -r autoreduce-frontend/requirements.txt

PR testing quickstart

Avoid breaking your current work!

To avoid headaches it is advised to do PR testing in another clone of the repositories installed in a different virtual environment. Essentially, repeat getting the Code and installing Autoreduction, but in a different folder!


Before testing anything you make sure you have migrated the DB with fixtures: Run the Migrate database (with PR fixtures) task, or run make migrate-with-fixtures in autoreduce-frontend/

Running the webapp is enough to test simple UI changes and webapp only features. For bigger features that require interaction with the backend, e.g. reductions, reruns, you'll have to run Kafka and the Consumer.

Migrating a clean database

In some cases you may want a clean database, without any fixtures (e.g. if you are cloning the production DB). For that the VSCode task Migrate database (clean) or make migrate can be run.

Resetting the database

Because the whole database is stored in the sqlite3.db file, resetting everything is as simple as deleting or renaming the ~/.autoreduce/dev/sqlite3.db file, and migrating again.

Are you going to be submitting runs / reruns?

None of the rerun/configure new runs functionality is enabled without a reduce.py and reduce_vars.py for the instrument. With the test fixtures the instrument name is TestInstrument, and to make the required files with some sample code run:

mkdir -p ~/.autoreduce/dev/data-archive/NDXTESTINSTRUMENT/user/scripts/autoreduction
echo 'standard_vars = {"variable1":"value1"}' > ~/.autoreduce/dev/data-archive/NDXTESTINSTRUMENT/user/scripts/autoreduction/reduce_vars.py
echo 'def main(input_file, output_dir): print("Hello")' > ~/.autoreduce/dev/data-archive/NDXTESTINSTRUMENT/user/scripts/autoreduction/reduce.py

Are you going to need the runs processed?

You will need the Kafka broker running locally, so that the producer & consumer can connect to it. Environmental variables are used for credentials.

With make:

  • In autoreduce/ run make kafka

With VSCode:

  • From the tasks run Run Kafka Docker daemon

Once you've got the DB & fixtures installed you can run the Queue Processor & Web App, by:

  • Going to Run & Debug tab (CTRL+SHIFT+D default keybind) in VSCode and start the Run Kafka Consumer and Run webapp tasks - all you need to do afterwards is CTRL-click the http://127.0.0.1:8000/ in the terminal

if not using VSCode, and running from a terminal

  • Make sure to activate your environment
  • Start QP with python autoreduce_qp/queue_processor/confluent_consumer.py - you will need a 2nd terminal for the webapp
  • Start webapp with python autoreduce_frontend/manage.py runserver

Using Mantid

  • The recommended approach is to run the autoreduction/qp Docker container locally. The one hosted on autoreduction/qp may be out of date as updates get published before a new cycle. It's recommended you build the autoreduction/qp image from autoreduce-containers locally
  • Using the system Python and installing Mantid following the relevant installation instructions
    • Note: you won't out-of-the box be able to use a virtualenv since installing Mantid through the system package manager will use /usr/bin/python3. To use a virtualenv you will have to build Mantid from source
  • Using a conda environment. Create a new environment based on either scipp's mantid-framework, or mantid's mantid-framework. I'd suggest using whichever is newer
    • Once you got Mantid installed just install the autoreduce packages into it, and it should be picked up. Note that you don't have access to any GUI features, including the easier workspace plotting.
Clone this wiki locally