The new and improved DDW Analyst UI interface
-
Make sure you're starting with a clean DB volume, so Docker knows to create the new User
docker-compose down` `docker volume rm metadata2
-
Create a persistent dev volume
docker volume create --name=metadata2
-
Create a self-signed certificate
mkdir -p ssl && openssl req -newkey rsa:2048 -new -nodes -x509 -days 3650 -keyout ssl/privkey.pem -out ssl/fullchain.pem
-
Build your app
docker-compose up --build -d
-
Fetch CSV files from GitHub
docker-compose exec web python manage.py update_csv_files
-
Migrate the database.
docker-compose exec web python manage.py migrate
-
Load test data
docker-compose exec web python manage.py loaddata test_data` `docker-compose exec web python manage.py loaddata --database=datasets test_datasets
-
Alternatively, load the real data
export FTSUSER=X` `export FTSPASS=Y` `docker-compose exec web data_updates/completed_scripts.sh
-
Create a superuser.
docker-compose exec web python manage.py createsuperuser
-
Add the bit registry to npm config to install bit dependencies
npm config set @bit:registry https://node.bitsrc.io
-
Install frontend dependencies
npm install
-
Bundle frontend code and collect static files
npm run build
-
Restart the app.
docker-compose restart
-
Manually run management command to download csv files:
docker-compose exec web python manage.py update_csv_files
-
Create a scheduled event to periodically download updates from the git repo. The bash script is
update_csv_files.sh
-
The git hub repo with csv files can be found here
To create a test development DB, for local development (e.g. virtualenv steps below)
-
Ensure the line that normally appears as
local all postgres peer
inpg_hba.conf
instead readslocal all postgres trust
-
Run script
./dev-db-setup.sh
-
A database analyst_ui will be created in your local postgres instance.
-
Access sample database through default postgres user using
psql -d postgres
\c analyst_ui
-
For additional users, edit script analyst_ui_users.sql adding the username that you need
-
Run script to grant permissions to all the schemas and tables of analyst_ui
-
Follow the steps under the virtualenv section below to intergrate with your local environment.
Prerequisites
- python
- virtualenv
- NodeJS
- postgreSQL
- Initialise a virtual environment
virtualenv env
- Activate & enter virtualenv environment
source env/bin/activate
- Install python dependencies
pip install -r requirements.txt
- Fetch CSV files from GitHub
python manage.py update_csv_files
- Migrate the database.
python manage.py migrate
- Load test data from a fixture like so
python manage.py loaddata test_data
python manage.py loaddata --database=datasets test_datasets
- Create a superuser.
python manage.py createsuperuser
- Add the bit registry to npm config to install bit dependencies ``npm config set @bit:registry https://node.bitsrc.io
- Install frontend dependencies
npm install
- Bundle frontend code and collect static files
npm run dev
NB: is set to watch for changes and recompile - Run the app.
export DJANGO_DEV='True' && python manage.py runserver
-
Make sure you're starting with a clean DB volume, so Docker knows to create the new User:
docker-compose down docker volume rm metadata2
-
Create a persistent dev volume:
docker volume create --name=metadata2
-
Create a self-signed certificate:
mkdir -p ssl && openssl req -newkey rsa:2048 -new -nodes -x509 -days 3650 -keyout ssl/privkey.pem -out ssl/fullchain.pem
-
Build & run your app with the dev docker config:
docker-compose -f docker-compose.dev.yml up --build
-
Fetch CSV files from GitHub
docker-compose -f docker-compose.dev.yml exec web python manage.py update_csv_files
-
Migrate the database:
docker-compose -f docker-compose.dev.yml exec web python manage.py migrate
-
Load test data:
docker-compose -f docker-compose.dev.yml exec web python manage.py loaddata test_data docker-compose -f docker-compose.dev.yml exec web python manage.py loaddata --database=datasets test_datasets
-
Alternatively, you can acquire a db dump of the live data (binary) and import it into your database:
docker-compose -f docker-compose.dev.yml exec db psql -U analyst_ui_user -d analyst_ui -c 'drop schema public CASCADE;' docker-compose -f docker-compose.dev.yml exec db psql -U analyst_ui_user -d analyst_ui -c 'create schema public;' docker cp [DB BUMP FILE NAME].backup ddw-analyst-ui_db_1:/var/lib/postgresql/data docker exec ddw-analyst-ui_db_1 pg_restore -U analyst_ui_user -d analyst_ui /var/lib/postgresql/data/[DB BUMP FILE NAME].backup docker-compose -f docker-compose.dev.yml exec web python manage.py update_csv_files docker-compose -f docker-compose.dev.yml exec web python3 manage.py migrate
-
Create a superuser:
docker-compose -f docker-compose.dev.yml exec web python manage.py createsuperuser
-
Add the bit registry to npm config to install bit dependencies:
npm config set @bit:registry https://node.bitsrc.io
-
Install frontend dependencies:
npm install
-
Dynamic API base URL setting
Add an
API_BASE_URL
in the.env
file and assign it either a localhost, staging, or production url. If not set, this defaults to the url of the current environment in which the application is running. -
Start frontend dev environment which watches and collects static files:
npm start
Configure a cronjob to run the run-schedules.sh
script which in turn runs the command that checks for scheduled events every minute
* * * * * /root/run-schedules.sh >/root/cron-logs.txt 2>&1
Make sure the following schemas are created;
archives
i.eCREATE SCHEMA archives;
dataset
i.eCREATE SCHEMA dataset;
On first run (i.e if you have used a database dump without FTS Precoded tables included or a clean DB set up) run the following scripts;
docker-compose exec web data_updates/manual_data.sh
docker-compose exec web data_updates/manual_data_fts.sh
which adds the FTS metadata into the DB
The above should only be run once, on initial deployment of the feature, and only if using a clean and fresh DB set up with no data or using a DB dump that does not include the FTS metadata
To pull the latest FTS updates from the APIs, we shall run
3. docker-compose exec web data_updates/fts.sh
. Note that at this point the analyst may download the updated codelists and edit them, then re-upload them using the https://ddw.devinit.org/update/ feature
To precode and join the dependency tables, we shall run:
4. docker-compose exec web data_updates/fts_precode.sh
. This should be run everytime there is a change made to the codelist entries or everytime the script in 3 above is run.
To update the manual FTS tables with missing codelist items, we shall finally run the below script
5. docker-compose exec web data_updates/fts_diff.sh
We can run 4 and 5 above in one step by using:
docker-compose exec web data_updates/finalise_precode.sh
This will be the preferred way of running them from the front end as a scheduled event.
NOTE:
- All the above scripts should be run in that exact order on first run / deployment.
- After any update (i.e after running the script in 3 above, or an analyst using the https://ddw.devinit.org/update/ feature) is made to the codelists, run
docker-compose exec web data_updates/finalise_precode.sh
which combines 4 and 5 into one step. This can be run from theScheduled Events
on the front end.
This is set up to run with Cypress.
To test locally:
- Update the
baseUrl
option in thecypress.json
file to one that suits your current need - Setup test users as specified in the
frontend/cypress/fixtures/users.json
file - Run
npm run cy:run
for headless tests andnpm run cy:open
for interactive tests in a browser.
If you're using Postman for testing the REST api, you can use the following setup:
- Make sure you have an environment set for your collection.
POST
tohttp://localhost:8000/api/auth/login/
withBasic Auth
and the Username and Password.- Paste this code in
Tests
which will save the token to the environment.
var jsonData = JSON.parse(responseBody);
postman.setEnvironmentVariable("token", jsonData.token);
- In the Headers of your subsequent posts, send the Header
Authorization: Token {{token}}
- If certbot has not been installed already, install certbot by following commands
sudo add-apt-repository ppa:certbot/certbot
sudo apt-get install certbot
-
Run below script to generate certificates
certbot renew --dry-run --webroot -w /root/ddw-analyst-ui/static/letsencrypt
-
If the command above is run successfully copy certificates to the ssl folder of ddw app
cp -f /etc/letsencrypt/live/ddw.devinit.org/privkey.pem /root/ddw-analyst-ui/ssl/
cp -f /etc/letsencrypt/live/ddw.devinit.org/fullchain.pem /root/ddw-analyst-ui/ssl/
From ddw-analyst-ui root folder, reload nginx so that certificates are picked
docker-compose exec ddw-analyst-ui_nginx_1 nginx reload
- Check if there is a cron job set to renew certificates. If there is non add the cron task below. This will try to renew the certificate twice a day every day
0 */12 * * * /root/ddw-analyst-ui/certbot.sh >/dev/null 2>&1
Read more here
- Create a release branch with a name format
release/v[VERSION NUMBER]
- Update version number in
package.json
- Create a GitHub release with a name format
v[VERSION NUMBER]
- this will deploy to production - Create PR from release branch to master/main
- Merge in release branch to master/main