A list of staff email domains, which the API will treat as single-validation, email-verification only, is to be found here.
- docker and docker-compose
$ docker-compose build
$ docker-compose run --rm migrate
$ docker-compose run --rm loaddata
$ docker-compose run --rm test
$ docker-compose run --rm makemigrations
$ docker-compose run --rm makemigrations_merge
$ docker-compose run --rm migrate
$ docker-compose run --rm shell
$ docker-compose run --rm createsuperuser
$ docker-compose up serve celery
$ (To attach to a container with stdin, e.g. for debugging: docker attach <container_name>)
$ (Or, without celery, the old pdb-friendly way: docker-compose run --rm --service-ports serve)
Access the site at http://localhost:8000
$ docker-compose build
Using poetry (python package manager)
- Install used python version using pyenv
pyenv install $(cat .python-version)
- Install poetry packages in host
poetry install
- Add new package
poetry add <package-name>
- Update lock file
poetry update --lock
# Creation and upkeep language po files (for eg: fr)
python3 manage.py makemessages -l fr
# Creation and upkeep language po files (for eg: multiple languages)
python3 manage.py makemessages -l en -l es -l ar -l fr
# Updating current language po files
python3 manage.py makemessages -a
# Translate empty string of po files using AWS Translate (Requires valid AWS_TRANSLATE_* env variables)
python3 manage.py translate_po
# Compile po files
python3 manage.py compilemessages
# Import/Export django static translation
python3 manage.py static-translation-export path-to-export.csv
# For specific languages or only new(empty) strings.
python3 manage.py static-translation-export path-to-export.csv --only-new --languages en es
python3 manage.py static-translation-import path-to-import.csv
# Use this to copy the data from original field to it's default language.
# For eg: if the field `name` is registred for translation then
# this command will copy value from `name` to `name_en` if en is the default language.
python manage.py update_translation_fields
# Auto translate values from default lang to other language – to be used in the future (AWS Translate)
python manage.py translate_model
$ docker-compose run --rm coverage
Please read TESTING.md for guidance on writing and executing tests.
Identify the function/class to modify from main/urls.py.
Use docstrings to add action specific descriptions,
class CustomViewset(viewsets.ReadOnlyModelViewSet):
"""
list:
Description for list action of Custom.
read:
Description for read action of Custom.
"""
Look for the field definition in the CustomViewset
class or its attributes
like CustomFilter
.
Add help_text
attribute to the field definition.
variable_name = filters.NumberFilter(field_name='variable name', lookup_expr='exact', help_text='Description string for variable name.')
Django automatically generates description strings for standard fields like
id
or ordering
.
Circle-ci handles continuous integration.
To release a new version to docker hub do the following:
- Update
version
value inmain/__init__.py
- Create a new git tag with the same version
- Commit and make a PR against master
- The tagged version of the code is used to build a new docker image and is pushed to docker hub
main/runserver.sh
is the entrypoint for deploying this API to a new
environment. It is also the default command specified in Dockerfile
.
main/runserver.sh
requires that environment variables corresponding to
database connection strings, FTP settings, and email settings, among others, be
set. Check the script for the specific variables in your environment.
docker-compose up serve celery
or (just the base serve command):
docker-compose run --rm --service-ports serve
In main/runserver.sh
the line containing the loaddata
command is only necessary when creating a new database. In other cases it might be causing the conflict, so it is commented.
For the initial creation of an index
docker-compose exec serve bash python manage.py rebuild_index
For updating the index
docker-compose exec serve bash python manage.py update_index
For updating the cron monitored tasks
docker-compose exec serve bash ./manage.py cron_job_monitor
There are a few different ways to see logs in the new Kubernetes based stack. Both of the options require kubectl
, access to the cluster. Once the cluster is added to your local kubernetes context, follow the steps below:
Grafana is an open source log analytics software. We use Grafana along with Loki to interactively fetch logs, run custom queries and analysis. To access the Grafana dashboard and view logs:
- Proxy the Grafana server to you localhost:
kubectl port-forward --namespace loki-stack service/loki-stack-grafana 3000:80
- Now visit http://localhost:3000 to see the login page
- Get the password using
kubectl get secret --namespace loki-stack loki-stack-grafana -o jsonpath="{.data.admin-password}" | base64 -d ; echo
. The username isadmin
- Now go to the Dashboard > Logs. Currently we have added API and Elastic Search logs
- If you want to run custom queries, use the Explore tab.
Using kubectl
is a more direct way to looking at logs. Once you find the pod name (using kubectl get pods
), run kubectl logs podname
.
There are two Django management commands that helps to work with ICRC admin0 and admin1 shapefiles. These commands should be used only when you want to update geometries, or import new ones from a shapefile. The structure of the shapefile is not very flexible, but can be adjusted easily in the scripts.
This management command is used for updating and importing admin0 shapefile. To run:
python manage.py import-admin0-data <filename.shp>
The above command will generate a list of missing countries in the database based on the iso2 code to a file called missing-countries.txt
. In case the script comes across any countries with duplicate iso code, these will be stored in duplicate-countries.txt
--update-geom
-- updates the geometry for all countries matched in the shapefile using the iso2 code.--update-bbox
-- updates the bbox for all countries matched in the shapefile using the iso2 code.--update-centroid
-- updates the centroid for all countries from a CSV. The CSV should have iso code, latitude and longitude. If a country is missing in the CSV, the geometric centroid will be used.--import-missing missing-countries.txt
-- this will import countries for the iso2 mentioned inmissing-countries.txt
to the database. The file is the same format as generated by the default command.--update-iso3 iso3.csv
-- this will import iso3 codes for all countries from a csv file. The file should haveiso2, iso3
columns.--update-independent
-- updates the independence status for the country from the shapefile.
This management command is used for updating and importing admin1 shapefile. To run:
python manage.py import-admin1-data <filename.shp>
The above command will generate a list of missing districts in the database based on the district code and name (in case there are more than one district with the same code) to a file called missing-district.txt
--update-geom
-- updates the geometry for all districts matched in the shapefile using the iso2 code.--update-bbox
-- updates the bbox for all districts matched in the shapefile using the iso2 code.--update-centroid
-- updates the centroid for all districts matched in the shapefile using the iso2 code.--import-missing missing-districts.txt
-- this will import districts for the iso2 mentioned inmissing-districts.txt
to the database. The file is the same format as generated by the default command.--import-all
-- this option is used to import all districts in the shapefile, if they don't have a code we can match against in the database.
This management command is used for updating and importing admin2 shapefile. To run:
python manage.py import-admin2-data <filename.shp>
The shapefile should have the following mandatory fields:
- name or shapeName
- code or pcode
- admin1_id (this is the ID of the GO district this admin2 belongs to)
See this ticket for a full workflow of preparing the admin2 shapefiles.
The above command will generate a list of missing admin2-s in the database based on the code (we use pcodes) to a file called missing-admin2.txt
--update-geom
-- updates the geometry for all admin2 matched in the shapefile.--import-missing missing-admin2.txt
-- this will import admin2 listed inmissing-admin2.txt
to the database. The file is the same format as generated by the default command.--import-all
-- this option is used to import all admin2 in the shapefile.
Run python manage.py update-region-bbox
to update the bbox for each region in the database.
Run python manage.py import-fdrs iso-fdrs.csv
to update the countries table with FDRS codes. The csv should have iso, fdrs
structure
Run python manage.py update-sovereign-and-disputed new_fields.csv
to update the countries table with sovereign states and disputed status. The CSV should have the id,iso,name,sovereign_state,disputed
columns. The matching is based on iso and name. If iso is null, we fall back to name.
To update GO countries and districts Mapbox tilesets, run the management command python manage.py update-mapbox-tilesets
. This will export all country and district geometries to a GeoJSON file, and then upload them to Mapbox. The tilesets will take a while to process. The updated status can be viewed on the Mapbox Studio under tilesets. To run this management command, MAPBOX_ACCESS_TOKEN should be set in the environment. The referred files are in ./mapbox/..., so you should not run this command from an arbitrary point of the vm's filesystem (e.g. from the location of shapefiles), but from Django root.
--production
— update production tilesets. If this flag is not set, by default the script will only update staging tiles--update-countries
— update tileset for countries, including labels--update-districts
— update tileset for districts, including labels--update-all
— update all countries and districts tilesets--create-and-update-admin2 <ISO3>
— if a new admin2 tileset should be created, use this argument. It will create a new source on Mapbox and then register a tileset. Ensure that recipes are create inmapbox/admin2/
directory. For example, seemapbox/admin2/COL.json
andmapbox/admin2/COL-centroids.json
. A recipe for polygons and centroids are required. For centroids, we don't need to create a staging recipe. To runpython manage.py update-mapbox-tilesets --create-and-update-admin2 COL
--update-admin2 <ISO3>
— use this to update an existing admin2 tileset. For example,python manage.py update-mapbox-tilesets --update-admin2 COL
To import GEC codes along with country ids, run python manage.py import-gec-code appeal_ingest_match.csv
. The CSV should have the columns 'GST_code', 'GST_name', 'GO ID', 'ISO'