This repository features the basemodels for job manifest data consumed by hmt-escrow.
In order to get going you need Docker installed on your computer.
First build the image that contains all the code and dependencies:
docker compose build
Run the tests:
docker compose run basemodels ./bin/test
Lint the python files:
docker compose run basemodels ./bin/lint
Setup the environment
virtualenv -p python3 venv
source venv/bin/activate
Run the tests:
pytest tests
Lint the python files:
yapf --diff ./basemodels/__init__.py ./test.py
mypy ./basemodels/__init__.py ./test.py --ignore-missing-imports
Using the new model (based on pydantic library)
import basemodels
from pydantic.v1 import ValidationError
model = {
'job_mode': 'batch',
'request_type': 'image_label_area_select',
'unsafe_content': False,
'task_bid_price': 1,
...
}
# Validate model on creation
try:
manifest = basemodels.Manifest(**model)
except ValidationError as e:
print(e.json())
# Or creating model without validation
manifest = basemodels.Manifest.model_construct(**model)
# See https://pydantic-docs.helpmanual.io/usage/models/#creating-models-without-validation
The tags will need to be pushed to master via a user that has the proper privileges (see the contributors of this repo).
Versioning should follow the semver versioning methodology and not introduce breaking changes on minor or patch-level changes.
In root folder
virtualenv -p python3 venv
source venv/bin/activate
pipenv install --dev
pip install twine
python3 setup.py sdist bdist_wheel
twine upload dist/*
Open an issue!
MIT © HUMAN Protocol 2020