Hadoop Jobs is a bunch of Jenkins Pipelines and Ansible Playbooks for building a specific version of Hadoop from source.
- Ordering a VM from a cloud provider (Digital Ocean)
- Update OS, install Docker
- Build Hadoop from the patched sources of the specific version
- Run built images as LDC/LXD containers, with building a network between them
- Put there file (through POST) and get the file back, just a smoke test