With DevSkiller.com you can assess your candidates' programming skills as a part of your recruitment process. We have found that programming tasks are the best way to do this and have built our tests accordingly. The way our test works is your candidate is asked to modify the source code of an existing project.
During the test, your candidates have the option of using our browser-based code editor and can build the project inside the browser at any time. If they would prefer to use an IDE they are more comfortable with, they can also download the project code or clone the project’s Git repository and work locally.
You can check out this short video to see the test from the candidate's perspective.
This repo contains a sample project for Terraform and below you can find a detailed guide for creating your own programming project.
Please make sure to read our Getting started with programming projects guide first
It is possible to automatically assess the solution provided by the candidate. Automatic assessment is based on the plan generated by Terraform and code quality measurements based on tflint linter.
Verification tests are performed with Open Policy Agent and rego rules included in the task that and hidden from the candidate during the test. Files containing verification tests will be added to the project after the candidate finishes the test and will be executed during the verification phase. Verification tests result will be used to calculate the final score.
Once the candidate submits their solution, the platform executes verification tests by creating a plan from the files provided by the candidate and then it reads the output of the bats tests included in the task.
Terraform tasks can be configured with the DevSkiller project descriptor file:
- Create a
devskiller.json
file. - Place it in the root directory of your project.
Here is an example project descriptor:
{
"verification": {
"testNamePatterns": [
".*Verification.*"
],
"pathPatterns": [
"**verification/verify**"
]
},
"terraform": {
"configureProvider": [
"aws"
]
}
}
You can find more details about the devskiller.json
descriptor in our
documentation.
To define the Terraform specific variables in the descriptor use the following pattern:
terraform.configureProvider
- a list of providers to configure for the task. Here's the list of supported cloud providers:
A terraform task must include a terraform
directory with an empty empty.tf
file to ensure that terraform returns no error during the initialization of the task on the platform. This directory is also a place where candidate provide their solution code to be evaluated.
Unless a task or a cloud provider requires providing additional configuration (e.g. features {} for azurerm) there is no need to put a configuration stanza for providers in a task, as all configuration is provided by the platform and our Terraform runtime uses read-only credentials with a limited scope and privileges.
Upon completion of the exam, the files in the verification/
directory are moved to the terraform/
folder, where the bats
tests are executed to evaluate the candidate's solution.