This document supplements the Exercism contributing guide; all contributors should read that document before proceeding.
Exercism tracks inherit exercise definitions from the problem-specifications repository in the form of description files (from which exercise READMEs are generated)
exercises/[EXERCISE]/
├── [EXERCISE].py
├── [EXERCISE]_test.py
├── example.py
├── .meta
│ ├── template.j2
│ ├── additional_tests.json
│ └── hints.md
└── README.md
Files:
File | Description | Source |
---|---|---|
[EXERCISE].py | Solution stub | Manually created by the implementer |
[EXERCISE]_test.py | Exercise test suite | Automatically generated if .meta/template.j2 is present, otherwise manually created by the implementer |
example.py | Example solution used to automatically verify the [EXERCISE]_test.py suite |
Manually created by the implementer |
.meta/template.j2 | Test generation template; if present used to automatically generate [EXERCISE]_test.py (See generator documentation) |
Manually created by implementer |
.meta/additional_tests.json | Defines additional track-specific test cases; if .meta/template.j2 is also present these test will be incorporated into the automatically generated [EXERCISE]_test.py |
Manually created by the implementer |
.meta/hints.md | Contains track-specific hints that are automatically included in the generated README.md file |
Manually created by the implementer |
README.md | Exercise README | Generated by configlet tool |
- A local clone of the problem-specifications repository.
- configlet: may be obtained either by
- (Recommended) Following installation instructions at the above link
- Running
bin/fetch-configlet
(configlet
binary will be downloaded to the repositorybin/
)
configlet generate <path/to/track> --spec-path path/to/problem/specifications
configlet generate <path/to/track> --spec-path path/to/problem/specifications --only example-exercise
If an unimplemented exercise has a canonical-data.json
file in the problem-specifications repository, a generation template must be created. See the test generator documentation for more information.
If an unimplemented exercise does not have a canonical-data.json
file, the test file must be written manually (use existing test files for examples).
Example solution files serve two purposes:
- Verification of the tests
- Example implementation for mentor/student reference
config.json
is used by the website to determine which exercises to load an in what order. It also contains some exercise metadata, such as difficulty, labels, and if the exercise is a core exercise. New entries should be places just before the first exercise that is marked "deprecated": true
:
{
"slug": "current-exercise",
"uuid": "aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa",
"core": false,
"unlocked_by": null,
"difficulty": 1,
"topics": [
"strings"
]
},
<<< HERE
{
"slug": "old-exercise",
"uuid": "aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa",
"core": false,
"unlocked_by": null,
"difficulty": 2,
"topics": null,
"deprecated": true
},
Fields
slug | Hyphenated lowercase exercise name |
uuid | Generate using configlet uuid |
core | Set to false ; core exercises are decided by track maintainers |
unlocked_by | Slug for the core exercise that unlocks the new one |
difficulty | 1 through 10 . Discuss with reviewer if uncertain. |
topics | Array of relevant topics from the topics list |
Similar to implementing a canonical exercise that has no canonical-data.json
, but the exercise README will also need to be written manually. Carefully follow the structure of generated exercise READMEs.
Before committing:
- Run
configlet fmt
andconfiglet lint
before committing ifconfig.json
has been modified - Run flake8 to ensure all Python code conforms to style standards
- Run
test/check-exercises.py [EXERCISE]
to check if your test changes function correctly - If you modified or created a
hints.md
file, regenerate the README - If your changes affect multiple exercises, try to break them up into a separate PR for each exercise.