[Recurring] Keep exercises up to date with problem specifications #1027
Labels
x:action/sync
Sync content with its latest version
x:knowledge/intermediate
Quite a bit of Exercism knowledge required
x:module/practice-exercise
Work on Practice Exercises
x:size/medium
Medium amount of work
x:type/content
Work on content (e.g. exercises, concepts)
This is a recurring task that has to be done from time to time. Do not reference this issue by writing
resolves/fixes #XXX
in your PR because that will close it.We want to keep the track up to date with problem specifications, which means regularly checking if anything changes and needs updating.
How to do this task:
Step 1: fetch configlet
Run:
Step 2: check which exercises need updating and in which way
Run:
The output will list all Elixir exercises that are out of sync in some way. Example output:
Choose one or more problems and attempt to fix it. Metadata and doc updates can be all done together for many exercises in a single PR, but please create separate PRs per exercise when changing tests.
Step 3.1: Sync all docs
Run this command to update all exercise docs (introductions):
Press
y
when askedsync the above docs ([y]es/[n]o)?
.Step 3.2: Sync all metadata
Run this command to update all exercise metadata:
Press
y
when askedsync the above metadata ([y]es/[n]o)?
.Step 3.3: Update tests for an exercise
The general goal is that all Elixir practice exercises follow problem specifications, except for when it doesn't make any sense in Elixir.
Important:
canonical-data.json
format it uses for describing tests cases.CONTRIBUTING.md
for the general rules of this repository.exercises/practice/<exercise>/.meta/design.md
if it exists to learn about design decision that we took in the past for this exercise (e.g. to drop a required function fromresistor-color
). Create this file if you're making unexpected decisions yourself!Step 3.3.1 Update
tests.toml
Update
tests.toml
by running this command for a chosen exercise:This command assumes that we want to implement all the new tests, and that is our default approach, but it still needs verifying on a test-by-test base.
Step 3.3.2 Analyze the new test cases and update the Elixir tests files accordingly
Find all the test cases affected by the previous step in the exercise's
canonical-data.json
file in the problem specifications repository. Analyze what the changes actually are.You might need to delete a test. You might need to add a new test. You might also need to replace an existing test with a new one (you'll see in
tests.toml
that it "reimplements" another).If you believe a new test case shouldn't be implemented, add
include = false
to it intests.toml
and document why inexercises/practice/<exercise>/.meta/design.md
.When editing the Elixir tests, make sure to:
tests.toml
file as texts ofdescribe
andtest
blocks (unless that would invalidate the first rule).tests.toml
file (unless that would invalidate the first rule).{:ok, result} | {:error, error}
return type (not raise errors!).Step 3.3.3 Check that the example solution works with new test cases
After modifying the Elixir tests, run this command:
If it fails, double-check if the new tests are correct. If they are, that means our example solution for this exercise is no longer valid for the new tests and needs to be updated!
When adding a completely new function that students need to implement, or when changing the return types of an existing one, make sure to also update the stub file.
Need help?
The text was updated successfully, but these errors were encountered: