Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement a new exercise generator #29

Open
gvrooyen opened this issue Aug 28, 2024 · 0 comments
Open

Implement a new exercise generator #29

gvrooyen opened this issue Aug 28, 2024 · 0 comments

Comments

@gvrooyen
Copy link
Contributor

There is a basic exercise generation script (bin/gen-exercise.sh) which does not yet make use of configlet create to correctly populate the new exercise's configuration files, and add blank files for the stub, tests, and example.

A more comprehensive exercise generator would:

  1. Run configlet create to create the skeleton structure of the exercise.
  2. Add a very basic solution stub, and replicate it as the initial example in the .meta folder.
  3. Generate the stubs for the tests from the exercise's canonical-data.json (and possibly a similar track-specific metadata file; see below).

For the last step, each object in the cases array would correspond to a test. property would be the name of the solution's procedure that should be called (converted from camelCase to snake_case), input would specify the function's parameters and their test values, and expected would specify the expected result for the given input arguments.

The value of property and the keys of input could also be used to generate the procedure stubs in the solution stub.

@BNAndras notes that we probably want to be able to define track-specific tests, and define these in the .meta metadata folder as static definitions. Then it could be possible to regenerate the .odin files from scratch, or just add new tests if either the canonical tests or the track-specific definitions are updated:

The main thing to watch out for would be not to clobber the track-specific tests. My spur-of-the-moment though would be perhaps have an additional file in the .meta folder that contains JSON for the extra tests. Then the generator can append the generated test cases from that file to the end of the upstream test cases.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant