Skip to content

Add _tests folder with a set of test file to run against the application #178

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
paoloricciuti opened this issue Jul 26, 2024 · 4 comments
Labels
enhancement New feature or request

Comments

@paoloricciuti
Copy link

Is your feature request related to a problem?

It would be cool to be able to provide a set of test file that will run against the application to validate if the solution is actually passing.

Describe the solution you'd like.

Add the ability to create a _test folder with tests for that specific lesson.

Describe alternatives you've considered.

Additional context

No response

@AriPerkkio
Copy link
Member

AriPerkkio commented Jul 26, 2024

You can add test cases in the _files and src/templates, and then create a script in package.json that runs your test script. Then in the mainCommand you can define this script to be run.

Something like this should work:

src/
├── content
│   └── tutorial
│       ├── meta.md
│       └── math
│           └── exercies
│               ├── meta.md
│               └── sum
│                   ├── meta.md
│                   ├── _files
│                   │   └── sum.js
│                   └── _solution
│                       └── sum.js
└── templates
    └── default
        ├── sum.test.js
        └── package.json
// _files/sum.js
export function sum(a, b) {
  return 0; // TODO Fix me
}

// _solution/sum.js
export function sum (a, b) {
  return a + b;
}

// templates/default/sum.test.js
test('sum adds numbers', () => {
  expect(sum(2, 3)).toBe(5);
});
// package.json
"scripts": {
  "test": "<your test runner command here>"
}
// meta.md
mainCommand: ["npm run test", "Running tests"]

@EricSimons
Copy link
Member

@AriPerkkio would that gate users from proceeding to the next lesson (or some other type of indication that they "successfully completed" the challenge)? If not, I think that sort of functionality would be super useful

@AriPerkkio
Copy link
Member

AriPerkkio commented Jul 30, 2024

We don't yet provide any way for preventing users from proceeding the tutorials without completing previous lessons.

I think for this feature we could add support for:

  1. Run test cases against code editor's current code and preview's DOM
  2. Show results of these test cases in the UI
  3. Optionally set UI to hide "Next lesson"-link when test cases do not pass. Note that this doesn't prevent navigating via URL as TutorialKit's Astro builds are just static files.

For the test cases we could provide our own test() / it() function that either renders ✅ on UI when pass, or ❌ when the given callback throws.

@AriPerkkio AriPerkkio added the enhancement New feature or request label Jul 30, 2024
@paoloricciuti
Copy link
Author

@AriPerkkio would that gate users from proceeding to the next lesson (or some other type of indication that they "successfully completed" the challenge)? If not, I think that sort of functionality would be super useful

Yeah that was my idea and i think i like this idea from @AriPerkkio

For the test cases we could provide our own test() / it() function that either renders ✅ on UI when pass, or ❌ when the given callback throws.

@AriPerkkio AriPerkkio mentioned this issue Oct 2, 2024
29 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

3 participants