-
-
Notifications
You must be signed in to change notification settings - Fork 112
Description
Clear and concise description of the problem
The Problem
The raw output in the "Test Results" panel (and the associated terminals) has room for improvement:
-
AI-Assisted Debugging is Hindered: When sharing test failures with AI tools, the fragmented output format creates friction: Expected vs actual values are in separate boxes. The same applies to Stack traces of errors. Because of this, I need to manually copy from 3-4 different places, then reconstruct the context before pasting into an AI chat. This breaks the flow and introduces errors.
-
Manual Reproducibility takes time: currently, there isn't a straightforward way to get a one-click command to replicate the exact test run in a standalone terminal for deeper debugging.
-
Comparing with CLI Behavior Creates Confusion
Developers are used to running vitest in the terminal and seeing formatted, comprehensive output. When the same test fails in VS Code, the output looks completely different and provides less context. This creates the need to re-run in terminal just to get better output
Example Workflow
Current
Screen.Recording.2026-01-09.at.22.55.28.mov
I exaggerated a bit by copying and pasting... but it's usually not just a simple number to write out.
Suggested
Screen.Recording.2026-01-09.at.23.00.45.mov
Note: Also, note that the exact command to replicate it in the terminal is provided.
Suggested solution
I've opened a pull request here: #707
On which several issues have been pointed out.
I am willing to contribute and find the most idiomatic way to solve this problem, but I would first like to know if the maintainers find value in my proposal.
Alternative
Alternatively, I would like at least one option to:
- customize the test results output. Or
- allow the extension to trigger tests in the CLI instead of test-results (similar to how this Jest extension does)
Additional context
No response
Validations
- Follow our Code of Conduct
- Read the Contributing Guidelines.
- Read the docs.
- Check that there isn't already an issue that request the same feature to avoid creating a duplicate.