Skip to content

Proposal: Improve Test Results Output for AI and Manual Debugging #709

@GermanJablo

Description

@GermanJablo

Clear and concise description of the problem

The Problem

The raw output in the "Test Results" panel (and the associated terminals) has room for improvement:

  1. AI-Assisted Debugging is Hindered: When sharing test failures with AI tools, the fragmented output format creates friction: Expected vs actual values are in separate boxes. The same applies to Stack traces of errors. Because of this, I need to manually copy from 3-4 different places, then reconstruct the context before pasting into an AI chat. This breaks the flow and introduces errors.

  2. Manual Reproducibility takes time: currently, there isn't a straightforward way to get a one-click command to replicate the exact test run in a standalone terminal for deeper debugging.

  3. Comparing with CLI Behavior Creates Confusion
    Developers are used to running vitest in the terminal and seeing formatted, comprehensive output. When the same test fails in VS Code, the output looks completely different and provides less context. This creates the need to re-run in terminal just to get better output

Example Workflow

Current

Screen.Recording.2026-01-09.at.22.55.28.mov

I exaggerated a bit by copying and pasting... but it's usually not just a simple number to write out.

Suggested

Screen.Recording.2026-01-09.at.23.00.45.mov

Note: Also, note that the exact command to replicate it in the terminal is provided.

Suggested solution

I've opened a pull request here: #707
On which several issues have been pointed out.

I am willing to contribute and find the most idiomatic way to solve this problem, but I would first like to know if the maintainers find value in my proposal.

Alternative

Alternatively, I would like at least one option to:

  1. customize the test results output. Or
  2. allow the extension to trigger tests in the CLI instead of test-results (similar to how this Jest extension does)

Additional context

No response

Validations

Metadata

Metadata

Assignees

Labels

p2-nice-to-haveNot breaking anything but nice to have (priority)

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions