-
Notifications
You must be signed in to change notification settings - Fork 1k
Open
Labels
enhancementNew feature or requestNew feature or request
Description
Problem
Currently, the prompts in AGENTS.md focus on generating high-level proposals and tasks, but they do not specify how to verify that each task has been completed correctly. When OpenSpec generates tasks.md, there is no guidance on acceptance criteria, so developers must infer how to test or validate the work.
Proposal
- Update the prompts in
openspec/AGENTS.mdso that for every generated task (and, if applicable, for each section of tasks), the AI also provides a clear acceptance or verification instruction. - The acceptance instructions should describe how to verify that the implementation meets the requirement — e.g., what tests should pass, what behavior should be observed, or what artifacts should be delivered.
- This guidance should be embedded directly in the tasks file to help developers and reviewers understand when a task is done.
Benefits
- Embedding acceptance criteria makes tasks measurable and reduces ambiguity.
- Facilitates QA and code review by defining what “done” looks like for each task.
- Improves the reliability of generated proposals and helps ensure that the final implementation aligns with the intended specification.
Please consider adding acceptance verification instructions to the AGENTS.md prompts.
TabishB and dlecan
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request