-
Notifications
You must be signed in to change notification settings - Fork 4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] : +[Doc] Maintain annotations from all retries #35215
Comments
This would also permit to append data to existing descriptions. In case it seems edgy to you, here is an explicit example : |
It's unclear what you'd like to see here. Can you clearly enumerate your desired changes in a small list? |
Not sure where it misses clarity, but titles summaries it :/ Let me rephrase it then. Actually, test annotations are run scoped, I would like :
|
I don't understand how you would just make annotations run scoped. From your example, I would expect to see: Run 1
Run 2
Combined all runs
Even if you deduped them you would get
Which is definitely not what you want. As discussed in #30188 as you linked, annotations are intended to be static. You are not expected to modify them during the actual content of the test (only conditionally at the beginning based on your platform or whatever). You should be using attachments for this usecase. If that's not sufficient for your usecase, you can also write a custom reporter. |
Discussed with the team, we would like to make annotations available on |
Nice to hear ! Will definitely help :) |
🚀 Feature Request
Hello community & maintainers !
Firstly, I would like to say that some aspects of annotations are confusing for people, which is quite sad regarding the overall quality of the Playwright documentation
As per #33954, this seems it had been already discussed internally, but was set as not planned without further indication. Not sure if this is related, but as this issue had the 1.50 tag and 1.51 brought the possibility to add attachments to test.step, I suppose this is the expected way to add dynamic values. If so, it should definitely be mentionned in the Annotation documentation.
Secondly, and it has being already said here #30188 #32411, users would tend to think that annotations values are kept for each run, or values would be updated, but this is more trickier : only ones from the latest run are kept.
It almost sounds like a bug, as runtime annotations added only on the previous runs will not appear at all, which is a loss of information that is important enough for a developper to add it manually. Please, this has to be put in a warning block in the docs.
Yet, this could be fixed by keeping every annotation between runs, and just override their description when it is updated.
Some would argue that we could lose tracibility from which run it happened, but we could just add testInfo.retry to its type then.
You may think that this is a duplicate of the aforementionned issues. In fact, I would hugely prefer the first to be reopened, as it would be better to treat each annotation block as a separated object for each retry, or at least only for runtime ones and keep the static ones on top, but I undestand it may be difficult to implement and that attachments can fullfill those needs . However, I think the solution I propose would be less complex to implement, while removing a counter-intuitive behaviour.
Example
Let's say we have 2 runs, as an error occured for the first one because one of the server instances has an invalid SMS sender API token
Actual behaviour
No mention of the error event here
With feature :
Here, error event and all visited nodes are kept
Motivation
As you may see above, I put several key informations in the annotations, as this is the first element to appear on the report that allows to share clickable URLs. Unfortunately, some of them are dynamic between run and I can't track them all this way.
Putting it in test steps or attachments is not ideal, as it forces to go to the bottom of the report.
For example, I link the id of the issue generated by Sentry in the case of an error event shows up to reduce the number of clicks devs needs to reach those precious debugging logs, for critical situations.
I also use them to track the ID of the tested server instance as we use a load balancer, and when the test started. I don't find this convenient to put them in attachments, but I could accomodate if the latter was added natively to the report for each run/retry in the same way it is displayed for the main page of the HTML report
The text was updated successfully, but these errors were encountered: