-
Notifications
You must be signed in to change notification settings - Fork 43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
crawler skip links nested within a shadow tree #1216
Comments
Hi @yinonov, thanks for filling an issue! I believe this is not a bug and is in fact by design. Please check out our docs here, which discuss why Accessibility insights does not support elements that are contained within a closed shadow DOM: https://accessibilityinsights.io/docs/web/reference/faq/#are-automated-checks-compatible-with-shadow-dom |
The team requires additional author feedback; please review their replies and update this issue accordingly. Thank you for contributing to Accessibility Insights! |
Thanks @sfoslund, this section u shared refers to a |
This issue has been marked as ready for team triage; we will triage it in our weekly review and update the issue. Thank you for contributing to Accessibility Insights! |
This issue requires additional investigation by the Accessibility Insights team. When the issue is ready to be triaged again, we will update the issue with the investigation result and add "status: ready for triage". Thank you for contributing to Accessibility Insights! |
Investigating with engineers |
any clue why this happens? does it happen in axe itself? |
You are referring to which links are crawled, not which links are scanned, correct? Not crawling open shadow dom links is a current limitation of accessibility-insights-service). The most realistic path to picking up links in open shadow DOMs would be switching the service from puppeteer to playwright (which would pierce open shadow DOMs by default with the sort of query apify uses to scrape for links), Piercing closed shadow DOMs is a lot harder implementation-wise – that’s functionality that neither Puppeteer nor Playwright facilitate. This does make sense for us to address long-term. I have filed an issue in the service repo to track the request: #2170. I will close this issue. But the community should feel free to leave comments in the service tree repo to help us prioritize this change. |
Thanks for the detailed explanation. IN the meantime, I s there a workaround where I could specify a list of links for the action to visit? |
Hello @yinonov, Sorry for the delayed response. There is a mechanism for specifying a list of links for the action to visit. You can use the An example of the step with the URLs provided might look something like this: - name: Scan for accessibility issues
uses: microsoft/accessibility-insights-action@v3
with:
url: https://your-website-url
input-urls: https://your-first-link https://your-second-link https://your-third-link This will put the four listed URLs in the queue to be scanned first, then any crawled URL will be added to the end of the queue and will be scanned after those first four. Hopefully that solves your issue for now. Let us know if you have any further questions! |
thanks @brocktaylor7 that's exactly what I did |
Describe the bug
running the action on a website with links that are nested within shadow tree just skips visiting those links and only visits links nested within the light tree
CodePen repro example
it's not a repro but the actual site (preview channel) where this occurs and shadow links are skipped
https://vivid-j0vxkmsqmhn0.deno.dev/
Expected behavior
Shadowed links should be visited and checked
Screenshots
this gets visited
this doesn't
both are equivalent according to the accessibility tree view (devtools) and share the same computed properties
Are you willing to submit a PR?
Of course, with a little guidance
Did you search for similar existing issues?
yes, none to be found
Additional context
this bug might no be related to the GH action but to the underlying accessibility insights service.
forward to the relevant repo if need
The text was updated successfully, but these errors were encountered: