-
Notifications
You must be signed in to change notification settings - Fork 53
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Proposal] Add automated accessibility testing #86
Comments
With the changes to sample apps in #5 and how we see the sample app working in 'dev'/single-experiment mode vs. 'release'/browse mode I see some options here. We could basically just enable the 1st option when running the sample app in single-experiment mode as an icon on the TabView container and when running the WinAppSDK head. We could document this as a tool available for developers and part of the Labs check-off list/process. I would imagine that if accessibility is working well with Axe on WinUI 3 then it should also work ok when running in UWP still. We can investigate having a more full-featured approach like option 3 in the main repo later that would be part of the transition out-of-labs process (see #47). Theoretically, if the control itself is already tested, then adding this as part of the transition process wouldn't be that much extra effort and shouldn't raise too many new/surprise issues. Adding it to our sample shell process too means the basic templates don't need to be modified and we can add this later without needing it for the initial launch of Labs. |
Yes, I think this is a good idea, but I also think it's essential to be clear about the goals here and how these fit into the wider project, and how we balance these against all the other challenges and constraints. Goals:
Challenges:
Constraints:
My suggestion:
Note. I have set up automated accessibility testing of UWP apps using Axe for Template Studio, so this doesn't need to be limited to just the WinUI3 builds. This is done as part of the integrated (BUILD*) tests. These run on the CI pipeline but are slow *️⃣ I use "fully" accessible in quotation marks as this is hard to define. |
The big question is what we want to achieve with accessibility testing in the toolkit labs. Should it just be a quick check off that a11y-wise, controls are good enough? Should it ensure that they are accessible (in the short or long term)? If the goal is to ensure accessibility through and through, I don't think that WinUI 3 Option 1 is enough, especially with controls for UWP. |
@chingucoding I think the goal with what I was proposing with the WinUI 3 option would be an easy to add/validate method for developers to check a11y principles during the initial development in Labs to validate some basic principles ahead of time. Then in the main repo as part of the transfer process we could have a step/template part for adding a more rigorous integration test that can run on both UWP and WinUI 3. I guess the questions may be around how Axe works and if we can run it in the CI too. Could you share more docs/details of what you integrated with the WinUI repo currently? |
This is good, we're multitargeting and devs can pretty easily provide completely different code for the different targets, so we can't always assume that automated tests for any one platform is an automatic greenlight for the others.
@mrlacey If I understand the proposal correctly, this sounds like something that would belong in a Labs Wiki. Unless this suggestion was for directions that the control author writes out, like a checklist? Would need more details on what you're thinking here. |
Adding Axe tests as part of automated test strategies should be the same for WinUI 3 and UWP once it is solved how to start the respecitve test apps (and how to interact with them). Regarding that, for WinUI we have dedicated pages that cover the different states of the control, though the current shortcoming is the transition. However that allows to see if they are accessible in the different state and might be a good start for the main repo. These tests are indeed slow, this is due to the nature of interaction tests though. In the case of WinUI, they are still being run for every PR even if they literally take an hour. So to recap previous comments, the next step would be to add a small button somewhere in the page that runs an Axe scan against the page? |
A core part of writing new components should also be ensuring that they are accessible. While testing that manually is good, having automated tests like Axe ensure that components are accessible can be very helpful when developing.
The big question is, when and where to automate testing for accessibility. In my opinion this boils down to the following possibilities:
WinUI 3
UWP
Option 1 and 2 from the WinUI 3 options will not work as easily since Axe calls into code that is not allowed to be run from the app container, implementing those would require using a bridge that get's called and orchestrates Axe. I think while it can work, cost and reward might not be in proportion, especially due to the added complexity. This leaves us with only other option being:
Non windows Uno Platform
As of right now, I think this is out of scope for this project and proposal.
In my opinion, starting with a "Scan for accessibility" button on WinUI would be the fastest way to help add this, though it would still require manual testing in the form of someone running the app and clicking the button. Due to the instability surrounding WinUI 3 unit tests, right now might not be the time to add WinUI 3 option 2. Interaction tests for UWP and WinUI 3 both are quite more complicated and require a bit more infrastructure surrounding the apps though this would not be the first time this will be done. What are you thoughts on this? What do you think would give the biggest benefit?
The text was updated successfully, but these errors were encountered: