Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

H44: Radio button label positioning #1005

Closed
standardspace opened this issue Jan 8, 2020 · 12 comments
Closed

H44: Radio button label positioning #1005

standardspace opened this issue Jan 8, 2020 · 12 comments

Comments

@standardspace
Copy link

standardspace commented Jan 8, 2020

Checking the success criteria after the following markup failed an automated accessibility test.

<label for="radio-1">
   <input type="radio" name="search_option" id="radio-1" value="" checked="checked">
   All
</label>

For context, I chose to wrap the input as I wanted to make use of the label focus to set the (visually hidden) input checked state.

The failure was due to:

Note that the label is positioned after input elements of type="checkbox" and type="radio".

https://www.w3.org/TR/WCAG20-TECHS/H44.html

I'm wondering about the reasoning behind this. To my mind the input is both implicitly and explicitly labeled so I fail to see the need for this extra requirement for radios and checkboxes when standard labelling would appear to be sufficient. Am I missing something?

@JAWS-test
Copy link

JAWS-test commented Jan 8, 2020

Your testing tool is not perfect. Firstly, the position of the label in H44 is only a recommendation and secondly, the position is correct in your case (first the radiobutton, then the label).

If you are using an enclosing label, you can omit the for attribute from the label.

@standardspace
Copy link
Author

Thanks @JAWS-test for your response.

Yes, I get that no automated test will be (anything like) perfect. The tool however quotes H44 as its source and on checking H44, the tool appears to adhere to the letter of the recommendation. I've tested my component extensively and I'm pretty sure it's as accessible as I can make it. I'm rather questioning the reasoning behind the recommendation.

Previous versions made reference to the Window-Eyes 5.5 screen reader not being able to understand implicit labelling so I have typically used "for "as a belt and braces approach when nesting inputs within labels. Given that Windows-Eyes 5.5 is a commercial software product from 2005, the justification (which is missing from the latest version of H44) looks weak in 2018 let alone 2020.

Is there still a case for this aspect for the recommendation? If so It would be good to know so I can update my markup and improve accessibility. If not, the recommendation as used by automated tools causes a minor issue.

@JAWS-test
Copy link

Regarding your testing tool:

  • A violation of a WCAG recommendation is not automatically a violation of a WCAG SC. It is not mandatory to follow all recommendations in order to be in compliance with the WCAG
  • Also, in your case the recommendation is not violated, because the label is located after the radio button (at least in the source code, the visual representation is probably not checked by the tool)

Regarding label for and enclosing label:

  • There are always bugs in browsers and screen readers that occur in connection with label
  • These are not limited to old versions of Windows Eyes
  • According to WCAG you are not responsible for the bugs of the browsers and AT
  • If your client requires not only compliance with WCAG, but also actual accessibility, a test environment should be determined and empirically tested with it.

@awkawk
Copy link
Member

awkawk commented Jan 8, 2020

According to WCAG you are not responsible for the bugs of the browsers and AT

This isn't accurate. Accessibility Supported requires that you use techniques that actually work. You may make a conformance claim that references Windows 10 and NVDA and other current tools and not include Window-Eyes 5.5 in that claim and as long as there is accessibility support in the browsers and AT that are part of your conformance claim then that claim can stand up.

If there was a major issue in a major browser rendering engine that defaulted to hiding the keyboard focus for interactive controls, for example, authors would need to find a way to restore that even if this was regarded as a bug in the browser (again, unless the conformance claim didn't include that major browser rendering engine).

@standardspace - maybe the issue in error reporting is due to your opening label element being incorrect? Not sure if that is copied/pasted directly...

@standardspace
Copy link
Author

Typo fixed now. Yes, I cleaned up code for the question and deleted a bit too much. Corrected now.
Thanks for your input - useful guidance on reporting approach.

@JAWS-test
Copy link

JAWS-test commented Jan 8, 2020

Hi @awkawk

Accessibility Supported requires that you use techniques that actually work

that surprises me greatly, because

  • whenever I have named UA and AT problem (e.g. at Error of the User Agents part of WCAG or not #866), all involved parties have stated that the authors cannot be held responsible for browser and AT bugs and therefore there is no violation of the WCAG
  • the WCAG does not require any empirical test in the SCs with certain browsers and certain AT.

Also in the Conformance Claim the specification of a test environment is only optional.

The only thing the WCAG requires is Accessibility-Supported Ways of Using Technologies. However, I would say that the label element is accessibility supported, and problems encountered here are not violations of the WCAG.

@awkawk
Copy link
Member

awkawk commented Jan 8, 2020

Conformance claims are optional (you don't need to make a conformance claim), but #5 is "A list of the Web content technologies relied upon". The accessibility-supported ways of using technologies need to align with the relied on technologies.

@JAWS-test
Copy link

JAWS-test commented Jan 8, 2020

"Web content technologies" are not test environments (like AT and UA), but things like "HTML, CSS, SVG, PNG, PDF, Flash, and JavaScript.

In the above example with the radio buttons I would only have to write that I rely on the web technology HTML (and assume that of course all browsers and screen readers interpret HTML correctly)

@awkawk
Copy link
Member

awkawk commented Jan 8, 2020

:) right. Hasty answer.

Really conformance criterion 4 is the main thing - you can't pass an SC without it.

@JAWS-test
Copy link

JAWS-test commented Jan 8, 2020

Exactly, I already mentioned CC4 above and stressed that the label element certainly fulfills CC4, even if there are some bugs in the screen reader, right?

If not, I would suggest that CC4 be clarified. Even when I read https://www.w3.org/TR/WCAG21/#dfn-accessibility-supported and https://www.w3.org/TR/UNDERSTANDING-WCAG20/conformance#uc-accessibility-support-head, it is not clear to me when something is considered accessbility supported

@alastc
Copy link
Contributor

alastc commented Jan 20, 2020

@standardspace The techniques show ways to pass the criteria, but there can be other ways to pass as well. (I.e. they are a sub-set of possible ways to pass.)

If you have tested it and are confident it works (and from memory, it looks fine to me), then I would ignore the automated test. I wouldn't fail that in a manual test if it worked with common browsers & AT.

You might even report it as a false positive to the provider, they will probably either agree and adjust the test, or disagree and you might find out why they consider it a failure. I've done that before in another case and we discovered that it was there due to an old IE6 problem, so the test was dropped.

@fstrr
Copy link
Contributor

fstrr commented May 10, 2024

This issue looks like it can be closed. If it needs to be re-opened, please do that and convert it to a Discussion.

@fstrr fstrr closed this as completed May 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants