Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tests(report): add axe-core validation of report output #9421

Merged
merged 1 commit into from
Jul 30, 2019
Merged

tests(report): add axe-core validation of report output #9421

merged 1 commit into from
Jul 30, 2019

Conversation

johnemau
Copy link
Contributor

@johnemau johnemau commented Jul 20, 2019

Summary
Add a test to run axe-core validation on the output of a sample report,
any violations are logged as the assert failure message.

Example Failure Output:
axe-core-failure-message

Axe-core is slow so I needed to bump the timeout to 10 seconds.

@johnemau johnemau changed the title tests(reports): add axe-core validation of report output tests(report): add axe-core validation of report output Jul 20, 2019
@johnemau
Copy link
Contributor Author

I am not sure how this slowed down boot time when it is adding a test, does the boot time come from test runs?

 ✘ difference at bootup-time audit.details
              expected: {"items":{"0":{"scripting":">1000"}}}
                 found: undefined
          found result:
      {
        "id": "bootup-time",
        "title": "JavaScript execution time",
        "description": "Consider reducing the time spent parsing, compiling, and executing JS. You may find delivering smaller JS payloads helps with this. [Learn more](https://developers.google.com/web/tools/lighthouse/audits/bootup).",
        "score": null,
        "scoreDisplayMode": "error",
        "errorMessage": "Fatal trace logic error - expected start event, got X"
      }

@connorjclark
Copy link
Collaborator

@johnemau iirc that particular test is flaky. Let's see if it happens again on your next push.

@brendankenny
Copy link
Member

I'm not sure about the fidelity of the axe tests when run against jsdom, since most of styling and layout isn't simulated. Some structural stuff will be caught, but it might be easier to run this as one of the smoke tests against a generated report (one of which is generated for at least the now.sh deployment)

@connorjclark
Copy link
Collaborator

smoke tests against a generated report

❤️

@googlebot
Copy link

We found a Contributor License Agreement for you (the sender of this pull request), but were unable to find agreements for all the commit author(s) or Co-authors. If you authored these, maybe you used a different email address in the git commits than was used to sign the CLA (login here to double check)? If these were authored by someone else, then they will need to sign a CLA as well, and confirm that they're okay with these being contributed to Google.
In order to pass this check, please resolve this problem and have the pull request author add another comment and the bot will run again. If the bot doesn't comment, it means it doesn't think anything has changed.

ℹ️ Googlers: Go here for more info.

@johnemau
Copy link
Contributor Author

I made a web edit and forgot that would default to my private github email which triggered the cla bot message. I will rebase the branch on Monday with my corrected author email to resolve the issue.

@johnemau
Copy link
Contributor Author

I'm not sure about the fidelity of the axe tests when run against jsdom, since most of styling and layout isn't simulated. Some structural stuff will be caught, but it might be easier to run this as one of the smoke tests against a generated report (one of which is generated for at least the now.sh deployment)

If that is a better approach I can move this over to the smoke tests.

I didn't look closely at the smoke tests before starting this work, instead I used adding this test as a learning exercise into how the reports are generated.

@googlebot
Copy link

CLAs look good, thanks!

ℹ️ Googlers: Go here for more info.

@johnemau
Copy link
Contributor Author

I'm not sure about the fidelity of the axe tests when run against jsdom, since most of styling and layout isn't simulated. Some structural stuff will be caught, but it might be easier to run this as one of the smoke tests against a generated report (one of which is generated for at least the now.sh deployment)

If that is a better approach I can move this over to the smoke tests.

I didn't look closely at the smoke tests before starting this work, instead I used adding this test as a learning exercise into how the reports are generated.

@brendankenny I took a look at the smoke tests and because they currently require a pre-checked in test fixture would it preferred to check in a snapshot of the report view once and update it periodically or add a step to the smoke test infrastructure that generates a report on the fly?

If we want to continue down that road it might be better to consider it a new kind of test all together because axe-core relies on webdriver.

That would definitely catch more types of issues (at the cost of runtime/flakyness) and it depends on what the project's appetite is for webdriver.

I think this is a good starting point with a todo to replace with a webdriver based test in the future.

@brendankenny
Copy link
Member

@brendankenny I took a look at the smoke tests and because they currently require a pre-checked in test fixture would it preferred to check in a snapshot of the report view once and update it periodically or add a step to the smoke test infrastructure that generates a report on the fly?

Yes, I think adding that step would be the best fix available. Lighthouse runs axe-core against any site, and the smoke tests are already set up to run Lighthouse against a url and assert the results, so all the pieces are in place if we can get a report built.

yarn now-build builds a set of reports from the checked in sample_v2.json, so that could be the thing we use. #9434 moves the generated reports to a nicer directory (dist/now/).

So I think we'd need:

  • add a new report (or whatever) target to smoke-test-defns.js
  • add a way for run-smoke.js to run lighthouse-core/scripts/build-report-for-autodeployment.js if it gets report as one of the requested types of smoke tests to run
  • add a special path to static-server.js to serve files from the generated report directory (similar to the filePath.startsWith('/dist/viewer') special case for viewer files)
  • add config/expectations for the report type of smoke tests

the ugliest part of this is the second bullet point, but I can't think of a great way around it that wouldn't require anyone wanting to run the smoke tests to always run both yarn now-build && yarn smoke

Anyone else have thoughts on this?

@brendankenny
Copy link
Member

(spoiler: we also fail the contrast check)

Other benefits of this approach:

  • we can easily do a11y testing in other states, e.g. make sure a report filled with error messages, error icons, warning boxes, etc still passes (and test with other languages, though I'm not sure we'd be able to get much different there except maybe RTL languages)
  • we've wanted a report smoke test for a while to catch e.g. performance regressions or other bad actions, so we can easily expand this beyond a11y audits in the future

Copy link
Member

@exterkamp exterkamp left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@johnemau
Copy link
Contributor Author

(spoiler: we also fail the contrast check)

Other benefits of this approach:

  • we can easily do a11y testing in other states, e.g. make sure a report filled with error messages, error icons, warning boxes, etc still passes (and test with other languages, though I'm not sure we'd be able to get much different there except maybe RTL languages)
  • we've wanted a report smoke test for a while to catch e.g. performance regressions or other bad actions, so we can easily expand this beyond a11y audits in the future

Make sense to me but it will take me a while to get ramped up on the smoke test infrastructure and get any PRs out.

In the meantime this test is ready to go and will provide some basic coverage so I will leave this PR open if anyone wants to move forward with it / merge.

Copy link
Collaborator

@patrickhulce patrickhulce left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed @johnemau! This is a great start and smoketest seems like a perfect candidate for future followup.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants