-
Notifications
You must be signed in to change notification settings - Fork 9.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Make clear that opportunities and diagnostics do not factor into score #8717
Comments
Hi @halukkaramete, yes there is! See it here: https://github.com/GoogleChrome/lighthouse/blob/master/docs/scoring.md#how-are-the-scores-weighted |
I also found this a common source of confusion e.g. among management/business stakeholders but even developers using Lighthouse or PageSpeed Insights (PSI) for the first time. It's not explicitly called out on the lighthouse HTML reports (neither v4 nor v5) that opportunities or diagnostics don't influence the score directly. Proposed solutionI think it should be clearly stated that the score is purely based on metrics and no other audit directly influences it. Of course passing audits help in improving metrics. The current metrics disclaimer (
I can cut a quick PR for this if you agree. A note on PageSpeed Insights reportsPSI currently has the following text that links to the scoring page:
Even the PSI message is easily overlooked and not explicit enough. Messaging similar to the proposed Lighthouse message above could be adopted. Ideally PSI and LH could also agree on terminology (i.e. speed vs performance score). |
Just realized there's another opportunity here. Certain audit results could probably be used to run another simulation assuming they're fixed and estimate actual score gains. Probably not for all audits though and the definition of 'fixed' is not that clear either... Also this would definitely slow down execution time. |
That's exactly how most of the opportunities compute the "estimated savings" time already :) We could extend this to the number of points the overall score might change, but there are a few issues with this approach that might promise too much to users if we make that connection explicit. It's not an exact science. Good idea though we'll have to think about it! |
Thanks @patrickhulce! Re: the updated messaging for metrics - are you in favour of something like that? |
There are a lot of things we need to balance when communicating the role of the metrics, and I'm not quite sure what specific path will be the best for us, but I'm definitely in favor of making it clearer that only the metrics impact the score. I've heard this point of confusion many, many times as well 👍 |
@csabapalfi we like this proposal. Let's doooo it |
Feature request summary
Fixing some items might be more benefical than others, but we just do not know.
For example, Is it "deferring images" or "reducing TTFB" more bang for the buck :)?
At the end of the day, one should fix as many problems as possible but I think knowing that insight helps. Or we can go reverse; for example: when there is 85 on performance, it would be nice to know how that 85 came up? 40 points from X, 20 points from y and so on. That too could be helpful. As time goes, we'd get familiar what items are more important than others.
What is the motivation or use case for changing this?
How is this beneficial to Lighthouse?
The text was updated successfully, but these errors were encountered: