You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Often times an issue harming SEO is site performance. Technically speaking, fault could lie with the developer for unoptimized scripts, uncompressed images, etc. But the fault could also lie with content editors who inject things like tracking scripts or 3rd party HTML into the page. Sometimes it's the hosting provider/configuration not setting cache TTLs on the static/media.
It could be useful to run some of the open source Lighthouse tools on a page of a wagtail site, and then curate the results to the editor explaining the results in non-technical terms. Problems could be grouped by who can fix them, which is the first line of communication to the team, rather than the first line being "Emergency: Google says our site has a low score! Fix it now!".
In my experience, usually what happens is that a site owner/editor will see a low score pop up on Google Search Console, and immediately blame either Wagtail or the Developer. Numbers that say "good"/"bad" can really put high tension on a team - especially when the team cannot do anything about said numbers. If editors can proactively monitor this, it could reduce tensions.
For example, a marketer adds a tracking script to the site.
Google crawls it and lowers the score of the site.
The marketer is alarmed by this, and immediately blames the developer, or Wagtail.
The developer may try to explain the problem, but it is viewed as simply "passing the buck" or hand-waving to avoid responsibility.
Often times what we need to do in these situations is break down the Google Pagespeed / Lighthouse score and create a manual report showing which metrics are important. This is a very time consuming task. If our tools were able to clearly provide such a report, in a more non-technical format, it could be a big help to the editors in understanding their site better.
The text was updated successfully, but these errors were encountered:
Often times an issue harming SEO is site performance. Technically speaking, fault could lie with the developer for unoptimized scripts, uncompressed images, etc. But the fault could also lie with content editors who inject things like tracking scripts or 3rd party HTML into the page. Sometimes it's the hosting provider/configuration not setting cache TTLs on the static/media.
It could be useful to run some of the open source Lighthouse tools on a page of a wagtail site, and then curate the results to the editor explaining the results in non-technical terms. Problems could be grouped by who can fix them, which is the first line of communication to the team, rather than the first line being "Emergency: Google says our site has a low score! Fix it now!".
Similar to how Wagtail does an accessibility check, and soon to be content check, we could do a performance check using open source frontend tools: https://github.com/GoogleChrome/lighthouse/blob/main/docs/readme.md#using-programmatically
In my experience, usually what happens is that a site owner/editor will see a low score pop up on Google Search Console, and immediately blame either Wagtail or the Developer. Numbers that say "good"/"bad" can really put high tension on a team - especially when the team cannot do anything about said numbers. If editors can proactively monitor this, it could reduce tensions.
Often times what we need to do in these situations is break down the Google Pagespeed / Lighthouse score and create a manual report showing which metrics are important. This is a very time consuming task. If our tools were able to clearly provide such a report, in a more non-technical format, it could be a big help to the editors in understanding their site better.
The text was updated successfully, but these errors were encountered: