-
-
Notifications
You must be signed in to change notification settings - Fork 361
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Disallow indexing few pages for crawling robots #6283
Labels
Comments
yangwao
added
$
~<50usd
p1
preventing everyone from using app
A-seo-search-socials
labels
Jun 19, 2023
imo an update on |
👋 |
ASSIGNED - @floyd-li 🔒 LOCKED -> Wednesday, June 21st 2023, 14:07:10 UTC -> 36 hours |
maybe it need some time to take effort? |
Closed
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Currently, we are fighting to get good content on Google SEO and be in bit control; we got an advisory to drop a lot of "bad" pages which are not interesting for bots and raise our quality for them and get speedy-turn-around-reindex times.
Pages we should disallow for robots
beta is currently 60% URLs
nft. has fair portion
we will follow up with pre-render integration as I sense it's culprit CF doesn't do it.
Ref
https://support.google.com/webmasters/answer/7440203#crawled
The text was updated successfully, but these errors were encountered: