You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Suggestions from Iakov about managing bots making too many requests:
crawl-delay in robots.txt for yahoo & bing (if they are the problem)
google webmaster tools allows you to set limit for google
http code 429 + retry-after for nasty bots (if they are useless, just disallow /*)
questionable: robots.txt + disallow for certain pages
advanced: perhaps we discussed that already, if I were you I would use a) cache all standard elements b) save all gene pages to files (e.g. excluding header/footer). it is also possible to cache sql requests if you have enough hdd/ssd.
The text was updated successfully, but these errors were encountered:
In GitLab by @fbastian on Feb 12, 2018, 17:19
Suggestions from Iakov about managing bots making too many requests:
The text was updated successfully, but these errors were encountered: