You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Due to the nature of the distributed index, it becomes increasingly difficult to fetch results as the page number grows.
Probably for this reason, Google seems to limit queries to 300-400 results. Also, there's a good argument that nobody may be willing to go much further.
With the current default of 25 results per page, should we add a maximum page number of 20 ?
The text was updated successfully, but these errors were encountered:
Google has a max of 300-400 results because of the way they do queries: each shard only returns a couple of candidate answers. All web-scale indexes are done that way, with lots of shards. Only a few % of searches go to the 2nd page, although I think 80% of people doing searches occasionally go to the 2nd page. The faction of non-robots that go to deep pages is vanishingly small. I don't think there's any reason to go beyond 50 results, especially if you think your robot defenses are weak.
Due to the nature of the distributed index, it becomes increasingly difficult to fetch results as the page number grows.
Probably for this reason, Google seems to limit queries to 300-400 results. Also, there's a good argument that nobody may be willing to go much further.
With the current default of 25 results per page, should we add a maximum page number of 20 ?
The text was updated successfully, but these errors were encountered: