-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Improve search performance #129
Comments
This bothers me a bit. Is the performance that terrible? |
I would even dare to say: Since the data we send in each response is not all that large, we could even consider having a larger "window" in the back-end and paginate that in the front-end. May increase implementation complexity in the front-end but may help with potential performance issues if that was the core focus. Since it seems to only be a problem after several tens of pages and our goal is to have a lot of filtering possibilities, I don't see us reaching that level of active offers quickly enough to justify losing the possibility of random page access, I think. |
Nothing in this issue is really relevant at our level. However, I have found that in NoSQL databases, having that offset+limit pattern is not the way to go, hence the creation of this issue. It's kind of a learning opportunity if someone wants it. Additionally, regarding the "cannot jump to page x" problem. That is not a concern in this scenario since that is not a use case of the application. We use "infinite scroll", so we'll load continuously |
Currently, searching many offers depends on a
offset
+limit
combination to simulate pagination of results. This can be slow and we might lose results in the event of addition/removal of elements in between page changes.I propose the change to keyset-based search.
To the one implementing this, please read this article, which explains it very well: https://medium.com/swlh/mongodb-pagination-fast-consistent-ece2a97070f3
The basic idea is to take advantage of object ids being naturally sorted, so we can use the value of the last returned object as the threshold to fetch the next page (next page will have elements whose IDs will be > than the last element from previous page)
The text was updated successfully, but these errors were encountered: