-
Notifications
You must be signed in to change notification settings - Fork 980
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Root cause Bandersnatch hitting Stale Cached JSON API Requests #4892
Comments
we have the same problem here: 2018-10-23 16:30:58,066 INFO: bandersnatch/3.0.0.dev0 (cpython 3.6.6-final0, Linux x86_64) |
Another report of a stale cache: #5017. |
Yet another: #5323. |
@cooperlees checking in since it's been a few months -- you said on the Bandersnatch issue
You mean logging from the Bandersnatch side or from the PyPI side? |
From warehouse logs. We should be able to store where requests from the "bandersnatch" User-Agent hitting this endpoint are coming from and see if that helps work out why it's happening. Bandersnatch logs stay out on individuals computers and we can not analyze them centrally like warehouse. |
Does anyone have time for debugging this? Do we want to remove the hack in pypa/bandersnatch#57 for Bandersnatch where it tries and clear stale caches. |
Hi, I also have the same issue. I recently uploaded my first package to pypi but it is not appearning in search. I tried searching by name, no luck. Then i used license -> date last updated there are also I cannot find the package. I am assuming this is not an expected behaviors. Is there something, I am missing or something wrong with my upload? Here is the link https://pypi.org/project/contextualSpellCheck/. |
@R1j1t - I don't think your issue and this CDN stale cache issue are related ... This is forcing the CDN to refetch from origin. I need to workout if this is still happening. Might try and add more verbose logging to bandersnatch around this and see if people notice. |
With #13936 merged now (along with a few follow up PRs to fix some deadlocks), I think that the primary cause of this has been fixed now. The tl;dr is that our mirroring relied on the serial to be a monotonically increasing integer, but due to the way PostgreSQL works, concurrent transactions could end up with serials being "out of order", and #13936 changes that so that transactions that generate new serial numbers are serialized behind what is effectively a mutex. I'm going to close this now, but if anyone sees any new reports of this happening after today, we can re-open this issue. |
Wow. Thanks Donald. Will hola if anyone sees anything. |
Describe the bug
Summary: Bandersnatch installs hitting stale JSON API Data
Full Details: pypa/bandersnatch#56
Purge Hack: pypa/bandersnatch#57
Expected behavior
Bandersnatch not having to PURGE the cache.
To Reproduce
Unknown here, but I’d suggest lets add logging / collection of more data about Bandersnatch user-agent calling purges
My Platform
Bandersnatch with requests EDIT: Now aiohttp
The text was updated successfully, but these errors were encountered: