-
-
Notifications
You must be signed in to change notification settings - Fork 543
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
nocache
: Memory usage gets bigger and bigger
#7039
Comments
if logged_in
wrapped with nocache
condition: Memory usage gets bigger and biggerif logged_in
wrapped with nocache
: Memory usage gets bigger and bigger
Hm. This seems to be a general issue with all nocache tags, the |
if logged_in
wrapped with nocache
: Memory usage gets bigger and biggernocache
: Memory usage gets bigger and bigger
We had 2 site outages over the weekend after introducing the {{nocache}} tag on a half static cached site.
|
Do you see any errors in the log? Like |
@arthurperton after about 10 refreshes it says:
|
Can you provide the actual template where you're using the nocache tag? |
layout.antlers.html
default.antlers.html
Btw: my jobs collection for testing has 292 entries. |
Thanks! |
Anything new on this one? This needs to be addressed as it makes Static Caching unusable more or less. |
Nothing? We are running a relatively small site (about) 100-200 pages in total and the load is insane. The site is unusable without static caching since load times explode to 10 seconds or so. With full measure caching everything is fine – besides the fact that warming the static cache with |
Do more get created on every request or only the first request for a page? |
@jasonvarga |
Do you even have static caching enabled? |
We do have static caching enabled and after a few days of usage the ls -l storage/framework/views/nocache/ | wc -l
521017 Again the Here is the info from
After warming the cache there are only ls -l storage/framework/views/nocache/ | wc -l
303 We have 128 URLs right now and use the |
Maybe this is another topic, but wouldn’t it be nice to have the possibility to specify a {{ nocache key="announcements" }}
// output dynamic announcements here
{{ /nocache }} |
I was directing the question to @j3ll3yfi5h. Since more stuff is generated on every request, it seems like maybe Static Caching wasn't enabled at all. Even since you say you can sit there and hit refresh and watch the files grow. If the page was cached, there shouldn't have been another
This is sorta the exact topic. 😄 The reason so many files are being generated is because keys are unique for each instance of the nocache tag. We give each region a unique ID so that if you're in a loop, we can be sure that instance of the loop gets its proper data. (See commit) A separate file is generated even if it's identical to another. It should be possible to re-use the files, but store the data uniquely. But that would only solve the issue of there being lots of temporary view files. I'm not 100% sure the number of files is the actual issue. I wonder if the array of cached URLs ends up being the problem. Maybe if the URLs are being hit with different variations of query parameters. Would you be able to run |
I ran |
That's probably it, then. 😄 Can you look at the values in there? Are there are duplicates? |
@jasonvarga Hm, I hope I had. But without enabling it, it definitely happens...😅 I'm not sure anymore, if I missed this with testing, sorry! |
@jasonvarga Am I right, that the nocache folder will never be empty after any artisan command? It gets bigger again, because after clearing the cache, new files are generated? |
There's currently nothing that would delete files from the nocache directory, so that'll just keep growing. We should add some garbage collection to that, but I'm not convinced the number of files in that directory is the issue. |
Absolutely, I checked some of the files and they contain recurring contents. There are
A million redundant files is a problem for sure. As described above deleting these files and rebuilding the static cache fixes the problems for a few days. |
Nothing attached.
Ok yes it's definitely a problem too, but it sounds to me the size of that cache file becomes a problem before the files do. It's chewing up memory when it needs to read and write that growing array over and over. |
Sorry, here is the screenshot. Just in case: There are |
Oh I thought it was going to be a screenshot of the cached array. I believe you that there are lots of files. |
Here is a screenshot of a part of the array. The array seems to be blown up by non-existing URLs, which is expected behavior afaik – even though this seems to be some kind of brute force vector if this is what messes up the site. The pages like I cannot tell if the array had an enormous size when the site broke down the last time, but I will check next time. |
Little update — 5 more days down the road: |
|
I'd be very interested to learn about the underlying issue here since we've run into the same problem. From what I've read here I'd also argue, that the amount of files doesn't seem to be a limiting factor. Our memory exhaustion error occurs just like described above, in the file FileStore.php, Line 77 - see file on GitHub. The issue occured after the website has been live for a couple of weeks. We had to clear the cache via We're using a nocache partial on some sub pages and in the navigation part of our website, which is included on every page of the website. The website does not have very many pages (< 50), not much going on there. The problem occured with caching completely disabled at first. Now we've enabled 'half' caching and are waiting to see what happens. Haven't been able to trigger the error. Not very much fun to anticipate the customer's website going down, tho. 🫣 |
We ran into this issue big time in a recent project (edit: using half static caching). There is a good amount of traffic on that page and the server response time got worse and worse within minutes. As observed by others, the cause of the problem doesn't look to be the amount of files in the Same as with the files, this array never gets cleaned and with every request to a page, data for every What currently makes the application work fine is running this command every five minutes: collect(Cache::get('nocache::urls', []))->each(function ($url) {
Cache::forget('nocache::session.' . md5($url));
}); This effectively clears the whole Obviously we would prefer to have a more appropriate solution than this. |
The proposal I made a while back would not help with the problem that the cache never gets cleared but it could prevent hundreds or thousands of redundant copies in a lot of cases and give control to developers to summarize regions. |
I'm running out of memory and I think it's related to this issue. I'm using the full measure caching and I have just one URL that is excluded from the Static cache, when I hit that URL I run into that issue. The error seems to be in FileStore.php, Line 77 - see file on GitHub. The funny thing is ... I'm getting this error after more or less 13 hours after I clear the cache. The solution for me was to set up a cron to clear the cache again every 12 hours. If I include all URLS to static cache, it doesn't run into issues. |
Are you using full measure static cache? I tried to run your code for clearing the nocache cache, but that gives me errors when loading those tags. |
Sorry, I didn't mention that. We're using half measure caching. For full measure caching it seems fine to create new files and add data to the Makes the problem even more urgent, it seems. |
Yes, as you mentioned when you have pages excluded from static cache and those pages use the {{ nocache }} tag, those pages will run out of memory at some point, however you can use the code that you mentioned cleaning the nocache::session cache when you visit those pages. I have fixed (workaround) creating a custom tag that clears the nocache::session only for excluded pages, but that fixes the issue for me that is using the static cache full measure, not sure for those using half measure, I think it's worse. I'm not sure if I should open a new issue about this, but I have detailed steps on how to reproduce it. |
Thanks for commenting it here @jasonvarga I like the solution to have it be opt in |
Hi, I was wondering the the support for keys on the |
Bug description
Usinglogged_in
condition insidenocache
tags causes memory usage to get bigger and biggernocache
tags files (storage/framework/views/nocache
) are created on every page request and get never cleaned.How to reproduce
Add a collection with entries.
Logs
No response
Environment
Installation
Fresh statamic/statamic site via CLI
Antlers Parser
runtime (new)
Additional details
No response
The text was updated successfully, but these errors were encountered: