Skip to content

Magento 2.4.1 - HUGE Cache Sizes grows quickly #32118

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
amitvkhajanchi opened this issue Feb 11, 2021 · 74 comments
Closed

Magento 2.4.1 - HUGE Cache Sizes grows quickly #32118

amitvkhajanchi opened this issue Feb 11, 2021 · 74 comments
Assignees
Labels
Area: Content Component: Cache Issue: needs update Additional information is require, waiting for response Reported on 2.4.1 Indicates original Magento version for the Issue report.

Comments

@amitvkhajanchi
Copy link

amitvkhajanchi commented Feb 11, 2021

Magento CE 2.4.1
Ubuntu 20.04 LTS
PHP 7.4.3
MySQL 8.022 (Amazon RDS)
Elasticsearch 7.6.2
Redis 5.2.1
Varnish 6.2.1
Nginx 1.8

PS. I have over 5000+ products in 2 websites (EU/US) EU site has 5 Store views (EN,FR,IT,DE,ES). US site has 1 Store view (US). Total of 6 Store view for my site.

During Testing, the site was performing well on my staging server. When I finally deployed it to production this week after setting the mode to production and seeing real traffic, my cache started growing at alarming rate. Right now, it will fill up 100GB in 2 hours. That never happened in Magento 2.3.3 in my old box.

My old box had 30GB in production mode and was running fine with Varnish. much smaller.
When I run the command du -h /var/cache it total 741MB in my old box
When I run the command du -h /var/cache in my current box it grows from 0 -> 10GB in 1 min and within 2 hours its 100GB.

Attached are results of both du command to see file size.
du-output-2.4.1-mage--d.txt
du-output-2.3.3-mage--d.txt

@m2-assistant
Copy link

m2-assistant bot commented Feb 11, 2021

Hi @amitvkhajanchi. Thank you for your report.
To help us process this issue please make sure that you provided the following information:

  • Summary of the issue
  • Information on your environment
  • Steps to reproduce
  • Expected and actual results

Please make sure that the issue is reproducible on the vanilla Magento instance following Steps to reproduce. To deploy vanilla Magento instance on our environment, please, add a comment to the issue:

@magento give me 2.4-develop instance - upcoming 2.4.x release

For more details, please, review the Magento Contributor Assistant documentation.

Please, add a comment to assign the issue: @magento I am working on this


⚠️ According to the Magento Contribution requirements, all issues must go through the Community Contributions Triage process. Community Contributions Triage is a public meeting.

🕙 You can find the schedule on the Magento Community Calendar page.

📞 The triage of issues happens in the queue order. If you want to speed up the delivery of your contribution, please join the Community Contributions Triage session to discuss the appropriate ticket.

🎥 You can find the recording of the previous Community Contributions Triage on the Magento Youtube Channel

✏️ Feel free to post questions/proposals/feedback related to the Community Contributions Triage process to the corresponding Slack Channel

@hostep
Copy link
Contributor

hostep commented Feb 11, 2021

Since your block caches are very big, I believe this might be a duplicate of #29964 which was fixed in Magento 2.4.2

You can try to temporarily disable the Magento_Csp module and see if that solves it. If that is the case you'll be certain that it's the same bug and upgrading to 2.4.2 should fix it.

@amitvkhajanchi
Copy link
Author

@hostep

Thanks for quick response. I have disabled the module. I am using Amazon EFS for the pub/media folder and realized that I had encryption on for this. I created another unencrypted volume and mounted this to pub/media. This helped my problem somewhat as I was filling 100GB of space in 30min and that changed to 60GB in 3-4 hours. Which was still crazy.

I will check in the morning (as its night my time) to see how the disk is filled. So far, it seemed to have slowed down, but I will report back in morning if disabling Magento_Csp improved it.

@mrtuvn
Copy link
Contributor

mrtuvn commented Feb 12, 2021

As i remember this caused by high load plugin design load
Need some workarounds recheck disable this plugin and move to properly place with correct intercepts
https://github.com/magento/magento2/blob/2.4-develop/app/code/Magento/Theme/etc/di.xml#L109 <= default codebase
@hostep @amitvkhajanchi . This is confirmed issue but i don't remember exactly id

@amitvkhajanchi
Copy link
Author

@hotstep @mrtuvn

Thanks for all the feed-back. I can confirm to you that disabling Magento_Csp module resolved the issue on Magento 2.4.1 version I have. My cache only grew to 600MB and is stable now compared to multiple GB previous days. This is inline with my experience on Magento 2.3.3. Thanks for the help and prompt feed-back. I really appreciate it!!!

@IbrahimS2
Copy link

Still, this is an issue on M2.4.2!

@DavorOptiweb
Copy link

DavorOptiweb commented Jul 7, 2021

We have same issue with 2 sites
One is 2.4.2
Second is 2.4.2-p1 (upgraded from 2.3.6)

It generated huge amount of redis cache keys and it all started with upgrade from 2.3.6 to 2.4.2
image

It goes even higher and server had to shut down due to out of memory (32G all together), so now I'm clearing redis every night to keep page alive

@mrtuvn
Copy link
Contributor

mrtuvn commented Jul 7, 2021

@DavorOptiweb
Copy link

#29964 (comment) @DavorOptiweb

We have Magento_CSP disabled from start (never enabled)

@mrtuvn
Copy link
Contributor

mrtuvn commented Jul 7, 2021

hmm! Interesting case
from above comment i see used csp module but not use redis but in your case that you claims enabled redis by default right ?
Maybe we need more investigate root cause on this why redis write so much data
Can you have monitor this case in latest code base (git 2.4-develop) ? I think this case require real data with big site for able reproduce

@IbrahimS2
Copy link

IbrahimS2 commented Jul 7, 2021

We have same issue with 2 sites
One is 2.4.2
Second is 2.4.2-p1 (upgraded from 2.3.6)

It generated huge amount of redis cache keys and it all started with upgrade from 2.3.6 to 2.4.2
image

It goes even higher and server had to shut down due to out of memory (32G all together), so now I'm clearing redis every night to keep page alive

@DavorOptiweb We have investigated this further, it appears what is consuming the memory in Redis is basically the 'page_cache', it stores the page source code which is consuming most of the memory, once we disabled/kept it off Redis it came down from 40GB to just up to 200-300MB memory usage.

The whole concept seems to be against the purpose.

@DavorOptiweb
Copy link

DavorOptiweb commented Jul 7, 2021

We are using Varnish for Page Cache.
Redis database for page cache is empty, only "default" cache is going to Redis.

If I have a look on Redis keys, this is what I see (most of "k" are block)
image
image

@IbrahimS2
Copy link

@DavorOptiweb Please share the cache declaration in your env.php file.

@DavorOptiweb
Copy link

DavorOptiweb commented Jul 7, 2021

cache redis (port 6379, host redis)
image

Redis DB 0 has around 200K records in 10h time
Redis DB 1 is empty

And Session is on Redis, but has own instance (port is 6380, host redis)
image

@mrtuvn
Copy link
Contributor

mrtuvn commented Jul 7, 2021

the difference redis of mine only in session
database => 2
log_level => 1
max_concurrency => 20
Maybe you have some tweak in redis.

How many products that you have ? cms blocks count

@IbrahimS2
Copy link

IbrahimS2 commented Jul 7, 2021

@DavorOptiweb Please try the following:


    'cache' => [
        'frontend' => [
            'default' => [
                'id_prefix' => '46c_',
                'backend' => 'Magento\\Framework\\Cache\\Backend\\Redis',
                'backend_options' => [
                    'server' => '127.0.0.1',
                    'database' => '0',
                    'port' => '6379',
                    'persistent' => '',
                    'force_standalone' => '0',
                    'connect_retries' => '1',
                    'read_timeout' => '10',
                    'automatic_cleaning_factor' => '0',
                    'compress_data' => '1',
                    'compress_tags' => '1',
                    'compress_threshold' => '20480',
                    'compression_lib' => 'gzip',
                    'preload_keys' => [
                        '46c_EAV_ENTITY_TYPES',
                        '46c_GLOBAL_PLUGIN_LIST',
                        '46c_DB_IS_UP_TO_DATE',
                        '46c_SYSTEM_DEFAULT'
                    ]
                ]
            ],
            'page_cache' => [
                'id_prefix' => '46c_'
            ]
        ],
        'allow_parallel_generation' => false
    ],

@mrtuvn
Copy link
Contributor

mrtuvn commented Jul 7, 2021

i think this one should like below
'preload_keys' => [
'46c_EAV_ENTITY_TYPES',
'46c_GLOBAL_PLUGIN_LIST',
'46c_DB_IS_UP_TO_DATE',
'46c_SYSTEM_DEFAULT'
]
Just curiously why do you use prefix bcf ? Correct me if i'm wrong

@IbrahimS2
Copy link

@mrtuvn I missed that, just corrected it.

@DavorOptiweb
Copy link

I have changed the env.php cache settings, but after 1h, I don't see any change.
After 1h, there is already 1G+ memory usage and over 70k keys (same pattern as previous days)

Here is graph where is clear that before upgrade, Redis usage was stable
image

@IbrahimS2
Copy link

@DavorOptiweb Could you execute the following command "redis-cli --bigkeys"? It cannot be session keys I must tell you now.

@DavorOptiweb
Copy link

DavorOptiweb commented Jul 8, 2021

`# Scanning the entire keyspace to find biggest keys as well as

average sizes per key type. You can use -i 0.1 to sleep 0.1 sec

per 100 SCAN commands (not usually needed).

[00.00%] Biggest hash found so far 'zc:k:46c_AB140E4DCDEBB91C4004178C10A7A5D3BD7C179B' with 4 fields
[00.00%] Biggest set found so far 'zc:ti:46c_CATALOG_PRODUCT_VIEW_SKU_09035' with 2 members
[00.01%] Biggest set found so far 'zc:ti:46c_CAT_P_25297' with 6 members
[00.01%] Biggest set found so far 'zc:ti:46c_CONFIGURABLE_21830' with 9 members
[00.06%] Biggest set found so far 'zc:ti:46c_CAT_P_35517' with 16 members
[00.15%] Biggest set found so far 'zc:ti:46c_CAT_P_33770' with 26 members
[00.29%] Biggest set found so far 'zc:ti:46c_CAT_P_34905' with 31 members
[00.41%] Biggest set found so far 'zc:ti:46c_CAT_P_34878' with 33 members
[00.64%] Biggest set found so far 'zc:ti:46c_DEFAULT' with 7568 members
[04.67%] Biggest set found so far 'zc:ti:46c_COLLECTION_DATA' with 13786 members
[11.88%] Biggest set found so far 'zc:ti:46c_CAT_C' with 14970 members
[16.62%] Biggest set found so far 'zc:ti:46c_MAGE' with 69201 members

-------- summary -------

Sampled 80175 keys in the keyspace!
Total key length in bytes is 5420897 (avg len 67.61)

Biggest set found 'zc:ti:46c_MAGE' has 69201 members
Biggest hash found 'zc:k:46c_AB140E4DCDEBB91C4004178C10A7A5D3BD7C179B' has 4 fields

0 strings with 0 bytes (00.00% of keys, avg size 0.00)
0 lists with 0 items (00.00% of keys, avg size 0.00)
14909 sets with 362628 members (18.60% of keys, avg size 24.32)
65266 hashs with 261064 fields (81.40% of keys, avg size 4.00)
0 zsets with 0 members (00.00% of keys, avg size 0.00)
0 streams with 0 entries (00.00% of keys, avg size 0.00)`

Session keys are in separeted instance of Redis (used only 12M)

@pmonosolo
Copy link

pmonosolo commented Jul 8, 2021

Having the same issue after upgrading from 2.3.3 to 2.4.2-p1. The cache overfilling issue happens both in Redis and in file storage (if you dont have Redis cache enabled).

This would fill up my hard drive to the brim when Catalog Search indexing was running. It only happens when Catalog Search index is running.

When I do redis-cli monitor, the output shows this:

1625771650.924712 [0 172.18.0.7:59056] "expire" "zc:k:c8a_BLOCK_81EE233A1B32FD871F6A4DD3CC91A5BFD107B28B_215321_FINAL_PRICE_LIST_CATEGORY_PAGE" "86400"
1625771650.924729 [0 172.18.0.7:59056] "sadd" "zc:tags" "c8a_BLOCK_HTML" "c8a_CAT_P_215321" "c8a_CAT_P" "c8a_MAGE"
1625771650.924738 [0 172.18.0.7:59056] "sadd" "zc:ti:c8a_BLOCK_HTML" "c8a_BLOCK_81EE233A1B32FD871F6A4DD3CC91A5BFD107B28B_215321_FINAL_PRICE_LIST_CATEGORY_PAGE"
1625771650.924746 [0 172.18.0.7:59056] "sadd" "zc:ti:c8a_CAT_P_215321" "c8a_BLOCK_81EE233A1B32FD871F6A4DD3CC91A5BFD107B28B_215321_FINAL_PRICE_LIST_CATEGORY_PAGE"
1625771650.924754 [0 172.18.0.7:59056] "sadd" "zc:ti:c8a_CAT_P" "c8a_BLOCK_81EE233A1B32FD871F6A4DD3CC91A5BFD107B28B_215321_FINAL_PRICE_LIST_CATEGORY_PAGE"
1625771650.924762 [0 172.18.0.7:59056] "sadd" "zc:ti:c8a_MAGE" "c8a_BLOCK_81EE233A1B32FD871F6A4DD3CC91A5BFD107B28B_215321_FINAL_PRICE_LIST_CATEGORY_PAGE"
1625771650.924776 [0 172.18.0.7:59056] "exec"
1625771650.925294 [0 172.18.0.7:59056] "hget" "zc:k:c8a_BLOCK_81EE233A1B32FD871F6A4DD3CC91A5BFD107B28B_215321_FINAL_PRICE_LIST_CATEGORY_PAGE" "d"
1625771650.940544 [0 172.18.0.7:59056] "hget" "zc:k:c8a_LAYOUT_FRONTEND_STORE1_46F1B068EC7CCF4878F9284DD1137AFD1_PAGE_LAYOUT_MERGED" "d"
1625771650.940892 [0 172.18.0.7:59056] "hget" "zc:k:c8a_STRUCTURE_LAYOUT_FRONTEND_STORE1_46F1B068EC7CCF4878F9284DD1137AFD1" "d"
1625771650.941389 [0 172.18.0.7:59056] "hget" "zc:k:c8a_BLOCK_81EE233A1B32FD871F6A4DD3CC91A5BFD107B28B_215322_FINAL_PRICE_LIST_CATEGORY_PAGE" "d"
1625771650.943766 [0 172.18.0.7:59056] "hget" "zc:k:c8a_Zend_LocaleC_en_US_currencynumber_" "d"
1625771650.943950 [0 172.18.0.7:59056] "hget" "zc:k:c8a_Zend_LocaleL_en_US_symbols_" "d"
1625771650.948630 [0 172.18.0.7:59056] "hget" "zc:k:c8a_BLOCK_81EE233A1B32FD871F6A4DD3CC91A5BFD107B28B_215322_FINAL_PRICE_LIST_CATEGORY_PAGE" "t"
1625771650.998865 [0 172.18.0.7:59056] "multi"
1625771650.999604 [0 172.18.0.7:59056] "hmset" "zc:k:c8a_BLOCK_81EE233A1B32FD871F6A4DD3CC91A5BFD107B28B_215322_FINAL_PRICE_LIST_CATEGORY_PAGE" "d" "gz:\x1f\x8bx\x01t\xbd\xc7\xd2\xabH\x1b\x06v/\xff\x96r\x91D\x1a\x97\x17\xe4 \x84\xc8\bv\x88\x1cD\xce.\xdf\xbb\xfb\xd8\xdb\x8f\x9a\xc5\xcc\x99\xa9\xa1\x04\xdd\xfd\xf6\x1b\x9e\x10\xff\x87\xfd\xf7\x7f\xcf\xff\xd1\xff\xfdo\xe8\xdb*\xa9\xb2\xf9\x7f\xffg\xfc\xef\xdfU\xff!\xe0\x1f\xf0\x7f\xff\x11\xfb\xef\x7fU\xfa\xbf\xffs\xfe\x8f\xf9\xef\x7f\xf3r\xb6\xd9\xff1O\xc9\xbf?\x13\xff\xfd\xaf\xec\xe7\xe5\xff\xfb_\x90\xff\xfe\xef\xffg\xfe\x8f\x04\xff&\x9e\xcb\xff\xff)(\x85`\x8f\x7f\x0fx<\xfe\xfb\x9f\xa2oF\xb3\r\xe1\x94<eB/\xe4\xf5\xb4zt\xd3\xde0\xee\\\xb4\xa1Bs\x86\x8b!\x8aXC\xb0\xff_\xff\x9e\r\x9e4\x971F\x90\xff\xfe\xf0\xef\x01\xf9\xf6\xaeV\xaaI\xd6!\x84\x9e\n5\xd2}\x9ct\xa6 %\xa5\xf2;\x1f\xa4\xc9\x8f\x84\xdf\\\xe7s\x8a\xd9\xbf\x1f\xb0\x1a\xa9G\xbfU\x8e;/\x8aGM}\xf88\x82\xf3DO\xa8\x94\xc9F\x9f\xbb 

..............

^^^ this rapidly increases and keeps storing data.

I don't think its related to the Cache Type as I see it both on Redis and File Storage cache.

my env.php looks like this:

'cache' => [
    'frontend' => [
        'default' => [
            'backend' => 'Cm_Cache_Backend_Redis',
            'backend_options' => [
                'server' => '127.0.0.1',
                'database' => '0',
                'port' => '6379'
            ],
            'id_prefix' => '123_'
        ],
        'page_cache' => [
            'backend' => 'Cm_Cache_Backend_Redis',
            'backend_options' => [
                'server' => '127.0.0.1',
                'port' => '6379',
                'database' => '1',
                'compress_data' => '0'
            ],
            'id_prefix' => '123_'
        ]
    ]
],

Here is the output of redis-cli --bigkeys

Scanning the entire keyspace to find biggest keys as well as
average sizes per key type.  You can use -i 0.1 to sleep 0.1 sec
per 100 SCAN commands (not usually needed).

[00.00%] Biggest hash   found so far '"zc:k:t8a_BLOCK_CAB48299B47CDFA4E7BAD5C76907B5ED0F0BB1E4_21925_FINAL_PRICE_LIST_CATEGORY_PAGE"' with 4 fields
[00.00%] Biggest set    found so far '"zc:ti:t8a_CAT_P_106630"' with 1 members
[01.98%] Biggest set    found so far '"zc:ti:t8a_CATALOG_CATEGORY_VIEW_DISPLAYMODE_PRODUCTS"' with 2 members
[04.21%] Biggest set    found so far '"zc:ti:t8a_CAT_P_464510"' with 3 members
[14.59%] Biggest set    found so far '"zc:ti:t8a_MAGE"' with 8929 members

-------- summary -------

Sampled 16647 keys in the keyspace!
Total key length in bytes is 923645 (avg len 55.48)


Biggest   hash found '"zc:k:t8a_BLOCK_CAB48299B47CDFA4E7BAD5C76907B5ED0F0BB1E4_21925_FINAL_PRICE_LIST_CATEGORY_PAGE"' has 4 fields
Biggest    set found '"zc:ti:t8a_MAGE"' has 8929 members

0 lists with 0 items (00.00% of keys, avg size 0.00)
8894 hashs with 35576 fields (53.43% of keys, avg size 4.00)
0 strings with 0 bytes (00.00% of keys, avg size 0.00)
0 streams with 0 entries (00.00% of keys, avg size 0.00)
7753 sets with 42303 members (46.57% of keys, avg size 5.46)
0 zsets with 0 members (00.00% of keys, avg size 0.00)

^^^^ this is still in progress while Catalog Search Index is running. The RAM load at this point was at 40GB. It went up by 20GB in span of about 40 minutes.

@pmonosolo
Copy link

hmm! Interesting case
from above comment i see used csp module but not use redis but in your case that you claims enabled redis by default right ?
Maybe we need more investigate root cause on this why redis write so much data
Can you have monitor this case in latest code base (git 2.4-develop) ? I think this case require real data with big site for able reproduce

It doesnt look like it's only related to Redis. It happened both on Redis and File Storage cache for me.

@pmonosolo
Copy link

Looks like you still have to disable CSP plugin for the issue to disappear. This is very Magento :)

It looks like it's no longer storing crazy amount of data into zc:k:c8a_BLOCK_81EE233A1B32FD871F6A4DD3CC91A5BFD107B28B_215322_FINAL_PRICE_LIST_CATEGORY_PAGE

@DavorOptiweb
Copy link

image
And issue is still present

@mrtuvn
Copy link
Contributor

mrtuvn commented Jul 9, 2021

imho Module Magento_Csp seem not root causes of problem cache growing. Maybe something else
cc: @orlangur @kandy @paliarush
Should we reopen this for investigates ?

@pmonosolo
Copy link

image
And issue is still present

Did you recompile / deploy static / flush cache?

@DavorOptiweb
Copy link

@pmonosolo we have automated deploy and it has passed it many times since upgrade. So this is not an issue.

@engcom-Hotel engcom-Hotel added Issue: Cannot Reproduce Cannot reproduce the issue on the latest `2.4-develop` branch Issue: ready for confirmation labels Sep 30, 2021
@m2-community-project m2-community-project bot removed Issue: Cannot Reproduce Cannot reproduce the issue on the latest `2.4-develop` branch Issue: needs update Additional information is require, waiting for response labels Sep 30, 2021
@engcom-Hotel engcom-Hotel added Issue: Cannot Reproduce Cannot reproduce the issue on the latest `2.4-develop` branch Issue: needs update Additional information is require, waiting for response labels Sep 30, 2021
@m2-community-project m2-community-project bot removed Issue: Cannot Reproduce Cannot reproduce the issue on the latest `2.4-develop` branch Issue: ready for confirmation labels Sep 30, 2021
@onlinebizsoft
Copy link

I can't understand what is going on with Magento2 framework, I have a trace from a _initProduct and somehow it call to config more than 100 times and as the result, Magento spend around 500ms just for reading config value from cache and decode it. How can we make a website faster with such design from core?

image

@mrtuvn
Copy link
Contributor

mrtuvn commented Oct 1, 2021

Not sure related your problem but i see one pull request related with fix read config. I will mention you if i found it. When in magento2 is big core we able to improve a lot and also easy to make mistakes with improperly way
Maybe problem exists very longtimes but we not notices it. From your evidence i will assume issue come from redis and somewhere in code config call incorrectly

@mrtuvn
Copy link
Contributor

mrtuvn commented Oct 1, 2021

Try upgrade package from colinmollenhour redis in your test instance
Magento already bump version redis in development branch
See this https://github.com/magento/magento2/blob/2.4-develop/composer.json#L33-L34

@onlinebizsoft
Copy link

Try upgrade package from colinmollenhour redis in your test instance Magento already bump version redis in development branch See this https://github.com/magento/magento2/blob/2.4-develop/composer.json#L33-L34

I don't think it is related (I think it was for fixing a long array of cache key or something)

@hostep @IbrahimS2 any idea why this is not being cached? Basically this method can be called many times from many where or within a loop,...etc for a same config value (like module enabled or not, a store setting,.....) (developers shouldn't care about caching the value themselves) so it should cache the value inside the object

image

@mrtuvn
Copy link
Contributor

mrtuvn commented Oct 1, 2021

cc: @fooman may know this
for happy case i think it's not problem but in real production with have much data could be some missed edge-cases
From developers view i think this method called almost every place (Block class/Helper/ViewModels) when we need to check something enable/disable from backend

@fooman
Copy link
Contributor

fooman commented Oct 1, 2021

@mrtuvn sorry I have got nothing to add to this discussion

@engcom-Hotel
Copy link
Contributor

Hello @amitvkhajanchi,

Have you tried to reproduce this issue in Magento 2.4-develop. Is it still reproducible for you?

Thanks

@amitvkhajanchi
Copy link
Author

@engcom-Hotel -

No. I haven't tried to reproduce the issue on Magento 2.4-develop. I am now on 2.4.2 myself and running production with no issue -- I have Magento_Csp disabled (which fixed for me). i will shortly in a month or so try to port my app to 2.4.3 but I don't think I should run into issue.

@engcom-Hotel
Copy link
Contributor

Hello @amitvkhajanchi,

Thanks for the clarification!

As per your comment, it seems that your issue has been resolved. Can we close this now?

Thanks

@engcom-Hotel
Copy link
Contributor

Dear @amitvkhajanchi,

We have noticed that this issue has not been updated for a period of 14 Days. Hence we assume that this issue is fixed now, so we are closing it. Please raise a fresh ticket or reopen this ticket if you need more assistance on this.

Regards

@Adel-Magebinary
Copy link

This still happens in Magento ver. 2.4.2. The cache grows to 20G in 2 - 3 hours.

@Adel-Magebinary
Copy link

Adel-Magebinary commented Apr 5, 2022

Update:

The issue seems to be from "PayPal\Braintree\Plugin\ProductDetailsBlockPlugin::aroundGetProductDetailsHtml"

This plugin with lazy marketing HTML insert to regenerate for every product load from the code level. We had to disable the Paypal plugin to go back to the original state.

@Garde85
Copy link

Garde85 commented Nov 17, 2023

This issue is already present on magento 2.4.6-p2.

@simonrl
Copy link

simonrl commented Jan 7, 2024

We're also having this issue on 2.4.6-p3 with the modules in question here - Paypal_Braintree, Magento_Csp - already disabled. Normal file cache storage, no Redis etc.; has anybody found a root cause yet?

@robfico
Copy link
Contributor

robfico commented Oct 3, 2024

Seeing this same issue with 2.4.6-p4 where Redis cache grows very large - all block_html with many FINAL_PRICE_LIST tags. Does not look like this is fixed yet...

@JDavidVR
Copy link
Contributor

I'm facing same issue on magento opensource 2.4.7-p3

@onlinebizsoft
Copy link

Update:

The issue seems to be from "PayPal\Braintree\Plugin\ProductDetailsBlockPlugin::aroundGetProductDetailsHtml"

This plugin with lazy marketing HTML insert to regenerate for every product load from the code level. We had to disable the Paypal plugin to go back to the original state.

@hostep @engcom-Hotel should we re-open this one?

@hostep
Copy link
Contributor

hostep commented Feb 4, 2025

@onlinebizsoft: if the code in the latest version of that paypal/braintree module is still causing issues, I'd recommend to open a new issue with detailed steps of what the problem is, and not re-open this old ticket.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Area: Content Component: Cache Issue: needs update Additional information is require, waiting for response Reported on 2.4.1 Indicates original Magento version for the Issue report.
Projects
None yet
Development

No branches or pull requests