Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cleaning ActiveInsights table periodically #2

Open
AliOsm opened this issue Nov 23, 2024 · 6 comments
Open

Cleaning ActiveInsights table periodically #2

AliOsm opened this issue Nov 23, 2024 · 6 comments

Comments

@AliOsm
Copy link

AliOsm commented Nov 23, 2024

Thanks for this great gem, I'm using it currently in production where I'm handling ~100-200 TPM.

I just found that the table size increased by a huge margin in a small period of time and reached ~4GBs, and thought that the gem could provide an ActiveJob job that cleans the table and keeps the last N days of data only, what do you think?

@npezza93
Copy link
Owner

love it. we can probably do something similar to what solid cache/cable do.

@thedumbtechguy
Copy link

love it. we can probably do something similar to what solid cache/cable do.

Be careful with autovacuum on postgres when you do this.

@thedumbtechguy
Copy link

Alternatively, you could run a periodic aggregation over a given resolution period e.g 5mins for older data while giving more resolution for recent data.

@AliOsm
Copy link
Author

AliOsm commented Dec 11, 2024

love it. we can probably do something similar to what solid cache/cable do.

Be careful with autovacuum on postgres when you do this.

What is the concern exactly? I'm not a postgres expert :3

@thedumbtechguy
Copy link

love it. we can probably do something similar to what solid cache/cable do.

Be careful with autovacuum on postgres when you do this.

What is the concern exactly? I'm not a postgres expert :3

When deleting large amounts of data, you can trigger auto vacuuming, which can impact performance.

By default, autovacuum starts when 20% of the rows plus 50 rows are inserted, updated, or deleted. For example, a table with 1 million rows must have more than 200,050 dead rows before an autovacuum starts ((1000,000 x 0.2) + 50).

@AliOsm
Copy link
Author

AliOsm commented Dec 11, 2024

love it. we can probably do something similar to what solid cache/cable do.

Be careful with autovacuum on postgres when you do this.

What is the concern exactly? I'm not a postgres expert :3

When deleting large amounts of data, you can trigger auto vacuuming, which can impact performance.

By default, autovacuum starts when 20% of the rows plus 50 rows are inserted, updated, or deleted. For example, a table with 1 million rows must have more than 200,050 dead rows before an autovacuum starts ((1000,000 x 0.2) + 50).

Got it, yeah this could run autovacuum alot of times. The table has a lot of rows after 3-4 days of usage only.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants