Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pulumi versions build up over time consuming disk space #976

Open
alfred-stokespace opened this issue Jul 6, 2023 · 4 comments
Open

pulumi versions build up over time consuming disk space #976

alfred-stokespace opened this issue Jul 6, 2023 · 4 comments
Labels
kind/bug Some behavior is incorrect or out of spec

Comments

@alfred-stokespace
Copy link

What happened?

I am looking at my github runner disk and seeing that the /actions-runner/_work/_tool/pulumi directory is consuming 17GB

[root@ip-XX-XX-X-XXX _tool]# du -h --max-depth=1 .
475M    ./node
17G     ./pulumi
418M    ./go
48M     ./buildx
18G     .

When I investigate I see the following

[root@ip-XX-XXX-X-XXX pulumi]# du -h --max-depth=1 .
136M    ./3.22.1
136M    ./3.24.1
136M    ./3.25.1
137M    ./3.26.1
137M    ./3.27.0
139M    ./3.28.0
138M    ./3.29.1
138M    ./3.30.0
175M    ./3.32.1
180M    ./3.33.1
179M    ./3.33.2
186M    ./3.34.0
185M    ./3.34.1
185M    ./3.35.0
185M    ./3.35.1
185M    ./3.35.2
185M    ./3.35.3
186M    ./3.36.0
186M    ./3.37.0
186M    ./3.37.1
186M    ./3.37.2
192M    ./3.38.0
198M    ./3.39.0
198M    ./3.39.1
200M    ./3.40.0-alpha.1662150830
198M    ./3.39.2
198M    ./3.39.3
199M    ./3.40.0
203M    ./3.40.2
210M    ./3.41.1
210M    ./3.42.0
210M    ./3.43.1
210M    ./3.44.2
211M    ./3.44.3
211M    ./3.45.0
211M    ./3.46.0
211M    ./3.46.1
216M    ./3.47.0
216M    ./3.47.1
216M    ./3.47.2
216M    ./3.48.0
218M    ./3.49.0
220M    ./3.50.0
220M    ./3.50.1
220M    ./3.50.2
220M    ./3.51.0
221M    ./3.51.1
221M    ./3.52.0
221M    ./3.52.1
221M    ./3.53.0
221M    ./3.53.1
220M    ./3.54.0
220M    ./3.55.0
220M    ./3.56.0
220M    ./3.57.1
224M    ./3.58.0
224M    ./3.59.0
224M    ./3.59.1
224M    ./3.60.0
224M    ./3.60.1
224M    ./3.61.0
226M    ./3.61.1
226M    ./3.62.0
226M    ./3.63.0
226M    ./3.64.0
226M    ./3.65.0
226M    ./3.65.1
226M    ./3.66.0
267M    ./3.67.0
267M    ./3.67.1
266M    ./3.68.0
306M    ./3.69.0
302M    ./3.70.0
302M    ./3.71.0
299M    ./3.72.0
302M    ./3.72.1
302M    ./3.72.2
302M    ./3.73.0
302M    ./3.74.0
17G     .
[root@ip-10-100-5-234 pulumi]#

Expected Behavior

Pulumi action should cleanup after itself rather than put work on me to clean it up.

Steps to reproduce

Normal use of Pulumi Action

Output of pulumi about

...

Additional context

No response

Contributing

Vote on this issue by adding a 👍 reaction.
To contribute a fix for this issue, leave a comment (and link to your pull request, if you've opened one already).

@alfred-stokespace alfred-stokespace added kind/bug Some behavior is incorrect or out of spec needs-triage Needs attention from the triage team labels Jul 6, 2023
@alfred-stokespace alfred-stokespace changed the title pulumi versions build up over time consuming diskspace pulumi versions build up over time consuming disk space Jul 6, 2023
@alfred-stokespace
Copy link
Author

I maintain a fleet of github runners, so this is now a part of my day today cleaning up the other runners and will need continual cleanup here on out.

@justinvp justinvp removed the needs-triage Needs attention from the triage team label Jul 11, 2023
@justinvp
Copy link
Member

Sorry for the trouble, @alfred-stokespace.

It's not entirely clear to me the best approach to having the action clean up after itself. We may want to make it so that rather than installing into versioned subdirectories, we install the CLI into a single dir (replacing any existing files), which would prevent build-up over time.

@simenandre
Copy link
Contributor

AFAIK, this behaviour is from @actions/tool-cache.

Not sure how we can change any of this without moving away from that package, which I don't think we want?

@simenandre
Copy link
Contributor

@alfred-stokespace You could probably add a job/step that does this in your own workflow as a fix for now. Might be worth opening an issue on @actions/tool-cache if we evaluate that is where this comes from?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Some behavior is incorrect or out of spec
Projects
None yet
Development

No branches or pull requests

3 participants