-
Notifications
You must be signed in to change notification settings - Fork 182
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"Saving 'file.tf': Running 'Hashicorp Terraform' Formatter" - takes forever to run #1356
Comments
I expect it's the same issue as #1265; recommended current workaround is to downgrade to v2.23.0 |
@avgalani , unfortunaly, on v2.23.0, it is still slow :( |
After gracefully stoping VScodium, 2 processes still linger around:
If I strace them, this is what they do:
and
|
Just adding on, I've had this issue in Windows pretty consistently for ~3 years now, throughout different versions of VSCode, Terraform, and the Terraform extension. It's pretty infuriating, It's always terraform-ls getting held up, and it tends to act the worst when I use the search find/replace (Ctrl + Shift + H), as it saves multiple files at the same time. It will literally run for hours if I don't stop it. Sometimes restarting VSCode helps, but usually I just disable Save on Format and just run |
Having this issue on the normal, and beta release of this module. terraform-lsp behaves fine on neovim, so I assume its something vscode related. We do use large monorepo's, and ive tried to make sure i only open vscode in as far down of dir as possible, but hasnt improved anything. |
Thank you for taking the time to report this issue. We released the new extension 2.26.0 version (and 0.31.1 language server version) with a potential fix. So please try it and let us know if it solves this issue. |
Same behaviour, latest version. One reasonably large Terraform module with many (100) resources and submodules. Sometimes (erratically) takes up to 20 minutes to save a file. |
We plan to make some changes as part of hashicorp/terraform-ls#1056 to help us understand situations like this (performance). While CPU/memory profiles can already be collected, these would not be as useful without context (such as some understanding of the workspace - no of files, folders, sizes of files etc.). In the meantime, the only suggestion I can make is to avoid opening large/deep workspaces with lots of folders, esp. monorepos - i.e. open the one or few individual folders that you actually need to edit at any given time. This may not be most convenient, but should help avoid most performance issues for now. Thank you for your patience and understanding. |
I had to disable format on save because it consistently took 5+ seconds to save a file, even in a relatively small repo. While I was trying to figure out what was going on, I noticed that the extension enqueues some work on every keystroke within a .tf file. This is without even saving the file, you can watch the output as you type...
|
I know "me too" doesn't help, but... me too. So frustrating. > uname Extension is currently 2.27.1, though I've tried them all. Also happens on an Intel Mac w/same versions. I've downgraded, upgraded, uninstalled, reinstalled so many times over the last year that I've completely lost track. I have a small root project that is sourcing a private registry module. That module has 5 sub-modules, each of which have 1-3 .tf's and are creating no more than ~5 resources on each module {} call. Collectively the root project references one of the child modules ~20x (iow... the root project has ~20 module { source = "<>//submod" }'s). I don't know if that's considered deeply nested, but it's deeper than some. Perhaps LOC is a useful metric to get a sense of size and complexity... The root project has 11 .tf's and 512 LOC. Doesn't seem too overwhelming. I will say there is some pretty hairy complex statements using functions, conditional operations, for loops, etc to build out locals. All of those are in the module. Again, a vague bullet point, no idea if it's relevant. My observation is that Intellisense seems to perform reasonably in that it pops quick enough that I don't particularly notice. But file save with format on save is horrendously slow. A simple, 10 line .tf might take >10 minutes. I would say saving a file is never shorter than ~10 seconds and often nears 20 minutes if I have enough patience to let it ride. |
We know this has been a frustrating experience and appreciate your understanding while we gathered feedback and examples to diagnose this issue. We've created #1557 as a central place to see updates on what we are doing to address performance issues in terraform-ls in both the short and long term and pinned it to the repo. We'll be adding more detail there as we implement fixes. As we work on this we'll be recording the content and then closing the individual issues so that everyone has one place to look at instead of searching for individual tickets for updates. |
We've released With the fix, we should be back to terraform-ls If you have the time, please give it a try and let us know how it works for you. Please open a new issue rather than replying here, as this issue has gotten so many replies with different experiences it is hard to process. I am going to close this as we're using #1557 as a central tracking issue, so subscribe to that to see continued updates. |
I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues. |
Versions
Extension
VS Code
Operating System
Terraform Version
Steps To Reproduce
Basically every time I click Save (or press Ctrl+S) on a .tf file, I see the notification in the bottom right corner saying ""Saving 'file.tf': Running 'Hashicorp Terraform' Formatter" and it takes from a couple of seconds to... so many seconds that I have to click "Cancel"
Expected Behavior
The file should be formatted and saved in an instant
Actual Behavior
It takes from a couple of seconds to... so many seconds that I have to click "Cancel"
Additional context
I saw many instances of this bug being reported... all of them are closed, but the issue is still present, it seems
The text was updated successfully, but these errors were encountered: