Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

"Saving 'file.tf': Running 'Hashicorp Terraform' Formatter" - takes forever to run #1356

Closed
scretu opened this issue Mar 11, 2023 · 14 comments
Closed
Labels
bug Something isn't working performance Gotta go fast terraform-ls Features/bugs which will be implemented/fixed purely on the LS side

Comments

@scretu
Copy link

scretu commented Mar 11, 2023

Versions

Extension

HashiCorp Terraform: v2.25.4

VS Code

Version: 1.76.1
Release: 23069
Commit: 9561d503ffa83a1498d030540c4579e75c686489
Date: 2023-03-10T18:21:47.706Z
Electron: 19.1.9
Chromium: 102.0.5005.167
Node.js: 16.14.2
V8: 10.2.154.15-electron.0
OS: Linux x64 5.15.0-67-generic
Sandboxed: No

Operating System

Ubuntu 22.04.2 LTS

Terraform Version

$ terraform version
Terraform v1.2.9
on linux_amd64

Steps To Reproduce

Basically every time I click Save (or press Ctrl+S) on a .tf file, I see the notification in the bottom right corner saying ""Saving 'file.tf': Running 'Hashicorp Terraform' Formatter" and it takes from a couple of seconds to... so many seconds that I have to click "Cancel"

Expected Behavior

The file should be formatted and saved in an instant

Actual Behavior

It takes from a couple of seconds to... so many seconds that I have to click "Cancel"

Additional context

I saw many instances of this bug being reported... all of them are closed, but the issue is still present, it seems

@scretu scretu added the bug Something isn't working label Mar 11, 2023
@avgalani
Copy link

I expect it's the same issue as #1265; recommended current workaround is to downgrade to v2.23.0

@scretu
Copy link
Author

scretu commented Mar 18, 2023

@avgalani , unfortunaly, on v2.23.0, it is still slow :(

@scretu
Copy link
Author

scretu commented Mar 23, 2023

Hopefully this helps. Just by keeping VSCodium open, the extension is consuming a lot of memory:

Screenshot_20230323_194829

@scretu
Copy link
Author

scretu commented Mar 23, 2023

After gracefully stoping VScodium, 2 processes still linger around:

$ ps faxu | grep terraform
silvian   256717  1.2  7.9 4332136 2582788 ?     Sl   mar22  21:37  \_ /home/silvian/.vscode-oss/extensions/hashicorp.terraform-2.23.0/bin/terraform-ls serve
silvian  3647588  2.7  8.3 4471344 2727008 ?     Sl   13:00  11:41  \_ /home/silvian/.vscode-oss/extensions/hashicorp.terraform-2.23.0/bin/terraform-ls serve

If I strace them, this is what they do:

$ sudo strace -p 256717 -ff
strace: Process 256717 attached with 24 threads
[pid 259418] futex(0xc044c1a150, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 257184] epoll_pwait(3,  <unfinished ...>
[pid 257069] futex(0xc000880950, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 257068] futex(0xc000900d50, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 257067] futex(0xc000900950, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 257033] futex(0xc000091150, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 256790] futex(0xc000090d50, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 256770] futex(0xc000090950, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 256735] futex(0xc000a80550, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 256734] futex(0xc000a00550, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 256733] futex(0x1395438, FUTEX_WAKE_PRIVATE, 1 <unfinished ...>
[pid 256732] futex(0xc000880550, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 256729] futex(0x13c3b38, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 256727] futex(0x13c3be0, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 256726] read(0,  <unfinished ...>
[pid 256725] futex(0xc000a00150, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 256724] futex(0xc000980150, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 256723] futex(0xc000900150, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 256722] futex(0xc000880150, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 256721] futex(0xc000090150, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 256720] futex(0xc000068950, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 256719] futex(0xc000068550, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 256718] restart_syscall(<... resuming interrupted read ...> <unfinished ...>
[pid 256717] futex(0x1393e10, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 256733] <... futex resumed>)       = 0
[pid 256718] <... restart_syscall resumed>) = -1 EAGAIN (Resource temporarily unavailable)
[pid 256733] epoll_wait(7,  <unfinished ...>
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=40000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=80000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=160000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=320000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=640000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=1280000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=2560000}, NULL) = 0
[pid 256718] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 256718] futex(0x1395438, FUTEX_WAIT_PRIVATE, 0, {tv_sec=60, tv_nsec=0}^Cstrace: Process 256717 detached
strace: Process 256718 detached
 <detached ...>
strace: Process 256719 detached
strace: Process 256720 detached
strace: Process 256721 detached
strace: Process 256722 detached
strace: Process 256723 detached
strace: Process 256724 detached
strace: Process 256725 detached
strace: Process 256726 detached
strace: Process 256727 detached
strace: Process 256729 detached
strace: Process 256732 detached
strace: Process 256733 detached
strace: Process 256734 detached
strace: Process 256735 detached
strace: Process 256770 detached
strace: Process 256790 detached
strace: Process 257033 detached
strace: Process 257067 detached
strace: Process 257068 detached
strace: Process 257069 detached
strace: Process 257184 detached
strace: Process 259418 detached

and

$ sudo strace -p 3647588 -ff
strace: Process 3647588 attached with 24 threads
[pid 3653247] futex(0xc06ea90150, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3648971] futex(0xc0120b2150, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3648050] futex(0xc0007c7d50, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3647996] futex(0xc0007c7950, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3647954] epoll_pwait(3,  <unfinished ...>
[pid 3647953] futex(0xc000900950, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3647951] futex(0xc000880550, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3647724] futex(0xc0007c6d50, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3647665] futex(0xc0007c6950, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3647626] futex(0xc000900550, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3647602] futex(0x13c3b38, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3647598] futex(0xc000a00150, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3647597] futex(0xc000068d50, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3647594] futex(0xc000880150, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3647591] futex(0x13c3be0, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3647589] restart_syscall(<... resuming interrupted read ...> <unfinished ...>
[pid 3647588] futex(0x1393e10, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3647895] futex(0x1395438, FUTEX_WAKE_PRIVATE, 1 <unfinished ...>
[pid 3647629] futex(0xc000a00550, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3647628] futex(0xc000980550, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3647595] read(0,  <unfinished ...>
[pid 3647593] futex(0xc0007c6150, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3647590] futex(0xc000068550, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3647589] <... restart_syscall resumed>) = -1 EAGAIN (Resource temporarily unavailable)
[pid 3647895] <... futex resumed>)      = 0
[pid 3647596] futex(0xc000980150, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3647589] futex(0xc000880150, FUTEX_WAKE_PRIVATE, 1) = 1
[pid 3647594] <... futex resumed>)      = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000},  <unfinished ...>
[pid 3647895] epoll_wait(7,  <unfinished ...>
[pid 3647594] futex(0xc000880150, FUTEX_WAIT_PRIVATE, 0, NULL <unfinished ...>
[pid 3647589] <... nanosleep resumed>NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=40000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=80000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=160000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=320000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=640000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=1280000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=2560000}, NULL) = 0
[pid 3647589] nanosleep({tv_sec=0, tv_nsec=20000}, NULL) = 0
[pid 3647589] futex(0x1395438, FUTEX_WAIT_PRIVATE, 0, {tv_sec=60, tv_nsec=0}^Cstrace: Process 3647588 detached
strace: Process 3647589 detached
 <detached ...>
strace: Process 3647590 detached
strace: Process 3647591 detached
strace: Process 3647593 detached
strace: Process 3647594 detached
strace: Process 3647595 detached
strace: Process 3647596 detached
strace: Process 3647597 detached
strace: Process 3647598 detached
strace: Process 3647602 detached
strace: Process 3647626 detached
strace: Process 3647628 detached
strace: Process 3647629 detached
strace: Process 3647665 detached
strace: Process 3647724 detached
strace: Process 3647895 detached
strace: Process 3647951 detached
strace: Process 3647953 detached
strace: Process 3647954 detached
strace: Process 3647996 detached
strace: Process 3648050 detached
strace: Process 3648971 detached
strace: Process 3653247 detached

@reed-hanger
Copy link

Just adding on, I've had this issue in Windows pretty consistently for ~3 years now, throughout different versions of VSCode, Terraform, and the Terraform extension. It's pretty infuriating, It's always terraform-ls getting held up, and it tends to act the worst when I use the search find/replace (Ctrl + Shift + H), as it saves multiple files at the same time. It will literally run for hours if I don't stop it. Sometimes restarting VSCode helps, but usually I just disable Save on Format and just run terraform fmt in my folder, which processes all the files in less than 150ms. I've given up on it ever being fixed at this point.

@xiehan xiehan added terraform-ls Features/bugs which will be implemented/fixed purely on the LS side performance Gotta go fast labels Apr 21, 2023
@jseiser
Copy link

jseiser commented Apr 25, 2023

Having this issue on the normal, and beta release of this module.

terraform-lsp behaves fine on neovim, so I assume its something vscode related.

We do use large monorepo's, and ive tried to make sure i only open vscode in as far down of dir as possible, but hasnt improved anything.

@dbanck
Copy link
Member

dbanck commented Apr 28, 2023

Thank you for taking the time to report this issue.

We released the new extension 2.26.0 version (and 0.31.1 language server version) with a potential fix. So please try it and let us know if it solves this issue.

@27Bslash6
Copy link

Same behaviour, latest version. One reasonably large Terraform module with many (100) resources and submodules.

Sometimes (erratically) takes up to 20 minutes to save a file.

@radeksimko
Copy link
Member

We plan to make some changes as part of hashicorp/terraform-ls#1056 to help us understand situations like this (performance). While CPU/memory profiles can already be collected, these would not be as useful without context (such as some understanding of the workspace - no of files, folders, sizes of files etc.).

In the meantime, the only suggestion I can make is to avoid opening large/deep workspaces with lots of folders, esp. monorepos - i.e. open the one or few individual folders that you actually need to edit at any given time. This may not be most convenient, but should help avoid most performance issues for now.

Thank you for your patience and understanding.

@adrianisk
Copy link

I had to disable format on save because it consistently took 5+ seconds to save a file, even in a relatively small repo. While I was trying to figure out what was going on, I noticed that the extension enqueues some work on every keystroke within a .tf file. This is without even saving the file, you can watch the output as you type...

2023/06/07 16:30:42 opts.go:203: Received request batch of size 1 (qlen=60)
2023/06/07 16:30:43 opts.go:203: Received request batch of size 1 (qlen=61)
2023/06/07 16:30:43 opts.go:203: Received request batch of size 1 (qlen=62)
2023/06/07 16:30:43 opts.go:203: Received request batch of size 1 (qlen=63)
2023/06/07 16:30:44 opts.go:203: Received request batch of size 1 (qlen=64)
2023/06/07 16:30:45 opts.go:203: Received request batch of size 1 (qlen=65)
2023/06/07 16:30:45 opts.go:203: Received request batch of size 1 (qlen=66)
2023/06/07 16:30:45 opts.go:203: Received request batch of size 1 (qlen=67)
2023/06/07 16:30:45 opts.go:203: Received request batch of size 1 (qlen=68)
2023/06/07 16:30:46 opts.go:203: Received request batch of size 1 (qlen=69)
2023/06/07 16:30:46 opts.go:203: Received request batch of size 1 (qlen=70)
2023/06/07 16:30:46 opts.go:203: Received request batch of size 1 (qlen=71)
2023/06/07 16:30:46 opts.go:203: Received request batch of size 1 (qlen=72)
2023/06/07 16:30:46 opts.go:203: Received request batch of size 1 (qlen=73)
2023/06/07 16:30:47 opts.go:203: Received request batch of size 1 (qlen=74)
2023/06/07 16:30:47 opts.go:203: Received request batch of size 1 (qlen=75)
2023/06/07 16:30:47 opts.go:203: Received request batch of size 1 (qlen=76)
2023/06/07 16:30:47 opts.go:203: Received request batch of size 1 (qlen=77)
2023/06/07 16:30:47 opts.go:203: Received request batch of size 1 (qlen=78)
2023/06/07 16:30:47 opts.go:203: Received request batch of size 1 (qlen=79)

@staranto
Copy link

staranto commented Aug 3, 2023

I know "me too" doesn't help, but... me too. So frustrating.

> uname
Linux pop-os 6.2.6-76060206-generic #202303130630168901512522.04~ab2190e SMP PREEMPT_DYNAMIC Mon J x86_64 x86_64 x86_64 GNU/Linux
> code --version
1.81.0
6445d93c81ebe42c4cbd7a60712e0b17d9463e97
x64
> terraform-ls --version
0.31.1

Extension is currently 2.27.1, though I've tried them all.

Also happens on an Intel Mac w/same versions.

I've downgraded, upgraded, uninstalled, reinstalled so many times over the last year that I've completely lost track. I have a small root project that is sourcing a private registry module. That module has 5 sub-modules, each of which have 1-3 .tf's and are creating no more than ~5 resources on each module {} call. Collectively the root project references one of the child modules ~20x (iow... the root project has ~20 module { source = "<>//submod" }'s). I don't know if that's considered deeply nested, but it's deeper than some.

Perhaps LOC is a useful metric to get a sense of size and complexity...

The root project has 11 .tf's and 512 LOC.
The referenced module (with the 5 sub-modules) has 16 .tf's and 505 LOC.

Doesn't seem too overwhelming. I will say there is some pretty hairy complex statements using functions, conditional operations, for loops, etc to build out locals. All of those are in the module. Again, a vague bullet point, no idea if it's relevant.

My observation is that Intellisense seems to perform reasonably in that it pops quick enough that I don't particularly notice. But file save with format on save is horrendously slow. A simple, 10 line .tf might take >10 minutes. I would say saving a file is never shorter than ~10 seconds and often nears 20 minutes if I have enough patience to let it ride.

@jpogran
Copy link
Contributor

jpogran commented Aug 18, 2023

We know this has been a frustrating experience and appreciate your understanding while we gathered feedback and examples to diagnose this issue. We've created #1557 as a central place to see updates on what we are doing to address performance issues in terraform-ls in both the short and long term and pinned it to the repo. We'll be adding more detail there as we implement fixes. As we work on this we'll be recording the content and then closing the individual issues so that everyone has one place to look at instead of searching for individual tickets for updates.

@jpogran
Copy link
Contributor

jpogran commented Sep 13, 2023

We've released 2.27.2 which contains two fixes (hashicorp/terraform-ls#1369, hashicorp/terraform-ls#1372) in our continuing efforts to address this issue.

With the fix, we should be back to terraform-ls v0.29.2 levels of CPU usage while maintaining the memory improvements of v0.29.3.

If you have the time, please give it a try and let us know how it works for you. Please open a new issue rather than replying here, as this issue has gotten so many replies with different experiences it is hard to process.

I am going to close this as we're using #1557 as a central tracking issue, so subscribe to that to see continued updates.

@jpogran jpogran closed this as completed Sep 13, 2023
@github-actions
Copy link

I'm going to lock this issue because it has been closed for 30 days ⏳. This helps our maintainers find and focus on the active issues.
If you have found a problem that seems similar to this, please open a new issue and complete the issue template so we can capture all the details necessary to investigate further.

@github-actions github-actions bot locked as resolved and limited conversation to collaborators Oct 14, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Something isn't working performance Gotta go fast terraform-ls Features/bugs which will be implemented/fixed purely on the LS side
Projects
None yet
Development

No branches or pull requests