Skip to content

cmd/compile: pgo can dramatically increase goroutine stack size #65532

@felixge

Description

@felixge

Go version

go1.21.5

Output of go env in your module/workspace:

GOARCH=amd64
GOOS=linux
GOAMD64=v1

What did you do?

Build my application using a default.pgo CPU profile from production.

What did you see happen?

Go memory usage (/memory/classes/total:bytes − /memory/classes/heap/released:bytes) increased from 720 MB to 850 MB (18%) until rollback, see below.

2024-02-05 pgo for koutris-forwarder-intake (increase goroutine stack size)  Datadog at 23 03 54@2x

This increase in memory usage seems to have been caused by an increase in goroutine stack size (/memory/classes/heap/stacks:bytes) from 207 MB to 280MB (35%).

2024-02-05 pgo for koutris-forwarder-intake (increase goroutine stack size)  Datadog at 23 06 01@2x

This increase was not due to an increase in the number of active goroutines, but due to an increase of the average stack size (/memory/classes/heap/stacks:bytes / /sched/goroutines:goroutines).

2024-02-05 pgo for koutris-forwarder-intake (increase goroutine stack size)  Datadog at 23 05 17@2x

To debug this further, I built a hacky goroutine stack frame profiler. This pointed me to to google.golang.org/grpc/internal/transport.(*loopyWriter).run

For the binary compiled without pgo, my tool estimated 2MB of stack usage for ~1000 goroutines:

2024-02-05 koutris-forwarder-intake goroutine_space at 23 23 33@2x

And for the binary compiled with pgo, my tool estimated 71MB of stack usage for ~1000 goroutines:

2024-02-05 koutris-forwarder-intake goroutine_space at 23 22 14@2x

Looking at the assembly, it becomes clear that is due to the frame size increasing from 0x50 (80) bytes to 0xc1f8 (49656) bytes.

assembly

before pgo:

TEXT google.golang.org/grpc/internal/transport.(*loopyWriter).run(SB) /go/pkg/mod/google.golang.org/grpc@v1.58.2/internal/transport/controlbuf.go
  0x8726e0              493b6610                CMPQ SP, 0x10(R14)                   // cmp 0x10(%r14),%rsp
  0x8726e4              0f86ab020000            JBE 0x872995                         // jbe 0x872995
  0x8726ea              55                      PUSHQ BP                             // push %rbp
  0x8726eb              4889e5                  MOVQ SP, BP                          // mov %rsp,%rbp
  0x8726ee              4883ec50                SUBQ $0x50, SP                       // sub $0x50,%rsp

after pgo:

TEXT google.golang.org/grpc/internal/transport.(*loopyWriter).run(SB) /go/pkg/mod/google.golang.org/grpc@v1.58.2/internal/transport/controlbuf.go
  0x8889a0              4989e4                          MOVQ SP, R12                         // mov %rsp,%r12
  0x8889a3              4981ec80c10000                  SUBQ $0xc180, R12                    // sub $0xc180,%r12
  0x8889aa              0f82c0300000                    JB 0x88ba70                          // jb 0x88ba70
  0x8889b0              4d3b6610                        CMPQ R12, 0x10(R14)                  // cmp 0x10(%r14),%r12
  0x8889b4              0f86b6300000                    JBE 0x88ba70                         // jbe 0x88ba70
  0x8889ba              55                              PUSHQ BP                             // push %rbp
  0x8889bb              4889e5                          MOVQ SP, BP                          // mov %rsp,%rbp
  0x8889be              4881ecf8c10000                  SUBQ $0xc1f8, SP                     // sub $0xc1f8,%rsp

And the root cause for this appears to be the inlining of 3 calls to processData, each of which allocates a 16KiB byte array on its stack

What did you expect to see?

No significant increase in memory usage.

Maybe PGO could take frame sizes into account for inlining, especially if multiple calls are being made to a function that has a large frame size.

Meanwhile, maybe we should send a PR that adds a //go:noinline pragma to the processData func in gRPC. Given the current code structure, it seems highly undesirable to inline this function up to 3 times in the run method.

cc @prattmic

Metadata

Metadata

Assignees

No one assigned

    Labels

    NeedsInvestigationSomeone must examine and confirm this is a valid issue and not a duplicate of an existing one.compiler/runtimeIssues related to the Go compiler and/or runtime.

    Type

    No type

    Projects

    Status

    Todo

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions