Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Probabilistic panic when create clusters with multiple nodes #1151

Closed
cumirror opened this issue Mar 24, 2022 · 0 comments · Fixed by #1152
Closed

Probabilistic panic when create clusters with multiple nodes #1151

cumirror opened this issue Mar 24, 2022 · 0 comments · Fixed by #1152
Labels
bug Something isn't working

Comments

@cumirror
Copy link

What is version of KubeKey has the issue?

v2.0.0

What is your os environment?

Ubuntu20.04

KubeKey config file

No response

A clear and concise description of what happend.

When creating a cluster with multiple nodes, it will panic probabilistically. In my environment, there are 3 master nodes and 3 worker nodes. It is found that kk panics when creating the cluster.

It seems that when there are many nodes, JoinNodesModulejoin will be executed concurrently, in which GenerateKubeadmConfig calls util.GetArgs, resulting in concurrent reading and writing of the map v1beta2.ApiServerArgs.

Relevant log output

Generate kubeadm config
fatal error: concurrent map writes

goroutine 3011 [running]:
runtime.throw({0x218e49b, 0xc000412fc0})
  /usr/local/go/src/runtime/panic.go:1198 +0x71 fp=0xc0008ff938 sp=0xc0008ff908 pc=0x434431
runtime.mapassign_faststr(0xc000412fc0, 0x23, {0xc000412fc0, 0x17})
  /usr/local/go/src/runtime/map_faststr.go:211 +0x39c fp=0xc0008ff9a0 sp=0xc0008ff938 pc=0x412cbc
github.com/kubesphere/kubekey/pkg/core/util.GetArgs(0x21aae00, {0xc0006bd780, 0x2, 0x4})
  /usr/src/myapp/pkg/core/util/util.go:100 +0x2d6 fp=0xc0008ffad0 sp=0xc0008ff9a0 pc=0x620216
github.com/kubesphere/kubekey/pkg/kubernetes.(*GenerateKubeadmConfig).Execute(0xc000a9b6e0, {0x257d120, 0xc000f12780})
  /usr/src/myapp/pkg/kubernetes/tasks.go:240 +0x2fe fp=0xc0008ffe58 sp=0xc0008ffad0 pc=0x1bae3fe
github.com/kubesphere/kubekey/pkg/core/task.(*RemoteTask).ExecuteWithRetry(0xc0008d72b0, {0x257d120, 0xc000f12780})
  /usr/src/myapp/pkg/core/task/remote_task.go:214 +0x11c fp=0xc0008fff20 sp=0xc0008ffe58 pc=0x9c563c
github.com/kubesphere/kubekey/pkg/core/task.(*RemoteTask).Run(0xc0008d72b0, {0x257d120, 0xc000f12780}, {0x2585cd8, 0xc00010cf40}, 0xaf26, 0x7365636375530700)
  /usr/src/myapp/pkg/core/task/remote_task.go:150 +0x1c5 fp=0xc0008fff98 sp=0xc0008fff20 pc=0x9c4da5
github.com/kubesphere/kubekey/pkg/core/task.(*RemoteTask).RunWithTimeout·dwrap·4()
  /usr/src/myapp/pkg/core/task/remote_task.go:109 +0x3e fp=0xc0008fffe0 sp=0xc0008fff98 pc=0x9c4b9e
runtime.goexit()
  /usr/local/go/src/runtime/asm_amd64.s:1581 +0x1 fp=0xc0008fffe8 sp=0xc0008fffe0 pc=0x464421
created by github.com/kubesphere/kubekey/pkg/core/task.(*RemoteTask).RunWithTimeout
  /usr/src/myapp/pkg/core/task/remote_task.go:109 +0x16f
...

goroutine 2990 [runnable]:
github.com/kubesphere/kubekey/pkg/core/util.GetArgs(0x21aae00, {0x378a9a0, 0x0, 0x0})
  /usr/src/myapp/pkg/core/util/util.go:103 +0xb2
github.com/kubesphere/kubekey/pkg/kubernetes.(*GenerateKubeadmConfig).Execute(0xc000a9b6e0, {0x257d120, 0xc000f12300})
  /usr/src/myapp/pkg/kubernetes/tasks.go:241 +0x33a
github.com/kubesphere/kubekey/pkg/core/task.(*RemoteTask).ExecuteWithRetry(0xc0008d72b0, {0x257d120, 0xc000f12300})
  /usr/src/myapp/pkg/core/task/remote_task.go:214 +0x11c
github.com/kubesphere/kubekey/pkg/core/task.(*RemoteTask).Run(0xc0008d72b0, {0x257d120, 0xc000f12300}, {0x2585cd8, 0xc00010ce60}, 0x0, 0x0)
  /usr/src/myapp/pkg/core/task/remote_task.go:150 +0x1c5
created by github.com/kubesphere/kubekey/pkg/core/task.(*RemoteTask).RunWithTimeout
  /usr/src/myapp/pkg/core/task/remote_task.go:109 +0x16f

Additional information

No response

@cumirror cumirror added the bug Something isn't working label Mar 24, 2022
cumirror pushed a commit to cumirror/kubekey that referenced this issue Mar 24, 2022
…ltiple nodes

Signed-off-by: cumirror <jacksontong@yunify.com>
ks-ci-bot added a commit that referenced this issue Mar 24, 2022
Fix #1151: Probabilistic panic when create clusters with multiple nodes
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant