Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

data race in thread pool #455

Closed
CharlesCheung96 opened this issue Nov 5, 2024 · 0 comments · Fixed by #459
Closed

data race in thread pool #455

CharlesCheung96 opened this issue Nov 5, 2024 · 0 comments · Fixed by #459
Assignees

Comments

@CharlesCheung96
Copy link
Collaborator

=== Command to ticdc(new arch).
==================
WARNING: DATA RACE
Write at 0x00c004d1bd20 by goroutine 2651:
  github.com/pingcap/ticdc/utils/threadpool.(*waitReactor).scheduleTaskLoop()
      /root/ticdc/utils/threadpool/wait_reactor.go:129 +0x284
  github.com/pingcap/ticdc/utils/threadpool.newWaitReactor.gowrap1()
      /root/ticdc/utils/threadpool/wait_reactor.go:101 +0x33

Previous read at 0x00c004d1bd20 by goroutine 2652:
  github.com/pingcap/ticdc/utils/threadpool.(*waitReactor).executeTaskLoop()
      /root/ticdc/utils/threadpool/wait_reactor.go:182 +0x349
  github.com/pingcap/ticdc/utils/threadpool.newWaitReactor.gowrap2()
      /root/ticdc/utils/threadpool/wait_reactor.go:102 +0x33

Goroutine 2651 (running) created at:
  github.com/pingcap/ticdc/utils/threadpool.newWaitReactor()
      /root/ticdc/utils/threadpool/wait_reactor.go:101 +0x305
  github.com/pingcap/ticdc/utils/threadpool.newThreadPoolImpl()
      /root/ticdc/utils/threadpool/thread_pool.go:27 +0x184
  github.com/pingcap/ticdc/utils/threadpool.NewThreadPool()
      /root/ticdc/utils/threadpool/task.go:68 +0xc4
  github.com/pingcap/ticdc/utils/threadpool.NewThreadPoolDefault()
      /root/ticdc/utils/threadpool/task.go:63 +0x84
  github.com/pingcap/ticdc/downstreamadapter/dispatcher.GetDispatcherTaskScheduler.func1()
      /root/ticdc/downstreamadapter/dispatcher/helper.go:213 +0x8f
  sync.(*Once).doSlow()
      /root/go/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.2.linux-amd64/src/sync/once.go:76 +0xe1
  sync.(*Once).Do()
      /root/go/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.2.linux-amd64/src/sync/once.go:67 +0x44
  github.com/pingcap/ticdc/downstreamadapter/dispatcher.GetDispatcherTaskScheduler()
      /root/ticdc/downstreamadapter/dispatcher/helper.go:212 +0xa6
  github.com/pingcap/ticdc/downstreamadapter/dispatcher.newResendTask()
      /root/ticdc/downstreamadapter/dispatcher/helper.go:185 +0x7d
  github.com/pingcap/ticdc/downstreamadapter/dispatcher.(*Dispatcher).dealWithBlockEvent()
      /root/ticdc/downstreamadapter/dispatcher/dispatcher.go:409 +0x10f2
  github.com/pingcap/ticdc/downstreamadapter/dispatcher.(*Dispatcher).HandleEvents()
      /root/ticdc/downstreamadapter/dispatcher/dispatcher.go:284 +0x1037
  github.com/pingcap/ticdc/downstreamadapter/dispatcher.(*EventsHandler).Handle()
      /root/ticdc/downstreamadapter/dispatcher/helper.go:248 +0xa4
  github.com/pingcap/ticdc/utils/dynstream.(*stream[go.shape.string,go.shape.struct { github.com/pingcap/ticdc/pkg/common.low uint64; github.com/pingcap/ticdc/pkg/common.high uint64 },go.shape.struct { github.com/pingcap/ticdc/pkg/common/event.Event },go.shape.*uint8,go.shape.*uint8]).handleLoop()
      /root/ticdc/utils/dynstream/stream.go:319 +0x8f8
  github.com/pingcap/ticdc/utils/dynstream.(*stream[go.shape.string,go.shape.struct { github.com/pingcap/ticdc/pkg/common.low uint64; github.com/pingcap/ticdc/pkg/common.high uint64 },go.shape.struct { github.com/pingcap/ticdc/pkg/common/event.Event },go.shape.*uint8,go.shape.*uint8]).start.gowrap1()
      /root/ticdc/utils/dynstream/stream.go:213 +0xaa

Goroutine 2652 (running) created at:
  github.com/pingcap/ticdc/utils/threadpool.newWaitReactor()
      /root/ticdc/utils/threadpool/wait_reactor.go:102 +0x376
  github.com/pingcap/ticdc/utils/threadpool.newThreadPoolImpl()
      /root/ticdc/utils/threadpool/thread_pool.go:27 +0x184
  github.com/pingcap/ticdc/utils/threadpool.NewThreadPool()
      /root/ticdc/utils/threadpool/task.go:68 +0xc4
  github.com/pingcap/ticdc/utils/threadpool.NewThreadPoolDefault()
      /root/ticdc/utils/threadpool/task.go:63 +0x84
  github.com/pingcap/ticdc/downstreamadapter/dispatcher.GetDispatcherTaskScheduler.func1()
      /root/ticdc/downstreamadapter/dispatcher/helper.go:213 +0x8f
  sync.(*Once).doSlow()
      /root/go/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.2.linux-amd64/src/sync/once.go:76 +0xe1
  sync.(*Once).Do()
      /root/go/pkg/mod/golang.org/toolchain@v0.0.1-go1.23.2.linux-amd64/src/sync/once.go:67 +0x44
  github.com/pingcap/ticdc/downstreamadapter/dispatcher.GetDispatcherTaskScheduler()
      /root/ticdc/downstreamadapter/dispatcher/helper.go:212 +0xa6
  github.com/pingcap/ticdc/downstreamadapter/dispatcher.newResendTask()
      /root/ticdc/downstreamadapter/dispatcher/helper.go:185 +0x7d
  github.com/pingcap/ticdc/downstreamadapter/dispatcher.(*Dispatcher).dealWithBlockEvent()
      /root/ticdc/downstreamadapter/dispatcher/dispatcher.go:409 +0x10f2
  github.com/pingcap/ticdc/downstreamadapter/dispatcher.(*Dispatcher).HandleEvents()
      /root/ticdc/downstreamadapter/dispatcher/dispatcher.go:284 +0x1037
  github.com/pingcap/ticdc/downstreamadapter/dispatcher.(*EventsHandler).Handle()
      /root/ticdc/downstreamadapter/dispatcher/helper.go:248 +0xa4
  github.com/pingcap/ticdc/utils/dynstream.(*stream[go.shape.string,go.shape.struct { github.com/pingcap/ticdc/pkg/common.low uint64; github.com/pingcap/ticdc/pkg/common.high uint64 },go.shape.struct { github.com/pingcap/ticdc/pkg/common/event.Event },go.shape.*uint8,go.shape.*uint8]).handleLoop()
      /root/ticdc/utils/dynstream/stream.go:319 +0x8f8
  github.com/pingcap/ticdc/utils/dynstream.(*stream[go.shape.string,go.shape.struct { github.com/pingcap/ticdc/pkg/common.low uint64; github.com/pingcap/ticdc/pkg/common.high uint64 },go.shape.struct { github.com/pingcap/ticdc/pkg/common/event.Event },go.shape.*uint8,go.shape.*uint8]).start.gowrap1()
      /root/ticdc/utils/dynstream/stream.go:213 +0xaa
==================
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant