Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deadlock in SharedLock #2905

Closed
project-snail opened this issue Jun 28, 2024 · 5 comments · Fixed by #2961
Closed

Deadlock in SharedLock #2905

project-snail opened this issue Jun 28, 2024 · 5 comments · Fixed by #2961
Labels
type: bug A general bug
Milestone

Comments

@project-snail
Copy link

Bug Report

When using the reactor client and performing Redis operations within the onErrorResume block,
there is a chance of encountering thread deadlock. This can impact all threads at the entry points.

Reproduction steps are as follows:

  1. Start Redis normally
  2. Start the service
  3. ab -c 3 -n 100000 http://127.0.0.1:18080/lettuce/get (simulate normal traffic)
  4. Stop Redis (simulate Redis failure)
  5. Start any process on the Redis port (simulate Redis IP and port being accessible but Redis not yet providing service)
  6. Wait 30 seconds (wait for Lettuce to automatically reconnect to Redis, triggering the deadlock issue)
  7. After stopping ab, the HTTP service still does not work properly. And upon querying jstack, it is confirmed that a deadlock has occurred.

lockWritersExclusive

incrementWriters

Environment

  • lettuce-core 6.1.9.RELEASE

Full demo code here

demo_code.zip

@tishun tishun added the type: bug A general bug label Jun 28, 2024
@tishun tishun added this to the Backlog milestone Jun 28, 2024
@tishun
Copy link
Collaborator

tishun commented Jun 28, 2024

Quite possibly the same problem as the one in #2879

@tishun
Copy link
Collaborator

tishun commented Jun 28, 2024

@project-snail thank you for the verbose bug report!
I think we actually need to prioritize this fix for one of the coming versions

@tishun tishun modified the milestones: Backlog, 7.x Jun 28, 2024
@tishun tishun changed the title When using the reactor client and performing Redis operations within the onErrorResume block, there is a chance of encountering thread deadlock. This can impact all threads at the entry points. Deadlock in SharedLock Jul 2, 2024
@thachlp
Copy link
Contributor

thachlp commented Jul 3, 2024

@project-snail
I am a new contributor and would like to work on this.
I try to reproduce the issue but I can't.

In step 3, here is the output:

➜  ~ ab -c 3 -n 100000 http://127.0.0.1:18080/lettuce/get 
This is ApacheBench, Version 2.3 <$Revision: 1903618 $>
Copyright 1996 Adam Twiss, Zeus Technology Ltd, http://www.zeustech.net/
Licensed to The Apache Software Foundation, http://www.apache.org/

Benchmarking 127.0.0.1 (be patient)
Completed 10000 requests
apr_socket_recv: Operation timed out (60)
Total of 16333 requests completed

Does it what you expect?

In step 5, What I should do? I am using redis by Docker

Thanks 🙇

@tishun
Copy link
Collaborator

tishun commented Aug 2, 2024

@thachlp did you also check the scenario in #2422 ?

@thachlp
Copy link
Contributor

thachlp commented Aug 6, 2024

@tishun not yet, let me try

Roiocam added a commit to Roiocam/lettuce that referenced this issue Aug 16, 2024
Roiocam added a commit to Roiocam/lettuce that referenced this issue Aug 16, 2024
@tishun tishun modified the milestones: 7.x, 6.5.0.RELEASE Aug 30, 2024
tishun pushed a commit that referenced this issue Sep 13, 2024
* fix:deadlock when reentrant exclusive lock #2905

* confirm won't blocking other thread

* apply suggestions
tishun pushed a commit to tishun/lettuce-core that referenced this issue Nov 1, 2024
…is#2961)

* fix:deadlock when reentrant exclusive lock redis#2905

* confirm won't blocking other thread

* apply suggestions
tishun added a commit that referenced this issue Nov 1, 2024
* fix:deadlock when reentrant exclusive lock #2905

* confirm won't blocking other thread

* apply suggestions

Co-authored-by: Andy(Jingzhang)Chen <iRoiocam@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
type: bug A general bug
Projects
None yet
Development

Successfully merging a pull request may close this issue.

3 participants