-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix race condition in string generator helper #19875
Conversation
g.charsetLock.RLock() | ||
charset := g.charset | ||
g.charsetLock.RUnlock() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you explain why locking is needed for setting a variable like this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The lock isn't necessary for the write to the local charset
variable, but the read of the g.charset
value.
In validateConfig
:
if len(g.charset) == 0 {
g.charset = getChars(g.Rules)
}
so if there's parallel access to both validateConfig
and generate
, we will race reading this variable. Since this assignment isn't an atomic write (its a []rune
slice, which is a pointer type), you might read a bad pointer (like, half of the pointer's value could be copied over when you read, so you'd point to a garbage memory location) and thus fail.
In particular, I think when you first initialize the group and set it up to do a password rotation, if you have a lot of in-flight requests, you'll see multiple try and write this charset via validating the config I believe, and some which will be past it trying to do this generate
read already.
Or maybe it is parallel modification + use of the policy...
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @cipherboy! That is a great summary.
@robmonte I am happy to talk more about this if you have more questions. Let me know!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
👍
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Makes sense to me!
g.charsetLock.RLock() | ||
charset := g.charset | ||
g.charsetLock.RUnlock() |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The lock isn't necessary for the write to the local charset
variable, but the read of the g.charset
value.
In validateConfig
:
if len(g.charset) == 0 {
g.charset = getChars(g.Rules)
}
so if there's parallel access to both validateConfig
and generate
, we will race reading this variable. Since this assignment isn't an atomic write (its a []rune
slice, which is a pointer type), you might read a bad pointer (like, half of the pointer's value could be copied over when you read, so you'd point to a garbage memory location) and thus fail.
In particular, I think when you first initialize the group and set it up to do a password rotation, if you have a lot of in-flight requests, you'll see multiple try and write this charset via validating the config I believe, and some which will be past it trying to do this generate
read already.
Or maybe it is parallel modification + use of the policy...
A data race in vault/helper/random.(*StringGenerator).validateConfig() was exposed by new text coverage that was recently introduced.
and
Failure: https://app.circleci.com/pipelines/github/hashicorp/vault/51983/workflows/d654323c-baa8-4985-9838-a4131f8900f5/jobs/607405/steps