-
Notifications
You must be signed in to change notification settings - Fork 110
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Lint to enforce no trailing zero byte on KeyUsage bitstrings #682
Conversation
Looks good to me. Not merging myself given that a few others were flagged in the original PR. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think this quite checks for exactly the right thing. It seems to be looking for trailing 00 bytes. Such bytes will definitely be in violation of DER, so that's good, but I think it misses some cases.
For example, a KU extension encoded as 03 02 04 80
means "a bit string (03) of length two bytes (02) with four unused bits at the end (04) containing the bits 1000XXXX (80)". But according to DER, that should be encoded as 03 02 07 80
(aka 1XXXXXXX), trimming those three trailing zeros out of the value.
So this needs to operate not just at the byte level, but at the bit level.
func init() { | ||
lint.RegisterLint(&lint.Lint{ | ||
Name: "e_superfluous_ku_encoding", | ||
Description: "RFC 5280 Section 4.2.1.3 describes the value of a KeyUsage to be a DER encoded BitString, which itself must not have unnecessary trailing 00 bytes.", |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Description: "RFC 5280 Section 4.2.1.3 describes the value of a KeyUsage to be a DER encoded BitString, which itself must not have unnecessary trailing 00 bytes.", | |
Description: "RFC 5280 Section 4.2.1.3 describes the value of a KeyUsage to be a DER encoded BitString, which itself must not have unnecessary trailing 0 bits.", |
The example given is this thread does confuse me a bit, however I think that's because the binary for 80 is actually May I construct an arbitrary value to test my understanding of this encoding?
Is this a correct example? Now, I am thinking that this may actually be two lints, because if my understanding is correct then it is possible to be compliant with one requirement but not the other.
I believe that the above correctly encodes the "skippable" bits but violates the requirement of no trailing zero bytes. |
Sorry, I was using hexadecimal encoding for all those example bytes, but forgot to say so explicitly. So byte
Yep, completely agreed.
In this case simply detecting that the third byte (number of skipped bits) is greater than 7 is an immediate indicator that something is wrong: that byte is supposed to encode the number of unused bits up to the next byte boundary, so it should never be 8 or more. I think you're right that detecting both mis-encodings is good. I don't have a strong opinion on whether it should be one or two lints. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Definitely manages the "no trailing 0 byte" requirement in the title of the PR. Either in the same lint or in a second lint then it'd be nice to have checks for the extra cases you've discussed already with @aarongable.
Happy to keep reviewing if/when you make any further changes to the PR.
So @aarongable I gave the other side of this coin a whack (counting and correcting unused bytes), however I think I've hit (what I have to assume is) an encoding that I'm hoping you can help me understand. I'm getting over 6500 failures in the test corpus and they're all from the exact same KU ( Now, all of these certs have only one KU, Bytes (decimal): My lint, as written, is pointing out that
Indeed, if my understanding is correct and a bitstring reads in a left-to-right evaluation, then |
@christopher-henderson I believe that the encoding X.690 2002-07, clause 11.2.2 says:
Thus, it is clear that an encoder that encodes "5" as the initial octet but then asserts "128" as the following octet value has not correctly removed all trailing bits and has failed to DER encode the (named) BIT STRING value. |
(Apologies, I'm on vacation this week but I can take a closer look next week.) On first glance, I agree with Corey and with your lint: a KU of digitalSignature encoded as |
Thank you @aarongable and @CBonnell for the quick glance! Given that it was over 6k new failures in the test corpus I had to assume that I was the problem 😛. However, after a bit of digging through the So I'm gonna reckon that these 6k+ failures are all simply certs generated by the same bug in OpenSSL or other such common provider. I think I'm going to go ahead and merge this lint and then open up a new PR for encoding issue. Thank you again! |
This lint addresses #681 where certificates were discovered whose KeyUsage encodings violated DER's requirement on minimal bitstring representations.
The following are interested parties in this issue:
@robplee
@aarongable
@CBonnell