-
Notifications
You must be signed in to change notification settings - Fork 141
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
der: Use #[asn1(type = "OCTET STRING")] with Vec<u8> #1539
Comments
#1540 should handle the conversions from |
Thank you, that PR fixes the first error "trait bound is not satisfied" on the call of |
#1562 should help although I don't know if it actually solves this |
Fixed #[derive(Sequence)]
struct Foo {
#[asn1(type = "OCTET STRING", deref = "true")]
bar: Vec<u8>,
} |
I'm trying to use the
der
andder_derive
crates to implement encoding and encoding a struct containing aVec<u8>
as anOCTET STRING
. I need my struct to be owned, not borrowed from the input DER bytes.Using
OctetString
in the struct definition is inconvenient because I must use the falliblenew()
method when manually creating an instance of the struct. I would rather the struct useVec<u8>
directly, and keep the der-specific types and encoding error in the encode and decode steps.I thought
#[asn1(type = "OCTET STRING")]
would implement this, as the documentation states:This does not work. Contrary to the documentation, the macro performs an intermediate conversion to
OctetStringRef
, which does not have aTryFrom
andInto
implemented forVec<u8>
.The text was updated successfully, but these errors were encountered: