-
-
Notifications
You must be signed in to change notification settings - Fork 5.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
IOBuffer of SubString #9888
Comments
How doesn't it work? |
@ivarne great question, I will try and come up with a simple example (was hoping it was somethng obvious). I think it's unicode related. The bug was with unicode in markdown tables with the above definition (from the commit), without unicode it works (IIRC). |
Looking at how the IOBuffer(s::SubString) = IOBuffer(s.string.data[(s.offset:(s.offset+s.endof)) + 1], true, false) will work much better. Edit: Changed |
That seems suboptimal because it is creating a copy of the data (at least until #9150 lands). And if you're willing to make a copy, you could just do Doing this without making a copy seems problematic right now because |
I just thought |
It looks like the code would work with any |
i think we can make IOBuffer parameterized by the type of the AbstractVector{Uint8} that it contains, rather than forcing Vector{Uint8}. that would also allow it to wrap a SharedArray too. |
Doesn't IOBuffer kind of need to own the vector it wraps? I guess a read-only or non-growable IOBuffer on a SubArray would be ok though. |
@JeffBezanson, in the |
right. when creating a IOBuffer from an existing object, it should typically default to read-only |
…} as the underlying buffer (ref #9888)
…} as the underlying buffer (ref #9888)
…} as the underlying buffer (ref #9888)
…} as the underlying buffer (ref #9888)
this was implemented in #11554 |
It seems like this might be useful to define, for example if you don't know whether the type is a string or a substring.
I had a go implemented with the following in this commit:
however, that seems not to work. :(
The text was updated successfully, but these errors were encountered: