-
Notifications
You must be signed in to change notification settings - Fork 618
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No sanity checking in BMPDecoder #622
Comments
That is actually not only a problem of malformed images…I don’t have a good recipe of handling big images in general. It is easily possible to run out of memory when encoding big images. |
Right, but with valid big images you do need to have somebody upload said big image to you :) The OOM reproducer is some 500 bytes in size and very viable to be sent to some API endpoint many times in a row. |
I am not disputing that. |
So I'm looking at fixing this issue, and I've added some extra checks to error on invalid values though I'm not sure how to deal with images that specify very large dimensions in a good way. The BMP format has a size field in the header, but it's not required to be set (or could be set to a silly value) so it's not something that can be relied on. Since the decoder operates on a reader, it can't access the actual file size (well at least not without specialization on certain readers, but I'm not sure that's feasible yet). One option could be to simply specify a maximum size the decoder, alternatively the decoder could under-allocate the output buffer on images sized larger than a certain size, and reallocate it to be larger as the data is being read. (Though that could slow down loading of large valid images a bit.) Imagemagick seems to check the file size if it's loading from a file and see if it matches, SDL_Image doesn't seem to do any check for this. The mozilla image library seems to limit the width to 0x0000FFFF (65535) and error out if the width is larger. Maybe this would be the best course of action for now. |
Firefox approach doesn't really work here as for Firefox an assumption that
image will eventually be shown on screen is true (or used to be, anyway.
I feel like it would be fine to just allocate a vector in size similar to
the file size. A cost of reallocation, if it is ever necessary will be
extremely low anyway.
On Mar 6, 2017 9:13 PM, "oyvindln" <notifications@github.com> wrote:
So I'm looking at fixing this issue, and I've added some extra checks to
error on invalid values though I'm not sure how to deal with images that
specify very large dimensions in a good way. The BMP format has a size
field in the header, but it's not required to be set (or could be set to a
silly value) so it's not something that can be relied on. Since the decoder
operates on a reader, it can't access the actual file size (well at least
not without specialization on certain readers, but I'm not sure that's
feasible yet).
One option could be to simply specify a maximum size the decoder,
alternatively the decoder could under-allocate the output buffer on images
sized larger than a certain size, and reallocate it to be larger as the
data is being read. (Though that could slow down loading of large valid
images a bit.)
Imagemagick seems to check the file size if it's loading from a file and
see if it matches, SDL_Image doesn't seem to do any check for this. The
mozilla image library seems to limit the width to 0x0000FFFF (65535)
<https://dxr.mozilla.org/mozilla-central/source/image/decoders/nsBMPDecoder.cpp#557>
and error out if the width is larger. Maybe this would be the best course
of action for now.
—
You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
<#622 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AApc0hRDuamqtqTYBoFVtNK44tgVcPMdks5rjFrXgaJpZM4MLdDc>
.
|
The issue here is that the decoder doesn't currently have a way to get the file size as the underlying object it operates on is only required to implement Read + Seek and the BMP header isn't required to specify it. I suppose the API could be altered so that this info is passed to the decoders (this isn't' really an issue unique to BMP images), though that wouldn't work for streams so maybe one would just have to pass some expected and/or maximum size for that case. In either case I suppose there could be some limit to the size of initial image data buffer since as you suggest reallocating is probably not going to create a significant slowdown anyhow. |
Yes, one could specify a maximum of memory that should be used or specify the desired boundaries than one could either resize on the fly or only decode parts of the image. A gigapixel image (jpeg) is a valid use case which is mot covered by this library. |
Looks like the issue brought up here about very big image dimensions was addressed in #632, so I think this issue could be closed? |
It is possible to construct small, but malformed BMP images which may result in denial-of-service.
Following two images (base64-encoded) have unusually big dimensions.
slow2.bmp
slow2.bmp: PC bitmap, Windows 98/2000 and newer format, 32 x 6946848 x 32
Takes 66 seconds to decode.
overalloc.bmp
overalloc.bmp: PC bitmap, Windows 98/2000 and newer format, 1111498827 x 1083981 x 1
=> fatal runtime error: out of memory
The text was updated successfully, but these errors were encountered: