-
-
Notifications
You must be signed in to change notification settings - Fork 44
Array Buffers #34
Comments
feross/buffer/#39 |
I think all Level* is built on the assumption to use Buffer objects for binaries, and the use of ArrayBuffers, UInt8Array and bops is not really needed anymore thanks to the work of @feross. Is it impractical to wrap your ArrayBuffer in an UInt8Array and then in a Buffer? What is your usecase? We might want to add some docs on this in the README. |
my use case involved passing buffers of images to socket.io on the server, @feross's buffer silently turns arraybuffers to 0 length UInt8Arrays which On Tue, Aug 5, 2014 at 3:55 AM, Matteo Collina notifications@github.com
-Calvin W. Metcalf |
I don't get if this bug still belongs here, and in case how we should address it :(. |
One thing that would be helpful could be some custom data types, e.g array
|
@calvinmetcalf I think the correct way to solve this issue is to just wrap the ArrayBuffers you're getting from your server with a Uint8Array and Buffer. It's just: Then when you get the data out of level.js, you'll get a Buffer. If you need the data as a Uint8Array, you can just use the Buffer as-is, like this: |
@ralphtheninja following what @feross said, I think we should remove And same for the |
That should be tagged |
Btw, my comment was from 2014! I think Node.js is moving towards supporting |
Interesting! I guess we might restore ArrayBuffer support at some point, but the Level ecosystem is pretty much Buffer-only so for now we're good. |
Sounds good, consistency is more important! |
FTR, we actually do support You can |
If I try to put array buffer into a db that has valueEncoding set to 'binary' I get an error about
as far as I can tell this is due to it being improperly turned into a Uint8 at some point, I managed to trace it to here where the array Buffer failed the Buffer.isBuffer test, but then when new Buffer(ArrayBuffer) is called it uses ArrayBuffer.prototype.length instead of ArrayBuffer.prototype.byteLength, so I'll likely have to open an issue there as well
The text was updated successfully, but these errors were encountered: