Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Body.arrayBuffer([begin, end]) #554

Closed
beaufortfrancois opened this issue Jun 15, 2017 · 13 comments
Closed

Body.arrayBuffer([begin, end]) #554

beaufortfrancois opened this issue Jun 15, 2017 · 13 comments

Comments

@beaufortfrancois
Copy link

beaufortfrancois commented Jun 15, 2017

It would be very practical if arrayBuffer() method could take begin and end optional parameters in order to give the ability to reduce memory usage for cache responses whose content is "huge" but can still be useful in chunks (video and audio).

const response = await caches.match('https://example.com/huge-file.mp4');
const slicedData = await response.arrayBuffer(0, 1024);
return new Response(slicedData);

R: @jakearchibald
FYI @paullewis

@beaufortfrancois beaufortfrancois changed the title Body.arrayBuffer(begin[, end]) Body.arrayBuffer([begin, end]) Jun 15, 2017
@mkruisselbrink
Copy link
Collaborator

How is that better than getting the body as a blob, and then slicing and dicing that in whatever way you want? And would calling arrayBuffer like that still consume the whole body, meaning effectively you should have done a range request since you're never going to be able to read the rest of the body?

@guest271314
Copy link

Are you trying to request and only receive response having Content-Length of 1024? Or, read response in chunks of 1024?

@beaufortfrancois
Copy link
Author

beaufortfrancois commented Jun 16, 2017

I've updated my code snippet to better reflect what I'm experiencing. A huge file is in my cache (fetched with "Background Fetch") and I would like to return only a subset of it on a fetch event instead of bringing the entire file in memory.

Both code snippets are equivalent as far as I can tell in my browser in term of memory usage:

const response = await caches.match('https://example.com/huge-file.mp4');
const data = await response.arrayBuffer();
const slicedData = data.slice(0, 1024);
const response = await caches.match('https://example.com/huge-file.mp4');
const blob = await response.blob();
const slicedData = blob.slice(0, 1024);

@asutherland
Copy link

The blob() case is more efficient by far in terms of what the browser can and does optimize. Speaking for Gecko, the blob() call only creates a handle to the underlying file on disk. No reads of the file's contents need to be performed before returning the blob. This handle can be given to the new Response and passed across processes, allowing the target process to perform just the 1024-byte read directly, and without needing to involve the ServiceWorker or its global.

In contrast, the arrayBuffer() call will result in the entirety of "huge-file.mp4" being read from disk and exposed to the ServiceWorker and its global. The read needs to complete before the new Response can be created and returned, and its contents may then need to be streamed between processes (unless some kind of underlying shared memory/copy-on-write thing is done, which I'm pretty confident Gecko will not do at the current time).

@jakearchibald
Copy link
Collaborator

I think w3c/ServiceWorker#913 is the solution here.

@guest271314
Copy link

guest271314 commented Jun 16, 2017

Have little if any experience using caches. And not sure gathering requirement correctly. If you are trying to get only a range of time slices from a media resource already accessible at browser cache you could use Media Fragment URI at an HTMLMediaElement, which if read specification accurately, does not perform additional requests for cached resource. If necessary capture the stream of media playback with HTMLMediaElement.captureStream() to get Blob or ArrayBuffer representation of the slice of media played back

<label>Play first 30 seconds of audio</label><br><audio controls src="https://upload.wikimedia.org/wikipedia/commons/b/be/Hidden_Tribe_-_Didgeridoo_1_Live.ogg#t=0,30"></audio><br><br>
<label>Play from 30 to 50 seconds of audio</label><br><audio controls src="https://upload.wikimedia.org/wikipedia/commons/b/be/Hidden_Tribe_-_Didgeridoo_1_Live.ogg#t=30,50"></audio>

Alternatively, request resource as Blob, or ArrayBuffer, use .slice() or .subarray() to get from, to bytes of Blob or ArrayBuffer, respectively.

@guest271314
Copy link

guest271314 commented Jun 18, 2017

An alterantive approach to achieve requirement is to use media fragment concatenated to Blob URL.

At initial request create a Blob URL using Blob result of .blob() response.

const blobURL = fetch("https://example.com/huge-file.mp4")
.then(response => response.blob())
.then(blob =>  URL.createObjectURL(blob));

When we want to play specific time slices from media, for example, from 30 seconds to 50 seconds, use appropriate media fragment identifier specifying the range of media to play

blobURL.then(url => /* do stuff with Blob URL */ new Audio(url + "#t=30,50").play())

which references original Blob and Blob URL (XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX)

XXXXXXXX-XXXX-XXXX-XXXX-XXXXXXXXXXXX 206 media Other (from disk cache)

@jakearchibald
Copy link
Collaborator

#556 - for standardising Mozilla's current behaviour with request/response blobs.

@guest271314
Copy link

guest271314 commented Jun 30, 2017

@beaufortfrancois Is 1024 a valid byte length for an audio chunk? Have not been able to playback a media segment at AudioContext() less than 1024 * 3.

An array buffer of audio data having .byteLength of 1024 would be less than one second of audio; that is, inaudible, yes?

Curious what exactly you are trying to achieve within the application?

@guest271314
Copy link

guest271314 commented Jun 30, 2017

This is probably not what you are looking for. The approach requests resource once, then creates AudioBuffer from response as ArrayBuffer.

You can then create an AudioBuffer using OfflineAudioContext containing audio data between specific time ranges. This will allow a sample of audio data to be played even where the specified range of playback does not begin at 0.

let data; // store decoded audio data


async function audioSample({
  url, from, to, when = 0, channels = 2, sampleRate = 44100
}) {
  const duration = to - from;
  const [ac, oac] = [
    new AudioContext()
  , new OfflineAudioContext(channels, sampleRate * duration, sampleRate)
  ];
  if (data === undefined) {
    // this should only be necessary once to get `AudioBuffer`
    const request = await fetch(url);
    const response = await request.arrayBuffer();
    data = await ac.decodeAudioData(response);
  }
  const source = oac.createBufferSource();
  source.buffer = data;
  source.connect(oac.destination);
  // create an `AudioBuffer` of audio between `from` and `to`
  source.start(when, from, duration);
  const ab = await oac.startRendering();
  return {
    ab, ac, oac
  }
}

// create an `AudioBuffer` of media content from 60 to 65 seconds
audioSample({
    url: "/path/to/media/resource",
    from: 60,
    to: 65
  })
  .then(({
    ab, ac, oac
  }) => {
    return new Promise(resolve => {
      console.log(ab, ac, oac);
      const source = ac.createBufferSource();
      source.buffer = ab;
      source.connect(ac.destination);
      // close `AudioContext` 
      // https://developer.mozilla.org/en-US/docs/Web/API/AudioContext/close
      source.onended = event => 
        ac.close().then(() => console.log(event, ac)).then(resolve);
      source.start();
    })
  })
  .then(() => {
    // create `AudioBuffer` from 120 to 125 seconds
    audioSample({
        from: 120,
        to: 125
      })
      .then(({
        ab, ac, oac
      }) => {
        return new Promise(resolve => {
          console.log(ab, ac, oac);
          const source = ac.createBufferSource();
          source.buffer = ab;
          source.connect(ac.destination);
          source.onended = event => 
            ac.close().then(() => console.log(event, ac)).then(resolve);
          source.start();
        })
      })
  })

plnkr

@guest271314
Copy link

@beaufortfrancois Tried to set <video> element .src to Blob URL with .slice() of Uint8Array() with ArrayBuffer as source from .arrayBuffer() to smallest value which dispatched canplay event of <video> element, using https://nickdesaulniers.github.io/netfix/demo/frag_bunny.mp4 as media resource.

The smallest value passed to .slice() which rendered the non-playing beginning of the media was 33700. The video time is set to 2 seconds, though pressing play control does not play a time slice of the media

fetch("https://nickdesaulniers.github.io/netfix/demo/frag_bunny.mp4")
      .then(response => response.arrayBuffer())
      .then(ab => {
        const VIEW = new Uint8Array(ab);
        video.src = URL.createObjectURL(new Blob([VIEW.slice(0, 33700)]));
})

but rather flashes the <video> element.

A .slice() of 36500 was the least value which actually moved the current time control.

A .slice() of 144000 was least to playback the full initial 2 seconds of media.

A .slice() of 1908 dispatches loadedmetadata event.

Was not able to produce media playback with a .slice() of 1024, which does not appear to load enough media data for loadedmetadata event to be dispatched or media playback.

If the requirement is to play media from cache while the request is being processed you can use ReadableStream and MediaSource with SourceBuffer .mode set to "segments" to append a chunk of data as Uint8Array to SourceBuffer during the read of the response, before the full request has completed

    const [video, mediaSource, mimeCodec, processStream] = [
      document.querySelector("video")
    , new MediaSource
    , "video/mp4; codecs=avc1.42E01E, mp4a.40.2"
    , ({value, done}) => {
        if (done) {
          mediaSource.endOfStream();
          return
        }
        sourceBuffer.appendBuffer(value);
      }
    ];
    video.oncanplay = () => {
      video.oncanplay = null;
      video.play()
    }
    // if required respond from cache at ServiceWorker here
    fetch("https://nickdesaulniers.github.io/netfix/demo/frag_bunny.mp4")
      .then(response => response.body.getReader())
      .then(reader => {

        video.src = URL.createObjectURL(mediaSource);
        
        mediaSource.onsourceopen = () => {
          sourceBuffer = mediaSource.addSourceBuffer(mimeCodec);
          sourceBuffer.mode = "segments";
          sourceBuffer.addEventListener("updateend", event => {
            reader.read().then(processStream)
          });

          reader.read().then(processStream);
          
        }

        return Promise.all([new Promise(resolve => mediaSource.onsourceended = () => resolve(mediaSource.readyState)), reader.closed])
        
      })
      .then(([msReadyState]) => {
        console.log(msReadyState)
      })
      .catch(err => console.log(err))

plnkr

@beaufortfrancois
Copy link
Author

beaufortfrancois commented Jul 5, 2017

Thank you so much @guest271314 for the explanation.
sourceBuffer.mode = "segments"; was indeed the missing bit for me.

@jakearchibald
Copy link
Collaborator

Blob seems to provide the required API, although Firefox's implementation could be better.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

5 participants