-
Notifications
You must be signed in to change notification settings - Fork 139
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is duration needed ? #52
Comments
I'm not sure what API we should have for timestamps, because the options can differ between codecs and codec implementations. Typically media containers provide timestamps, and sometimes durations also, but these can be in the bitstream as well. Reordering is always signaled in the bitstream AFAIK. Some codec implementations allow us to perfectly correspond frames with encoded chunks, while others give special meaning to timestamps. (Eg. some Android MediaCodec video decoder implementations will reorder frames to match the timestamps provided with the input chunks.) We can usually work around this by disabling post-processing filters and generating artificial timestamps in the browser implementation. There exist codecs without a reliable encoded chunk to frame mapping. (Eg. VPx without superframes has chunks that map to no frame.) We can probably avoid this by requiring that each encoded chunk corresponds to exactly one frame as a matter of conformance, although I still expect WebCodecs implementations to sometimes drop frames for unspecified reasons. If we require a 1:1 mapping, then we could allow apps to specify an opaque object that flows through the encode/decode process, which could be a timestamp or some other structure. We could also determine codec-by-codec whether implementations should add additional annotations for in-band metadata that they find. Edit: The above applies to audio and video, but not images. We don't expect apps to parse their own timestamps from animations. |
You probably also want to use a high res timestamp for actually timestamps, no? |
That's an open question. We (Chrome implementors) are using an integer type so that we can make guarantees about exact preservation through our whole media stack, but some aspects of that are Chrome-specific and we'll need to make sure all browsers can meet the requirements equally. DOMHighResTimeStamp is a double type in milliseconds. It could potentially meet our needs as well, would be easier to use correctly from JS, but may be less nice for WASM users. |
Re: duration use cases, one good one is ImageDecoder for animated images. The ImageDecoder will emit VideoFrames, and duration is needed to know how long the last frame is presented. |
In terms of compatibility, how about adopting |
Time as an integer fraction of seconds (ie. an integer multiple of a timebase) is indeed the most common format for interchanging media timestamps. There isn't a Rational type already though, so the ergonomics are not great in JS. |
Closing this issue as the timestamp type question is now tracked separately. Conclusions on duration, lets keep it, but also keep it as optional. Duration isn't required by video decoders and encoders, but it is potentially a nice-to-have passthrough attribute for authors. For these uses its fine to keep it and leave it as optional. Duration is however very useful for VideoFrames output by ImageDecoder in cases where the image is animated and you need to know how long to present the final frame before looping. For this use, the ImageDecoder will set it accordingly. |
The
duration
is an optional attribute in VideoFrame object that represents the time interval for which the video composition should render the composed VideoFrame in microseconds.The text was updated successfully, but these errors were encountered: