You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Currently one can apply effects to videos during playback and transcoding by implementing the IBasicVideoEffect interface and adding it to the player or transcoder. However, it does not support HDR video. When using with Direct2D, one has to specify 'ARGB32' for the video subtype in SupportedEncodingProperties. I tried using the GUID for the 64bpp RGBA floating point video encoding format but it didn't work. My proposal is that specifying this pixel format should work and should result in 64bpp RGBA floating point in and out surfaces for the effect. If one specifies 'ARGB32' as well, then the transcoder/player should choose 'ARGB32' unless the video is actually > 8 bits per channel. This does not require any new API, but it would be good if it was possible to test for support.
In addition, it would be nice if MediaPlayer in frame server mode would support HDR. All that is required is that CopyFrameToVideoSurface works if the target surface is 64-bit RGBA floating point (with the correct color space, i.e. scRGB). I'm not sure how it works internally so it may be necessary to instruct the media player about the nature of the target pixel format in advance, in which case a new API is required, else all that is needed is a way to tell that a 64-bit RGBA floating point target surface is supported. It would also be useful if there was an easy way to tell if the video was high-bit depth, although one can probably use the ProfileId property of the VideoEncodingProperties on the video track. (Incidentally, when transcoding files, it would be nice if the ProfileId was populated when calling MediaEncodingProfile.CreateFromFileAsync(IStorageFile) (all other properties are populated), instead of having to create a MediaPlaybackItem around a MediaSource and then calling OpenAsync).
I should note that I am using frame server mode during playback to add effects rather than using IBasicVideoEffect because it is easier to use the device I am using to load bitmap assets and it's rather awkward to ensure the same device is used when using the IBasicVideoEffect interface, so any way that could be improved would be good.
Finally, I've noticed that when using frame server mode, some videos appear slightly less sharp and less saturated than in the built-in mode. Is there some some sort of enhancement being added to the built-in mode? If so, is it possible to opt in in frame server mode? Or is the video processing just higher quality? If so, can we have the high quality in frame server mode, either by default or as an opt-in?
The text was updated successfully, but these errors were encountered:
Currently one can apply effects to videos during playback and transcoding by implementing the IBasicVideoEffect interface and adding it to the player or transcoder. However, it does not support HDR video. When using with Direct2D, one has to specify 'ARGB32' for the video subtype in SupportedEncodingProperties. I tried using the GUID for the 64bpp RGBA floating point video encoding format but it didn't work. My proposal is that specifying this pixel format should work and should result in 64bpp RGBA floating point in and out surfaces for the effect. If one specifies 'ARGB32' as well, then the transcoder/player should choose 'ARGB32' unless the video is actually > 8 bits per channel. This does not require any new API, but it would be good if it was possible to test for support.
In addition, it would be nice if MediaPlayer in frame server mode would support HDR. All that is required is that CopyFrameToVideoSurface works if the target surface is 64-bit RGBA floating point (with the correct color space, i.e. scRGB). I'm not sure how it works internally so it may be necessary to instruct the media player about the nature of the target pixel format in advance, in which case a new API is required, else all that is needed is a way to tell that a 64-bit RGBA floating point target surface is supported. It would also be useful if there was an easy way to tell if the video was high-bit depth, although one can probably use the ProfileId property of the VideoEncodingProperties on the video track. (Incidentally, when transcoding files, it would be nice if the ProfileId was populated when calling MediaEncodingProfile.CreateFromFileAsync(IStorageFile) (all other properties are populated), instead of having to create a MediaPlaybackItem around a MediaSource and then calling OpenAsync).
I should note that I am using frame server mode during playback to add effects rather than using IBasicVideoEffect because it is easier to use the device I am using to load bitmap assets and it's rather awkward to ensure the same device is used when using the IBasicVideoEffect interface, so any way that could be improved would be good.
Finally, I've noticed that when using frame server mode, some videos appear slightly less sharp and less saturated than in the built-in mode. Is there some some sort of enhancement being added to the built-in mode? If so, is it possible to opt in in frame server mode? Or is the video processing just higher quality? If so, can we have the high quality in frame server mode, either by default or as an opt-in?
The text was updated successfully, but these errors were encountered: