-
-
Notifications
You must be signed in to change notification settings - Fork 122
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add NV12 image format support for flatbuffers! #920
Conversation
@awawa-dev Since they have included the new changes “Add NV12 image format support for flatbuffers!” in their master branch, I am asking them how to handle this now. Do I now have to allocate the entire LUT table (150 MB) to the flatbuffer, or can you simply perform a LUT calibration? (although this is not necessarily as easy under webOS as on the Windows PC or Raspberry Pi). |
Hi @satgit62
to be able to receive the NV12 stream, a full 150 MB LUT file is needed. However, you don't have to worry about resources because one 50 MB part is still loaded into memory, just like it was the case with the smaller LUTs like the one you are using now: 150 MB = three tables of 50 MB each, only one is used at a time, 1 is needed for RGB tone mapping (classic RGB flatbuffers stream) and 2 are for YUV and YUV & tone mapping (for NV12).
Yes, best to perform a calibration. We are working with @s1mptom to provide an alternative method for the calibration process: instead of Windows PC and a browser, simple to play calibraton test movie on the source player. So it should be much simpler. There is still a lot of work needed and it is theoretically possible to calibrate Dolby Vision using such a file. However I have problems encoding such DV file (from slideshow boards) using ffmpeg which has recently supported this format and therefore there is very little information on how to do it, we could use some help here. So currently it's work in progress here: #896
You need special webos grabber that will provide raw NV12 stream @s1mptom managed to implement & test it in his fork. |
@satgit62 At the moment, my implementation to hyperion-webos is quite simple and more of a “temporary” solution for debugging the overall mechanism. That’s why I haven’t shared this solution in Discord yet… I’ve implemented a “direct” image transfer in my fork, bypassing the converter for the libvtcapture backend for versions 5+ (since I personally have an LG G3). I’m not entirely satisfied with the auto-calibration results so far, but I’m still testing and checking things out =) @awawa-dev has been actively helping me with this. |
Till recently HyperHDR flatbuffers server only supported raw RGB format. It had many limitation.
But thanks to @s1mptom for inspiration this PR adds NV12 image format to HyperHDR flatbuffers server (more discussion here: #916) For now you can use it with rather exotic and unsupported here webos capturing application but it is intended to work with any client and in the future it will be easy to extend it to other formats such as mjpeg
Advantages:
Important:
My thought after receiving the sample frame (thanks again @s1mptom): when webos uses NV12 internally, as a consequence of using NV12 format there is only one color for every 4 brightness values. So even if webos capturing converts it like it does now to 320x240 RGB before sending the frame to HyperHDR, in reality we have only 160x120 colors and 320x240 pure brightness values blended together to achieve 320x240 RGB. Which can be important for dense LED strips, where you will need to use a higher resolution for a better effect.
But that's how internal webos capturing works so it's not a fault of the format since they chose it and we cant change it. However, we can abandon the unnecessary conversion from NV12 to RGB (which does not improve the quality at all and brings other problems) and send the original NV12 stream via flatbuffers now.
Some results (webOS, thanks s1mptom):
YUV coefficients are switched during LUT calibration thanks to the native NV12 format. It was impossible for the RGB flatbuffers stream.
source