-
Notifications
You must be signed in to change notification settings - Fork 3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Separate --target-peak-sdr for SDR content #8249
Comments
If sdr should get its own "target-peak-sdr" setting, I suggest to rename "target-peak" to "target-peak-hdr" for consistency sake. |
Well, not really. In the situation I've described, --target-peak still serves a purpose for SDR content playback, just as it does now - it defines screen's peak brightness, which may be different than the brightness you want to display your SDR content at (--target-peak-sdr), and can be set regardless if it's a SDR or HDR screen. It just so happens that it's also used for HDR content tone mapping, but HDR content doesn't need another setting because it defines full dynamic range so screen's peak brightness is all that's needed. For SDR content you need both screen's peak brightness (--target-peak) and the peak you want your SDR content to be mapped to, which is currently hardcoded at 100 nits (203 nits in master), and which --target-peak-sdr should take care of. |
Ah. Makes sense. |
While it's a good idea to get rid of arbitrary hardcodings I vehemently disagree with statement that "mastering SDR content for 100nits peak is a disaster, people are watching SDR content on their SDR displays with brightness set to the max all the time". None of people I know do that and I target something in range of 80-120. I still need my eyes, thank you very much. Maxing out brightness with displays of 1000:1 contrast, which is all TN & IPS SDR displays, is just silly. Some people would do so, though. Or one can tempt luck with 3000:1 VA SDR displays where 200-300 might make some sense. Many TVs are like that nowadays, I think. Anyway, good proposal. |
I think it would be a good option to have. It would be useful in my case as I run my display in HDR at all times. The new default of 203 is better, but my display is calibrated at around 300nits when playing SDR content. So being able to set that peak would be helpful. |
No, I'm not. HDR displays users are minority so far and will be for a while. 200-300 is ridiculously bright for displays with such low contrast, it hurts eyes and looks washed out. And how much of those complains about "HDR displays" them maxing out with 400-500 nits and using some kind of "dynamic contrast" nonsense ?
Well, 80-120 nits is what sRGB/BT.709/SDR were targetting, apparently because HDR was not a thing then. It's what SDR displays are supposed to have by default which is why it's what SDR should be mastered for. Also, I have a bone to pick with modern video mastering. Gamma and brightness looks all out of whack on newer recordings, especially on my simple VA display. Seems not to be a problem on most older content. Not sure if its a problem of sources or mpv's interpretation with default settings but I bet on mostly the former. I have colour management enabled in Firefox (as much as it allows) and EDID-based colour profiles generated by DisplayCAL (before it was removed from distros due to being python2-only). In your Witcher screenshots, first one looks like having most brightness and contrast, very crisp (but some pink petals seem to clip to white), while third is dimmer (but no clipping) and second is dim (maybe even more so than 3rd) and clipped (which it has no good reason to be). Picard looks dull but not obviously washed-out, not sure if it's player's interpretation's fault or just Holywood's ugly brownish filter's. But your use-case being the opposite of mine is exact reason why this should be configurable. |
I was talking about SDR display users. Just earlier this week I had a conversation on reddit with one SDR display user arguing that SDR content looks fine on his 500 nits iMac with brightness maxed out. Insanity, I know. But that's the way it is. And average video content consumer has basically no knowledge about video content mastering nor about how it's supposed to look like. And granted, by blowing up SDR to 200+ nits I'm not giving a damn about the original intent, but I'm trying to stay within reason, with properly calibrated gamma curve and so on. I just don't consider 100 nits to be enough as far as dynamic range goes, not in this day and age. In regard to contrast, I respectfully disagree. As I've said, it's not all about black point. Viewing conditions matter, and the further those are from complete darkness, the more it makes sense to boost the brightness, regardless if it's OLED or TN. One could even argue that dynamic range influences image perception way more than pure contrast. Try telling an average person that the same display calibrated towards 100 and 300 nits has almost identical contrast in both cases and they won't believe you. Numbers aren't everything. My 250 nits SDR TN screen definitely doesn't look washed out, and I saw some 600 nits OLEDs and 1000 nits QLEDs, so I know that there are stupidly better displays around. If you're used to staring at 100 nits display for hours, your perception may be different. I've got IPS display in my smartphone set to 1/3 brightness, because at 1/2 it's brighter than my monitor, and at max it's a damn flashlight, so it's not like I'm trying to stare into a photon engine here.
And it's why that standard should be long dead already. Current "SDR" displays are more like half-HDR, as far as dynamic range goes. Properly mastered HDR content looks on them much worse than on actual HDR display, but at the same time it's light years ahead of SDR @ 100 nits.
Which is better than 100 nits max. One can always lower the brightness and have his 100 nits. Most people won't. First comment you've linked mentions 120-160 nits, but that's perhaps for average office user. Second comment suggests 120-160 nits as SDR target, calling 100 nits "rather dark". And what's the current hardcoded target in master? 203 nits. As much as I respect people willing to set it to 100 nits, there are many for whom 203 nits will not be enough. I'm half time programmer, half time gamer/media consumer. For coding, I've got low contrast theme. For everything else, going below 250 nits is just a waste of display capabilities. So yeah, this can't be hardcoded.
I very much doubt "mpv's interpretation" problem. Silly question, what's your display gamma, 2.2? Because as long as files are properly tagged, mpv shouldn't have a problem. Got Ctrl+r vf toggle format:colorlevels=full in input.conf for those that are not.
Those are mpv screenshots, should be standard sRGB. As mentioned earlier, they're calibrated towards 250 nits peak brightness, so your display should be at least that bright to watch them properly. I can make another ones, calibrated towards different peak, but since most are tonemapped HDR, 100 nits peak is simply not enough. The "very crisp" first screenshot from The Witcher is standard SDR, as is. Point being, when you'll compare first and second screenshot, you'll notice that brightness is the only difference and when watching both at 250 nits screen, first one looks like blown out version of the second one - which it is, only processing went the other way around when creating those screenshots. As for Picard, it's all HDR, so it really has to be seen at 250 nits minimum - below that it's guaranteed to look dull. And yeah, one shot is overselling the "sunny vineyard" feel a bit, but overall I consider the colors there to be very good, accurately presenting lighting in different environments. It's all about that 100 nits range being pure insanity on current SDR displays. When the content targets 200-250 nits (like those HDR scenes with tone mapping), it suddenly gets a lot more breathing room. It doesn't have to get much brighter on average to look much better, because every minor highlight here and there increases the overall dynamic range. SDR flattens everything into oblivion, which is why it gets that much brighter when blown out to 200+ nits - because relative brightness of some scene elements is different than in HDR, with darker elements being brighter than they should be. It shouldn't be the case (it suggests their tonemapping is worse than what mpv can do), but that's how it is often mastered. If SDR content would be mastered properly towards 200-300 nits, it would look much better by default. As it is, it's artificially limited to 100 nits and suffers. It may be the reason why some new content looks worse in SDR than it used to - they're mostly filming in HDR now, tonemapping afterwards.
Agreed. |
This comment has been minimized.
This comment has been minimized.
For me, black point is crucial and I prefer watching something in darker environment. But, yeah, that's how we got SDR with blown up brightness and I hate that trend. If it had contrast capability to match to actually achieve that dynamic range then it would be fine. But even supposedly "HDR" displays are stuck with giant backlight zones and 8-bit encoding (how much of those "10-bit" displays are 8+FRC like TN with its 6+FRC with hardwired whatever dithering ?).
And @haasn, who made that commit, uses 90 nits :] Ironically, SDR displays that have default brightness blown-out also may have lowered default contrast which decreases dynamic range further. And with random controls it's newer clear where panel's native settings end and distortion due to signal meddling of control board begins.
It should be sRGB-compliant on display, in driver and in mpv. And
Ironically, first one looks just fine to me, clipping notwithstanding. But second looks the worst because, unlike the first where brightness is gradually and naturally distributed, it gets clipped while being dull, it's not even actual white but grey. Third has colour gradation preserved while being dull. If I would max-out brightness on the display itself, maybe third one would look better but github's background would burn my eyes, it's already not good. Guess, we need that stuff for the compositor too, maybe after Wayland will get itself together and KDE & Gnome devs actually implement that.
Yeah, alright colours but it's a gloomy day on that vineyard :]
Maybe. It almost always looks dull yet overblown and low-contrast to me, I have to decrease brightness and increase gamma in mpv. But I'm not ready for 200-300 nits of white blasting at me. @lextra2
|
@v-fox throw those screenshots at fullscreen, then rise brightness to ~250 nits. they're 1080p for a reason (originally jpeg though, but it's not a disaster and shows what needs to be shown). |
Yeah but that wouldn't make such settings any less impractical. I can't change physical brightness every time I run a player. |
watching them in a window, with reduced peak brightness, when they're mostly tonemapped HDR shots, makes limited sense as well. sidenote regarding games, the biggest HDR-related abomination these days are games tonemapping to... you've guessed it, 100 nits SDR. not all, but at least some. like they couldn't implement damn peak slider for SDR like they do for HDR. difference between SDR straight out of, for example, Gears 5, compared to properly tonemapped HDR output from that game, is simply insane. |
That's the thing though: on SDR display it's not reduced.
Well, that's the consequence of hardcoding for 1000:1 TN & IPS LCD 96dpi (say hello to miniscule non-scaled text) displays and just removing any adaptation code. I'm baffled by what comes into their heads since year 2010 and how good ideas leave never to be back again. It would be better if games, players and all GUI apps just rendered in some absolute colour space (RGB 16f/32f for full visible spectrum + "translucency & self-luminance" for everyone !) and let compositor down-sample and correct stuff for output devices. But that would require Wayland-like spec for Windows & Linux & MacOS with already implemented colour-correction protocol. |
What? Normal SDR content targets 100 nits, so you can watch it with display set to 100 nits or whatever and it'll look fine. Those screenshots (at least HDR ones, so most of them) are tonemapped to 250 nits. Watching them on display set to 100 nits guarantees they'll look wrong. If they would be tonemapped to 100 nits, they would be way brighter, at the cost of flattened picture and crushed details. They're simply mapped to actual SDR display capabilities (here set to 250 nits brightness), not to the 100 nits standard. |
"Reduced" is not the same thing as just being "low". That implies that it's capable of more, like HDR limited to SDR or "limited" range of encoding instead of "full". That is not the case though, peak brightness in my window is not reduced. Which is why what may look overblown & fine to you, looks fine & dim to me. Still, I wonder if it's possible to tune up mpv on 120-160 nits SDR display so the result would be the first Witcher screenshot but without clipped highlights. Something like:
But I don't have the videos to test it out and you don't have an SDR display & don't like lower brightness.
"Capability" isn't a word I would use for something that goes outside of standard and makes things weird. Like I've said, actually setting SDR display to that would blow-up brightness for everything but the player that reduced its own. Then it would be "reduced" but, otherwise, impractical because everything else would not know that it's being shown on an over-blown display and wouldn't correct itself. |
Here's a crude picture of Witcher comparing the 250 nits tone mapping to the HDR passthrough (the peak is pretty low on streaming media. Probably only around 250 nits, anyway). You can see that it's pretty close, and with a tone mapping peak of 203 nits, possibly negligible. The real test is with something that has a much higher peak. |
Either I'm confused, or you are, terribly. SDR display capable of 200-300 nits, calibrated towards 100 nits peak, is limited to those 100 nits. From my understanding, you're running SDR at about 120 nits peak. My screenshots were tonemapped towards 250 nits peak, meaning that to display them properly, you need SDR display set to 250 nits peak - basically, with whole gamma curve stretched from 100 nits standard range to 250 nits. Which is exactly what happens when you set the brightness high enough. You cannot display them properly on a display calibrated towards 100 nits peak, it won't "cut out" those brighter values, instead the whole picture will be dimmer, because those are essentially sRGB SDR pictures, so no tonemapping is performed when displaying them. So yeah, when you set your display to 120 nits peak when it's capable of much more, you are limiting its capabilities. Peak is peak, it won't go higher than that, despite the fact it could with different settings. You're sacrificing dynamic range for black point, that's all, and those screenshots are calibrated towards much higher dynamic range than 120 nits.
First of all, I do have an SDR display - 250 nits TN, as I've said before. Second, hdr-compute-peak=yes? tone-mapping=hable? With all due respect, either you're unaware what those options do, or you don't care about accuracy the way I do. hdr-compute-peak brightens scenes when it decides there's enough dynamic range left in case of given target-peak, but it can cause sudden, unnatural brightness changes, even when configured to avoid it. hable tone mapping on the other hand - it was supposedly designed for games. It tries to preserve details at the cost of accuracy, while darkening the whole picture. The way I see it, gamedevs noticed that proper tone mapping of HDR picture in their games towards 100 nits SDR target on a display running at 200-300 nits gives a picture that's way too bright, so someone came up with that quick hack. It doesn't look right at all, it does not produce correct picture. It is artificially darkened to compensate for target-peak being too low. Most of the HDR picture - often the whole scene - is usually within 200-300 nits range. Often it's even close to 100 nits. You can easily check it out by setting target-peak to your monitor's maximum brightness, set the brightness to the max, then display HDR content with tone-mapping set to clip, or even mobius. Which is exactly what second and third screenshot from The Witcher shows - SDR picture at 100 nits has average brightness about the same as properly tonemapped HDR picture. Meaning two things: on an actual HDR screen that HDR content would have the exact same brightness when calibrated properly, and SDR screen calibrated towards 200-300 nits peak can show most of the HDR picture with correct brightness, minus lack of 10bit precision. Two things lost on SDR are highlights outside of the display's range and color precision, that's it.
SDR picture at 200 nits or more is as correct as SDR picture at 100 nits, just brighter, assuming proper gamma calibration. Of course you may argue that top of the SDR range should be 100 nits according to the spec and everything else isn't correct, but it doesn't change the fact that's the only difference when the display still follows gamma 2.2 at that increased brightness. The ideal would be to be able to define target-peak for the display and let the compositor manage it per app, but with 8bit panels that makes no sense. When HDR monitors become common, I guess that's what will happen. On SDR it is what it is - if you're limiting the brightness of your screen to 120 nits or whatever lower value, it will not display more than that, so it will not display content mapped towards 250 nits properly. Basically, what you're doing by displaying those screenshots at 120 nits or something screen, is like trying to display HDR content on an SDR display with linear mapping. What's supposed to be 250 nits, becomes 120 nits on your screen, and everything below 250 gets similarly darkened, because there's no tone mapping - because those are technically SDR screenshots with tone mapping of HDR content already performed. |
@unic0rn I compared your screenshot to metadata passthrough, so you could see what it was supposed to look like. A photo was necessary because a screenshot isn't possible. The peak is so low on streaming media that passing the metadata to the display gives you the exact content provider intention, with no tone mapping. The peak of my display is around 1600nits, well above the peak of this media. For SDR, it's calibrated to around 300nits. So I'm able to directly compare your screenshot of 250nits to the actual peak of the frame. Something from a Blu-ray would show the difference more clearly. But this is why I called it 'crude'. Just to give a very basic comparison. Even if I took an HDR photo, you still wouldn't see it correctly on a web page. |
That's what I was trying to illustrate. That the idea of being able to set the target peak is a good idea, as it's pretty close to what it should look like. While there are noticable differences, it's close. Bearing in mind that I'm using the PQ curve and bt.2020. So there will be a color shift. |
This comment has been minimized.
This comment has been minimized.
@Doofussy2 "around 300nits" explains the brightness difference. if it would be calibrated to 250 nits for SDR, not counting colorspace differences, it should look the same - which is kinda obvious. Most HDR content should look fine actually, usually only highlights go above 300 nits. But the whole HDR discussion here is kinda different thing to --target-peak-sdr I'm suggesting. I was just trying to illustrate to @v-fox that SDR content at 100 nits is too dim and washed out by comparing it to tonemapped HDR - which often targets the same average brightness, but looks way better, and boosting SDR brightness is a way to mask its lack of dynamic range a bit. Not ideal, far from it, but as I've said, for me tonemapped HDR looks better than SDR @ 200-250 nits, which in turn looks better than SDR @ 100 nits, and when you can't have HDR content - or it's terribly mastered - that's all that's left. Sidenote, it's interesting that they apparently chose reinhard (or something very similar) for SDR mastering of The Witcher. It looks like mobius stays so close to the original HDR version that at 100 nits the lack of dynamic range becomes too obvious. If they would target at least 200 nits for SDR, mobius would look way better, but instead we got reinhard @ 100 nits and flattened picture. But if all this discussion doesn't prove that mastering SDR content to 100 nits is a waste of potential of most displays on the market, then I don't know what does. |
Yes, I agree. SDR at the present 203nits is significantly better (I think) than the original 100nits. An option to set the peak would be nice to have. |
Problem with 203 nits as current default is that while it looks better than 100 nits, it makes no sense, it's just a number out of thin air, kinda. 100 nits is a standard, as bad as it is. 203 nits seems like it came from the fact they've switched default tone mapping to bt.2390 (I didn't check it out yet, not running current build), and well... just search this pdf for "203": https://www.itu.int/dms_pub/itu-r/opb/rep/R-REP-BT.2390-8-2020-PDF-E.pdf Then check page 51 (actual page number 49) of that pdf - they're suggesting SDR peak to be 200 nits there. So yeah, I wonder how the devs here figured out 203 nits to be the right value. And I also wonder why, when MovieLabs suggests 200 nits as a right target for SDR, the industry still tonemaps HDR content to 100 nits for SDR releases. Seriously, who the hell watches tv series on a reference SDR monitor? It's all such a goddamn waste. |
lol. I feel ya. It's like the people in charge never even bother to check playback on a midrange TV. |
See https://code.videolan.org/videolan/libplacebo/-/commit/9d9164773 |
@unic0rn I'm confused, isn't |
I use autoprofile to switch 203 nits is a thing. HDR analysis tool built inside reference monitors uses it as the boundary between SDR and HDR. If you use these tools to check out 4K Blu-ray discs, they use 203 nits as the reference white for HDR. So I think it's correct to hardcode it as the reference white and set it as the default value for |
@unic0rn I thinks you are right, hdr-compute-peak changes the brightness unnaturally, it also darken the image when it is bright, I can see lots of unnatural changes in Interstellar. |
Hi any news on this? I think this would be an awesome feature for everyone with an hdr monitor. If you are in bright room you want this value to be greater than 203 nits. In a dark room its perfect. Also one interesting thing I noticed. every other player that I tried is not using a hard coded value but is instead dependend on the value that I set on the windows hdr settings. There is slider so set the sdr/hdr balance. Every player besides mpv will get brighter if sdr content is played and I increase this slider. Only mpv stays the on the same luminance. This slider has no effect on hdr content. So maybe an auto mode that respects the OS settings would also be great. For non windows users i mean this menu: |
@mnisius Obviously none, sadly. I didn't have much time to look at the code yet, but I assume the main problem is it's probably not entirely trivial patch. --target-peak defines peak brightness for both HDR and SDR target curves, as it should. The problem is, peak for source SDR curves - so for SDR content - is hardcoded at 203 nits. --target-peak defaults to 203 nits for SDR target curves, so the end result is what people expect. For HDR target curves it's a different story. Additionally, --target-peak should still default to 203 nits for SDR target curves. It defines display brightness, not content brightness directly. --target-peak-sdr should accept values up to current --target-peak or display a warning and clip its value, since setting source SDR curve peak higher than target SDR curve peak would enforce tonemapping, and for SDR content that makes no sense. So the proper usage would be setting --target-peak to whatever the display brightness is - as it is now - and --target-peak-sdr to the brightness SDR content should be displayed at. As a result, --target-peak-sdr should also default to 203 nits, since that's the current hardcoded value. It may sound simple on paper, but quick look at mpv's code - where 203 shows up only as a white point value - makes it clear that's not exactly the case. That value was obviously never meant to be changed - and it shouldn't be. It isn't hard to figure out what needs to be done, it's more of a "how to do it without making it look like a messy hack". As a bonus, subtitles are also affected by that 203 nits SDR source curve default, and they should follow the --target-peak-sdr setting. I may take a shot at patching it myself, but since I'm a Linux user, don't count on auto mode for this. Not sure when I'll have time for this though. Speaking of Linux, it has no HDR support in desktop environments yet - it's in the works, but on the hardware front in the kernel there's support for some Intel chips only, so it'll take time - so until it gets standardized in wayland, setting --target-peak-sdr manually is the only way. Down the line, when Linux desktops will start running in HDR, this setting will become more important. As it is, it's mostly useful for SDR displays and users doing tonemapping of HDR content that screws up SDR content display in the process. With HDR displays, there's #8219 - but maybe it's possible to output HDR on some Intel chips without using HDR passthrough, since there's HDR display support for those in the kernel, and some support in mesa due to ongoing work on the wayland/gnome front, so who knows. If it'll work (I don't have the hardware to test it), it'll involve both changing the display mode and disabling desktop compositing - but in theory, it may work. |
@unic0rn thank you for looking into it. Auto mode is not really important to me I just wanted to mention it because most other player seem to respect this windows slider out of the box. I'm not sure if they actively implemented this as a feature or it just works for them... |
@mnisius Hell knows, could be the latter. mpv is pretty low-level compared to at least some video players for windows, but that's exactly why I like it - it gives full control over the picture. |
So I was just playing around with some mvp settings and made a nice discovery that I wanted to share with you. By setting d3d11-output-csp=srgb it is possible to adjust the brightness of sdr content with the windows hdr/sdr brightness balance. So now I can control the brightness with this slider to my liking: 😀 |
Thanks for sharing! I was just looking for this exact solution! |
Since the operating systems have a slider, mpv should obey it - and in absence of it, provide its own and default to the current 203 nit value. Really, SDR mastering peak is used only for playing SDR content in forced HDR mode, which is what you have to do if you have multiple content types. In practice, users set brightness according to lighting to keep a semblance of contrast and quite a few games provide a control for SDR brightness or "UI brightness". In the meantime, I have to use contrast dial to avoid getting blinded by this "choice" and turn it back up for HDR content or when inverse tone mapping. That is not an acceptable situation. Forcing srgb surface robs you of wide color gamut and HDR in case you have varied content. mpv cannot switch it on the fly it appears. |
As it is, --target-peak works well with tone mapping and also makes it possible to treat the display as a "HDR in disguise".
But here's the thing. Every half-decent display is a "HDR in disguise" (or rather "half-HDR" - not as bright, not as precise, but not 100nits either). The whole idea of mastering SDR content for 100nits peak is a disaster, people are watching SDR content on their SDR displays with brightness set to the max all the time - what 100nits peak displays are on sale these days? The average for inexpensive monitors is around 250-300.
Some people have much brighter screens though, so --target-peak should work as it does. What I propose is --target-peak-sdr, being "maximum signal level for SDR content only". It may default to 100 for backward compatibility, and should work together with --target-peak, for example if someone has a 800 peak HDR screen, --target-peak=800 --target-peak-sdr=300 should display SDR content at 300 nits max, while tone mapping HDR content to full 800 nits range.
At this point I can't throw target-peak into the config to use different HDR tone mapping settings, because it'll mess with SDR playback. 100nits peak for SDR may be fine for calibration purposes, but it is not for actual video playback.
And a sidenote, --target-peak-sdr should also set the peak brightness for subtitles/OSD. As it is, those are displayed at 100 nits with --target-peak=250 on my SDR screen, which makes little sense - can't imagine that being enough on HDR screens either.
EDIT: just noticed that SDR default has changed from 100 to 203 in master. Makes sense, but that's just an arbitrary default - better than the old one, but considering the amount of different displays people have, my proposal stands.
The text was updated successfully, but these errors were encountered: