Wide Gamut Color and HDR Discussion

Do you have it?
Do you want it?
What challenges did you encounter in setting it up?

Attached: fstoppers-dell-i1-display-pro-lead_0[1].jpg (1185x670, 182K)

Other urls found in this thread:

4kmedia.org/samsung-fashion-show-milan-uhd-4k-demo/
twitter.com/SFWRedditVideos

>Do you have it?
no
>Do you want it?
Want many things, there's no way I can afford one, for now I stick with my CRT, one day if I become rich and uLED wide glamut becomes a thing maybe I consider switching.
I'm really curious how my photos would look like on an HDR display.

Wide gamut monitors are very cheap. Sometimes cheaper than standard sRGB monitors because of the issues involving oversaturated colors if you don't have a color profile set up properly. And even if you do have a profile it still won't look exactly right.

I own two wide gamut monitors. A Dell 2408wfp and a HP zr30w. Paid $40 and $120 for them respectively.

When everything is set up properly on them they look good. Really good. But it is very hard to navigate the right settings combination and a lot of things (like games) will look wrong whatever you do.

Still, it is fun to just tinker with for those who enjoy that sort of thing.

I had no idea, I thought they were uber expensive, how do those monitors compare to CRTs when it comes to viewing angles and gamma?

HDR is what is expensive.
>viewing angles
Not nearly as good. They're IPS displays so they are good for an LCD monitor but can't compare to a CRT.
>gamma
If you mean black level not as good since they are still LCD. If you mean contrast ratio they are 1000:1 for the most part with fairly high nit brightness level. If you mean smooth and accurate gamma curves they are very good.

I see, but I can't even imagine the level of complication when I would have to get it to work on Linux and how photos would translate into wide gamut and sRGB.
I'm still trying to figure out how to post-process photos for the Adobe RGB or if I should stick with sRGB, that alone is another layer of confusing I'm trying to figure out, imaging having to add a wide gamut display would just make things even harder.
But yeah, I would buy one if I wasn't in absolute poverty (neet) where even 40bucks is a big deal.

I've discovered one trick for viewing photos with 10 bit color on my PC with GeForce series card. Usually you need a Quadro to view 10 bit photos (I think) since most photo viewers use OpenGL and that is only supported on the Quadro line.

However, 10 bit color is supported through directx on GeForce series. This means video players and video games can use it.

In media player classic home cinema I was able to open a 10 bpp test image and display the full range of colors that were not passing this test when viewed in standard photo viewers like XnView. Too bad it is just a viewer, and no way to edit the photo in real time.

>Do you have it?
A TV for home theater


>What challenges did you encounter in setting it up?
It's really for movies. If you're using it with a PC for anything other than movie and maybe games then just set your display to sRGB or limited color mode because you're going to have a hard time using it

Yeah, I guess the bottom line is that wide gamut and HDR is still in it's early days and nothing supports it yet.
There is little to gain from being an early adopter, better wait some more years until standards are built, things get universal support and technology overall improves.

Yes.
It's neat.
I'm using the manufacturer calibration, all I use it for is shitposting and games. It looks really nice.

It does support true HDR10 too though, IPS panel with selective backlight. But WCG alone is totally worth it.

Being someone with a (ultrawide, HDR10) IPS panel and several CRTs, even high end ones like LaCies, the viewing angles are pretty much the same... I see no degradation even on extreme angles.

Don't have this problem with my AMD card. Does Nvidia really lock 10-bit to Quadro only?

WGC does not give you good gamma, HDR does that
viewing angles have nothing to do with HDR or WGC but what panel technology is used
stop confusing shit

Yeah, I know, I was asking about cheap IPS in general, since I don't own an IPS LCD, only NT crap.

Stop buying cheap monitors? The way Wide Color Gamut works is that it's an extension over sRGB. If it's calibrated well, it's going to look exactly the same for content that does not support it, while looking better for content that does, specially games. If yours look different, you probably have bad calibration or just very bad panels.
While HDR is the thing that can look bad if not supported and forced on. Wide Color Gamut is also supported far more than HDR.

lel my TV is 12 bits, what high end quadro card would I need for that? thank god for amd

>CRT

If you display sRGB content on any wide gamut monitor it will always look oversaturated, with good calibration (or good primaries and linearity) you won't notice major colour shifts but it will still be quite oversaturated
The fact that wsRGB it's just sRGB but with larger primaries doesn't fix saturation problems when displaying sRGB content, they only way to fix that it's work proper colour management, or just setting the display to sRGB and losing all the extra gamut
Also, all monitors follow DCI-P3 or Adobe RGB instead of wsRGB since the later is retarded

>If it's calibrated well, it's going to look exactly the same for content that does not support it
This is not true, you should do more reading.

Anecdote: the zr30w is like having a mini-stove in my room. It puts out more heat than any LCD I've ever seen. You can feel the heat off of the screen from 1.5' away.

>Does Nvidia really lock 10-bit to Quadro only?
10 bpp is only supported in DirectX applications on Geforce.
>what high end quadro card would I need for that?
Any old quadro will work, even the $20 ones. But you will want power if you're going to be viewing high resolution 10bpp video.

>The fact that wsRGB it's just sRGB but with larger primaries doesn't fix saturation problems when displaying sRGB content, they only way to fix that it's work proper colour management
Exactly retards, stop trying to be pseudo-intellectuals. Properly calibrated WGC display will be no different.
I mean... I'd understand if you're arguing against my eyes but I literally have half a dozen Spyders.

Recently played thru FC5 and BF1 with HDR10 turned on, on a real HDR screen great experience

>If it's calibrated well, it's going to look exactly the same for content that does not support it
No matter how you rationalize this statement is still false.

>t. consuses HDR and WGC
you're probably a winshitter and that's why, on Linux those two are separately usable

Not confusing anything. You can't exactly scale sRGB to Adobe RGB and have an identical image as the original sRGB.

That's exactly how it works though. It's digital numbers, not magic.

Not the guy you are arguing with but please do us all a favour and stop. I work on movies, color calibrate my displays, work with several color spaces for color correction, and that's definetly now how it works.
Just fucking stop spilling bullshit about things you can't even grasp.

Attached: believe in yourself.jpg (500x375, 111K)

The numbers work out properly. It's like trying to do an exact-mutiple scale of 720p to 1080p. Can't be done because it's not an exact integer scale.

>The numbers work out properly.
*don't

I've been reading about how a 8 bits per pixel UHD video encoded with 4:2:0 sub-sampling can be converted to a 8 bits per pixel 1080p video with 4:4:4 sub-sampling (and maybe even 10 bits per pixel?).

How would a person go about actually putting this into practice? Either through re-encoding or in real time.

>Do you have it?
No
>Do you want it?
What I want is a display with much better contrast and actual black that doesn't burn in or suffer from some other serious flaw. Also it should be decently priced.

>on Linux those two are separately usable
It shouldn't be. Higher luminance/intensity coincides with more colors, which is why LCDs were able to achieve a greater color gamut than CRTs. It's why gamma shift directly affects saturation and brightness on a panel. Also, no HDR spec uses sRGB as the reference color space, so whatever Linux is doing, it's not authoritative in any way.

As for colors being wonky on WCG displays. Apparently apple found a way of working around it for their iphones or something using a unique color profile spec, but in general the engineers in charge of color profile standards are retarded, so we often see the case where SDR content is improperly mapped against a "HDR" LUT and ends up looking like washed out shit. It should work, but often doesn't.

You should be able to come so close you can't tell the difference with human eyes. Often it's so bad you can. HDR10 and DCI-P3 call for 10-bit output from source, there's no reason why they can't properly map sRGB within it.

>and ends up looking like washed out shit. It should work, but often doesn't.
When I have used the manufacturer supplied .icc files this has been my experience. If I bought one of those spyder units and created a new .icc do you think it would look a lot better?

>8 bits per pixel UHD video encoded with 4:2:0 sub-sampling can be converted to a 8 bits per pixel 1080p video with 4:4:4 sub-sampling
mpc-hc and mpv will do this automatically with default settings when you play back 4k on a 1080p screen, no fancy tricks needed. Getting 10 bit video though idk about

I wouldn't get spyder. Their software is basically their DRM. xrite's stuff is compatible with FOSS that gets updated frequently.

Also, you would need two separately color profiles, one for sRGB and one for HDR/DCI-P3 and use a program like color profile keeper to switch it or switch dynamically. The reason being that the entire problem apparently has to do with how the color profiles are designed, so sRGB content doesn't map right to a HDR color space I guess.

Suuure thing buddy. Also even the other user already agreed that it's exactly how it works:
>wsRGB it's just sRGB but with larger primaries

>You can't exactly scale sRGB to Adobe RGB and have an identical image as the original sRGB.
There Is no scaling. The reason it's hardware specific is that if one is 100%, the other is 200% while still being 100% in-hardware. That's literally the simplest explanation I can come up with for brainlets.

Scaling isn't the correct term for it. It's called mapping. I don't what the crap you're talking about with your 100% and 200% so whatever you can keep thinking of me as a brainlet but you're the one who can't put his thoughts into understandable prose.

I watched this video on the zr30w.
4kmedia.org/samsung-fashion-show-milan-uhd-4k-demo/

I made sure I was really getting true 10 bits per pixel color (sometimes called 30 bit color). I
>set color depth in Nvidia control panel to 10 bpp (using GTX 980)
>installed mpc-hc with LAV filters and hardware acceleration off
>enhanced video renderer custom presenter
>direct 3d full screen mode
>10-bit RGB output
>Force 10-bit RGB input
>used test photos and videos to make sure 10 bpp color was being displayed

My opinion? Honestly it doesn't make a lot of difference. I can tell there is little few shades of color that I wasn't used to, and very little banding, but otherwise it looks 95% identical to 8 bpp. I've heard that sometimes even though a video may be encoded in 10 bit color it doesn't really take advantage of it too much, but since this is a demo video you'd think they'd use as much as they could.

For the amount of headache involved I'd say it is not worth it at all except to the worst AV geeks and people actually working in the industry.

Did you have HDR mode on in your OS?

No, I'm not doing anything with HDR in this example. It's just 10-bit color.