THANK YOU BASED NVIDIA G-SYNC HDR

youtube.com/watch?v=RTJhZ9ZJ18U

THANK YOU BASED NVIDIA G-SYNC HDR

Attached: maxresdefault.jpg (1280x720, 124K)

Other urls found in this thread:

en.wikipedia.org/wiki/DisplayPort#Display_Stream_Compression
spectrum.ieee.org/consumer-electronics/audiovideo/your-guide-to-televisions-quantumdot-future
twitter.com/NSFWRedditGif

>4k 120/144hz panels are finally out in HDR
I guess this is a good step in the right direction. I wanted Super AMOLED but I can settle for the next best thing.

Threadly reminder that this shit is bogus, completely unusable with hdmi 2.0 and barely usable with DisplayPort 1.4

>you can't have simultaneously 4k + HDR + 120Hz
>you just can't, it's limited by the DP/HDMI bandwidth

Even if you connect with DisplayPort 1.4 you will have to compromise something.

>want 4K @120Hz? give up HDR
>want 4K+HDR? cap your fr @60Hz
>want all three? implement "visually lossless" compression called DSC

en.wikipedia.org/wiki/DisplayPort#Display_Stream_Compression

>DSC compression works on a horizontal line of pixels encoded using groups of three consecutive pixels for native 4:4:4 and simple 4:2:2 formats, or six pixels (three compressed containers) for native 4:2:2 and 4:2:0 formats.

You're gonna spend thousands of $$ on a purportedly high end display which won't be able to deliver what's specified. And if you force it to do so, you're going to have your image COMPRESSED WITH CHROMA SUBSAMPLING, literally downgrading it to a bad jpeg quality that the producers will want you to believe is "visually lossless".

Don't fall for this scam Jow Forums

Attached: 4K+HFR+HDR=DSC.jpg (1572x814, 828K)

Just wait until the new HDMI standard then buy a new cable lmao not that hard

low quality bait or you're literally retarded thinking that a cable would change anything

4K G-Sync HDR at 144 Hz is available at 4:2:2 chroma. It can only do 98 Hz at 4:4:4 chroma.

Wow what a steaming pile of crap

Attached: 420vs444.png (609x334, 353K)

4K OLED 144Hz Freesync Display when?

Never, because OLEDs won't be able to mitigate the burn-in issue.

However 4K 144Hz Freesync in general will come pretty soon, I give it 2 years until HDMI 2.1 and next DisplayPort (1.5) are out and about.

For the next generation displays that have some minimum viability of market introduction, check pic related or the whole article:
spectrum.ieee.org/consumer-electronics/audiovideo/your-guide-to-televisions-quantumdot-future

However for Emissive QLEDs or even worse microled QLEDs we will have to wait a long, long time, maybe even a decade.

Attached: PhEnhQLED-OLED-PhEmQLED-ElEmQLED-mLEDQLED.jpg (634x1856, 716K)

Boo fucking hoo

Why not just got for 1440p?

I'd still wait, it's only Hi10 not Hi10+ so no dynamic brightness. Everything will just look super saturated.

This is really mind boggling why the hell did they put 4K in 27" screen when this is clearly the sweet spot for 1440p resolution.

>most people won't bother with 4K
>it's easier to producer a 2.5K panel
>no compression shambles needed
>384 FALD zones would be more accurate for 3.7Mpix than 8.3Mpix

vs.

>MUH 4K MARKETING POWER

>However 4K 144Hz Freesync in general will come pretty soon, I give it 2 years until HDMI 2.1 and next DisplayPort (1.5) are out and about.
yeah i cbf paying $3-5k for a decent hdr 4k screen because of gsync bullshit

Its really not a relevant product at all.
Gtx 1080ti struggles to break 100-120fps at 1440p. So what happens when it goes to 4k?

It's basically aimed at people who are planning on buying 2x gtx 1180 when they come out.

what is HDMI 2.1 for then?
what is double HDMI 2.1 cable for then?
is 85.2 Gbit/s not enough?

Attached: b90.png (645x729, 91K)

>what is HDMI 2.1 for then?
this display doesn't use hdmi 2.1, neither any graphic cards released on the market

hdmi 2.1 is a new specification, it was just released this january and chances are the new nvidia generation won't include it either

so, you fucking spastic, useless mongoloid, read up first before posting, because that 90th brainlet image from your collection is clearly aimed at you.

what does wait for it means?

Attached: aa6.jpg (1000x989, 106K)

You can't just "update" a device to a new HDMI standard with a new cable. It needs to be supported in hardware by the device. That's why the Fury Nano was so useless - they only supported HDMI 1.4, so no 4K/60Hz. Which was dumb, since AMD promoted them as HTPC cards and TVs don't have DisplayPorts.

Still waiting for AU Optronics to make a better 1440/144/IPS panel than the prototype-grade thing that's been on the market since 2015.

>visually lossless
makes me mad

Attached: 1hurr.jpg (300x300, 114K)

It is though. You can't tell the difference.

i will stick with my 1080 Predator

>Still with the sexual predator branding
It's like they don't want to sell any monitors.

Yup, looks fine to me!

I'm getting ready my ass wallet to buy this reasonably priced $3000 display!

Attached: VISUALLY_LOSSLESS_(R).jpg (1920x1080, 64K)

That's not what it looks like you dumb retarded faggot

fucking sad

Why does gamer shit always have such awful names?

Wait, I thought you retards played on 1080p and lower? Or that's just for benchmark shitposting?

Nobody plays with toys here, this isn't /v/.

sure, it's exaggerated, most likely it will look like this:
which is still a fucking eyesore atrocity

No, it doesn't look like that either you dumb FUCK. Displayport compression already has been a thing forever. I run a 4k60 monitor on displayport 1.2 which requires the compression and you don't get artifacts like that at all

>NVIDIA G-Sync
Into le trash

it's literally stated that it uses chroma subsampling

Don't forget the 1000 € card that doesn't even drive it 100%, goy.

And that's not how it looks in the end you DUMB FUCKING RETARD
Are you the same one sperging out over UHD blu rays using 4:2:0 with hdr? Because you're wrong about that too. You dumb fuck.

Sounds like a massive scam. We've had several "HDR" monitors on sale lately, but it turned out to be "artificial" HDR aka saturated SDR. Show proof that this monitor is actual HDR. Also, IPS panels suffer from severe issues such as backlight bleed, ghosting and much more. How do we know for a fact there'll be good QC now?

Is there even a GPU to support 4k at 144hz?

This is literally early adopter tier, DisplayPort 1.4 cannot handle 4k+144Hz+HDR at the same time. HDMI 2.0 is even worse. Wait for DisplayPort 1.5.

It probably uses 8bit+FRC like everything else. For low framerate shit like movies theres no practical difference between FRC and true 10bit, but in 60fps+ games it might be noticable.

I got jewed by some MSI 980 once, GTX 980 is specced for HDMI 2.0 but in practice my MSI card only used 1.4. I know it wasn't the monitors fault since I tested it recently with my new Vega and it maxed out the HDMI 2.0 spec.

>Because you're wrong about that too. You dumb fuck.
Enlighten us then

>4k 144hz
No machine is really gonna fully utilize it for years

Can easily run older games at that resolution and framerate. Most people keep a monitor for years, so there's plenty of time for cards to catch up when it comes to running the latest AAA garbage.