So are 4k 144hz monitors officially /memetech/ now?

2080TI looks like it will be able to handle 4k at around 60FPS w/ GSYNC. With that said, why do we have 4k monitors rated at 144hz? What is the point? Is there something I'm missing?

Attached: Asus-ROG-PG348Q.jpg (1500x949, 98K)

>why do we have 4k monitors rated at 144hz? What is the point?
because for fast paced games like counter strike, quake, etc. Having 144hz is actually an advantage that can be fairly noticeable to people who have hundreds or thousands of hours in the game.

Also just looks smoother overall compared to 60hz, even in non-fast paced games, it looks better, just not as big a deal as it is in the faster games.


Doesn't look like we'll see any GPUs capable of 4k 144hz until 2019 or 2020 though.

As trips said, 100hz+ results in better smoothness.
In any case there's no one forcing you to use the native resolution, you could run the game at half res and enjoy the higher framerate. Having options is nice.

>why do we have 4k monitors rated at 144hz
We don't. The only "4K 144Hz" monitors on the market use chroma subsampling to achieve that refresh rate. None of them can run 4K/144Hz at full 4:4:4 chroma. They're literally a scam.

Attached: BvSGsXryDee2FPQv.jpg (1236x616, 121K)

Actually the panels probably COULD do it, the issue is there isn't a cable standard that can currently support the 4:4:4 144hz 4k bitrate (31.35 Gbit/s)

Maximum bandwidth of DP 1.4 is 25.92 Gbit/s

Maximum bandwidth of HDMI 2.0 is 14.4 Gbit/s

HDMI 2.1 isn't out yet as far as I am aware, though should hit the market soon with a bandwidth of 42.6 Gbit/s
DP 1.5 (or 2.0) hasn't been announced yet, but is expected to be announced in the next 6-12 months and is expected to have a bandwidth of 51.84 Gbit/s

Both of these will be able to handle 4k 144hz at 4:4:4

But it's irrelevant for these existing monitors, because you can't just update the firmware or buy a new cable to enable HDMI 2.1 support. There won't be any HDMI 2.1 devices until next year, because the HDMI Forum hasn't even finished its compliance testing tools yet.

It's also irrelevant with any current graphics card on the market, since those don't support HDMI 2.1 either and never will (nor will the new 2000 series from Nvidia). So even if HDMI 2.1-capable monitors arrive early next year, we'll be waiting even longer for output devices to drive them.

It's not irrelevant if you lower the resolution.

why didnt they just wait until HDMI 2.1 was available then?

why would you buy a 4k 144hz monitor and lower the resolution. might as well buy a 2k 144hz monitor. for much cheaper. only thing your missing is HDR

They already delayed the monitors by what 12-18 months after they were first announced?
Delaying another 8-12 months for HDMI 2.1 might not have been the best idea.

Though to be perfectly honest, no on in their right mind will buy these monitors anyway, so not sure what the hell they were thinking to begin with.

4k is such a stupid fucking meme.

For gaming.
Otherwise you could use 100hz instead of the full 144hz and still go 4k.

yeah but youre pretty much releasing an unfinished product. thats like releasing a 10k blueray play, but theres no way you can watch 10k movies, so just watch 4k movies, and when 10 movies do come out, it wont matter cause it wont have the necessary connections to do so.

or 96 hz for that sweet sweet 24fps content to be interpolated to 96fps without any frame skipping.

96hz would be ~20 Gbit/s, so it can be done with DP 1.4, but you wouldn't be able to use HDMI.

yes, but youre paying a rediculous premium for a product that cant REALLY do what it says it can. youre better off going with 2k 144HZ and then just waiting until real 4K 144hz monitors come out with HDMI2.1 and graphics cards with HDMI 2.1. other wise i dont see the value.

Technically DP1.4 at 25.92 Gbit/s is capable of 4k 120hz. Which would be 25.82 Gbit/s.
But with only 0.1 Gbit/s bandwidth left, you might find problems with cables being actually capable of hitting the full speeds of the specification.

Playing video games at max settings is not the only thing people do with their monitors all day.

what other use would the 144hz have? otherwise just buy a regular 4k monitor.

The value is that you can do both 144hz and 4k on a single monitor instead of having two separate monitors.
There's no doubt that things will be better once we can actually use both 144hz and 4k together, but let's be honest, no one has the horsepower to do both. Besides 100-120hz is smooth enough and the maximum that allows for strobing (I haven't seen strobed 144 yet).

Desktop animations, videos, etc. will be smoother. The GPU power to shitpost @ 4k144hz may have been there but the monitor standard didn't support the bandwidth.

>What is the point?
Are you saying you don't play older games? Older games are able to run at 4K @ 120hz easily.

I agree wholeheartedly.
There is no reason to go above 1080p right now. It is the sweet spot in gaming, television, and streaming/disc.

You want to go 144hz to play your fps then fine, but it crazy to go 4k at high refresh to play an fps.

Now once atsc 3.0, hdmi 2.1, new display port standards and hdr with adaptive refresh become more general standards, then fine buy your meme monitor. Not now.

im not buying a LCD again until its 500hz. happy with my 120hz crt until then and even think I might take CRT smooth image and no screen tearing over LCD shit even at 500hz 10ms (oh btw all 1ms lcd are 10-15ms in on crosshair.) lewl.

I had a 1080p tv in 2001 it was a massive sony waga crt. its literally like 2decade old tech while I agree chasing resolution is dumb if you think its a sweet spot your retarded. at least pick some thing contrarian like 720p or 900p or 4:3. if your going to spout "my shit is fine"

1080ti does 60fps at 4k for most games ive tried. 2080ti should be around 85fps average most likely.

desu new games can easly run at 120hz 4k if you have just 2x SLI and don't run ultra and dumb things like AA.

lol you forget that in presentation they didn't show a single benchmark? you realise all the comparisons to pascal where the Quadro cards not the GeForce ones?


2080ti could literally get 5fps more than 1080ti no one knows yet.

That too.

You run those monitors at 98 or 120 Hz.

You forgot to mention it's at 720p or lower on a 19" screen. No thanks, after playing on a 27" monitor it's impossible to go back to a CRT.

It makes text readable as you scroll.

This is how first gen tech always is. The first CD players used only 14-bit DAC with poor wave reconstruction. The first 120Hz LCDs were all 6-bit with dithering instead of 8-bit.

>want to play 4k60 for youre cinematic movie-game? Go ahead.
>Want to play 1080 144hz for your online fps?
Do it

Most modern games are "meant" to be run at 1440p nowadays, if you play at 1080p you lose out on a lot of detail. It would be like playing Crysis on 800x600 or something.

>HDMI 2.1 isn't out yet as far as I am aware, though should hit the market soon with a bandwidth of 42.6 Gbit/s

January at CES all the good TVs will now be also usable as high end monitors.

Unfortunately the new Nvidias don't support HDMI 2.1 so you'll have to settle for either navi or a club3d DP1.4 adapter to HDMI2.1 that introduces lag.

You do realize that there are games other than current year AAA titles, right?

The monitors should be able to do 4K 120 Hz without chroma subsampling tho.

>and dumb things like AA
God I hate the "4K doesn't need AA" meme.
t. someone with a 27" 4K monitor.

You want high-ultra settings especially if you run 4K.

I'm uncertain about the new Ti, it seems to be just a better 4K60 GPU, but I barely play and if I play 90s strategy and Mass Effect 3 MP. How do I justify a 1300 € GPU that doesn't even handle 4K90?

does the 1080ti handle 144fps at 1440p or nah? On high not ultra

1080p peasant. My goodness imagine being satisfied with an ancient resolution.

What i dont get is why they didnt multiplex it

>what do you mean you can also play older games on this monitor?
>what do you mean a 1080ti or less can run specific games run 4k 144hz?!

>I'm too poor for 4k so it's meme

okay user. I'll sit here enjoying my 4k 43" that i've been using now for over 2 years while you cry on Jow Forums

I've been gayming on 4K since 2015. I was one of the XB280 early adopters. It was horrible in the beginning, but I don't regret anything. I stilll need the fastest card, but a 1080 Ti is much of a difference to a 780 Ti. I don't have VRAM issues and get at least 50 FPS now.

It doesn't even do 144 at 1080p

Part of what one of the first anons said, Nvidia will never ever want to use HDMI 2.1. I think mostly because HDMI 2.1 actually has VRR built in and that would totally ruin their G sync upcharge jewery on monitors.

What happened to 120hz? Why is it 144hz? Why is 165hz a thing too?

The absolute best graphics cards can only barely push the latest high fidelity games to 60fps at 4k... at reduced settings. 4k ALONE is still a meme. 2k 144hz is actually viable.

It's called innovation you cunt

Higher refresh rate = better

1080p with a GTX 1070 is all anyone needs.

that is a very weird way to say 1440p

I have a non ti 1080 and it doesn't even come close to doing that. I bought battlefield 5 because it was on sale for $10 the other day and with everything set to "low" I get a stable 75fps. On ultra it dips so low it's unplayable

144Hz was for 3D movies when those were a thing. 3D died but gaming monitors kept using it. 165Hz at 1440p maxes out DP1.2. They're usually factory-overclocked 165Hz panels. For some reason they're Gsync-exclusive.