So what content is exactly available in 4K? I know of:

So what content is exactly available in 4K? I know of:
>some sports
>muh gaems at 30 fps (60 if you have a 1080ti and an efficiently programmed game)
>some porn
>action movies
>some youtube videos
So not much.
No major shows available live in 4K (Game of Thrones season 1 was released in 4K only this year).
Most kinos are not available in 4K, only flicks you could see in theater anyway.
Most games feel better to play at 144Hz and 1440p than 60Hz at 4K.
Most sports are broadcast at 720p, big events mostly 1080p and really big ones at 4K.
So how come people fall for this meme?

Attached: Digital_video_resolutions_(VCD_to_4K).svg.png (1024x540, 15K)

I’m with ya buddy, only in my opinion I rather have 120Hz with IPS instead of old style washed out colors panel the 144Hz come with.
4K is too powerful, macs come with 5K and it is a very nice display, but that video card is probably maxing out running the user interface alone

There are some monitors that have a VA panel running at 144hz 1440p. I have a Pixio New PX277 that I really enjoy.

Too bad plasma panel tech was abandoned, the motion resolution on my Panasonic GT60 still beats any 120Hz TN panel.

Most of the UHD movies sold as 4K are actually 2K masters upscaled to 4K.
Im not joking,that is the only thing that turned me away from 4K.

Even newly released movies? Why would they so that?

Don't get 4k for content.
Movies and streams just give you the same bitrate as with 1080p and it will look awful.
For games, GPUs aren't there yet and the fidelity isn't worth it, go to 1440p for a more noticeable bump.
4k is for content creation.

They are digitally recorded and mastering special effects in 4K usually done in eastern europe and asia is the 10th circle of hell.
Storage,processing power,logistics all major problems.

You can take traditional 35mm movie master and transfer it to 8K no problem.

>Most kinos are not available in 4K, only flicks you could see in theater anyway.
Worth it for the 4K versions of classic kino shot on film, mastered from new 8K transfers. From new stuff, Dunkirk and BR2048.

>I rather have 120Hz with IPS instead of old style washed out colors panel the 144Hz come with.
As a photographer I must have high quality IPS. But I'd also love to have 4K 120Hz monitor with FreeSync for the vidya. Unfortunately such a compromise doesn't exist.
Also at my day job I do have the 5K Macs, putting everything aside the image looks fucking great. Coming back to my 1080p monitors at home feels like going back in time 10 years, image quality-wise.

Because they are literally shot and/or edited at 2K digishit to cut costs. A non-upscaled 4K (or higher) version simply does not exist.

gee what is conspicuously missing from your list of things to do on a 4k monitor
oh right, basically everything else people use a computer for
I have 1 32" 4k monitor at 100% scaling and am able to multitask with 6 fullsize usable tabs open at once, a video, a music player, terminal and more. All that productivity in a neat single monitor form factor.

>As a photographer I must have high quality IPS. But I'd also love to have 4K 120Hz monitor with FreeSync for the vidya. Unfortunately such a compromise doesn't exist.
>Also at my day job I do have the 5K Macs, putting everything aside the image looks fucking great. Coming back to my 1080p monitors at home feels like going back in time 10 years, image quality-wise.

I run a 4k 43" IPS and a 27" 1440p 144hz VA panel.

IPS gives me the res and color accuracy for more color sensitive work. The VA panel gives me a high contrast ratio, high refresh rate, and still decent resolution for gaming, while not as hardware demanding as 4k 100hz+ would be.

>streaming 4k content
lol

Redpill me on plasma tech, why the industry abandoned that tech?

Where the fuck did this "content creation" requires high tech meme come from? What happened to being a photographer/editor with a pentium II and 256 meg of ram? Good goys buy our latest shit that makes no difference, a jpeg looks the same on a 20 dollar monitor as a 5000 dollar one you spergs

Attached: 1527036550756.jpg (675x637, 89K)

Updated the image to show the full 4K with common ratios.

Attached: 4K resolution.png (4096x3072, 227K)

It doesn't even matter if there is no 4k content available. 1080p content will still look better on a 4k screen than a 1080p screen.

Attached: 1525552515.png (615x317, 168K)

Further added common resolutions.

Attached: 4K resolution.png (4096x3072, 248K)

If you're talking about monitors, you're buying solely for gaming and you're sensitive to input latency... yes, you probably want 1440p144. At least until 4K120/4K144 become affordable.
If you do any kind of work with your computer, though, you really want 4K instead. You literally have 4 1080p displays in one, with no bezels. For gaming you can still run at 1080/1440 and upscale.

If you're talking about TVs, it simply doesn't make sense to buy a sub-4K TV anymore. I wouldn't rush out and buy a new one immediately if I had a good 1080p, but if you're in the market for a new TV then this isn't really in question. 4K is the default now.

Nothing. It's just a waste of power. 1920x1200 is the highest you should go.

>the only reason to buy a 4k monitor is media and gaming

Attached: 1524840022522.png (433x511, 39K)

Thats bullshit, things look better at native res.
And most 4k blu rays look worse as they are upscaled from 1080p masters, lord of the rings got a really shite transfer iirc for the 4k release

Virtually none. The """""4K""""" content you see is actually 1080p content. See 4:2:0 chroma sub-sampling.

Attached: Downsample-Feature-Image-1-640x360 (1).jpg (640x360, 43K)

>most 4k blu rays look worse as they are upscaled from 1080p masters
This is simply not true, they almost always use DCI 2k masters, which is higher res than the 1920x800 you get from a 1080p bluray.

Which means a UHD bluray even if it's an upscaled 2k master, will look better than a 1080p bluray.


Also, Lord the rings doesn't have a UHD/4k release, so you're talking pure shit.

By that same logic, the "1080p" content you've been watching for years is actually just 540p.

Pretty much but chroma sub-sampling has worked so well for low res old lcd and crt displays so well we've only just started really see how bad making up 75% of color information on the fly now that things like high-res amoled exist.

Won't stop people from getting memed into $2,000 4K tvs.

>we've only just started really see how bad making up 75% of color information on the fly
This is pure subjective bullshit, anyone who actually owns 4k and has done 1080p/4k back to back comparisons know this is bullshit.

Attached: 1458056596519.png (375x375, 138K)

On lcd displays sure but anything like oled is a different story. The main reason why chroma sub sampling just werks™ is because human perception of color hue is shit and shitty displays that can't even do native 8-bit color reproduction (ie 6-bit + fcr) exacerbate this problem.

Also to get the 4:4:4 video you manually have to transcode a 4:2:0 to 4:4:4 video which can't be done on the fly.

>which can't be done on the fly.
Sure it can, just needs more CPU horsepower than most people can reasonably afford.

Still your logic doesn't follow, you can be as mad as you want about there not being any proper 4k 4:4:4, but 4k 4:2:0 STILL is better than 1080p content, that's the crux of the argument, even if it's not as good as it SHOULD be in your opinion, it's better than the other alternatives.

true

I'm not denying that just trying to point out there there is no true 4K content, that's all. Sure you can watch "4K" diahreah on a 4K display if you want to, I won't stop you.

>1920x1200 is the highest you should go
Is it 2007?

Attached: 1495891780407.jpg (270x320, 19K)

>just trying to point out there there is no true 4K content
While at the same time claiming that the 1080p content (that isn't actually 1080p) is just fine...

Sorry, but you just seem like a pedantic autist and you've simply chosen 4k as your primary issue to focus your autism on for the time being.

Unless you have "8k" content then why bother? afdordable GPUs are still digshit for 4k gaymes too.

What's wrong with 1440p 144hz?

Plenty of GPU horsepower for most games at that res and refresh rate.

1920x1200p is fine, but i'll take 1440p over that, or even 1600p, but since you can't find those (and especially not at a high refresh rate) it's far cheaper and easier to get a high quality 16:9 1440p display.

4k is only really for media viewing or certain professional workflows. Maybe when 4k 144hz becomes possible at a reasonable price and with reasonable quality control i'll change my opinion, but as you pointed out, even with the new generation of GPUs, 4k gaming will be hard to manage decent framerate.

>Unless you have "8k" content then why bother?
What do you mean by this? 4k monitors are irrelevant because we don't have 8k media?

Were 1080p monitors and TVs irrelevant until we had 4k content?

Have you actually watched a 4K movie on a 4K display? I find it hard to believe that you have done so and didn't think it was dramatically superior to 1080p.

To clarify, I completely understand what chroma subsampling is. I'm saying that, as a practical matter, 4K content still looks stunning.

Why do people always refer to the newest and most demanding games when claiming that 4k isn't ready yet? There's years of games that you can run perfectly fine on midrange cards. Also crispy text when web browsing.

Autism aside why aren't blu-rays 4:4:4? Now I feel kinda cheated even though I buy them for like $20.

>tfw blew 1k on 4k monitor and got a 1050 ti instead
Why am I so stupid...

Attached: 1516772477150(1).jpg (124x122, 15K)

Because they'd be massive in size. A 1080p 4:4:4 bluray would be double the size or more of 1080p 4:2:0, and the apparent image quality is very very similar. It's simply not worth it.

For mastering sake, yes, use 4:4:4 since you're doing frame by frame touch ups, but for release? 4:2:0 is more than fine.

I have a 4k monitor and a GTX 960, i just don't do gaming.

How the FUCK is 4:2:0 and 4:4:4 "very similar"? Literally 75% of color information doesn't even exist on 4:2:0. Hardware decoding chroma upscalers have to make shit up so you don't see blocky video and they do a horrible job at it too. Do 99% of people have severe eye damage from staring the sun too much as kids?

You can do frame by frame comparisons and tell the difference, but if I show you a random identical 60 second clip of 24fps 4:4:4 vs 24fps 4:2:0, you'd be hard pressed to tell me which was which.

Why do you think people avoid harcoded subtitles like the plague? There is a difference and if you can't tell then maybe you shouldn't have stared at the sun for 5 minutes straight. Daiz was a savior among the chink toon scene, harcoded subs grew to pandemic levels at one point.

It doesn't even have to be text if at any moment there are 2 very different hues of color or even straight lines you'll notice the artifacts.

>No major shows available live in 4K
Most Netflix and Amazon original content is available in 4K (and often HDR), which comprises the majority of "major shows" people watch these days.

>you'll notice the artifacts
again, in frame by frame comparisons, sure.

But no, 90%+ of people simply wont notice, if you think that means 90% of people have somehow damaged their vision, believe whatever you want I guess.

But the fact remains, 4:4:4 vs 4:2:0 you're simply not going to notice a difference worth a 2x+ increase in data.

If the trade off were less, sure maybe you'd have an argument, but 4:4:4 content is simply too bloated for distributed media, even 4:2:0 bluray content gets compressed further in order to be streamed at a bitrate most people can actually use without constant buffering during playback.

People consider those things "shows"? They're on par with the scify channel movies. What sharknado are we on anyway? I've lost count.

Most people have a better PC than a potato.
4k gaming since 2016.

We need somebody to do a double blind study on this on a samshit amoled display. Even I'm doubting myself now but your statement carries no objective merit either.

>tfw I'm about to buy a tab s3 next month

Attached: whyf.png (540x540, 388K)

So is the chrona subsampling shitposter retarded or does he legitimately not realize movies are heavily compressed in other ways as well? Have you idiots ever heard of h.264/h.265? News flash, you're not watching raw video footage or film footage unless you film it yourself. Everything you watch has various degrees of scaling and interpolation to make the image trick your eyes and brain into seeing a better picture than what's actually there.

Obviously we can't have the raw video because most people don't give a shit about the best quality and just want to pay the lowest price possible. That's why dvds are still sold today. People are so fucking dumb they'll even eat up 1080p upscales to 4k blu-rats and say "wow, look at the quality XD".

It's as some user said a couple months ago in pic related.

Attached: Screenshot_2018-06-10-23-34-49(1)(1).jpg (1141x246, 90K)

Expensive, inefficient, you could argue unreliable because of the prior point, also shorter lived and more prone to burn in than OLED.

>People are so fucking dumb they'll even eat up 1080p upscales to 4k blu-rats and say "wow, look at the quality XD".
Well it IS higher quality...

They have more room on a UHD bluray and it will be higher bitrate compared to the 1080p bluray version. Even if they're both using the same 2k master, the 4k upscale will be less compressed than the 1080p.

Perfect scaling for 720 and 1080 content.

Text. Finer, more readable fonts.
Do you even code?

4:2:0 vs. 4:4:4 is noticeable on text, UIs, CGI and games since they contain a lot more high color contrast edges than photography.

At work we have a shitty old 60" TV as status display, it only does 4:2:0 1080p, so all the fonts are slightly blurry.

And one of the marketing guys has some miracast shit, text in his presentations too are slightly blurry.

Attached: 420-progressive-still.png (307x72, 6K)

Do you actually use pixel doubling/tripling? Or just keep the software/OS/driver/monitor's default scaling algorithm, but it looks better at integer multiples?