8 BIT FRC PANELS ARE NOT THE SAME AS TRUE 10BIT PANELS!

8 BIT FRC PANELS ARE NOT THE SAME AS TRUE 10BIT PANELS!

FUCK OFF WITH THAT BULLSHIT

>Side note where are all the 4k, HDR10, true 10bit, G-sync, 4:4:4 monitors?

Attached: 1766257102c5ce66a9.jpg (980x553, 134K)

Other urls found in this thread:

youtube.com/watch?v=e4UcgKKl2J4
rtings.com/tv/learn/chroma-subsampling
displayspecifications.com
twitter.com/NSFWRedditGif

>Side note where are all the 4k, HDR10, true 10bit, G-sync, 4:4:4 monitors?
In the box of incompatible gaymurfaggotry, mostly.
As for professionals, those screen cost more than some cars.

We were promised 4k, G-sync, HDR10 monitors like two years ago. Where the fuck are they, consoles are currently outdoing pc in terms of technology. That's disturbing.

>G-sync

>consoles are currently outdoing pc in terms of technology.
are you actually delusional?
nothing consoles have can't be had in better on pc

>muh 10bit
>muh anime
>muh mental illness

>consoles are currently outdoing pc in terms of technology
what the fuck are you talking about?
there's nothing consoles are ahead at

Let's get real here, in terms of HDR they've beaten the PC Master race without question. They've had it for a whole two years longer than us in terms of gaming and being mainstream.

HDR, VR. Even the fake 4k looks just as good as regular 4k.

>not knowing why anime is encoded in 10bit

Attached: 1527531677785.jpg (212x249, 11K)

>They've had it for a whole two years longer than us in terms of gaming and being mainstream.

Attached: 1527892521429.jpg (336x442, 107K)

>HDR
you mean the standard?
cause that's supported all around in PC world. In fact, we've had 10bit color depth for ages
>VR
are you kidding? The only "contender" is PSVR and it's horrendous due to low resolution and outdated tracking and input methods
>Even the fake 4k looks just as good as regular 4k
imagine having your brain damaged badly enough to think upscaling matches rendering at native resolution
besides, you said "better". Pretending you're almost as good isn't "better"

PCs have had "HDR" support forever.

Even you picture shows there's literally no difference between 8 bit and 10 bit.

>gets 4k, HDR10, true 10bit, G-sync, 4:4:4 monitor
>watches YouTube

The biggest issue is that LCD panels dominate the desktop space and most of them are 9-10 stops(1000:1). HDR requires 20000:1 to meet spec and the best you are going to do with an VA LCD panel is 5000:1. So what you get is OEMs using subpar panels with expensive local dimming implementation as a substitute and on top of all that a lot of them are edge lit, compounding the uniformity issues.

no problem with the right quality settings.

Attached: 4k.png (809x117, 41K)

> his monitor is 8bit only so he can't tell the difference
look at this faggot

joke aside, didn't YT actually plan to have 4k videos?

Do you think they don't already? This isn't 2012 it's 2018.

No idea, I've just never come across a 4k video.

>G-sync
how to spot the idiot.

What's 4:4:4 monitor supposed to mean? We holodeck now?

4k is bust
check this out
youtube.com/watch?v=e4UcgKKl2J4

>Jow Forums -Technology
/v/edditors and consumerism really ruined this board.

>how to spot the idiot

I you going to elaborate on that? Or do you just have no idea what you're talking about?

>it max shows the HD option

you wot

Attached: ass.png (306x440, 146K)

he's using a shit tier browser

Most TV panels are 4k, 10bit. Hdr implementation still isn't up to spec but still look great on mid to high end TVs. Around winter ornsoring you can typically find last gen TVs at a discount. Recently got the sony X900E for 800 bucks. Nice price and it looks great. Makes any monitor I've seen look like poo

The Samsung q9fn with local dimming hits 19000:1 and TV VA panels are better than ones found in monitors which typically have a real static contrast ratio of 2500:1. TV panels are at least twice as good in just static contrast

>TV VA panels are better than ones found in monitors

Attached: 1528480167018.jpg (334x334, 155K)

By leaps and bounds. Media consumption on PC is a joke

>tvs are better than professional grade monitors

Attached: 1528405684518.jpg (750x1000, 54K)

It's only showing QUHD for you though.

Reminder that AMD dithers output and Nvidia doesn't, which means that you will get pronounced banding on Nvidia and almost nothing with AMD.

Yes.

The hardware yea. Professional displays have better calibration out of box and extra features like an editable LUT. And better quality control and support per unit. But TVs still have better panels. It's outright superior tech with looser quality and fewer features that normal consumers dont need.

>10bit color depth != HDR
10 bit is a requirement of HDR

Who the hell said anything about professional grade monitors? In general TVs can have better panels than most of the stuff you find in a comparable monitor. I'd much rather have a 40" 4k VA panel TV than for $450 than a 32inch IPS 4k monitor that does less for the same price

The thing is, I can use those "superior" TVs with my PC anytime. In fact, I can use those TVs and my monitors, all at the same time, so I don't see how that's a drawback for PCs.

No one is really arguing about using PCs vs something like a smart TV for media. It's mainly just a discussion about TVs vs monitors which are just both displays that you can hook a PC to.

Yeah, I was kinda going against the guy saying consoles were better than PCs when it came to HDR.

>professionals color grade films on displays worse than what consumers will watch the film on in their home

Attached: snap face.jpg (600x580, 284K)

>.jpg

Attached: digitalisfototanfolyam_hu_16okt.jpg (848x565, 43K)

Actually I would agree with that guy. Windows is currently limited to HDR10 and you need a recent graphics card to do that. If you want proper 4k HDR on netflix well you need a recent CPU to support the DRM. Want 4k HDR blu-ray well fork over the $140 for the drive and software along with the proper CPU and GPU. It's not really a matter that a PC can't do HDR stuff but for alot of people using a console or other smart device would just be easier to access HDR content.
I have a 4k HDR TV and a PC connected to it and what I found is if I have HDR content it's just easier to let the built in Android TV access a share on the PC and play the content from there rather than having to mess with windows

>4k HDR blu-ray
Who the fuck uses optical media in 2018? Also, Netflix is irrelevant for me as everything is available online.
Now, I agree that it is much more convenient to use the consoles or Android TV to watch stuff in HDR.

4K Blu-ray is really the only way to get non-bitstarved versions of movies as streaming often only serves 4k at 15-20mbps which is bit-starved to hell

Marketed to cashed up retards
All I want is a stylish 27" prosumer 4k screen with 120hz freesync and real hdr10 1000nits+ with decent blacks low lag but they all cost stupid $ well over 1k going on for 5+ years now

>10 bit is a requirement of HDR
But 8bit+dithering can achieve that requirement.

If your buying a HDR panel with 8bit+FRC your going to have a trash HDR experience anyway.

>10-bit
>rendered onto 8-bit jpeg.
it's not like the leftmost 2 are accurate either

You're not going to notice the difference, except maybe in side by side comparisons assuming all else is equal.
Visual differences between 8bit and 10bit are really subtle, FRC blurs it even more.

good point.

Don't know how Nvidia buyers keep letting themselves get scammed like this.

>I-I CAN DO IT WHENEVER I WANT
>never does

Sure if you had a 10bit panel and a 8bit-frc panel with every thing else equal sure you'll probably never notice the difference. The point was if a TV is cheap enough to be using 8bit-FRC it's going to be bad at HDR as HDR is about more than just panel color depth

>I'm too dumb to pirate

Attached: h8rHLHo.png (684x1216, 980K)

who makes these pictures

>The point was if a TV is cheap enough to be using 8bit-FRC it's going to be bad at HDR as HDR is about more than just panel color depth
No. The contrast ratios are cheaper to implement than true 10bit panels, especially on the size scales that TVs get to.

There's a reason why the 5k iMacs use 8bit+FRC, because they'd be twice the price with a native 10bit panel.

4:4:4 means "no subsampling", as opposed to 4:2:2 or 4:2:1 or something shitty.
rtings.com/tv/learn/chroma-subsampling

Dithered-8 bit and 10 bit look almost identical according to this

Hdr is a shit show on the pc.
drm mixed with windows not wanting to do it, mixed with most of the displays that can do 10 bit and hdr don't have the bandwidth to do it 4k
hell I will never buy a bluray drive for video again, this shit was a nightmare and half my videos won't play due to drm, I can literally rip them and have an easier time playing them then playing them on disc even with the software tax.

just not worth the effort to download 50+gb of data

I have my monitor in 8 bit due to 4k 4:4:4 and 10bit not being compatible with each other though hdmi

this is more his monitor is so shit he cant see the difference then his monitor is 8 bit.

they look near identical in real life too. implementation is the sticking point, really you just want a smooth transition from colors and no banding, be with 10 bit or 8 bit, for media consumption, quality does not matter the same way it does for professional use.

Most of your mid range TVs have true 10 bit panels my X900E has a true 10 bit panel and if you look around on displayspecifications.com the lower range TVs that have 8bit+FRC hell the x900e even has a 120hz panel


>displays that can do 10 bit and hdr don't have the bandwidth to do it 4k
No TV currently can do HDR@4K@60hz with 4:4:4 as HDMI 2.0 can't do it. Not that is needed as all 4k content is done in 4:2:0 so having the screen at 4:2:0 isn't the end of the world

>Even the fake 4k looks just as good as regular 4k
In no universe could this ever be the case. If you want an upscaler, you have unlimited options right now, for both pc and console, soft and hard.

except you can plug things in that do output at 4:4:4 10 bit and hdr at 4k

now weather the games/computer do it well is another story altogether but you are able to do it,

hdmi 2.1 can't come soon enough.

Call me when TVs support DisplayPort.

>2018
>doesn't have a 1000 up/down fiber connection
>doesn't have multi terabytes of HDD space for media
>can't be bothered to take a couple minutes to browse and queue up some torrents
ISHYGDDT

Attached: president-ishygddt-598b9fe59f4dc.png (671x819, 812K)

Unless you’re encoding hdr is just to fuck with people.

>being this naive

Attached: 1528520634868.png (396x206, 169K)

>G-sync,

when i'm willing to pirate something like a movie, I either don't give 1 fuck about quality and just want to get it in a way that won't cause issues, and if i'm willing to actually get something physically, its something I like enough i'm willing to pay for.

I dont have the space to piss away 50-128gb of space for a movie I only like enough to pirate.

Takes me a few seconds to open ptp/btn, click on the remux torrent and start watching (seq DL is a pretty nice feature). Buying/renting the BDs is such a waste of time in comparison.

again, its either shit I could not give 1 fuck about, just have a passing interest and its not streaming legally, or its something I want to own, there is next to no inbetween.

Highlight the file and use the delete key or right click and find the delete file option. Don't forget to clean your recycle bin periodically.
I hope it helps!

You can watch YT or YIFY for all I care, but was factually incorrect.

meh, he's not wrong, for the most part all that's easily findable are encodes that are sub 10gb, usually sub 5gb so they can fit on a free mega download budget, depending on who does the encode, it may be better then 1080p, it may not be.

that said, is the current 4k hdr hdcp cracked yet or not, I dont keep up with this shit at all so I have no idea.

>0.044USD per GB
128gb = 5.632USD per movie VS 50 USD per physical copy.

have never paid more then 10$ for a physical copy of any movie. and the 4~$ I pay extra is worth it just because of the bullshit hoops I have to jump through to get the movie pirated.

>bullshit hoops I have to jump through to get the movie pirated.

Better than the bullshit hoops of DRM for sure, you don't have any guarantee that any of your copies will be working in the future, just like divX in the past.

>Jow Forumssync belongs in Jow Forums

No, I won't spoonfeed you how to find remuxes, m2ts and quality encodes, so you can stop posting.

true, bought the die hard movie series on blu ray a while back, got some software to play it on the pc, shit would not play, however I was able to easily rip it, strip the drm and play it with no issue.

learned the lesson that way at least, don't pay for software that deals with drm.

so the new hdcp is not cracked yet, good to know.

PQ wise the 49X900F is better than any monitor

Interesting, this explains alot

>BDs are the only way to get non-bitstarved content
Good post.

>the absolute state of Jow Forums

Attached: 1548573758384.jpg (427x450, 20K)

>27" 4k screen
>not a 24" 4k screen

how did you go from 10bit to HDR?

All monitors are 4:4:4 you massive retard

Next Week im getting 3 24" flanders scientific for my VT work

Yeah, let's re-compress 8bit 420 RAWs to 10bit 444 because reasons.

>4K Blu-ray is really the only way to get non-bitstarved versions of movies
You don't need the actual physical BD disks to watch the 4K movie, you know? And I'm not talking about streaming.

Name a monitor that can output 4:4:4 60hz 4k, with HDR faggot. You know exactly what I mean, don't try and twist my words.

>unless you are just literally retarded

Then you gotta live with the consequences of official media consumption, like the shoved-down-your-throat DRM and other limitations.
Meanwhile I can play my movie at the best possible quality whenever I want, wherever I want, forever and still own it officially if I really like it.

>g-sync

Found your problem

There are 4k, HDR10 (not FRC), Freesync monitors aren't there?
But they are 60hz so far. I heard some higher hz ones are coming.

But desu, I think 8bit+FRC looks decent enough as long as the panel itself is AHVA or better. Especially combined with AMD's dithering.
I'm guessing it looks shit on Gsync because of your lack of dithering.

You got scammed and should have fell for the better color meme.

10bit x264 is a meme.
i went back to 8butt because i'm not a noob and know how to handle the banding.

LG 27UK650-W and 27UK850-W are just that bumboi.
The only time you can't use 4:4:4 is when you are limited to HDMI 2.0 or older, or a very old version of DisplayPort.
DP 1.2 and up are good enough to run the stuff you're talking about without subchroma sampling.

>Static contrast: 1000:1

Attached: 1528316462354.png (500x552, 98K)

My panel is still 6bit dithered and I never noticed anything that it is lacking.