TV

What TV/monitor does Jow Forums have?

Attached: b7v_alt1_1.jpg (700x597, 37K)

Other urls found in this thread:

en.wikipedia.org/wiki/Chroma_subsampling
rtings.com/tv/reviews/tcl/p-series-2017-p607
twitter.com/SFWRedditImages

Just bought this. It's nice desu.

Attached: file.png (1600x880, 1.02M)

I got one of these element 50in 4K smart TV's. Pretty convenient for streaming Plex and stuff

Attached: element-4k-roku-tv-720x720.jpg (720x480, 45K)

I bought a used UN65JS8500 and after calibrating the display I have to say it's really nice

I have a 55" curved oled LG (the 1080p one) and it was one of the better purchases I've made.

>4k
dipshit

>inb4 braindead luma shills raid this thread

Attached: Colorcomp.jpg (1236x616, 212K)

This random 4k 55" TV I won at the office Christmas party. Not sure about the name but you can't beat free.

Attached: haier-55-4k-uhd-tv-rcwilley-image1_500[1].jpg (547x372, 28K)

Yeah, it's 4K with a 60hz panel

What 8K content do you have for it or do you just love staring at smeared shit on your screen?

Attached: Downsample-Feature-Image-1-640x360.jpg (640x360, 43K)

I mostly stream 4K stuff on it. I've streamed some 8K stuff from YouTube and was quite impressed

what is this autism on about

Literally 75% of all color pixels are missing from 99% of video out there to save space and lower hw decoder costs. All that missing color has to be made up on the fly which results in noticeable artifacts as the resolution increases. However none of that happens when you scale video down to 50% (ie 4K on a 1080p monitor or 8K on a 4K monitor. Also there's no way to avoid this problem as even blu-rays have video stored in 4:2:0 CSS.

en.wikipedia.org/wiki/Chroma_subsampling

Some autistic guy whining about chroma subsampling in movies despite it being the standard for roughly 70 years,being baked into the original NTSC standard. No idea why people are only kicking up a fuss now, although it's mainly one guy spamming every TV thread on Jow Forums for the past month or so.

>which results in noticeable artifacts
Haven't seen anything on my TV.

Those """4k""" RGBW LG screens are fucking dumpster fires.
I seriously hope nobody bought them.

Why do so people on Jow Forums not know how scaling works on LCDs? 1:4 or 4:1 pixel scaling doesn't exist in the vast majority of cases. Almost all upscaling and downscaling uses some kind of filtering or post-processing, either bilinear or Lanczos usually. Look through the MadVR options if you want more info. Downscaling a 3840x2160 4:2:0 image WILL NOT result in a 1920x1080 4:4:4 image. Just as upscaling a Bluray on a UHD doesn't result in the TV just "squaring" the pixels.

He does have some merit though, CSS on NTSC worked because the resolution was so low and color representation on CRT was so shit it didn't matter to most people. As you increase the resolution this becomes problematic as pixelation becomes less and less noticeable and color representation more accurate.

A 480p 40" CRT TV has a PPI of about 24
A 4K 40" OLED TV has a PPI of about 100

Now compact this with the fact that most "HD" channels are in 720p and many 4K bou-rays are just upscaled 1080p 4:2:0 CSS rips and you have an absolute clusterfuck of poor video quality.

I think the biggest problem here is getting 4:4:4 video content to prove my point, I'm not aware of any 4K 4:4:4 video sources.

So does this mean we have to encode 4:4:4 1080p rips from 4K video to get the most 1337 experience?

PC monitors: ASUS VG248QE, Philips 277E6EDAD

TV: Panasonic TC-L47E50

Attached: 54746.jpg (590x395, 19K)

>However none of that happens when you scale video down to 50%
False. The artifacts you're talking about messes with the downscaling, so it's literally impossible to get a better picture by it.

Let's say you have two displays: one 55" 1080p and a 55" 2160p.
When you watch 4K 4:2:0 video on the 2160p display, you're getting the same color resolution you'd get on the downsampled image, except you get a shitton more sharpness and detail.
Doesn't matter if you're looking at them from 5 meters tho.

AOC I2476VWM

Had it for about a year and a half, paid $180 for it. Excellent value.

Attached: 71C1gb8IzqL._SL1200_.jpg (1200x1000, 160K)

$200 black Friday 1080p Emerson lcd TV.

Did find a $300 used BenQ HT3050 1080p projector. I could get double if I were to sell it on ebay.

Not him but that doesn't make any sense. 2160p 420 chroma sub-sampled video has 1080p color video inside it and 4k greyscale video.

When you play said video on a 2160p screen the 1080p color video inside it is up-scaled to 2160p res and the 2160p greyscale video is left at 100% (ie no scaling)

So if you played said video on a 1080p screen the 1080p color video inside it ia left at 100% (ie no scaling and the 2160p greyscale video is down-scaled to 50%, right?

Attached: 1521159202722.jpg (1651x1100, 862K)

>monitor
Dell U2414H
Pretty decent panel, very good for gaming because the overall lag is very low along with a high quality IPS display. Flicker free of course. Really small bezels.
It has some issues sadly.
DisplayPort daisy chaining doesn't work reliably (or at all).
The EDID is fucked up to the point that both nvidia and amd vga's set up limited color range as there are only TV primary resolutions added. So unless you switch it back to full colors, the display is very pale. You can rewrite the EDID with CRU so this does not happen (on the PC, not on the display).
When it's plugged in, but not in use (for eg only the TV is active), it keeps turning on and off, showing the NO INPUT box for a couple of seconds every couple of minutes, pretty annoying in a dark room.

>TV
LG 55C7V
It's fucking awesome.
OLED. Acceptable overall lag for gaming.
The WebOS is very good. There's a dedicated netflix button on the remote, very handy.
I can't say anything bad about it. No signs of any permanent burn in or image retention, even though I'm using it almost every day for 8 months now, mostly for gaming.

I have a sj800 and got one of my relatives to buy a uj750

That sounds about right.
So you lose image information by down-scaling the 2160p greyscale video.
What doesn't make any sense?

LG 43UD79-B

Attached: 2018-03-14 18.11.38.jpg (2689x1871, 964K)

Attached: 1522778262463.png (1953x1077, 1.62M)

Keep in mind 4K 60hz 4:4:4 really pushes the upper boundaries for HDMI, so you need a very good quality cable, preferably a premium certified HDMI cable with a confirmation holo-tag, otherwise you get random blackouts, or frame drops, or you cannot even handshake with 4:4:4 set.
If it barely works, disabling audio in the nvidia control panel under "Set Up Digital Audio" frees up some bandwith.

I'm surprised you can specifically disable HDMI audio, but displayport has no option available.

Sorry I flunked middle school and did a lot of downers. It's a miracle I can work in a warehouse desu.

Doesn't the yt chroma sub sampling transcoding make this comparison invalid?

If that were the case, wouldn't the 4k to 1080p crop look identical to the native 1080p crop?

shit man idk

Attached: 1519186398840.jpg (480x336, 23K)

Yeah, you cannot disable it on Displayport, but you probably don't even need that little extra bandwith as displayport was always way ahead of HDMI, and it can easily throughput 4K 60hz 4:4:4 on slimmer cables than premium certified hdmi.
Sadly you don't get displayport on TV's

shit i found in a garbage bin

Attached: 1479846630997.jpg (3264x2448, 1.36M)

Yeah, youtube looks utter shit, you can easily tell the difference between 2160p and 1080p on a 1080p display.
Also 4K streams don't look that good on a 4K display, you need 8K streams for that.
But that's not because of the croma subsampling, that's because they're using pretty low bitrates sadly.

>Doesn't the yt chroma sub sampling transcoding make this comparison invalid?
Yeah, with streaming the bitrates are so low chroma subsampling is the least if your worries.
The discussion is only relevant when talking about 4K blurays, and the DRM involved makes it almost impossible to play on a 1080p screen. So basically your only option is to rip a 4k Bluray with MakeMKV (which is pretty flaky still) and transcode it to 1080p, which depending on the program may not actually convert 4:2:0 to 4:4:4 cleanly. Also you will need to tonemap HDR to SDR seeing as how almost no 1080p screens are true HDR10. So not only would you be losing 75% of the luma information, but you're downsampling HDR to SDR. If you ask me it's not worth it to pay $25-30 for a 4K HDR 4:2:0 Bluray only to watch it in 1080p SDR 4:4:4, especially considering the time investment.

Yup, especially when you take into account that 4:2:0 1080p encodes are everywhere and 1/3rd of the file size.

So we're fucked even if we used 4k pro-res shit for streaming...

Thought we'd have cheap 1PB HDD and affordable 10Gbps ISP by now mang.

Attached: USad.jpg (700x632, 197K)

some shitty samsung 55" 1080p 2014 model thats full of dark blobs. never samsung again

Get with the times, chromaposter.

Attached: shoo shoo chromaposter.png (822x1320, 374K)

Attached: retard.png (631x414, 222K)

tv/monitor = tcl p 605

really hard to argue against it or for anything else.

luma shills in full force today

Attached: typical_luma_shill.png (709x858, 30K)

You're literally the only one parroting that bullshit.

1080p 4:4:4 looks great compared directly to 1080p 4:2:0 (Bluray quality)

BUT, 1080p 4:4:4 directly compared to 4k 4:2:0, the 1080p, no matter how good it is, wont look better than the 4k.


The evidence has been posted many times, you can ignore direct screenshot comparisons all you want, but at the end of the day, it's pretty obvious to anyone not biased (as you obviously are).

a 20" 2008 iMac for my main monitor and a 24" PlayStation TV that's generally used with my HTPC but I switch inputs and use it as a secondary monitor when I need one

And 8K 4:2:0 looks better than 4K 4:2:0 on a 4K display. If luma mattered so much we'd get blu-rays in 4:1:1

because 8k 4:2:0 is pretty much 4k 4:4:4.

You basically just repeated what I said, but you replaced 1080p and 4k with 4k and 8k.

look man all this boils down to is 4K isn't worth it when there's little 4k 4:2:0 content NOT upscaled from 1080p 4:2:0 rips much less 8K content

No, the point is 4k looks better than 1080p, even if it's 1080p 4:4:4.


No one really gives a fuck about 4:4:4


We didn't care when 1080p came around, why should we care now with 4k?

>If luma mattered so much we'd get blu-rays in 4:1:1
Because 4:2:0 has the same bitrate as 4:1:1 but looks better. That's all this is about, compression. 4:4:4 increases bandwidth and the filesize too much for minimal gains in picture quality.

Monitor: Acer XB280 HK 4K GSync
TV: Samsung KU6079 50" 4K LED

pretty nice for a midrange tv, very low input lag

>TFW (You) are braindead

Attached: 4k.jpg (1454x1406, 103K)

I guess, you got me there. People are still unironically watching SD TV on 4K TVS. Can't wait for 16K TVs and people watching VHS tapes on them

Wow, this is going to magically replace the 75% missing chroma values with 100% accuracy.

He's talking about the content, not the signal sent to the TV/Monitor.

The content being 4k streamed video, and 4k UHD blurays, both of which are 4:2:0, not 4:4:4.

>much less 8K content
Why are you still under the mistaken impression that 8K 4:2:0 downsampled to 4K is 4:4:4? Or that 4K 4:2:0 iss 1080p 4:4:4? That's not how upscaling and downscaling work.

Does 4:4:4 blu ray even exist? Bluray is still much better quality than any streamed content.

In the meanwhile, I'll enjoy my rip collection in 4K.
>TFW Stargate only in 480p

>TV
A "Digital Lifestyles" 42" 1080p LCD tv that I got in 2008 when Newegg ran a firesale on them. It shows up as a "LG Panel" when I plug my computer into it. It's not the greatest TV out there by a long stretch but considering the price I paid and that it's lasted for over 10 years now I'm pleased with it.

>Monitor
Yamakasi 27" 1440p IPS gookshit monitor that I bought on ebay and had shipped direct from South Korea about 6 years ago. Build quality is meh -- it has a few stuck pixels but they're only really visible when the screen is completely black, it also fails to recognize video feed from my computer sometimes when I first turn the computer on so then I have to turn the monitor on and off repeatedly until it works properly. I've been expecting it to crap out completely for the last couple of years but it just refuses to die...

It actually is, though you'll be better off capturing native 1080p 4:4:4 instead of trying to encode 4k 4:2:0 into 1080p 4:4:4

Except it is at least for chroma values. There's literally 1080p color video inside 4K video.

Good blacks for only $599

Attached: 1_P6_front_7.png (720x480, 473K)

No, he's talking about taking a 4k bluray that is encoded at 4:2:0, then re-encoding at 1080p to 4:4:4 since you have 4x the information from the 4k rip.

It technically works, but simply isn't worth the effort, and barely looks better than 1080p 4:2:0. Let alone 4k 4:2:0, which simply looks better.

>tfw this is actually the biggest reason why 4K video capture on phones is a thing now

There's no such thing as a good black.

No, 4:4:4 video content largely doesn't exist, 99% of digital video cameras, even professional ones, store video in 4:2:0 or at the most 4:2:2, making his shitposting pointless. The only real way to get a 100% RGB signal is PC video games. And even then you'll probably get aliasing artifacts which means you need MSAA or SSAA to get a perfect image.

>99% of digital video cameras, even professional ones
I wouldn't go that far, it's become fairly standard for most of the 4k res prosumer and real pro cameras out there to do 1080p 4:4:4 now.

That being said, 1080p 4:4:4 is almost never released and is used for mastering.

Same with the few 4k+ 4:4:4 cameras out there. The content never gets released in 4:4:4.

The " 72 local dimming zones" as they call them turn off automatically to produce good blacks.

Attached: 20180401_134817[1].jpg (4032x3024, 1.7M)

Your player is likely upscaling it to 2160p first then downscaling it to fit your screen. Again, LCDs and media players don't do perfect pixel scaling, they use bilinear filtering or something similar. You're not getting a 4:1 image unless you manually transcode it.

Most VA panels can do decent black levels in a dark room. That's not exactly something new.

>unless you manually transcode it.

which is exactly what he's suggesting.

he's a moron.

I saw this in action, it looks utter shit.
Things on a black screen has a stupid glow on the black. I mean I wasn't expecting anything else.

>That's not exactly something new.

Good to know, I also realized that Dolby vision is better than hdr 10.

>stupid glow on the black
my tv doesn't have that problem

Got this brand new for $200 last month, best snag I've had in a while.

A-are you guys sure?....

Attached: 1518823788596.jpg (665x574, 29K)

>Dolby vision is better than hdr 10
in what sense?

Your TV doesn't have the specs to matter in either case. It barely meets the HDR spec.

Dolby Vision is more future proof since it supports up to 12 bit color depth, but your TV is using an 8-bit + FRC panel, and while dolby vision supports up to 10000 nit peak brigtness, I bet your TV doesn't even achieve more than 800 nit peak brightness. less than 1/10th of what Dolby vision supports.

>my tv with that technology is a miracle that doesn't do the thing that technology does

There are some high end scalers that might treat it properly, but I don't know of any that exist in currently produced TVs/bluray players

So it's something we might see in the future, but as of right now, I don't think it exists in consumer devices.

It says its 12-bit 60hz

Attached: rds.png (1380x196, 35K)

It's called OLED faggot

That's the output of your GPU, it doesn't mean your display can actually use that information unfortunately.

> it doesn't mean your display can actually use that information unfortunately.

that is complete and utter bullshit , user. What TV do you have?

Yeah it's pretty cool. What streaming apps do you like to use?

I really only use Netflix.

I have it hooked up to my computer so I play whatever I want.

It's not, the things you're reasing in the AMD control panel are your GPU settings, not your monitor settings. I know this because my monitor is only 8-bit but my AMD GPU can output a 10-bit or 12--bit signal as well. Granted, your TV can USE it, after all I think the other guy mentioned it's 8-bit+FRC which means it "simulates" 10-bit RGB with dithering.

The same is true with setting RGB to full or limited. That only controls the signal your GPU sends, it doesn't control whether your TV or monitor itself is set to full or limited.

When ever I play Rogue One in hdr and switch to dolby vision the reds and whites look more realistic.

It can say whatever it wants.

Your TV supports 8 bit + FRC, for ~1.07 billion colors.

12 bit would display 68 billion colors. Sorry, your TV doesn't even come close to that.

Attached: 96.png (830x654, 17K)

TCL doesn't sell any native 10 bit panels, let alone native 12 bit.

Attached: 2018-04-06 16_11_02.png (678x352, 16K)

rtings.com/tv/reviews/tcl/p-series-2017-p607
It's a good panel for the price, just not true 10-bit. Rather it uses FRC, or temporal dithering, to display more colors than it would normally be capable of. Almost all displays use this, true 10-bit displays are very expensive and 12-bit displays that are 100% compatible with Dolby Vision don't really exist.

Attached: tcl.png (1300x1157, 547K)

money well spent then.

anyway i'm gonna get drunk and watch Mindhunter in fake dolby vision.

Attached: 20180406_154741.jpg (3640x2540, 1.63M)

>12-bit displays that are 100% compatible with Dolby Vision don't really exist
Pretty much, the best you can get at the moment is a native 10 bit OLED panel

Sony has a decent one (PVM-X550) for only about $30,000

I just got a 55 inch X900E last week on a pretty good deal(paid $750 before tax). I had previously been waiting for the new TCL 6 series to come out and I might get buyers remorse when it does, but I liked this one a lot more than the P series.

....I do wish I could have gotten an OLED though. Just can't justify it with my other expenses right now.

Android TV seems to kind of suck, but Plex seems to work OK. I have some old hardware laying around, debating building a Plex server rather than buying a 4K HDR capable Bluray player.

Attached: XBR-55X900E-3.jpg (1100x1100, 117K)

Yes, it's an 8-bit panel with FRC, so it can display 10-bits.

Even your image says that, just look up

>Color depth: 10 bit

Mindhunter season 2 when?

Sony 49XE7005, the 2017 model.

Great image, everything else is shit tho. At least I paid like 70% of the original price, got it on sales.

Great 4K rendering, great colours, perfect for watching 4K movies and 4k web content.
Browser sucks, apps suck, menus are a mess, the remote is basic.

Attached: 626291f426e8771c8201a905b4168234.jpg (1200x791, 114K)

TVs: Pioneer Kuro 9g (good). Sony w805 (Unplugged. Garbage like any VA-panel).
Monitors: Dell U2412M. Some Asus 144 hz tn garbage.

FRC means it displays 10-bits at half-rate, so in 60Hz mode it dithers at 30Hz, and at 120Hz it might dither at 60Hz. So while it's much better than a vanilla 8-bit display its not "true" 10-bit. This is the same tech that those old shitty 6-bit TN monitors used to use, only the technology's probably improved a lot since then.