LG UNVEILS FIRST OLED TVS TO SUPPORT NVIDIA G-SYNC FOR BIG SCREEN GAMING EXPERIENCE

KINO

Attached: 2019-OLED-TV-with-NVIDIA-G-SYNC_1[1].jpg (1742x1742, 368K)

Other urls found in this thread:

youtube.com/watch?v=KkpXgU-G994
nvidia.com/en-us/geforce/news/lg-gsync-compatible-hdmi-big-screen-gaming/
twitter.com/SFWRedditVideos

>paying an extra $150 for GoySync when Nvidia cards now work with any VRR monitor
Top cuck.

>Oled
Kek.

Based. 1440p g-sync monitor for heavier/competitive games and 4K OLED g-sync for lighter/single player games sounds like the ideal setup for me with a 2080 Ti.

>oled
>gaming

are they retarded?

>LG
Fuck no

No, OLEDs have 0.1ms response times

You retard. It's G-sync over hdmi 2.1. Hdmi 2.1 officially supports a VRR protocol and Nvidia supports it

based retards
the screens literally don't allow for burn-in with auto pixel refresh on still images. been implemented for years now.

Too bad vrr on tvs is all over hdmi, meaning you're shit outta luck until the next nvidia gpus since the rtx line stupidly doesn't support hdmi 2.1. Even with the freesync tvs out there you can only use them with an amd card and you're shit outta luck there if you wanna play at 4k.

HDMI 2.1 is one of those things where certain features are basically backwards implementable which is how Samsung's VRR works on their HDMI 2.0 TVs

With this announcement Nvidia has supposedly has decided to support HDMI forum VRR
VRR can be implemented on older HDMI versions along with other features from HDMI 2.1

Based

Color me surprised. I didn't think backporting features would be possible. Thanks for the heads up.

>defending GoySync
>calling anyone else a retard
LOL

Enjoy paying a premium for this "support" when any TV with VRR support will work.

no need for this when freesync is available. does this mean they conquered burn in though?

Who is paying a premium in this instance?
This announcement applies to TVs that already exist

The TVs were already sold and have been selling for months without this feature you mongoloid. It's a free feature in the form of a firmware update, no one's paying extra money for it, and since the TVs came out in May-ish they've dropped almost $1000 in price

No premium, G-Sync at this point just means G-Sync compatibility from the Nvidia GPU, for HDMI 2.1 ready TVs like the LG C9 that means VRR compatibility with Nvidia GPUs. There will be firmware updates for current line-up, there are no G-Sync modules involved which is what you pay a premium for.

imagine being this blinded by fanboyism that you can't even figure out what the fuck is going on

>OLED
>gaming
It will be incredibly responsive and look amazing, until the HUD and crosshairs get burned in.

With this the C9 is my new holy grail. I'm hoping it drops to sub $1k canadian by next year's black friday/boxing day

>LG
>nvidia
>G-sync
Could you please include even more garbage in that post

>OLED
No thanks. I'll only buy a TV if I can reasonably expect it to work in good condition for at least ten years.
I'm going to see if I can ride this 1080p LED until MicroLED TVs start coming out.

Sweet, I just got my C9 last week.

Attached: IMG_4261.jpg (1921x1440, 973K)

>canadian
Unfortunately your play money is no good. $1000 USD maybe. The C8 was more expensive than the C9 at this point in its lifetime but dropped to $1300 on black friday 2018. Wouldn't be surprised to see the C9 hit 1000 USD

the only thing keeping me from upgrading my current oled is the dreaded panel lottery
i also hate moving shit

it doesnt have goysync module, you can tell by not hearing whiny little fan when you turn 120hz mode on, it is just goyvidia marketing buzzword slapped on device that supports hdmi vrr
if you have enough space to position 55 inch tv to optimal viewing distance for pc use, you can get todays best consumer monitor for 1500usd, which is cheaper than shitty haloing fald monstrosities from acer or asus and with lower input lag on top of it

but just so you wont get overly happy about it, you still need to wait for hdmi 2.1 cards to get 4k/120 and by that time lg will offer new models with 120hz black frame injection, with lower input lag, reportedly also in 49 inch size and with hdmi 2.1 bugs ironed out (try setting up earc now with your htpc...yeah)

Its sad that a TV is now a better monitor than most monitors including BFGDs for gaming, hell, even laptops have had OLED for years now.

Nah it's just everyone started adding G-SYNC label now since nvidia unlocked VRR.

>with 120hz black frame injection
Is it different from strobing? I mean, since it's oled, the actual pixels are glowing, which makes it effectively strobing. That shit will make you want to kys after a hour of playing due to eye strain.

The LG 2019 OLED line-up can do 1440p@120hz. I don't know how good the upscaling is, but, for a TV, it should be good enough. They can even do 4k@120hz, but there's no GPU with HDMI 2.1 at the time to test it.

What's up with this retarded meme? Why can't you just use an appropriate port for a PC display which is DP? DP 1.4 allows 4k@144Hz.

who cares about dumb shit like g sync seriously ?

It doesn’t combat burn in. It has absolutely nothing to do with burn in despite the myths floating about.

The short refresh performed during stand-by every 4 hours of use, simply pulses voltages across the OLEDs to remove any image retention that may have happened during those last four hours, so the screen is clean next time you switch it on.

The hour long refresh that can be done manually (or runs automatically every 2000 hours), recalibrate’s the brightness of your panel back to its optimum by measuring voltages and effectively “burning down” ones that are unusually high, to get an even field across the panel - it can then up the voltage back to full brightness without danger of blowing up the ones that were high. It does this in vertical batches, which is what causes the banding, and why the bands ‘move’ overtime. It is the most dangerous thing (to picture quality) the panel does, as it can be influenced by many outside factors such as room temperature or power cuts, but is a necessary evil as otherwise, over months the panel would just get dimmer and dimmer. It also shortens the lifespan of the panel. This is why you should not be using this function repeatedly, and why Sony officially recommends it only be used once per year.

Attached: 1493986915567.jpg (599x449, 38K)

Well, pixel refresher just ran it's "every 2000 hours" cycle on my C7 earlier tonight, and now I have to contact LG about a panel replacement because it burned a giant rectangle into my screen (which is apparently a known issue)... That's what I've seen it "actually" do.. so, not impressed :(

youtube.com/watch?v=KkpXgU-G994

Already in touch with LG, they want pictures of the issue (but for sharing with reddit, the youtube video does a better job, since I don't have to have the whole display in the frame at once.. but if you want a picture, here's a picture.

Attached: IMG-20190116-123915.jpg (1280x720, 73K)

Because it's not a PC display. It's a TV. It's mainly meant to be connected to cable/consoles/receivers/Blu-ray players. All the manufacturers for the aforementioned devices are part of a forum which can dictate a common interface for them and that's HDMI.
It just so happens that these TVs also make great PC displays for games.

Also, FYI, DP 1.4 can only do 4k@144hz with chroma subsampling for SDR. 4K, 10bit, full-chroma HDR is limited to 98hz. HDMI 2.1, which these TVs have, can already do 4k@120hz with HDR, but no GPU has implemented it, yet l.

Retard

as oled doesnt use backlighting, it turns all pixels off so whole screen turns black, some people notice flickering, much less will do in 120hz mode

1440p upscaled to 4k looks noticeably better than 1080p upscaled to 4k, so upscaling works

>BLIG SCLEEN GAMING EXPLERIENCE

Attached: cvbbmwwe4rzz.png (403x448, 53K)

That still is some weird workaround. Why does OLED have blur issues in the first place? It's the fastest modern display technology.

Image retention and burn-in are the same thing for OLEDs. It's just a matter of severity. Pixel refresh is 100% used to mitigate burn-in by wear leveling the cells. It's not perfect but a lot of people have had no problem using even older models as monitors and new ones like the C9 improved the sub pixel sizes so colors like red and green dont burn in as easily as the old models used to

It completely removes cinemascope burn-in, which makes it more burn-in resistant than a local dimming backlight when mostly watching movies.

doesn't nvidia support freesync now? why ride this dead horse?

it will be more beneficial for low framerate content where stutter can be more visible because of image staying displayed for longer, but it is always nice to have better motion if it doesnt come at cost of too much brightness or input lag increase

they supported it only through displayport, after this update it will work via hdmi

Technically your eyes are the problem, not the display. If you don't want flicker you get blur (at current framerates).

>OLED
>G-Sync
Dead on arrival.

>c7
>buy shitty prototype
>get shitty results

only on Turing tho

>they supported it only through displayport, after this update it will work via hdmi
Source needed. I can see them supporting g-sync over HDMI, but not freesync over HDMI.

It's literally all over the tech news websites.

nvidia.com/en-us/geforce/news/lg-gsync-compatible-hdmi-big-screen-gaming/

They never supported freesync. Freesync is an AMD hardware specific technology that uses adaptive sync on DP and some custom protocol for HDMI on freesync compatible monitors. Nvidia basically ported gsync over to use adaptive sync like freesync, but they're still different technologies.

Gsync over HDMI uses the HDMI 2.1 VRR protocol

You're really thick in the head, aren't you? G-sync does not equal Freesync. Nvidia has left the option on the table to lock support for freesync over HDMI because they're a business interested in making money. The only mention of freesync in your shitty link is in the comments.

Did you just abuse jannie powers to delete my post you little shit? Good luck with that, lol.

>just buy the latest one bro
>no first batch of products was ever close to being a prototype trust me bro

I can't tell if you're trolling, illiterate or retarded. Either way, you did not even bother to read the whole article even after being spoonfed like a toddler. Freesync is an AMD brand, that nvidia doesn't want to associate with, because that would probably diminish their own g-sync brand. Why would they mention Freesync by name in their own press release?

See

So no confirmation that freesync will also work HDMI at this time.

No confirmation that you have an IQ over room temperature value either.

Finally I can get freesync on my 1080 Ti over HDMI. Thanks Nvidia!

Obviously marketing departments are doing their jobs right as people are confused as fuck about buzzwords.

tldr gsync is pretty much same thing as freesync today, driver suite that works on top of opens standards like adaptive sync (vesa) or hdmi variable refresh rate (hdmi consortium)

stop being antisemitic goy

Not yet but if g-sync compatible works I'm sure amd will have it working shortly it's not working already

freesync works via hdmi like 3 years already, when will it work with these specific tvs depends on when amd releases drivers that support them

they have it even on their faq page, it would be retarded for them not to when it was also them (along ms and intel) pushing for vrr on hdmi forum

based retard

>doesn't allow for burn in
OLED "burn in" isn't burn in in the same sense as it was on plasmas, it's burn OUT. Those pixels will only last so many hours. If you left the whole screen on white as an experiment, it wouldn't "burn in" it would just gradually get dimmer and dimmer as the pixels slowly die until they completely die at ~10,000 hours.

in 2019? gsync is dead

>tfw you won't live to see microled TVs and monitors
They aren't even going for full microled panels, they're just planning to use them as local dimming backlights.

2080ti here 4k isn't happening mate I can barely get 30fps at 1440p in some games maxxed even before rtx on shit

What, does this pixel refresh program display a non full screen rectangle for long periods?

Sometimes you have to drop to "very high" which looks almost identical but performs much better.

You'll see microLED TVs, they might not be cheap but they will exist for mere mortals on the same league of current OLEDs, TVs usually are quicker to adopt new tech
Monitors you might as well give up though, it's a dead market and and the consumers in it are ironically some of the biggest luddites

Nope turns out rtx and dxr are just memes atm and not really ready to use even at 1080p
Normal games run 4k 60hz easy but good luck getting 120hz out of aaa+ vidya at 4k

He said even before RTX on shit

...

too bad they are all too huge to use as a monitor

no 43" option

A 2070 can run most rtx games at 1080p at around 60fps

The 2020 lineup will include a 48" model according to LG