Freesync on Nvidia GPUs confirmed

nvidia.com/en-us/geforce/news/g-sync-ces-2019-announcements/

>For VRR monitors yet to be validated as G-SYNC Compatible, a new NVIDIA Control Panel option will enable owners to try and switch the tech on - it may work, it may work partly, or it may not work at all.

ALL FREESYNC MONITORS WORKING WITH NVIDIA CARDS STARTING JAN 15TH

Attached: [email protected] (1700x820, 564K)

Other urls found in this thread:

en.wikipedia.org/wiki/FreeSync
twitter.com/NSFWRedditGif

Freesync is AMD's implementation of freesync. Freesync is a type of VRR. "G-sync compatible" != freesync

>all freesnyc monitors

Uh, no: . Only panels they've tested and certified. There's 12 in that list, which will grow in time--but there are fucking hundreds of freesync panels on the market. Blindly assuming it as fact is dangerous, stop.

literally read the OP greentext retard. Any a-sync monitor will be able to turn on VRR with nvidia GPUs, they just don't guarantee it'll work

read the link first before commentating

Attached: 80c.png (645x729, 77K)

find me literally any VRR monitor for sale that isn't branded as freesync or g-sync

A-Sync (Adaptive-Sync) != FreeSync

so that is cool and all but this is admitting they lost vs amd with their expensive gsync implementation vs their freesync which is or became a vesa standard

freesync and gsync are forms of adaptive sync
unless youre talking about adaptive vsync

>so that is cool and all but this is admitting they lost vs amd with their expensive gsync implementation vs their freesync which is or became a vesa standard
good, backed into a corner and forced to adapt to the industry standard and user needs

No, adaptive sync is a VESA protocol that allows the source device to control refresh rate. The original use of adaptive sync was PSR, panel self refresh, used on Intel laptops to stop refreshing the display in low power mode.

>I'm a retard who can't read

Attached: blinking guy.gif (320x376, 616K)

who amd card+ gsync monitor master race here?

>All Compatible displays are tested with GeForce 10 series and RTX 20 series graphics cards, and will be enabled automatically.

Sounds like complete Freesync implementation, but since this will be a default feature, only models that meet minimum variable refresh rate range will be marked.
Is there a database of Freesync activation limits?

hoping it works on my xg2401

Based nvidia levelling the playing field. everyone will have to pay the goysync tax now

About fucking time desu this could've happened 2 years ago. Hope they sacked the idiot in marketing whose bright idea was branding a signalling protocol and charging fees for it. Fucking retard.

cool, but what are the benefits of a freesync/g-sync monitor?

xg2402 bro here, hoping us the best

No screen tearing in games
No input lag or framerate getting cut in half like you get with vsync

Brainlet who has never used this stuff before, can someone give be a rundown on what the following terms mean?

Freesync

Gsync

adaptive sync

VRR

Vsync

thanks

Attached: spurd.jpg (183x183, 10K)

No. Fuck off, reddirt-spacer.

amd's open standard
nvidia's proprietary standard, costs money to license and requires additional hardware, increases cost of display by $100+ compared to similar panels without
an attempt to match refresh rate to content framerate
the concept of a refresh rate being able to be changed instead of static
frame rate limited to match refresh rate

i spaced it like that for viewing reasons i dont use reddit sry

Attached: prooohhh.png (960x960, 138K)

Why is Nvidia actually trying to do something for their customers for once, yet Intel continues to be the eternal jew?

Attached: 1506905228968.png (720x546, 358K)

>Vsync
Solves screen tearing, which is what happens on a fixed-refresh rate display (e.g. 60hz) when the display wants the next frame but the GPU has finished rendering it yet, so it sends part of the old one along with whatever it has of the new one.

Vsync does eliminate screen tearing, but at the cost of some input lag and the possibility of pretty major framerate drops when the GPU is rendering at less than the refresh rate. The display still wants a new frame every 1/60th of a second, but instead of sending two partial frames, vsync will send only completed frames, but if it doesn't have one it just redisplays the last one. That effectively turns 60hz into 30hz.

>G-Sync
nvidia's proprietary method using both the GPU and a specialized chip in the monitor to synchronize the monitor's refresh rate with the GPU's framerate, i.e., if your GPU is outputting 40fps, your monitor runs at 40hz (or more likely a multiple of that).

>FreeSync
AMD's version of G-Sync, basically. Unlike G-Sync though, it's cheaper to implement and AMD doesn't charge licensing fees for it.

>Adaptive Sync
Basically the same thing as FreeSync, but adopted by VESA as part of their standard.

>VRR
Variable Refresh Rate, generic term for any of the above (excluding vsync).

no reason at all to buy amd now. 2 of those are < $200 usd.

Oh man, there are 32 inch saumsung 4k TV's out that do freesync that are like 350 bux.

If it works with my 1080 that would be sweet.

Should say adaptive sync on the tvs not free sync, this user informed me.

>implying this isn't a PR stunt to shame FreeSync on the whole.

>shame the preferred industry standard
???????????????????????????????????

unironically hope that it does the opposite and forces monitor manufacturers to into better quality control
Also hoping there wont be the shitty Freesync flickering that gets a lot of complaints

All G-Sync monitors have the level of quality control you are craving.

If they get it to work on my monitor then its HUGE. Unironically was going to blow on a new gsync monitor.

Probably because goy sync monitors sales isn't what they've hoped. nvidia will likely still charge for the certification alone. Hopefully oems aren't forced block freesync compatible graphic cards from running on them.

not the auo optronics panels

>only one 1440p@144hz
>TN
Why even live?

those are just the ones that are confirmed, if you read the article you can force the VRR to work on your monitor

Nice, they finally (kinda) did what they were supposed to do from the get go. This will be good for Nvidia users. Though they are still cunts in my book.

>someone animayed this gif
god i love the internet

itll be good for AMD users as well, since itll force monitor manufacturers to have better quality control for their freesync panels

The money doesn't add up. From time to time even Jews have to cater to the goyim.

1440p@75hz here
Pray to the panel gods user for they will surely delivar...

that would cause a backlash the size of yellowstone if they did that shit, i couldnt see them doing that in a million years
that would be some EA style stock tanking

>ROG Strix XG35VQ
I got this a year ago, how fucked am I?

Attached: 1544946885830.jpg (1500x1024, 119K)

who knows, it might work, it might not, id imagine it will though, even if it doesnt get their certification

>400 tested | 12 passed
so are most of us fucked?

No, you just toggle it in CP and pray it works.

no, those are just the ones that will work off the bat when the new drivers are installed, you can still manually force VRR to turn on via a new nvidia control panel option (should work with most freesync monitors since they all use VESA adaptive sync tech)

For one, I can't. Not only do I operate from a TV, but I also have a 970.

Secondly, if OP's image is anything to go by, 3% of the panels tested experienced no issues. I won't say that's the whole market, but it's a hefty share of it.

And as an aside, this is interesting on a second front as NVIDIA opted to push HDMI 2.0b instead of the newer 2.1, which includes VRR by default. It includes a slew of other features as well. I wonder if this is an indication of them paging for that mistake.

From Nvidia:

G-SYNC Compatible testing validates that the monitor does not show blanking, pulsing, flickering, ghosting or other artifacts during VRR gaming. They also validate that the monitor can operate in VRR at any game frame rate by supporting a VRR range of at least 2.4:1 (e.g. 60Hz-144Hz), and offer the gamer a seamless experience by enabling VRR by default.

We will test monitors that deliver a baseline VRR experience on GeForce GTX 10-Series and GeForce RTX 20-Series graphics cards, and activate their VRR features automatically, enabling GeForce gamers to find and buy VRR monitors that will improve their gaming experience.

For VRR monitors yet to be validated as G-SYNC Compatible, a new NVIDIA Control Panel option will enable owners to try and switch the tech on - it may work, it may work partly, or it may not work at all.

Hopefully further driver updates will add support with more monitors (i believe they did say they will keep supporting it via driver updates)

WHY DOES NVIDIA GET TO USE GSYNC AND FREESYNC ITS NOT FAIR BROS

Welp, there goes any reason to buy Navi.

>charging others for a sticker to show they're worthy of you
Are they ever going to get off their high horse, or is it a lost cause at this point?

>the 2019 setup will be AMD cpus and Nvidia GPUs
im ok with it

You do realize no one is against this right? This is what they should have done from the start.

>falling for Nvidia's marketing bullshit.
It will work on any screen with various quality.
Take a look at en.wikipedia.org/wiki/FreeSync for more monitor info.

Their marketing spin is:
G-Sync Compatible aka Free-Sync for the poor Plebs.
G-Sync aka just like Free-Sync but you get NoVideo's approved label cause of backdoor deals with monitor producers. For the Burghers.
G-Sync Ultimate for the true NoVideo experience. The Patricians choice.

Price-performance.

You still need a 2K USD monitor for ULTIMATE G-sync

>You do realize no one is against this right?
You really think so? Do you?

Forgot pic

Attached: 1544850351569.jpg (1349x720, 275K)

Yes? They finally got their head out of their ass and stopped cucking consumers. How is that a bad thing for anybody?

Well shit, I own a 2080, what fuckin monitor should I buy now that the playing field has changed?

Attached: 4rewfd.jpg (1086x992, 73K)

The next version will be my new monitor. It'll fully support the newest dp for full 4:4:4 4k hdr with more hdr zones.

This is the only monitor that looks genuinely like flagship oled 4k tvs. It's incredible. It's so ahead of other regular $600 "flagship" panels it isn't even funny.

So navi is THAT good huh?

>This is the only monitor that looks genuinely like flagship oled 4k tvs
It can't look like OLEDs because it's LCD.
No.

Pick one from the list in OP's picture.

Clearly it is since nVidia is pulling out a bunch of stops only 2 days before its release.

those arent the only ones that will work though, id personally wait until the community does all the work and finds out which monitors work with freesync and which dont

Navi isn't released until Computex.

Qled is comparable to oled, but I don't expect Jow Forums poorfags to follow bleeding edge $3000 tvs on avsforums.

haha lol, good one Tim.
Damage control some more :)

>Qled is comparable to oled
Not even close, it's still LCD versus OLED.

ok.

Who?
And it isn't released until Computex.
That's the usual AMD dGPU launch window anyway.

Fuck off reddit

we'll ow its goen in a few days timmo.

What.

so, are they updating drivers for all nvidia cards to work with free sync, including 9 and 10 series?

Attached: t34erg.png (640x640, 111K)

only turing and pascal confirmed so far, but im sure the driver updates will eventually let it work for 900 series, too many people still own 970s and 980s

Why wouldn't you just get the Nitro XV273K seeing as it's the same panel and just uses freesync instead of g-sync at half the price?

Probably yes.

You would, that's the point.

you did it user, you cracked the puzzle :^)

>4k monitor
FUCK
PLEASEEEE FUCKING LET THE Nitro XV272U P WORK WITH THIS SHIT AS WELL

Because you'll only get the absolute barebone features and won't be able to take advantage of vrb/ulmb.

>vrb/ulmb.
whats that

There was research done by testufo and they concluded 144hz strobing equals to around 500hz regular refresh.

idk what that means
sounds good though

they tested 400 monitors only 12 of them were found to perform adequatelly with nvidia hardware and whatever driver hack they have done to enable adaptive sync

do not buy into nvidia marketing bullshit, they want you to buy more gsync chips. it's an open standard, if they didn't fuck it up intentionally it will work on everything.

truth is probably closer to
>nvidia offers manufacturers of 400 monitors a "certification"
>companies behind 12 paid for the marketing

I guess the real "news" here is that nvidia is admitting and making it clear that g-sync was a big scam and that special nvidia chip inside g-sync monitors, which is what makes them so overly expensive, isn't needed. that was obvious from the fact that freesync monitors do the same job without that chip and price-premium.

overall it's great news for those who have nvidia cards and happen to have a freesync display. and that's not unlikely these days, my 4k display has freesync and that's not because I looked for it specifically, the one I wanted happened to have it. it's pretty common.

>2.4:1 (e.g. 60Hz-144Hz)
This disqualifies a lot of Freesync monitors. There's plenty maxing out at 75Hz or 60Hz.

Dumbest thing I read today
It flashes the image instead of holding it until the next to reduce motion blur

If this fucking works going to be awesome. I refused to buy into the GSYNC bullshit and just got 3 non-Gsync to put together. If this is even partially true I'll be thrilled.

Are there any Freesync monitors that will do 30-72 Hz? Because that would technically also be 2.4:1

wouldnt the 60hz monitors work properly?

My cheapo Crossover VA does 34-75hz.

It's a good thing for Nvidia that their consumers aren't smart enough to realize how stupid it would be to support Nvidia in this.

Remember those DIY G-sync kits for the Asus VG248QE?

We have come along way guys.