Nvidia Driver Update Will Enable G-Sync on Freesync Monitors

>Literally zero reason to ever buy AMD

This also kills the G-Sync tax that you had to pay if you wanted VRR on Freesync monitors.

pcgamer.com/nvidia-brings-g-sync-support-to-freesync-monitors/

Attached: Untitled.png (646x645, 193K)

Other urls found in this thread:

nvidia.com/en-us/geforce/news/geforce-rtx-streaming/
twitter.com/NSFWRedditGif

>>Literally zero reason to ever buy AMD
No, you mean literally no reason to buy a G-Sync monitor.

>Nvidia admitting they've been scamming people with their $150 G-sync chip when they can support Fressync with just the push of a button
>No reason to buy AMD ever again

12

>Nvidia Driver Update Will Enable G-Sync on 12 Freesync Monitors*
ftfy

Nice spin. Ignore Ngreedia shill threads.

>brings g-sync support to freesync monitorls

What retard shit out this headline? They're just adding regular vesa adaptive sync support to some of their cards and branding it as g-sync.

I know reading is hard by try making it to the end of an article before racing to comment.

>And it actually won't be limited just to FreeSync monitors that Nvidia says pass certification. Those monitors will simply have the feature turned on by default.

>For gamers who have monitors that we have not yet tested, or that have failed validation, we’ll give you an option to manually enable [G-Sync], too," says Nvidia's press release.

My apologies then. I listened to Jensen's entire presentation last night and didn't hear a peep about any monitors beyond those 12. I figured if he didn't mention it they weren't doing it.

There's no reason to buy Nvidia, so this is only relevant for those who were scammed by the green jew.

Anybody who ever buys Nvidia again is a massive piece of shit.
I paid $600 for a Dell Gsync monitor and it was a massive piece of shit. The sync worked fine, everything else was shit.
PC Gamers are scum. And twitch is scum for feeding mental illness with their celebrity worship profit model.

What is the point of having a feature that doesn't work properly?

It will work properly.

Did Nvidia lie about testing 400 monitors?
The list of 12 are ordered alphabetically

So why would the monitor fail in the validation?

I fear that the monitors list that actually g sync will be really small. I lost all faith in nvidea after that rtx bullshit.

so wait the people with freesync monitor can use NVIDIA cards, but the people with GSYNC monitor can t use AMD ? lmaooo nvidia fucking their customers

It's actually a $500 chip with an entire separate SoC and like 3GB of DDR4

This is great news for some of us. I can finally see what my freesync actually does.

>MG278Q passes
>MG279 doesn't

Attached: 1373938059143.gif (369x206, 1.98M)

excuse me what the fuck
I was told that G-Sync monitors had special hardware that had to be certified and that's what made it so good and expensive
did Nvidia bamboozle me?

this

They're renaming normal g-sync to g-sync ultimate or some shit

Fucking BASED. I have absolutely no reason to every buy AMD ever now. Myself and a couple of friends were going to all upgrade our GPU at the same time and we were seriously considering AMD because most of us have 144hz monitors already (which also happen to have freesync support) but with this now we either don't have to upgrade or we can just go full out on a 2080 for 1440p 144hz. This must be a kick in the teeth for AMD. The one major selling point they had where Nvidia was completely uncompetitive has now been taken away from them. People were already not buying AMD despite loads of sales and game bundles so with Nvidia making this move, even less people will buy AMD now.

>Literally zero reason to ever buy AMD

Except for better value and not funding anti-consumer horseshit from Nvidia.

Only for the 4K 144z HDR Version. Lower resolutions and color ranged come with cheaper (but still expensive) hardware

Is has special hardware that takes care of something that every AMD card since Bonaire and every Nvidia chip after Maxwell can do on it's own

You can use every adaptive sync monitor, even those without label. They'll be failing their certification for stuff like to low of an adaptive sync range or fucked overdrive settings I suppose

Don't you have meetings to get to Jensen

This but fucking unironically. Novideo finally gave up and is instead trying to recapture marketshare. Don't fall for these scummy tricks.

>G-Sync
>5-144hz all ranges are adaptive

>FreeSync
>10, 20, 13, 18, 25, 9, whatever because the ranges teh adaptive sync works on are random between manufacturers and panels and FreeSync stops working in 90% of cases when the frame rate is low enough that it's needed most

Wait what? I watched that conference last night. I didn't hear anything about them supporting FreeSync. They just referenced to G-Sync as Adapative Sync the entire conference and said they would now be testing every single monitor that carries g-sync to certify it as not suffering from 2 common issues.

Better value? The same way a free flat tire is better value than a new one?

If I wanted 2 generation old performance then I'd buy a 700 series card.

you can have both, my dude.

NOOOO NOOOO
THE G-SYNC TAX WAS GOOD FOR US
IT WAS OUR TURN BROS

Attached: 1542461983683.png (653x726, 42K)

today is a good day Jow Forums, first Intel achieves 10nm and now Nvidia dominance!

...or you can buy nvidia and not a gsync monitor.

Novidia had the ability to do this at literally any time. They must be hurting and/or afraid of what AMD is bringing if they chose to pull this out now.

Yeah, increased market monopolization is fucking awesome. What a victory.

Yeah, better value. AMD cost less, better performance, more features that aren't artificially disabled, no planned obsolescence, no drivers that purposely cripple older cards, no paying devs to use software to purposely cripple older cards and make games look worse. When you buy nvidia you are funding all this gsync, gameworks, etc. horseshit. Or you can buy AMD, better value and a clear conscious.

t. team loser

They do. Just google the adaptive sync ranges. It doesn't work at every frame rate, there is a floor and ceiling of refresh rates when it can activate and its different between panels. At first FreeSync was awful, there were monitors that deactivated it if you went below 30 fps (when it would be needed most).

FreeSync has drastically improved, the best panels are like 15-140 now, but g-sync still has the widest range because of the module offloading some shit onto a standalone processor.

That being said it still wasn't worth the premium now after FreeSync was improved. G-sync was adding like $300 to some monitors and no amount of adaptive syncing at 5fps is going to make that okay.

I'll still buy AMD because I'm not a dumb goy

no they don't care about amd and if they did they would have done this in 2015. they're doing this to accommodate for the tv market where most tv manufacturers just announced VRR 4k 120hz support. nvidia are also trying to market their own brand of 65" big format gaming displays. if nvidia cared about what amd was doing or even scared they wouldn't have sold the 2060 at $350. them doing that means they probably know amd is going to be a let down.

Not random but what the monitor manufacturers choose. Also clearly stated in the specs. But for people.barely being able to read it's probably not feasible to spend 5 minutes informing themselves before buying something

No you're just a dumb fuck who spends money to match the performance I had in 2015

one gsync monitor costs as much as an amd gpu and a freesync monitor together

cope

good thing you can use nvidia with freesync now then.

How about wanting high end?

OP finna got dabbed on

you still buy amd because you're a literal fanboy. there hasn't been any reason to buy amd in ages and there is even less reasons now especially considering nvidia is pushing the envelope hard with new tech from ray tracing to DLSS to VRS etc. ray tracing might be a meme for now but DLSS and VRS are brilliant technologies and they will benefit the consumer. DLSS will bring 4k gaming to the masses and you don't even need a 4k60 capable video card.

nvidia has given up freesync has pretty much won at this point

>did Nvidia bamboozle me?
AMD made adaptive sync available right after Nvidia announced g-sync and you still bought it I think you bamboozled yourself.

Nvidia has no drivers

If you want "high end", you support the competitor. Or you get Sandy bridge rehashes. AMD still brings better drivers for both Windows and Linux, and you don't need to make a fucking account to use their own recording software. Buying Nvidia just supports a monopoly built on jewish practices so their gimps can work on more titles.

Fuck Nvidia and fuck Nvidia "people"

nvidia is officially a partner of OBS now so you don't even need to use shadowplay since they'll probably integrate all the shadowplay features such as the overlay into OBS.

nvidia.com/en-us/geforce/news/geforce-rtx-streaming/

It doesn't support at minimal 40-144hz range and have ULMB support.

Basically, Freesync monitors that wouldn't fit G-Sync 1.0 specs.

I'm no fanboy and I don't think any RTX card is a good value. But AMD hasn't released anything that competes with the 1080 Ti which is head-up-ass priced new, but can be had for a more acceptable price on eBay.

I buy whatever meets my requirements for a better price. A year and a half ago when I was shopping for a new PC, that was Nvidia. Now I'm looking forward to seeing what Navi can do, though I have no reason to replace my 1080 Ti.

I want a 1440p@144hz monitor...

Out of the 12 which one is the best?

>AMD still brings better drivers for both Windows and Linux
HAHAHAHAHAHAHAHAHAHA

Skinned overclocking utilities, still worse than 3rd party tools, is not "drivers". And Nvidia's hardware encoder has wider support everywhere.

embrace. extend. extinguish.

cause there are a lot of shit freesync monitors around. i agree on nvidia on this

How? They're accepting an industry standard and extinguishing their own proprietary overpriced one.

Jensen is under massive heat from shareholders from misleading them throughout 2018. RTX had lackluster uptake, they are still stuck with a significant surplus of Pascal stock from Crypto-currency bust. AMD RTG is about to unleash Navi which will probably put a hurt on the mid-range and value market.
They did this to prevent people on the fence with whole Gsync 1.0/Freesync debate from waiting till Navi and have them get existing Pascal stock now that it will support VESA VRR spec with a driver update while removing AMD RTG's only advantage in the performance GPU market.

>Buying Nvidia just supports a monopoly built on jewish practices so their gimps can work on more titles.

>nvidia invests shitloads of time inventing GPGPU, and overhauls a hardware specific physics simulator to run on their GPUs while working on it year over year until it's the most complete and highest performing general purpose physics simulator
>REEEE YOU HAVE TO SHARE THAT AND REWRITE IT FOR AN INFERIOR CLONE WE MADE

Fuck off commie

most expensive one

>1070tis, 1080s, and 1080tis have been running low on supply, slower restocks, some may not be restocked resulting in highway robbery prices by retailer
>they overproduced pascal!s

Imagine actually being this retarded.

no full range 40-144 hz . the mg279 i think it s only 40-75 hz
the mg278q is the only 1440p(27) on that list

Better skinned "overclocking utilities" than forced telemetry utilities

Asus MG278Q

nvidia is the clone. you fell for the marketing.

When that first became a thing no other scalers on the market could do adaptive sync, right now many freesync panels can't do full range/LFC. Making a single module is worth it in the short term for nvidia as the tech progresses so quickly. Since gsync modules are FPGA based (fast/cheap prototyping, expensive production), nvidia doesn't have to sink time/resources into developing an ASIC (expensive prototyping, much cheaper production) every time they update the spec. They can rapidly prototype/release new shit instead of AMD who relies on either OEM scalers having freesync features implemented or for monitor manufacturers to spend the money/time to develop their own proprietary scalers before anything can come to market. As long as Nvidia can make the modules themselves really fucking quickly ('cause FPGAs), the extra cost is arguably just the price you have to pay for the bleeding edge. For example, variable overdrive is a feature that practically don't exist on the freesync side (literally 1 monitor I can think of) but is a standard feature for gsync. Until the oem scalers with new features start production (or those new features can luckily be implemented on the old hardware), the only way for monitor manufacturers to support a new freesync feature is to dump money into developing their own proprietary scaler. Meanwhile as long as gsync supports a feature, no extra work needs to be done for monitor manufacturers, it just werks.
tl;dr Nvidia gets the sales for creating a system to expensively mass produce high end scalers with less lead time than any freesync scaler with comparable performance/features

literally zero reason to buy overpriced Gsync monitors.

thanks based and redpilled nvidia

Attached: 35077633_635403400144196_4358905323084316672_n.jpg (451x406, 53K)

Does this move benefit Nvidia or AMD more?

>Literally zero reason to ever buy AMD
if you buy Nvidia in the sub-300$ price bracket right now you're an idiot

>CUDA in developers hands for a full year before openCL was even demoed
>two years before openCL was available to devs
>M-MARKETING

The only marketing here is your shilling. The reason why CUDA won was because by the time openCL was widely available academia had already invested all in on CUDA

This move specifically nvidia, because its better than going down with the gsync ship, but over all it means AMD won with freesync. Nvidia is pretty much surrendering instead of digging their gsync grave deeper.

It's a kick in the teeth for AMD because Nvidia are basically thriving off the effort AMD put into their freesync certification and getting all monitor manufacturers on board. Even shitty chink brands support the freesync branding. All this will do is make the people who were legitimately considering AMD because of the freesync support stick with Nvidia as they always do but buy AMDs labeled monitors instead.

>only 12/400 monitors passed even though this is a desperate nvidia PR stunt

lmao DOA. for those saying you can use it on monitors (remember they already tested 400) without certification it's going to have a shitload of problems or not work at all like the article says.

Freesync is AMD's implementation of VRR. As nvidia indicated with their testing freesync has looser requirements than G-sync.

The article did not mention any problems for the monitors failing the test. If it work with an AMD card, it should work the same with nvidiot. Adaptive sync is an open standard after all.

cuda wasn't the first and opencl is objectively better than cuda and more widely used..

Trying to hijack the Freesync name and rename is G-Sync. Gotta admire Nvidia for their ballsy nature.

>For VRR monitors yet to be validated as G-SYNC Compatible, a new NVIDIA Control Panel option will enable owners to try and switch the tech on - it may work, it may work partly, or it may not work at all.
>it may work, it may work partly, or it may not work at all.

it's going to be a harsh lottery and again this is a big PR stunt for nvidia. if it were possible they would certify as many monitors as they could to please fans but they couldn't only 12/400 lmao .

CUDA is a perfect example of Nvidia proprietary bullshit slowing down the industry.

Feels good buying AMD GPUs and knowing my moneys not spent on developing anti-consumer software to come back and fuck me, or on advertising and shills.

Wish there was just one other competitor to AMD that's not a completely soulless corporation ready to screw over its own user base for every dime.

The upcoming Razer one. Finally a new fucking panel.

Could someone educate me on this?
Ok so there is Freesync, G-sync and then the upcoming VRR in HDMI 2.1.
Is the VRR on HDMI just freesync or will GPUs have to get support to support hdmi 2.1 VRR?

no they won't. they'll work just fine and identical to an amd card running on the same monitor. the certification guarantees that there is a wide frequency range mainly and if you look at all their certified monitors they all have at max a 48hz lower range at they all have to achieve 144hz at the upper range. my monitor is just as good as their certified ones from a quality standpoint like there aren't many reports of flickering when using freesync but it still won't ever be certified because it only has a 90-144hz range. other monitors like the mg279q which is pretty much identical to the pg279q gsync won't make it either because it only has a 30-90hz range. what this certification is is just nvidia doing the work for retarded casuals who want a "gsync" monitor but never do the research. people expect high quality from gsync based on the higher end models and if someone were to buy that mg279q and the "gsync" stops working at 90hz and they get tearing they'll blame nvidia or the gpu on it or something.

tldr; if a monitor works fine running freesync on an amd card it'll run just as fine on the nvidia card.

>literally wishful thinking

see
and the fact they can only put their name on 3% of the monitors they tested. they are gunning for freesync/AMD but can only manage to adequately replace 3% LOL what a failshow

>damage controlling this hard

i literally just explained to you why that is and judging by your post you're praying that it doesn't work because you know the only reason you have to defend your amd purchase was freesync and now that's been robbed of you by nvidia. VRR works the same from amd freesync to xbox VRR on tv's to now nvidia gsync compatible. there will be no difference and you're clinically retarded to think otherwise.

1080s and 1070s have been discontinued kiddo. It is mostly 1070Tis, 1060s and 1050s that are collecting dust on shelves. They are plenty of 1070Ti in NA and EU markets.

>it theoretically should work
but it doesn't.

>let me ignore the actual info given by nvidia
lmao fuck off with your wishful thinking disinfo idiot

>Adaptive sync is an open standard after all.
Adaptive sync is just a standard for the source device to influence the refresh of the panel. That's all it is. What you're able to do with that depends on the monitor. Obviously freesync isn't open so there's no point in this conjecture.

they literally said every monitor ever made which supports VRR will be compatible you retard. you're the one who's ignoring what they said.

>However, with a wide variety of VRR ranges and in some cases a narrow VRR operating range the VRR feature may only work when the game framerate is in a narrow, very specific range. Which is often not the case, as game frame rates vary significantly from moment to moment.

>In addition, not all monitors go through a formal certification process, display panel quality varies, and there may be other issues that prevent gamers from receiving a noticeably-improved experience.

there's your whole explanation of why they're doing the certification. it's not that it won't work, it's judged on how good the implementation is by the monitor manufacturer. see

what part of

>it may work, it may work partly, or it may not work at all.

can you not understand?

what part of

>only 12 out of 400 tested are weren't complete shit

do you not understand either?

>Nvidia slap your face everyday
>Finally one day Nvidia stop

Drone, hey Nvidia is good guy all along

Nvidia cuck everyone

Attached: 1272481949947.gif (151x136, 4K)

what are the odds that out of 400 monitors tested, 11 of the 12 that passed start with an A?

what part of
>However, with a wide variety of VRR ranges and in some cases a narrow VRR operating range the VRR feature may only work when the game framerate is in a narrow, very specific range. Which is often not the case, as game frame rates vary significantly from moment to moment.

>In addition, not all monitors go through a formal certification process, display panel quality varies, and there may be other issues that prevent gamers from receiving a noticeably-improved experience.

do you not understand? if they're complete shit that's because of the MONITOR not because of any nvidia related thing. if it doesn't work on an nvidia gpu it won't work on an amd gpu either you moron. the implementation is completely in the hands of the monitor manufacturer. there's no variance between amd and nvidia working on any particular monitor. if one works the other will and vice versa.

>sells the best gpus for 3 generations in a row
I refuse to buy amd housefire gpus. No one forced anyone to buy memesync