Has nvidia shot themselves in the foot by not allowing freesync to work with their cards?

Has nvidia shot themselves in the foot by not allowing freesync to work with their cards?
I don't think they should get rid of gsync but they could obviously do both and 90+% of adaptive displays are freesync.

I'm basically required to switch to an AMD card now when I next upgrade, just because I don't want to replace my expensive monitor.

Attached: 1526493681877.jpg (640x353, 38K)

Freesync sometimes works on Nvidia hardware, but Nvidia is always quick to "fix" it. This is one of the biggest reasons why you should go AMD.

what they make off of the nvidia tax for those displays is greater than what they lose from you not buying the next card
people that look for value in what they buy are not who nvidia is hoping to have as customers

it's a vanity brand, they use vanity brand tactics
you are apparently now poor, extremely extremely poor
what, couldn't afford it?

They can do whatever they want because there's not a proper AMD card you can buy.

>vanity brand
come now, Radeon hasn't been good for high end gayming since 2011

Has the frame loss/stutter gotten better with amd yet?

Amd has always benched well but the performance is rarely up to par. Frame loss in the last two cards I tested about 3 years ago was ridiculous.

No excuse for gsync, though, and considering they aren't sharing numbers, they're gonna change something in a couple years at the latest.

>frame loss
What?

It's a damn shame, but they're probably clearing a cool $100 off every Gsync monitor, so I highly doubt they'll stop.

I'll switch to pajeetsync when amd have an ulmb implementation.

you can't even use ULMB and gsync at the same time

GSync was always priced for the high-end segment and NVIDIA pretty much has free reign there at this point. I wouldn't expect anything to change in terms of GSync unless AMD becomes competitive again and pushes their shit in to force them to make their products more appealing in ways other than just having higher performance.

That's a pretty tricky problem because of how brightness is perceived. If the image isn't flashing then perceived brightness is not tied to refresh rate since there's always an image on screen. If you're strobing however, the monitor emits light for a moment as it displays each frame, now if you change the refresh rate you would have to constantly adjust how bright each flash is in order to perceive the same average brightness, but that isn't even the only problem.

Say you're running ULMB at 120Hz. If you added GSync on top of that and the game FPS were low, you'd get visible flicker because the time between flashes grows too long, kind of like a CRT running at low refresh rate. It doesn't really work, ULMB is pretty much fundamentally incompatible with variable refresh rate as it works today.

>everyone does HIGH end gaming.
someone needs to look at steam hardware charts

Attached: 1533863831657.jpg (500x667, 61K)

>smelt you buyin' shit like i wouldn't tax it

G-Sync was Nvidia's attempt to cash on Adaptive Sync before VESA-spec was finalized with Displayport 1.2 and newer.

They copy most of it and added middleware to make it work on Kepler since it predates Displayport 1.2.

Nvidia is trying to use G-Sync as a means to step-up a vertical monopoly. The massive flaw in that strategy is that they don't own or make their own monitors. Monitor vendors don't even like dealing with Nvidia which is why G-Sync-capiable SKUs are completely dwarf and overshadowed by units that have VESA adaptive sync spec.

Nvidia even uses the VESA adaptive sync on their mobile GPUs since Maxwell. They could easily enable it for their desktop GPUs with a driver update.

>They could easily enable it for their desktop GPUs with a driver update.
this is what really pissed me off.
I'm an nvidia user and I've been completely cucked by this decision.

Oh, you mean the one where nVidia dominates the GPU charts with 1000 series and 900 series cards?

Attached: Untitled.jpg (1954x1448, 379K)

>he still buys nvidishit in 2019
consider seppuku

G-sync was out there first, and a more complete solution.

These days there are a ton of options when it comes to compatible monitors, and I'm not going to need to upgrade any time soon, so the Viewsonic XG2703-GS I got this year was a bargain.

I'm not concerned it won't work with AMD hardware, since my Geforce 1070 does a great job at anything I throw at it, and I can't see AMD releasing anything any time soon that would make me wanna upgrade.

>amd GPU
>an upgrade
I feel sorry for you for thinking that. Can amd GPUs even do anything other than play gaymes?

There are two real issues with AMD that caused this; firstly AMD is still stuck on GCN, and while it was great in 2012, the chips have always drawn fucktons of power and are just slow by today's standards.

But more importantly they got FUCKED by miners prioritising AMD over nvidia for mining, and anybody who wanted a card had to settle with an overpriced 1050 or 1060 instead, leading to lots of sales and no market penetration.

Now that the mining market has completely crashed, everybody who wanted a mid range 1080p card from last generation already bought one, and everybody who wants one now can buy polaris half price with miners flooding the used market.

But for everybody buying fancy new 1440p ultramemes and 4k monitors?

Vega simply doesn't have the power necessary to drive these monitors. Everybody who needed cards to drive these resolutions last gen bought Pascal because vega kept getting delayed, and then were never available due to mining, while right now they don't have anything out at all to compete with Volta. They don't even have rumors or even a codename for their high end parts yet, and the only thing anybody knows about Navi are rumors about replacing Polaris.

If Jenson didn't fuck up hard buying into the mining craze, we would be seeing volta directly replacing pascal instead of Jenson waiting until all the pascal and gddr5x is stock is gone.

>most popular are 1060 and 1050/Ti
Good job proving his point.

AMDfags consider those high-end GPUs.

1440p Fury and still no reason to upgrade kek

If they do free no one will buy the g.

People playing at high framerates and 1440p and 4k monitors are not 10% of the market , combined. Also, I play on 1440p with a Vega 56 and have absolutely no issues, stop talking out of your ass. Vega 56 and the competing 1070/ti are more than enough for most games at 1440p. Your reasons for why radeon got fucked doesn't hold to scrutiny. Nvidia is the company with constant reports of unsold stock of 1060s, while AMD has it on a lesser degree. While I'm not agreeing that Nvidia is a vanity brand, there's certainly merit to this argument. AMD is bad at marketing, made terrible launching with both Polaris and Vega, but the cards weren't then and aren't bad by any stretch. They just lack mind share. Even when they had the better product (fury X) Nvidia still outsold them 3 to 1. They need to step up their marketing game.

NVidia stock drops in half

AMD second vice president quitting in 6 months after wanting to make a card that would cost $750 to build and cant even compete with a 1080 ti.

No other competitors.

It's a cluster fuck right now.

I'm playing in 1440p with my trusty old 7970GHz.

I'll wait for the new 1440p 144Hz HDR Freesync2 (with full range) monitor to come out and buy the next gen AMD card.

Freesync is such a game-changer. Pretty much has me stuck buying AMD cards now unless Nvidia ends up supporting it which is doubtful.

I'm not even buying a TV until a good one with freesync is available.

This Gsync price policy by Nvidia is just stupid. Considering I have a mid range PC, why would I spend 200$ extra on a monitor with a gsync chip when I could use the money to buy the next better Nvidia graphics card instead. It's a shame since people with mid range PCs could profit most from it.

if you mean frametime, amd has fixed their shit and are now ahead by easily 20%

I can buy a 144hz 1080p monitor for less than $200, why do i need to bother with either of these technologies?

>1080p
shoo, assfaggots

R9 290/390 is STILL god tier and aged like fine wine while the GTX 700 series aged like milk. The 970 3.5 debacle also happened. The cards are 6+ years old and still beat the GTX 1060. Literally only retards buy Nvidia

G-Sync is not a more complete solution. It was rush out to beat VESA adaptive sync spec. The middleware chip was needed because Kepler was taped before Displayport 1.2 spec was finalized. The chip does the same job as VESA adaptive sync over Displayport 1.2 and newer. It is completely obsolete with newer GPUs.

Nvidia chooses to keep it around because they want a vertical monopoly. They end-up fucking over the rest of the market by refusing to adopting VESA adaptive spec with their desktop GPUs. Adaptive Sync is the best thing since the move to digital output. It solves so many of the lingering issues with motion and display outputs.

The only difference between VESA adaptive sync and G-Sync is the Nvidia forces ULMB and 60-144hz range while they are optional for VESA adaptive sync.

AMD demolishes NVIDIA at the sub $300 price point.
Its Vega cards are also able to compete with NVIDIA's 1000 series.

Actually, what really caused this is that ATI-AMD RTG completely destroyed their goodwill to performance GPU market by failing to deliver on HD 2900XT and having supply issues with HD 5xxx family for almost a year.

8800GT and its rebrands outsold the entire HD 2xxx-3xxx family by a factor of 4. HD 4xxx provided a temporary respite. HD 5xxx's supply issues prevent it from outselling the Fermi family until Nvidia got its shit together with Kepler. Tahiti's potential wasn't enough to overcome damage done to the mindshare. Maxwell was the final nail in the coffin.

AMD doesn't have cuda, so they are only good for low end vidya gaming with features disabled.

>cuda
Stallman was right.

>Stallman was right
Stallman drinks Pepsi.

>sidesteps away from gaymemes
Face it and GPUs are only good for console tier vidya, and people only buy them with intent to play console tier vidya with features disabled.

Why is any sort of refresh rate syncing solution necessary when the monitor is 100hz or more anyway? Serious question here, I don't really see the point. Somehow it makes 45fps look better?

Nvidia will cave when HDMI VRR starts to ship on millions of displays

There are AMD cards, just not high end cards. The fact of the matter is, most users probably only need the mid tier or the entry level graphics card to play their games. This can be done with any $200 card.

Well the thing is, if you'd bit the bullet and paid the tax for GoySync, then you would be stuck with NVidia. That's what they want.

I have a Titan Xp for work and the ultrawide monitors we use are freesync, lel (not for gaming so no one cared, but they happened to support freesync)

THEY DID WHAT??

Nvidia is like 90% of the industry and they're way ahead of AMD in performance and efficiency; they have no reason to care whether their customers are pissed that they won't support GSYNC because it's not a big enough issue to make people buy AMD.

G-sync is such a slap in the face it isn't even funny. It's literally the same tech as freesync, but with a stupidly high price premium. I have a 980Ti as it is and because of Nvidia tricks, I refuse to buy another Nvidia card. I'm just waiting to see if I should get Vega 64 now, or wait for Navi. I'd then get a good 1440p freesync panel

Fuck Nvidia bunch of dirty Jews.
>inB4 lol ur poor
Having money doesn't mean you should spend it stupid.

Oh sorry I read it wrong and thought they started allowing Freesync. My mistake.

CUDA is going to go the way of GLIDE as soon as AMD RTG or Intel puts out a GPGPU that slaps Nvidia GPGPUs shit hard using OpenCL.

Nvidia is still the better GPU until amd, and Intel catches up then.

Yeah! Just look at how the Intel HD4000 is the most popular GPU Ever! Such quality!

Are there even 4K 60Hz+ gsync displays on the market yet? Or does Nvidia not care to deliver these to there high end market since 2080Ti can only deliver 1080-1440p 60hz with DSR on their highest end cards?

The problem is that more and more people are going 1440p, 4K, and ultra-wide, and AMD just has the Bulldozer-tier Vega cards for that.

Attached: 1534973234399.png (321x321, 12K)

Intel, believe it or not, might be the first company to really challenge nVidia.

imagine being a smelly nvidia shill and doing it for free

I don't believe it.

I have a GTX 1080, which I plan to use for several years. If AMD doesn't have a suitable upgrade by then and Nvidia still refuses to support FreeSync I will continue to use my 1080 until it dies, and then I'll probably buy something from AMD. The point is, the next GPU I buy absolutely will support FreeSync, and I don't give a shit who makes it as long as I'm not downgrading.

Most people are happy with 1080p60, and a 580 is perfect for that.

Navi 10 will be GTX1080/RTX2070 equivalent for 250-350$

So you save 200$ on AMD GPU over RTX 2070 and you save 200$ on a fressync monitor with the exact same specs as G-sync monitors giving you 400$ savings

Ryzen 3000 series will give you an i7-9900K equivalent processor for 230$+120$ for an R5-3600x+X570 motherboard.

So 230+120+350+500= 1200$ for a cpu/gpu/motherboard and 1440p, 144hz IPS gaming monitor with freesync

That same combo with Gsync and intel/nvidia would be $700+530+140+500=1770$

1770$ - 1200= 570$ or ~600$ in savings for the exact same specced system and an identical IPS 144hz panel from the same manufacturer with free sync instead of Gsync.

Even if your nvidia/intel shill till the day you die you should be on your fucking knees thanking AMD for this product lineup because it will force a massive price drop on Nvidia cards and intel CPU's meaning you will have your nvidia/intel rig but you get to save 300-400$ on your builds as well

I did a blind test with my 3 roommates and not a single person could tell of GSync or Free sync was turned on or off.
Therefore, they are both marketing bullshit to keep you from changing GPU brands.


>Delusional AMD drone from Reddit.
Drones always think they can get top end performance for $250.
It's always the same delusion over and over.
When will you fuckers learn?

Attached: 1544284559700.gif (655x378, 254K)

G-Sync user here, was worth every dollar, even if it did cost extra.

what does G stand for?

Attached: 1490541342964.png (112x112, 7K)

It's waterlogged insulation.

Goyim

Acer Xb271hk and Asus PG27AQ

You mean RTX not DSR