Is G-Sync really worth it?

Is G-Sync really worth it?

Attached: 1536525266713.png (340x190, 22K)

Other urls found in this thread:

youtu.be/XOBXQsNM3D8?t=206
twitter.com/NSFWRedditGif

...

no

It's nice but at a high enough refresh rate you really won't notice any difference between it and normal vsync.

No. Ask them to adopt Freesync.

It's nice to have but the price premium is ridiculous and I wouldn't buy a GSync monitor unless it's on some crazy sale. It's most effective at lower FPS/refresh rate, at higher refresh rate and FPS tearing is (subjectively) less of an issue and harder to notice unless you're looking for it.

I managed to get a GSync monitor for like 35% off, that put it slightly below its FreeSync equivalent in price. It works well in practice but as I've said, I would not pay the premium for it.

On shitty unoptimized games, yes

Copied from PCBG

The _sync section could be much clearer.
Gsync works for when you have a range of FPS that often falls below the native panel refresh rate in Hz by eliminating lag/stutter.
If you have a range of FPS that often is well above the native panel refresh rate, then Vsync with limited buffer or just capping your FPS is fine to prevent screen tear.

Some people don't care about stutter or tear = don't get _sync
Some people care a lot about stutter = get _sync
Some people are tired of having to keep up with the latest and greatest HW to get FPS = get _sync
Some people rip at high FPS and care about tearing = just cap your FPS at the native refresh rate and buy a panel that can keep up.

Example 1: most of user's hours are spent playing shooters that get 150+ FPS easily. user should buy a 144Hz+ panel and cap FPS there to prevent tearing. No need to worry about lag/stutter.
Example 2: user plays a mix of games and struggles to get 100 FPS with most of them. user should get _sync panel at high refresh rate to enjoy those games without lag/stutter. Future GPU's will push the range higher but new games will also be more taxing. _sync panel will therefore be in service for a while.

I play at 4K on a 580. Getting a solid 60 is not usually possible, but 40+ is usually easy enough to hit, and freesync ensures that it feels perfectly smooth as long as it doesn't fall below 40.

So for my use case, adaptive sync (obviously freesync not gsync in my case) is very important.

No?

Yes.

I'm an old fag. Over the years I've seen a few advances in hardware that have offered real-world differences in the gaming experience. They have been:

- Affordability of hard drives that lead to games being run from them and no longer requiring floppy disk loads
- Colour CRTs
- Introduction of affordable Soundblaster16 cards
- Mouse input for games (while not a device advance, this changed the way the PC was considered for immersive gaming in my opinion)
- Pentium class processors (when the P75 was released, things really started getting impressive)
- Daughterboard GPUs (things changed forever once 3Dfx emerged. Quite possibly the most impressive technology leap I've seen)
- SSD storage (this has been the most recent advance that took me back to the impact of 3Dfx cards. SSD reminds me of the advances we got in the old days that really propelled things forward significantly)
- GSync

Gsync to me, as above, reminds me of the emergence of 3Dfx cards and SSD drives. It's a leap, not a gradual improvement that is barely perceptible. However, Gsync I feel is best appreciated by those who have a legitimate frustration with the way display hardware is interfaced with on PCs, particularly with respect to input lag and overall response/feel/immediacy of inputs. Input latency is the ultimate frustration for me when it comes to all aspects of computing, as I miss the immediacy of old, despite the fact that things were far cruder and simpler. Nothing feels better for me when using PCs that have a clean, snappy response, no matter what application I am using. Gsync helps in this regard significantly, and on a well tuned and optimised system, is a fine way to really present a good feel for the user. They err on the expensive side in many cases, but are gradually coming down in price. I strongly recommend a 144Hz 27" panel as the sweet spot, with a native resolution of 2560x1440 so you can truly drive all applications at full frame rates with zero bottlenecks.

nope, normal v sync will make the tearing go away at the cost of input lag and frame time bullshit, have had games where vsync would introduce 2-3 seconds of lag, no idea why, it just did

free/g sync when the frame is ready, it displays it, so its always packed perfectly, having tested this shit just to see how big of an impact it had, I got a game down to 18fps and freesync was still engaged, the game felt perfectly playable, I think it was battlefield 4, sure some of the effects looked low fps, but there was 0 fo that bad low fps feel, this is what sold me on sync > fps

as games come out, its not likely you will play them at 60+ fps with all shit turned on, even at 1080p, yea, remember some game devs are actually this shit at making games, so a free/g sync option ultimately gives a better experience then high fps alone.

granted if you go with a high fps monitor, any one worth looking at also has sync options, and here is where you have the nvidia 'instead of conforming to the standard, lets use a 100~$ fpga' or in the 4khdr case, a 1000$ fpga or amd...

I have no idea what you would want to get tough, as gsync locks you to nvidia and costs money, but amd isn't exactly competitive on the high end, and may not be till after navi.

personally I would hold off till navi on any monitor or gpu decision at this point, nvidia pretty much released a barely better than current gen gpu, but amd is going 7nm and that could wildly redefine what we consider performance to cost to be.

If we just talk about games, high rpm drives too.
I would also take the re emergence of mechanical keyboards as another one
programable shaders
dx9, this is where 3d games started getting good to the point that high detail 2d sprites, while a nice novelty, were getting surpassed in graphical quality

Then you have a fuck load of tech they never got used/was highly abused like tessellation, at a time when games could have used it bad, no one touched amds shit, and once nvidia implemented it, they made a fucking box have 500k polies and made no improvement to graphics with it.

I would also add in the dual core cpus as another landmark case, as tradition multi cpus kind of had compatibility nightmares, but just adding a second cpu core... many of those first ones are still very useable today, may not be the best experience, but they are useable.

yes, mainly for the light boost feature.

How old are you OldAnon that you remember color CRTs?
Even my dad, who's getting up there in age, had a color display for his CoCo 1.

>tfw nvidia is the apple of the pc

Attached: 1080fire.png (1280x720, 1.14M)

nope, it doesn't work at all in fullscreen windowed and windowed mode

can it make a non-60 feeling 60 fps game feel like actual 60 fps smoothness? i know what i said doesn't make much sense but there are some games i play locked at 60 fps and it doesnt feel smooth at all

This. It's kind of a deal breaker for me.

if you have the money to shell on shit like gsync then odds are you can just buy a better gpu instead and get rid of the need for it with high frames

Attached: what.jpg (950x808, 135K)

You can force it in the drivers, but it does cause problems depending on the game. In my case warframe worked fine after alt tabbing, but far cry 5 was a stuttering mess

>cant use in virtual full screen or windowed
giant deal breaker

Completely useless 99% of the time.
The better option is to invest in a better GPU instead of paying a premium for that bullshit

>turn on gsync
>get massive performance hit
the absolute sate of gsync

Attached: 1622614.png (1120x629, 314K)

what's even worse is that some games like farcry 5 run in borderless full screen even if you set it to full screen only so it causes G-sync to shit itself

spend the tax on a better gpu instead unless you've already ordered a 2080 ti

yes

let me elaborate a bit, the thing that makes sub 60 feel like shit is you have an over 16ms time, and because of the intervol for monitors refresh is set in stone traditionally, it will display whatever frame was ready every 16ms, regardless of how old it is, in a bad case you may have 59fps, but because of how it works, instead of 16ms like it is with 60fps, 59 could have up to 32ms of delay between frames.

freesync, as long as the monitor is within its hz rating, when a frame is ready, the monitor will show it as fast as it can.
for me on my test, we had a 30-144hz monitor that was well known for working under its rated hz, ours would work into 18-17fps range, but we couldn't make the game run shittier if we tried as minimum clock rates were pushing us to 18-22fps

at 18fps, shit was not smooth, but there was no feeling of lag, the game felt perfectly playable.

you never want to take a game out of the sync zone, as to low and it will stop working, and too high and it will stop working, what you want to do is set it up so the card never goes above its sync range regardless of if it can easily do so.

Just know, sync tech is something you have to experience to fully understand, because on paper its underwhelming, and even if someone goes though it with you, it sounds underwhelming.

and just to note, here is what I value in monitors

ppi golden zone > contrast > local dimming > sync =/=144 hz > 10bit

I don't game to often, or I should say play games where the monitors refresh and how well a frame is delivered matters much, so just on a base level for me 144 is a bit more desirable than sync is. When I did the monitor tests I was playing fps games a bit more often then I do now.

will intel giving free sync to their new dedicated GPUs (in a slot) mean they will patch it to the old intergrated gfx drivers? I want to make use of my internal gfx and wondering if I should buy some random cheap office cpu that includes the intergrated gfx chip because I imagine a 9900k wont? you can use that win10 passthru mode thing to give your self free sync with only 3ms of lag then.

I guess just buying a 0$ amd gpu with it and putting it in another slot might be wiser thou shrugs.

>is gaming on PC worth it?
no

Yes and no 70% of the time you wont need it for playing old/current games but when that new game comes out that you've been waiting for and its an un-optimized piece of shit your gonna wish you had it.

144 Hz non G-Sync monitors are cheaper than G-Sync ones, and at 144 Hz, a frame of input lag really isn't going to be very noticeable.

Whoops, disregard that. I have a GPU that can consistently pull more than 60 FPS so it doesn't make a huge difference for me. But I can see how G-Sync would improve the experience with a lower end GPU - but for the $200 extra G-Sync adds to your monitor, you could have bought a nicer GPU.

What would you recommend for a 1080ti?
An Asus MG279Q (IPS no G-sync, FreeSync),for 550€ or a Dell S2716DG (TN, G-sync) for 520€?

Good goys, buy goy sync

>not using freesync with goyvidia gpu

Attached: bane6.png (346x524, 389K)

Get the IPS one if you're willing to pay the extra € (with a 1080ti you definitely are).
Adaptive sync is only worth it if your GPU isn't able to keep up with your monitor.

Yes.

Nigger.

>or just capping your FPS is fine to prevent screen tear.
It somehow works on radeon but on nvidia gpu's you will get a fixed position tearing line that never disappears. Not sure if leatherman did it on purpose to promote gsync or not, but I never had this problem with older nvidia gpus or radeons, but pascals are doing it.

Dude, I looooove proprietary interconnects and protocols
Fuck open standards

Have you checked whether the nvidia driver settings are set to override the application setting? Otherwise something else must be broken.

Am I the only one who really hates how lately the work "premium" is used instead of overpriced piece of crap?

Override what? I'm talking about the case where neither vsync, fastsync and gsync are disabled

>the case where vsync, fastsync and gsync are disabled
*sfx

there's no point, g-sync is dead in the water and the freesync/adaptive sync technology has been built in to every monitor with a displayport since 2014. nvidia will eventually roll over.

what software are you using to capping the frames? I personally use nvidia inspector and never had a problem

Ok you sold me on g-sync monitors. What are some good ones? preferable 27' 2k 144hz

Freesync is, unironically.

Adaptive sync is always worth it. It gets rid of screen tearing and minimizes the impact of those dips from intensive action scenes and/or system hic-ups.

G-Sync/Freesync have no impact on performance.

The problem with those benches is that Pascal/Maxwell's texture/color compression sucks at dealing with HDR so they ends-up being memory bandwidth straved (HDR mode requires a lot more bandwidth).

This has been known for quite some time acutally. It looks like Nvidia fixed it on Volta/Turing which is why they used HDR on their official "sneak peak" benches.

on a 144 hz, you are looking at ideally about a 7ms response time, but in the worst case scenario you are looking at 14ms. its not like 60hz where you were looking at 16ms at best and 32ms at worst but its still there and that input lag is feelable, personally I love the idea of when my computer is ready to display a frame it displays the frame opposed to waiting on a monitor to be ready to show it.

The fast the refresh gets the less impact a sync will have for it besides just dealing with tearing, which in and of itself would be worth it.

but even right now, 144 with sync is noticeable in a worst case scenario then 144 without.

The main problem is gsync which makes it hard to recommend, as they use fpgas that cost 100-200$ on normal monitors and 1000$ on the 4k144hdr monitors. that increased cost makes is very undesirable, but amds lack of a high end also makes it hard to recommend freesync alternatives.

If you play with mid range cards, I 100% recommend amd over nvidia just because of the options you have, and at mid range prices amd is competitive, if not the out right better choice. But if you play the high end, I wholeheartedly tell you to wait on navi and see if 7nm or any improvements amd made are worth it, as nvidia sync monitors are going to have at least a 500$ premium over the amd alternatives due to the 1000$ fpga (nvidia may have bought them in such large quantities that the price came down).

and for specifically, the more lazy a dev is the worse the vsync is, I have several games that the moment you aren't hitting 60fps it goes to 30fps
I like taking control away from developers in most cases due to how shit they are and how bad what they think is good enough is.

freesync is easier to recommend because you aren't paying a 1-200$ premium for it due to the fpga.

they both have a minimal impact on fps, however nvidia has a slightly more than amd impact, in both cases users would not be able to tell without looking at a graph.

>in both cases users would not be able to tell without looking at a graph
I read it so fucking often it triggers me. Does that mean gsync is fucking unbearable? Like fastsync?

If you want to go sub 45hz...

You can get away with tearing using values that are multiples of the screen. you don't get tearing with on a 60hz screen if you are at 30fps, 90fps or 120 fps and so on.

in normal non hdr g sync, the impact is between 2-4% and freesync is 1-2%
Its a number so small you need graphs to calculate the impact.

hdr though, one of the ways nvidia made due with 3.5 and 4gb of vram was by compressing the images, hdr shits on this hard so they need to come up with a new/better compression way to do this, or increase bandwidth, something gddr6 is doing, and since most of the hdr I have seen pc side is 4k and all tvs are 4k, you would need to be on the stupidly high end anyway to see it, and this is where nvidia is doing the most to increase ram bandwidth.

What do your numbers even mean? AFAIK gsync and freesync affect latency, not fps. Is it 2-4% latency increase? Or fps decrease? Or satisfaction rate?

both syncs when they came out got tested extensively, and it was found that they did impact fps, but the effect was so minimal without specifically testing for it, you wouldn't notice.

kek

So instead of looking at latencies which actually affect gaming experience you suggest looking at fps which is irrelevant, I got it

It'll never come to tv's or consoles, has it even made it to laptops yet?

They use actual adaptive sync in laptops (which is same as freesync) but label it as gsync.

pre·mi·um
ˈprēmēəm/Enviar
noun
...
2. a sum added to an ordinary price or charge.

Yeah it's just a polite way of calling the goyim tax

follow the fucking conversation

>G-Sync/Freesync have no impact on performance.
to
>they both have a minimal impact on fps
to
>I read it so fucking often it triggers me. Does that mean gsync is fucking unbearable? Like fastsync?
to
>in normal non hdr g sync, the impact is between 2-4% and freesync is 1-2%
to
>What do your numbers even mean? AFAIK gsync and freesync affect latency, not fps. Is it 2-4% latency increase? Or fps decrease? Or satisfaction rate?
to
>both syncs when they came out got tested extensively, and it was found that they did impact fps, but the effect was so minimal without specifically testing for it, you wouldn't notice.

someone said they had no impact on performance, but they have a very minimal impact.
beyond this, they function exactly as advertised, as long as you are in the sync range, it displays a frame when its ready, imparting no additional lag other than absolute on the gameplay (normal monitors have from the mouse click to when its registered, to when the gpu can produce the frame, to the monitor, and then the monitor, then it waits on the monitor hz to tick over with said image, this removes the hz tick over wait)

freesync is already on the consoles and by extension, several samsung tvs have freesync over hdmi, I think there are a few other companies that have sync over hdmi but samsung is the only one I know for a fact has some.

You have several factors to contend with here.

amd allowed everyone who fell into the standard to use freesync, which is why you had monitors that had a 45fps to 55fps range that were dog shit

Nvidia demanded a minimum standard, this also forced the price up because these were outright better monitors.
Then you have nvidia doing the sync through an fpga, which adds 100-200$ on top of said monitor cost, which is why you see the free sync variant of gsync monitors for several hundred dollars less.

You got a premium price because the monitors are better along with a price premium because they add in a 100+$ fpga

Except you can find monitors that have identical specs and only differ in having g or freesync and still the difference is almost 300 bucks.

I'm getting a g sync monitor in the mail today but I didn't plan on using that feature because it introduces input lag. Should I do it anyway, is it noticeable in shooters etc.

I honestly have no idea how much the fpga for normal gsync costs, I do know the one that nvidia is using for 4k144hdr is a 1000$ fpga and some people have said volume purchases got that down to 500$

for normal g sync maybe they have different fpgas depending on the range or quality, I have no doubt that companies would change more than needed, but not knowing the cost itself has be go on benefit of the doubt rather then this company is fuckign you for no reason.

Nobody said they were bad monitors, and nobody said anything about AMD for that matter

Who in their right mind would pay $300 extra for such a meme

I brought of amd because amd allowed anyone to use freesync regardless of quality, nvidia only let quality use g sync, because of what amd did it made freesync for a number of years seems shitty, and it also made nvidia seem expensive for no reason.

as for 300$, got a link to the monitor, because the worst once I know of were 200$, that said, a meme? no
But is the effect worth the price premium? thats a hard one to say.
I honestly think adaptive sync as a standard is going to be taken up by nvidia soon because gsync just does not scale, I mean for fucks sake they are using a 1000$ fpga at 4khdr144, shit like that isnt going to sell.

with both hdmi and dp having adaptive sync in their standard, the logical thing to do is support it.

>no mention of ULMB ITT
>wide spread missinfo and ignorance
Bit sad because this kind of technology has been wanted for years up until the internet went full cancer. And it was delivered with pretty much no quirks except for some shitty game support. Only thing missing was the talk of an adaptive ULMB/Gsync mode.
Also is a bit wrong about the lag. Lag was only an issue if you were forced to use vsync, The bigger advantage to gsync is the no screen tearing along with effectively no lag. Also GPU tech still isn't really there to push high frames at resolutions above 1080p unless you turn graphic settings down.

Attached: 1530038356201.png (975x1075, 1.6M)

It is a meme because it has no purpose other than reducing muh lag and reducing muh tearing

There are so many other factors out there that each provide a much better experience than having hz and fps synced, and that makes it a meme

yeah, I know...Acer Predator X34 owner. While meme-ish, no ragerts on this purchase. The X34A would have been more money, though the OSD controls are better and automatic overclock would be nice.
I can definitely see how Jow Forums would see this as pointless unless you secretly play games. So the price would very well not be worth it.
Until AMD makes a high-end GPU, G-Sync monitors will almost entirely be used by those with high-end GPUs.
Define "high frames." I would consider anything above 75 to be "high" since going any higher is pointless. 100Hz should be more than enough for anyone.

Attached: 92506_acer-predator-x34.jpg (1500x1454, 158K)

>I would consider anything above 75 to be "high" since going any higher is pointless. 100Hz should be more than enough for anyone.
Because you're ignorant about what you're talking about. 120fps static is what is needed. 240 static would be the next goal. That's static not average or high.

>Because you're ignorant about what you're talking about.
tell me why it is necessary to even want anything more than 100 Hz...
you can't see the difference. even if you magically were able to, what is the advantage? feeling obligated to use more energy?
Jow Forums is going blind from furious masturbation to yaoi, so clearly there's no point in continuing the "muh hertz" meme.

ULMB

Freesync > Gsync

Yes.

don't know about GSync but FreeSync is the tits

Too late. OP here, I bought a FreeSync 144H even though I have a NoVidia card.

Attached: 1535940178215.gif (232x232, 695K)

you are retarded.

nah i'll use the $250 extra towards a better gpu

or he can get a cheap radeon and use freesync passing through.. nvidia will patch this tho..

youtu.be/XOBXQsNM3D8?t=206

in a nutshell.

nvidia card does the rendering. pass through to the radeon freesync gpu (onboard is easiest to do but i've seen other cards used (cheapest costs $100. so if a gsync monitor is going to cost you 200 more than a freesync .. get the free sync and pick up a pass through card?)

Holy shit my based 380x still has a use

gsync is pretty much necessary for emulation, especially mame and dosbox

Costs too goddamn much
You're better off just skipping it and using blur reduction

That is completely wrong.

VRR has no impact on your frame rate or frame-time.

>ITT: /v/tards who don't understand how mointors and 2D output actually works.

No one makes 120fps content and very few games have netcode fast enough for >60fps to matter.
Gsync isn't 200$ better than freesync and freesync is standard on any monitor worth considering.

Enhanced sync is good enough for 99% of people anyway and it works on any monitor.

Attached: Enhanced-Sync-Decreases-Latency.jpg (1280x720, 178K)

>It looks like Nvidia fixed it on Volta/Turing which is why they used HDR on their official "sneak peak" benches.
Or they were trying to cherrypick for the biggest generation => generation gains which is why some games used HDR for no explained reason.

wtf does adaptive sync have to do with high refresh rate?

>user should buy a 144Hz+ panel and cap FPS there to prevent tearing. No need to worry about lag/stutter.
>Yeah bro cap your fps, that won't cause any lag at all
Disgusting misinformation, kys

Attached: 1469427187265.jpg (366x380, 20K)

>wtf does adaptive sync have to do with high refresh rate
What exactly do you think all these sync methods are syncing

Pro's
>no tearing

Con's
>extra cost to monitors
>slight input lag, though I only notice this on CS:GO
>some games don't play well when capping fps and can stutter
>slight performance hit, usually on esports titles due to high fps. minimal at most

Is it worth it? Yes. If you play esports games somewhat competitively, no.

I had a 144hz monitor for years and would cap the fps at 143 or 144 fps and would still get tearing. Also brought along a stutter and slight input lag.

They're probably not going to patch it. It's literally the same Nvidia Optimus you get in laptops that send stuff rendered on the Nvidia card to the Intel integrated one to be displayed.

>They're probably not going to patch it. It's literally the same Nvidia Optimus you get in laptops that send stuff rendered on the Nvidia card to the Intel integrated one to be displayed.

shame freesync doesn't work with Intel chipset video cards. i'd test it out. but then again only works on games that you can choose what video card you have does the rendering.

S O Y
O
Y

It does. I tested it. When fed through my iGPU. But I have to have the AMD GPU connected to something even if it's not displaying anything to the monitor. Freesync lights up even though the monitor is running off the iGPU. The grunt of the work gets done by the AMD GPU. Theoretically you can connect high end Nvidia out on your iGPU as long as you can set the AMD GPU as the low power device in Windows 10 with the latest update. It seems to work best with AMD APUs but then you might be bottlenecking so maybe a GTX 1060 and APU would be the better 'cheaper' option.