So, whats the point of a 144hz panel without a variable refresh rate technology? (be it gsync, whatever)

So, whats the point of a 144hz panel without a variable refresh rate technology? (be it gsync, whatever)

even at 1080p with, lets say, a 1060 or whatever those gaming laptops rock, you would get massive screen tearing right?

Attached: 364211-razer-blade.jpg (1000x736, 77K)

I never use vsync and I never notice tearing at any framerate

You can always lower the graphics, I have a 1050 and get 200 fps in CS:GO

Don't you kids still have school tomorrow?

not if you limit game fps dumbass

implying you arent a kid. faggot. go shill mdisks some place. yes i remember.

I get fucking screen tearing on a 60hz panel with a 1050ti

>Underage faggot with 3k to his name calling anyone a kid

Unreal

I have a Razer blade with the 1080p screen and a 1060 in it. zero issues whatsoever and I was playing shit like elite dangerous on it even

no, i have work
btw, i see you are new here and filled the name field. You should avoid that as it depicts you as an attention whore in an anonymous board.

limit the fps to an amount that divides 144hz cleanly? (ie: 72)?
because tearing, happens when your fps is not a an integer of your refresh rate right?
unless you mean cap both the rate and the refresh rate?

The one with the 144hz or 60hz panel?
using vsync?

Attached: 1511077631575.png (196x168, 39K)

Ditto. 2016 1080p, no vsync or any of that crap and I never get tearing.

144 Hz panel still receives part of the rendered frame, even with screen tearing

as long as that part of the rendered frame is somewhere central to the center of the screen, that means you’re getting a new visible frame that processed and updated from the last visible frame, and therefore have processed new mouse inputs, and gotten rid of momentary shutter angle hiccups between interpolated pixels that would otherwise cause pseudo motion blur.

> more responsive to mouse inputs
> less jitter in visible tweening
> more clarity, more soap-opera-y

because when running over 100 fps tearing is basically in-perceivable anyway

Attached: RBv5Z.png (1125x681, 63K)

you still get screen tearing at 144 but its less imo and most people are 100% used to screen tearing and don't even know what it is because they have used LCD their entire life.

I grew up on CRT and now switched back to it CRT has no screentearing but still has some thing similar to it if you move really fast over steps etc.

gsync and freesync actually add about half as much lag as vsync so they still add lag and mouse delay etc. so if your playing to win using them is bad that's why less monitors support them now.

they will probably never fix screen tearing on LCD closest they have got is with micro motion blur that simulates strobing of a CRT but it still isn't that good and most people don't like how it looks.

if you don't want screen tearing use a CRT if your using a LCD its best to just get used to it.

>t. Someone who hasn't used gsync.
Even with a 240hz display with gsync, enabling and disabling gsync makes a noticeable difference.

>CRT has no screentearing

Attached: 19180.png (856x846, 85K)

I recently reinstalled bayonetta on my "gayming" PC and even though the fps is already locked from the game I still get huge screen tearing with a GTX 1070,

that's weird i don't get tearing in that game with my 1070. not even in nioh which is also locked to 60 hz but doesn't have vsync

> (You)
>The one with the 144hz or 60hz panel?
>using vsync?

60hz, I never use vsync because I hate input lag

there's no point. just a marketing gimmick

The laptop GSYNC is figuratively freesync. It's not literally freesync because that's been trademarked by AMD as ASYNC over HDMI(?) I think. I don't know what kind of standard the controller boards have in laptops but it's all just digital and you can translate it vice-versa with passive adapters anyway.

Screen tearing as a visual effect is massively diminished as the framerate increases. If your delay between frames is

This.

This is counter to the meme that the jews at nvidia keep pushing - just keep the FPS above 180 or so, and nothing else matters. Turn off all your FPS limiters, vsync, freesync, gsync, triple buffering, other buffering, resolution scaling, motion blur, and post processing if you want minimal latency and tearing. This bare-bones approach sounds counter-intuitive but once you start piling these jew technologies on top of one another it starts to suck.

That said, if you want visual fidelity and don't care about latency, turn *everything* on and just pray it doesnt drop below 40 fps.

Thing is, let's say you have chosen to buy a gaming laptop with a 1060 and you get the choice of 60hz or 144hz. However since it comes with optimus, neither have a gsync option.

If you can't push 144 franes (let say you hover between 60 and 80 depending on the game) the 144hz panel could be pointless since you might be better off locking refresh rate to 60 to avoid tearing than leaving 144hz (also, what would happen if you enable vsync in this scenario?)

I'd still run w/out any vsync or buffering, and simply lower the model detail or other graphics settings to compensate. If I am below 120fps on any platform i either need a new platform or to adjust my settings. Again, this is mostly for fast pace gaming situations. If you are playing something like LoL/Overwatch/etc you will be 100% fine with this approach.