50 fps difference

>50 fps difference
Shitel fanboys will defend this.

Attached: 1562514044816.jpg (1920x1080, 221K)

Other urls found in this thread:

youtube.com/watch?v=hjWSRTYV8e0
twitter.com/SFWRedditImages

I will because my monitor is only 240hz so I'd rather have the much higher minimum framerate

99th percentile > average

Where the hell are you getting those retarded ass numbers from?

who tf has a 350hz monitor?

CS:GO and Dota too. Weird how lots of the esports games do well on it.

it is all placebo BS. As long as the minimal doesn't drop below 100FPS. You can't tell a difference. The impact of going beyond 100-150FPS with input lag is negligible. You become the bottleneck (musculoskeletal system - nervous system).

>High FPS count doesn't matter

>LTT video
guess Im the tard since I know where it comes from

Average FPS doesn't matter

I really wish instead of reporting stuff like lowest 1%, they'd just use something like standard deviation to describe how consistent the framerate is.

The creator of Linux

look at the blue bars, bitch motherfucker

Yes the bars are blue just like the pill you shoved up your ass.

It's ok bro next time just buy an Intel.

Linus Tech Tips?

>lower average
>but muh blue bars
stay mad fagit

Attached: reeee.jpg (213x237, 10K)

Just buy Intel next time you won't have to go through this debilitating cope.

>It's ok bro next time just buy an Intel

Attached: 1563455436921.jpg (1035x1000, 189K)

ADD IT TO THE LIST

The list is literally too big, 2000 character limit reached.

You know what else is too big? Your fps drops.

It's too late. Cope levels have surpassed 11/10 because zen2 failed to btfo intel.

Attached: tenor.png (1200x846, 390K)

Hmm

hm. now that i think about it "performance doesnt matter" is already on there, so high FPS count isn't necessary.

Took you long enough for this comeback, it's like your brain is pozzed or something.

You know AMDpajeets are seething when they start spamming greentext wojaks

AMD peasants were looking for a "David defeats Goliath" story but they got disappointed and now with disappointment comes extreme levels of coping.

Peasants love "David defeats Goliath" stories to try to forget their peasant status.

It

>As long as the minimal doesn't drop below 100FPS
no you can definitely tell the difference in something like csgo so i'd imagine its the same for siege

Attached: 1561671382202.jpg (1000x1000, 138K)

No way people are still arguing over this lmao.

Love intelfag tears so much, when gaming was their last line of defense and they're getting crushed there too, you know they start panicking and go into maximum shilling after realizing intel is pretty much shit in every way possible.

Attached: 1537419972674.png (1037x311, 340K)

>amd drops below 200 fps constantly
literally unplayable
why are people shilling for this garbage?

Attached: bullshit.jpg (512x384, 21K)

The absolute state of shitel cope ITT

Attached: your-tears-are-delicious-360x500_1_.jpg (600x833, 80K)

>this is the absolute best case scenario for PajeetMD
>it comes with inconsistent frame times and frame drops

All are above 60fps, so all are equally as good.

Noticeable how? Provided screen tearing is eliminated, there is objectively no way to know you're playing at your monitor's refresh rate or anything higher.

its hard to explain, you can feel the difference even at 60hz. Just try it

Attached: vlcsnap-2017-09-22-16h25m23s235.jpg (963x590, 92K)

Can confirm my 2700x is a huge bottleneck at 1440p with a 2080ti uncapped fps

I have capped csgo at 60, 80, and 120 and running the benchmark, couldn't find any difference. I mean there is no computerscientific reasoning behind this supposed advantage.

Some people can notice the difference. But noticing isn't really helping. Theoretically, higher refresh rate means the screen is updated faster, but a human won't out preform 120hz or 144hz.

Your monitor cannot possibly show an image on the screen faster than it's rated for.

running a benchmark isn't playing the game.
youtube.com/watch?v=hjWSRTYV8e0

this
why are people still perpetuating the 120hz meme?

120hz, like ssd's, are memes

Lowest 1% is easier to understand regardless of academic background and tells you right away if there are consistent performance issues present in a given benchmark.
Not saying standard deviation is some strange arcane concept mind you, just that it's relatively less clear what a high number actually means when compared to the 1% minimum metric.
I guess a meaningful comparison of the techniques would boil down to whether or not one or the other is better at identifying stutter or other performance issues.

>and running the benchmark
well you just discarded everything tied to the advantage right here. The supposed advantage is lower input latency, since the final frame that is finally drawn to the monitor will be newer than the first frame that was never drawn.

On a 60Hz monitor, if the game is running at exactly 60 FPS, that means at least 16.67ms of input latency from the moment you press a button to when you see the result of this onscreen. If the game is running at 600 FPS, only 60 frames are drawn to the monitor with the remaining 540 each second being discarded. But the information contained in the frame that is finally pushed to the screen could potentially be 15ms newer, which would represent a very clear and obvious advantage. It also means an enemy character that wasn't visible in the initial frame might've come into view during the frame that was finally pushed to screen, which means noticing that enemy more than 30ms sooner than otherwise.

>before Zen2
>intel gets 330 vs amd 280 fps
>Amd shills scream that difference is irrelevant
>1% lows count
>now amd gets more fps and sucks at 1%
>hurrdur intelaviv btfo
>it's over
What the fuck is wrong with you?
Even when you fucks get called out on it, all you can respond is
>muh intlel tears are delicious or cope
When in reality you are pathetic and coping spergs.

The 3700X numbers look very reasonable for a generational jump up from 2700X, greatly improving the 1% minimum rates. If anything, it's the 3900X that is an outlier in how it pushes up both the average and has a lower minimum.
So there's something clearly odd happening with the 3900X that I feel would warrant investigation, given that it's the only AMD part listed that is

AMD can do no wrong and Intel can do no right because they're blatant fanboys. It's utterly pointless to try and have an actual discussion.
You see the same type of hypocrisy in GPU threads. They'll literally tell you that power draw doesn't matter over there. Meanwhile in the CPU threads of course they're making fun of the 9900K for using a lot of power.

half of their reason for living is annoying intel owners

Attached: 1507818309681.jpg (190x1355, 91K)

That's a bunch of dookie, if you get more than 60fps, then you have more than 1 frame available to be read per second. There is no other way. Without vsynch, you will output what is currently on the frame buffer no questions asked, no latency added or taken. What the guy circles in red is misleading.
Around every 60th of a second, the kernel halts the game and starts outputting what's currently rendered to the monitor. Best case scenario, you're playing at 60FPS + epsilon to cover process switching and others; you have a complete redraw of the game scene with completely new input relative the last game frame, best case scenario you output it. Worst case, the game was in the middle of redrawing it and you get screen tearing.
This god damn myth is perpetuated by gullible gamers who don't understand computer architecture.
>15ms newer, which would represent a very clear and obvious advantage
That is your completely best case. Even if we assume 15ms is humanly noticeable, you will experience anywhere from 0 to 15. CSGO has a native linux port so you would do better to run openbox on Xorg, without a WM, and compile your drivers yourself along with a custom-tailored low-latency Linux kernel. You can adjust your process switching frequency and USB poll rates yourself and turn off many security features and unnecessary background processes, and run CSGO with almost max priority. This will easily slam your input delays near 0, deterministically. Try it yourself.

the real story that you guys never mention is the 3700x vs the 9700k which is the price bracket most people will be buying at.

You're all fucking hopeless. Enjoy your overpriced Intel shit!

Attached: images (2).jpg (531x578, 55K)

That doesn't stop intel shills from claiming "best in gayming".

I'm playing csgo on 60hz monitor and the difference between 60 and 150+ fps is huge. You may check your eyesight nigger.

What I'm saying is, I want clear and concise technical explanations backing up this latency FUD.

but you can feel the difference when playing, it feels more responsive

But I've already tested this since I was 8 years old on analogue VGA CRTs on 50 to 240 fps on 800x600 res, PS/2 mouse. There is no such benefit for fucks sakes this is god damn snake oil.
If anyone really does care about lowering latency, she should use a linux environment compiled and configured for the current specific PC.

It's sad how a thread that was supposed to btfo intel has now devolved into a thread where AMDpajeets are typing huge paragraphs of text to explain why the performance "doesn't matter"

There is, probably your eyes are just fucked up and you can't see the obvious difference between 50 and 240 fps lmao.

>You may check your eyesight nigger.
Except there's no way it would lead to a visible difference because your monitor is refreshing at 60Hz no matter what your fps is. Why is this so hard to comprehend for the drooling gamertards?

do you play vidya often? if you dont you aren't gonna realise the difference because you aren't used to it

If your monitor refreshes at X Hz then you see X images a second prove me wrong.

This is sad.

kys kanna

your pc is still rendering the game at a higher framerate than your monitor outputs though, the result is more responsive mouse input

>This is sad.

How do you think the monitor draws each frame? 60 frames instantly at the 999ms mark? or a single frame every 16.67ms?

you fucking retard i () was against intel

I don't get how people can't see the difference especially in fast paced games like csgo. You don't even need fps counter, you just know you are playing at below 100 fps.

Reminder that NVIDIA and AMD used to work together to fix prices on GPU releases and this is why they are so expensive now.
"the good guys"

I play a lot of games, about 5 hours a week.
That's not an argument because visually, the game world exists at discrete intervals of time ~16.5 ms apart. You do something and react to it 200ms later on average if you're in top shape, which is several frames of data. You cannot take advantage of inter-frame times.
Completely depends on your kernel, opengl on Linux is a single frame every ~16ms.

Performance doesn't matter, the blue logo is all that matters

Attached: 1536612142306.png (504x400, 91K)

>amd wins in one out of 10 or more games
>obsessive fanboys post it all over Jow Forums
i'm on ryzen's side btw, doesn't make you less of a lunatic
go back to shitposting on wccftech

hai i know user this other user is jsut a retard

Attached: 75714283_p0.jpg (1149x1106, 177K)

>I play a lot of games, about 5 hours a week.
nice bait

>caring about gayming

is that way child

>i'm on ryzen's side btw

Enjoy your 50 C idle temperatures

It's probably down to the cross-chiplet latency, which adds to the already present cross-ccx latency.

That's already an unhealthy amount according to doctors, researchers, and the WHO concerning videogames themselves and mental addictions. I'm serious.

Buy Shitlake sirs

Attached: 1562603080968.png (744x420, 106K)

do you do it for free?

Attached: 1563613802042.png (882x624, 27K)

literally who cares lmao

>smaller chip = heats up and cools down faster

the blue line doesnt matter to me since i have a 1440p 144hz monitor with amd least i can still use hyperthreading and be secure

>performance doesn't matter