We are better than Intel!

>we are better than Intel!
>blocks their own path

Attached: Untitled.jpg (2931x1102, 401K)

Other urls found in this thread:

twitter.com/BitsAndChipsEng/status/1138435414421901312
a.uguu.se/amT3cuWbGEby_2019-06-12_04-01-24.webm
blurbusters.com/gtg-versus-mprt-frequently-asked-questions-about-display-pixel-response/
twitter.com/AnonBabble

honesty bad

Useless benchmark

ryzen is better at ripping animus, though. i don't care about anything else.

>finding the one frame of the benchmark where "intel wins"
>forgets it's over NZD$100 more expensive for the same performance
oopsie

Attached: file.png (1904x462, 245K)

It's about the same difference as the turbo frequencies. Strange how the 3800x supposedly matches the 9900k

>games

are they using fucking fraps

Who gives a flying fucking fucking fuck if you have not 60, not double, but almost quadruple of that FPS on both and in 1440p above all? Pricing and power consumption should be considered here.

>literally looking for one frame spike in the entire video
So there are no paid shills huh
lmao

read that as ripping anuses

lamo
my 2600k @5.1ghz can do better than poozen

"That higher fps is gonna make a huge difference to my level of play" says someone who plays at the highest level with tens of thousands others and only rng determines the winner for 9/10 games. God I dont know it is just slots and I believe I can refine my skills further via insanity.

No... not like this bros... we were promised equal gaming performance why is Intel still better

Bros...

Attached: Untitled.jpg (2824x1098, 336K)

>no one complains about the resolution
okay

kek might be that as well

>it's a better cpu because the number is bigger
>it's the price tag

screenshot was 3600x

lamo
>275 fps

literally unplayable

when every frag counts every frame counts

Attached: IAjGaZO.png (930x528, 493K)

Nigga I got 200-300fps in CSGO with an AMD Phenom 2 from like 2010. How is this a proper measurement?

Attached: 1559407113248.jpg (600x814, 76K)

no price tag for new zealand yet

they're probably running with maxed settings and 16:9 instead of 4:3 stretched from the looks of it.

My i7 980x From 2010 gets 300fps flat in CS:GO with a 1070. Funny how hardware built around that generation's engines actually performs far better than the new stuff.

But that's well beyond any vsync. I'd say those would both do an equally good job of 165fps on my screen.

wtf are those benchmarks. That's the old Dust 2 from years ago.

You mine as well be looking at a still jpeg lmao

ARE THERE ANY REAL WORLD BENCHMARKS OUT YET?! DON'T GIVE ME FUCKING RELATIVE FPS! I WANT 99%s, 1%s and 0.1%s IN DEMANDING AAA TITLES, NOT SYNTHETIC BENCHMARKS!

Back in the day I had a full HD monitor and maxed out settings. I wonder how much of a difference the 1440p makes for that Russian toaster optimised game.

>Oh no no no no no they can't keep getting away with this! Shintel 4 life!

Attached: lmao.png (1033x767, 1.01M)

>1080p
Who plays in that 2003 resolution
intelavivs are fucked in the head

not much, probably. haven't played in a while but valve did some crazy optimizations to it all the time when i still played. what made huge improvement in terms of fps was dual channel ram for me. gpu doesn't do much. csgo is supposed to be cpu bound. goes to show how far amd has come.

there is no ultra setting for csgo. its all high

quality bait 10/10

Uhh, the screenshot clearly shows 1440p. Nice bait though. Also not sure what's wrong with 1080p. Not everyone has the best GPU money can buy.

So should I get Incel or Ayymd?

What are the buyers thinking?
Is it kids who are delusional/obsessive who want this stuff?
People who are easy to influence/manipulate are ignorant. Easy to sell things that arent effective. All that the seller has to do is to create marketing campaign with an image of popularity/superiority/healthy to get these people buy. Best, new, proven, on sale, fixes your pain. Life in a bubble. All they know is paid marketing. No other sources for information. Environment doesnt offer real choice. Brainwash, choices kept secret and not offered, no information, and dumbing down by locking down so impossible to learn, do more, compete or become aware of the bubble.
is me.

found the original

Attached: 1560269986648.jpg (2931x1102, 489K)

are people really this stupid? why compare cpu on GAMING when the 90% of the part is done by gpu???

Why is counter strike still being a reference point
Why does it matter if it runs at 290 or 300 fps

It's a CPU intensive game

I hate counter strike players so much. How can they really waste that much CPU power for a very small decrease in input latency. At least cap it at 200 fps, you'll be getting at most 5 ms of input latency.

no it's not. CPU intensive program can be encoder/video editor but not a game

For csgo players it makes a huuuge difference!

lol ok retard

amd is useless for gaming

INTARD BTFO

Attached: 1536694146733.png (1200x800, 164K)

theyre probably showing it to make the fps seems higher nu dust has lower fps than the old map

Yeah... I don't understand. CSGO is mostly GPU intensive, because your GPU will always bottleneck the CPU in this game unless you're trying to bottleneck it on purpose. Also how many cores does this game even support?

1440p ultra on a CPU benchmark?

try 1080p low to be CPU intensive dip shits.

Even the legacy branch of csgo has metal doors on dust 2 though

This benchmark was at 1440p. They were more than likely GPU bound for this test.

kek

twitter.com/BitsAndChipsEng/status/1138435414421901312

Oh no no no no no no

yeah i know they changed the doors and some small things on the bomb sites at one point, theyve changed the whole map now though

>RTG IS SANDBAGGING JUST WAIT(tm)
These guys have been caught lying time and again

Fucking amd. i can oc an athlon and beat out a celeron!

it isn't though. intel has better encode speeds

ryzen has more cores and is cheaper. when i apply my gay avisynth filters i use multiple threads. also, i can encode more animus at the same time. i don't think any of the avisynth filters use advanced cpu features apart from avx2 which both intel and amd cpus made in last few years have but i guess single core performance is still better for intel.

you need at least 300 fps consistent to get the game playing smoothly

oh god 19 fps at 250+ jesus unplayable

>3600x
>old dust
>grainy footage
seems fake and gay.

>1440p ultra on a CPU benchmark
Retards.

Bullshit, by the highest amount possible.

Wow cool. How is your electric bill and temp.
Why not oc to 6ghz? So you never need buy a need cpu.

everyone knows the eyes don't see more than 30 fps, why would you need 10 times that amount?

Game keeps getting more shit and is bloating

you guys obviously dont play this game on a competitive level

>not testing in 1024x768
>not testing in 800x600
do they even esports?

>that generation's engines
source is a patched id tech engine from the 90s.

When Intel eventually does make smaller transistors, AMD will be BTFO. They can't even beat Intel despite their current advantage.

For all the retards trying to tell you "more than X FPS doesn't matter," Source engine has frame based performance. Meaning someone with 300FPS will experience less latency and hitreg issues vs. someone with 144.

300 is the engine cap, so getting as close as possible in a competitive setting is key. Just another reason not to ever go AMD.

>not 1488 FPS
Missed opportunity

this, jesus christ stay in your containment board /v/eddit

>300 engine cap
source? that's just default for fps_max

Completely utter bullocks. Anything above 100FPS is pure placebo. What actually matters is that minimal FPS never dips below 100FPS for competitive twitch shooters.

Again none of that shit matters if your system consistently pumps well north of 100FPS.

You know about 240hz monitors?

Its true thought that engine is so shit that you need 300 fps to not get constant frame spikes while rainbow six is smooth as fuck at 60

I mean stuttering

you know that higher framerates reduce input lag right retard?

you obviously never had a 144+ hz display running a game at 300+ FPS, the input lag gets reduced by insane amounts and its very noticable, almost as noticable as going from 60 to 144 hz monitor, and im talking about input lag when u click or use a keyboard, its insane how good very high fps reduces input lag

how much of this is triple buffering? because 3 frames at 60hz take 45ms but 3 frames at 300hz is 9ms - less than one frame at 60

any type of threading/syncing features as well as max prerendered frames are set to 1 or disabled, nvidia latest driver even calls it very low latency mode

is that a 300 Hz monitor?

Intard? I have a Ryzen CPU!

Attached: file.png (514x567, 28K)

So this is all Wintoddlers care about? Childish video toys? What a bunch of retards.

Again, you hit a wall of diminishing returns beyond 100FPS a.k.a of 1/10th of a second. It is all placebo effect at work. Your are the "bottleneck" not the computer or your peripheral(s).

I have played games ranging from literate slide-shows (1-3FPS) all way to insane (300FPS+) insanity. Framerate starts becoming meaningless after 100-120FPS unless you are trying to accurately capture super-fast motion (not gayming shit). 60-85 is good enough for the overwhelming majority of the users out there. Only freaks of nature (not you) notice a difference with 120-144FPS.

The real killer of the experience has always been that dreaded massive dips (going from 120+ down to under 60FPS).

The whole more FPS = better! meme came from Quake engine-based games because it start breaking the game. It allow you jump slightly higher and run slightly faster which can make a difference in a competitive match.

Modern engines don't have this problem making the benefits of going at a super-high framerate is nearly pointless.

Netcode is a complete a different animal.

Obviously you dont you cringe faggot

it was not a meme, 125,250 333 and 1000 fps gave advantage in quake engine, games dont have that adv anymore in physics but more fps still reduces input lag and anyone that is at least in top 1% of any competitive game can easily tell the difference between few ms of delay due to how much experience they have of certain ingame actions happening very quickly.

you are a complete inbred retard and i hope you go back to Jow ForumsRust writing about how u implemented latest tranny traits,ty

to add on top of what i said above, somehow you think your opinion larping in ur RPG game matters, while entire thread is above CS:Go where every milisecond matters and everyone wants to get the most out of it hence why all good players user 240hz displays and all proper config/windows/hardware tweaks, the spacing, the desire to express >your< opinion and somehow attempt to skew the thread being about you,cloaking it all off with a nice slide finisher, really reeks of reddit aids

>is all placebo effect at work. Your are the "bottleneck" not the computer or your peripheral(s).
>Framerate starts becoming meaningless after 100-120FPS
>The whole more FPS = better! meme
>Modern engines don't have this problem making the benefits of going at a super-high framerate is nearly pointless.

a.uguu.se/amT3cuWbGEby_2019-06-12_04-01-24.webm

dont reproduce worm

Attached: 1511532047716.png (741x630, 27K)

Probably not until end of July when NDA lifts. I'm betting best value will be cross flashing a 5700 with XT vbios or getting a used 1080 for like $250 and crossflashing to no powerlimit vbios.

The coping from "mah 300FPS" fags is bloody comical.

They are just as bad as audiophiles who are convinced that $10K magical stones and $2,000 wooden knobs makes audio sound better.

CS:GO is entirely revolved around the netcode-gods (like all Source-based games) not super-fast framerates.

The point is that framerate started being a notable factor in competitive gameplay once CPUs got powerful enough to effortlessly handle minimal of 100FPS. You are going to be limited by netcode and yourself (nervous system + musculoskeletal system). The human ego can't easily accept the fact it is the weakness link by far.

Those charts are just placebos at best. I suspect that choice of mouse and mouse pad has a much larger impact on competitive performance then a silly-ass 240hz monitor. The human mind is a silly thing if it is able to convinced itself that a 240hz monitor is somehow better than 85-144hz units when it in practice it offers no tangible benefits.

>my brain is too dumb to process 240hz so anyone that can i will call a retard and say 240hz doesnt matter

blurbusters.com/gtg-versus-mprt-frequently-asked-questions-about-display-pixel-response/

MPRT doesnt matter, "mah 300 fps"

The visual cortex is indeed too slow to process 240hz. It just blurs out as "best-fits" .

Sorry, sweetie the human body has hard limits and the ego can't easily accept it. Time makes said limits grow larger. There's a reason why Esports arena is dominated by teens and young adults who often retire when they start heading towards middle-age.

Even on a 60/144Hz monitor, limit CS to 150fps and it'll feel like mud.

Blurbusters uses tests that are designed specifically to find those "issues" which are non-existent under real-world source materials, content and conditions.

It is interesting from a purely scientific and academic standpoint but has little practical utility.

ofcourse you cant see 240 frames frame by frame but the difference is huge in reduction of input lag and MPRS, there is a reason all of the pros in competitive games use 240 hz