Lol amd gets 190 fps instead of 200

lol amd gets 190 fps instead of 200

Attached: untitled-17.png (667x522, 21K)

Other urls found in this thread:

youtube.com/watch?v=YFqTGRhdu8w&t=10s
youtu.be/cAsyo8gIyys?t=6m1s
youtube.com/watch?v=6mHUWvMDcMI
twitter.com/AnonBabble

200fps is BF1's framerate cap. The 8700K could do far more without the cap in place, whilst Poozen can't even hit the cap at its max overclock.

Nice try though, Pajeet.

luh mao

Holy shit. It's like, 10 whole frames lower.

All those cores, and it can't max BF1. with a 1080ti
>le state of AMD

WHAT
I cant play BF1 on my 240hz monitor???

Attached: 8909098098.jpg (1280x720, 82K)

>with a help of 1080ti
lmao the state of amdead

Yeah, like, 201 fps, after ignorring the stuttering ;)

and only for 20% more shekels

>2600 $165
>8700k $370
If you only knew how bad things really are

As said framerate cap.

But the important part is that it shows for gamers that if they upgrade to a new tier of GPU they will less likely experience a cpu to gpu bottleneck and thus get more fps with just a gpu update instead of having to regularily change cpu aswell to get their desired fps and frame times.
This is important if you drive some >=144Hz screen.

stop being poor

>720p
AMD BTFO

Attached: aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9PLzUvNzY1NTA5L29yaWdpbmFsL2ltYWdlMDA4LnBuZw==(1).jpg (712x1435, 193K)

>720p benchmark
of course

Attached: perfrel_2560_1440.png (500x970, 49K)

STOP KVETCHING GOY AND JUST BUY IT

youre eyes can only see in 19 fps and 190 is a multiple of 190 so amd wins again but your to big of a brainlet to understand

Attached: 1518779418221.jpg (588x823, 109K)

Burning your house down doesn't matter.

Attached: your_off_ya_chops_mate.png (442x434, 176K)

>intel 2-core i3 chewing through more watts than an old 8-core from AMD
wtf man...

Attached: 1532119430581.jpg (1024x576, 44K)

What's the matter goy, what's wrong with housefires, too poor to buy a new house?

Attached: 1528292848450.jpg (811x1024, 223K)

>that benchmark is most likely single player mode (i.e. irrelevant)
>multiplayer is a lot more taxing for the CPU and the only thing that matters
>BF has a 200 FPS limiter hence only 200 FPS on 8700K
>unknown settings used in that benchmark
youtube.com/watch?v=YFqTGRhdu8w&t=10s
This guy is pushing 210-280 FPS on 1440p, no Poozen can touch this kind of performance.

>low settings
Anyway how do you explain a lower clocked 2700X raping the 8600K in 4K ultra preset BF1?

Attached: AMD-Ryzen-7-2700X-4K-01.jpg (720x567, 238K)

Delete this right the fuck now.

>>low settings
Yeah but the OP pic is in fucking 720p!
Look at the video description
>big difference in performance against 2700x at low and medium settings

Look at the guy's other videos, in depth comparison of both 2700X and 8700K in BF1, shows about 30% gain on 8700K.

>how do you explain a lower clocked 2700X
>4% difference
>raping
lmao
>lets remove the CPU bottleneck and run the game maxed out on unrealistic absolute dogshit framerate to cover up the fact that Ryzens are garbage
>comparing 6 threads to 16 threads knowing well that BF1 utilizes 8-10 threads depending on the map.
8600K despite having a better single threaded performance is behind 7800X obviously because it lacks more threads.
Btw you're missing a 8700K in that chart, how convenient.

Doesn't even show RAM or their timings which have a significant impact on FPS. This is 720p so it's absolutely irrelevant anyways, most people who get high end CPUs/GPUs also have high end 1440p or 4K monitors too.

If you really want to shill intel to the max why not use a 480p benchmark like pic related?

Attached: 1492016705583.png (622x336, 14K)

Is running games on the lowest settings or lowest res really the only way intel can win now?

>buy $1,000 PC
>play on lowest settings
k

youtu.be/cAsyo8gIyys?t=6m1s

This is everything you need to know about Poozens. Also keep in mind that these are stock clocks and that 8700K has a bigger overclocking headroom.

That's a beta. Here's a refined benchmark.

Attached: poo in the joo.png (1824x1026, 431K)

>beta

Attached: 1538285528291.gif (1020x797, 3.15M)

>what is competitive gaming
>sacrificing performance edge for graphic gimmicks that obscure the field of vision
AMD fags are a bunch of casuals, who would've guessed

If you play single players like fucking normies you are then go ahead and get the Pajeetzen.

>Is running games on the lowest settings or lowest res really the only way intel can win now?
Is purposely inducing GPU bottleneck via 4k resolutions the only way AMD can get close to Intel performance (all the same while having unplayable framerate to begin with)?

Dude, Intel is shit, get over it. Do you have buyer's remorse or something?

Attached: 1525554090644.jpg (1079x784, 161K)

>professional bibeo gaymen

Attached: Euphoric+_447d95e276a5b1dd56cac5e4a5d82b6c.jpg (1600x900, 357K)

/thread
Pack it up boys.

>That's a beta.
That's a cope

Wouldn't Intel CPUs also perform like shit it this was a software issue on Battlefield side? Fucking delusional AMDtards lmfao thinking everyone is out to get them and they need to protect their underdog

You have BF1 evidence just a few posts above, and BF1 is amazingly optimized, and guess what Poozen is still behind A LOT.

The only thing I have is $1000 in my wallet to upgrade from my old 5820K to 9900K the day it launches.

>4k_ultra_settings_benchmark.png
cope, most people still use 1080p and not maxing out their games so they could possibly run at 60+ fps, and AMD is shit at providing that experience better than Intel
>cue muh cinebench scores

> If you play single players like fucking normies
Normies play Fortnite, basement dweebs play VTMB.

>competitive gaymen
A fucking meme that needs to be banned. We're teaching future generations it's okay to play nintendo instead of going to college or even getting a job.

Attached: bd8.png (1190x906, 178K)

stop burning your daddy's money

>arguments

Sorry gramps, I need 1000 FPS in CSGO to stay relevant. You can get a Ryzen to run your Excel sheets.

this remind me of the 720p skyrim benchmarks from half a decade ago

That's fine and dandy but Intel is still shit dude, get over it. Do you have stockholm syndrome or something?

>muh bibio gaymes need to run hundreds of FPS at low 480 x 240 settings else my jobless uneducated bibio buddies can't circlejerk each other over FPS
>this is the only reason why people buy high end processors

Attached: 1531019938269.gif (320x240, 2.65M)

Why is it so hard to accept that intel processors are literal dogshit?
>"b-b-buh ma bibio gaymes"
How about you get a job like an adult, francis?

Attached: x265-2.jpg (600x600, 102K)

Who plays at 720p with those processors?

Not even CS:GO professional gaymers play on 720p. Apparently they overclock i7s to run at 10 GHz with an unlimited supply of liquid nitrogen and run their gaymes on low 240p settings to get 9001 FPS... on a 75Hz monitor like a pro

BTFO

>hundreds of FPS at low 480 x 240
Poorfags can't even afford a 144hz monitor let alone 240... all of the benches I showed were either 1440p or 1080p while OP (AMD shill) shows 720p with locked frame limiter at 200 fps kek

>everyone is suddenly a content creator that needs 64 threads because AMD is pushing MOAR CORES meme because they STILL can't compete with the core arch ONE DECADE LATER.
If you look at the OP, the topic actually is "bibeo gaymes" but AMD shills couldn't pass the opportunity to post some good ol multi threaded synthetics (not the topic of this thread).
Go shit in some other street Ranjit.

So they could fool people into thinking Poozen has good vidya performance.

>Not even CS:GO professional gaymers play on 720p.
Some still do, if not 720p then definitely something under 1080p. And since we're mentioning CSGO, look at that amazing Poozen performance. Truly a great budget buy, 16 threads are the future!

Attached: disaster.jpg.f7bbda2773c01ccaf282312e587da56e.jpg (1618x998, 428K)

>Intel auto overlocks unless turned off because of the bigger headroom due to usage of a chiller
>AMD can't overclock and stays ~on average TDP
Ok

Literally unplayable on my 200Hz monitor.

Based af

Attached: 1538259469940.png (300x577, 133K)

Ok but Intel is still shit dude, get over it. Do you have OCD or something?

Okay I'll bite: if you're gonna play video games like a child why get an i7 8700K? Since most games are optimized to run on 2 or 4 threads max why not get an unlocked i3 and OC the shit out of it?

The reason why we all recommend ryzen is because it's so energy efficient and low power that the free stock coolers they come with are more than enough to cool them even under heavy multi-threaded workloads. It can even do decent gaymen at 1440p like shown in . The cream of the crop is the abundance of threads you can use to do multiple things at once.

Attached: 13124123.jpg (525x525, 54K)

It's shit because they still haven't glued together 32 cores for cinebench subreddit karma?

>Okay I'll bite: if you're gonna play video games like a child why get an i7 8700K? Since most games are optimized to run on 2 or 4 threads max why not get an unlocked i3 and OC the shit out of it?
Cmon read the fucking thread faggot, most modern games are starting to use 6+ threads.

>The reason why we all recommend ryzen is because it's so energy efficient
A fucking meme why would some little shit playing Fortnite care about this, he only wants smooth gameplay. For data centers though, I agree EPYC is great.

>free stock coolers
I agree, also nice

>It can even do decent gaymen at 1440p
Decent - YES, best - NO
Are Ryzens great all around CPU? Yes
Are Ryzens great content creator CPU - Yes
Are Ryzens great gaming CPU - Ehh, not really
AMD fags recommending Ryzens for pure gaming builds is what grinds my gears here, like they're made by God himself and excel at every task, and if you buy Intel you're literally funding Greater Israel.

Fuck kikes but I'm not buying this chink shit either until I see Zen 2.

No they're not, not efficiently anyway.

It's not a meme to us if it means we don't have to pay $100+ for an AIO and higher electricity bill.

cool man but back to my main question: why not just an unlocked OC'd i3?

>smooth gameplay
The why the hell would he get an intel cpu? Do you even know what frame stuttering is and how bad it affects input?

Attached: aHR0cDovL21lZGlhLmJlc3RvZm1pY3JvLmNvbS9YL0wvNzc0OTIxL29yaWdpbmFsL0Zhci1DcnktUHJpbWFsLUZyYW1lVGltZW9C (711x445, 90K)

>now we'll start streaming on our 8700k
>15fps

kek this, I don't even care about gaymen anymore as long as I can multi-task smoothly. i3 7100 to ryzen 2700 has been a godsend for me.

>pay $100+ for an AIO
I've only owned soldered Incels, aircooled Noctua is all you need

>higher electricity bill
Really nigga? It's like $2 difference per year.

>why not just an unlocked OC'd i3?
>4 cores/4 threads
Read the thread

Ok, something interesting finally. I'll have to look more into this.
Though that same review shows this massive difference only with this one game, there are some where Ryzens are worse and then some are pretty much even, though Intel still shows higher average. Nice find.

8700K is fine for streaming with its 12 threads, BF doesn't even use all the threads and individual cores are faster anyway.

>The only thing I have is $1000 in my wallet to upgrade from my old 5820K to 9900K the day it launches.
lmao, enjoy your buyer's remorse when Zen 2 comes and annihilates it, while you're still on 14nm trash

I'm not asking the thread I'm asking you, child.

>enjoy your buyer's remorse
I won't thanks for the concern. I don't need 32 threads for daily usage and they'll barely come close to single thread performance anyway. I'd rather have that performance now than Wait(TM) 6 months.

That's because you're a child.

>I'm not asking the thread I'm asking you, child.
I'm not spoonfeeding you with an answer I already answered literally 3 posts above, fuck off nigger.
I'm not poor I can easily switch to BASED Zen 2 if it beats Intel. I only look at the performance TODAY, I'm not some deluded fanboy who feels the need to convince others to wait half a year for hardware with unknown performance.

BF1 uses up to 12 threads according to DICE

>It's shit because they still haven't glued together 32 cores for cinebench subreddit karma?
Funny you mention it, given that Cope Lake is literally intel gluing together 2x 28-core dies and it will STILL get stomped on by Rome.

>Cmon read the fucking thread faggot, most modern games are starting to use 6+ threads.
They really aren't. Most games still run the main game simulation on a single thread. Only a select few AAA games use that many threads, and they are relatively few.

>A fucking meme why would some little shit playing Fortnite care about this, he only wants smooth gameplay.
A Ryzen is more than capable enough of providing smooth gameplay. I don't care how the i7 8400 has a single millisecond better frame times as shown in .

The vast majority of people play games at 1080p@60Hz, which is not a problem. If you have special needs like hitting 144 FPS and not dipping below that, sure, go Intel. All that matters to me if I play games is that it hits my FPS target and doesn't dip below it, and since I have a 1440p@60Hz display that isn't a problem.

Since I also do software development and a fair bit of compilation, it is quite nice that I have 8c/16t.

>Are Ryzens great gaming CPU - Ehh, not really
It all depends on your requirements. If you want 144Hz on 1080p, go ahead and use Intel. However, if all you care about is hitting 60 FPS at 1080p (most people have 60 Hz displays) then Intel or AMD really doesn't matter much, and it comes down to whether you want more cores or not.

The GPU is always the bottleneck at 1440p+ and so your choice of CPU doesn't matter at all anymore, and you can just save some bucks on CPU to get a better GPU.

Attached: cooper lake.png (1920x1080, 1.04M)

>The GPU is always the bottleneck at 1440p+
Not always the case, I had Ryzen bottlenecking my 1080 Ti at 1440p 144hz, but the settings were low for higher framerate. I'd say only at 4K or maxed out 1440p this CPU battle becomes irrelevant.
Basically for high refresh gaming Intel, AMD fine for 60hz.

Protip: CPU hasn't mattered for gayming for years outside of niches (Dwarf Fortress, some console emuators and grand strategy games)

Ryzen is perfectly fine for gayming purposes. Anything above 100FPS framerate is pure placebo epenis bullshit.

So you're waiting for Zen 2, but Intel is still shit dude, get over it.

>Funny you mention it, given that Cope Lake is literally intel gluing together 2x 28-core dies and it will STILL get stomped on by Rome.
THIS CAN'T BE HAPPENING, THIS IS PURE ANTISEMITISM

Attached: 1527629778452.jpg (679x758, 54K)

Attached: Intel_tears.png (1066x600, 573K)

ITT: pedophiles developing the vapors over video games

>S-Stop pointing out f-facts! I counter with my memejutsu! H-H-Hiyaaaa!!!

Attached: absolutely.jpg (598x448, 111K)

>1280x720

It's okay, Shlomo. People still buy Intel.

Attached: haredi-intel.jpg (636x390, 35K)

>not 640x480

>benchmarking with video output in 2018

>The GPU is always the bottleneck at 1440p+ and so your choice of CPU doesn't matter at all anymore, and you can just save some bucks on CPU to get a better GPU.
This is at least somewhat true. Also I own a 1440p 144Hz monitor so I don't even care how well games run on some meme 480p resolution. I play games but I'm not a Professional Razer(tm) Gamer Powered by Asus Republic of Gamers(tm) and Monster Energy Drink(tm). I only need to know the 1440p performance since that's what I use. I think any benchmark under 1080p is pretty much false advertising now. You're not actually seeing the big differences that are promised.

...

>The GPU is always the bottleneck at 1440p+
That isn't a given. I have a 2700X and there are certainly games where it bottlenecks my (overclocked) 1080 Ti at 1440p. Assassin's Creed: Origins is something that I'm playing at the moment, and the 1080 Ti is held back a decent amount of the time, though not always. This is with all in-game graphical options maxed. And yes, I'm using 3200MHz RAM with extremely tight timings.

Attached: inb4 lying Intel shill.png (2560x1440, 330K)

I love how all the AMDfags will go on and on about productivity yet they spend their precious time here shilling. Let’s be honest if you were really that concerned about productivity you would not waste your time shitposting on Jow Forums about CPUs. I’m not sure what’s worse the intel gaymur muh niggahertz shills or AMD corez productivity shills. For fucks sake either will get the job done just fine

bf1 beta without key features still in development while the lead developer is talking about ways they can improve preformance... yea great benchmark you got there.

>4k ultra
>cope, most people still use 1080p and not maxing out their games so they could possibly run at 60+ fps

ok... so let me get this straight. most people are 1080p and are so gpu bottlenecked they don't even start to touch the cpu bottleneck area, yet amd offered 8 cores 16 threads so even if shit starts in the background and is on sale at times for 220$ (person a few days ago bought a 2700 or 2700x new on a sale) and you are telling them don't do that, buy the non upgradeable more expensive shit?

jesus christ did you earn the

youtube.com/watch?v=6mHUWvMDcMI

call me when your cpu can run 9 games that are slashing the first thread while playing cs go

Intel :
380$/200fps = 1.9 dollar per fps

AMD:
300$/200fps = 1.5 dollar per fps

Wake me up when the number are other way around.

>seven
>twenty
>fucking
>pee

bf1 beta without key features still in development while the lead developer is talking about ways they can improve performance... yea great benchmark you got there.

>4k ultra
>cope, most people still use 1080p and not maxing out their games so they could possibly run at 60+ fps

ok... so let me get this straight. most people are 1080p and are so gpu bottlenecked they don't even start to touch the cpu bottleneck area, yet amd offered 8 cores 16 threads so even if shit starts in the background and is on sale at times for 220$ (person a few days ago bought a 2700 or 2700x new on a sale) and you are telling them don't do that, buy the non upgradeable more expensive shit?

jesus christ did you earn the

>bf1 beta without key features still in development while the lead developer is talking about ways they can improve performance... yea great benchmark you got there.
Battlefield 1 is two years old, brainlet. It came out long before even first-gen Poozen. You're thinking of Battlefield V.

>11 more fps is worth $205

$165 Ryzen rivals Pootel's $370 stutterfire LOL
>m-m-m-m-m-m-muh niggahurtz...
>m-m-m-m-m-m-muh b-die

Attached: 1538402637126.png (2224x1369, 1.1M)

bf1 beta without key features still in development while the lead developer is talking about ways they can improve performance... yea great benchmark you got there.

>4k ultra
>cope, most people still use 1080p and not maxing out their games so they could possibly run at 60+ fps

ok... so let me get this straight. most people are 1080p and are so gpu bottlenecked they don't even start to touch the cpu bottleneck area, yet amd offered 8 cores 16 threads so even if shit starts in the background and is on sale at times for 220$ (person a few days ago bought a 2700 or 2700x new on a sale) and you are telling them don't do that, buy the non upgradeable more expensive shit?

jesus christ did you earn the

>Update your BIOS
>Games are yet to be optimized for multy-threading to utilize all Ryzen cores
>You need higher end RAM
>But Ryzen is better value for bank and has bigger potential

be clear here, 11 more fps after you are already starting to pull away from 120

funny thing is, 30 to 60fps is massive
60 fps to 75 also feels massive
75 fps to 90 is the last area where it feels like a big jump happened

90 to 144, while nice, is a much smaller jump in quality.

I have no doubt that 144 to 240 is also smooth and a big jump, but my understanding is from people who have/used that high a frame its almost inconsequential.

my understanding of the 60-144 range is because I have monitors that do it and I can say first hand.

>stutterfire
>Poozen has the lower minimums
>entirely GPU-limited benchmark
Why can AMDrones never argue from an honest position? Why is it always theatricality and deception? Perhaps because the objective benchmarks of CPU power always prove Poozen inferior? Perhaps because the objective benchmarks of GPU power always prove Pooga inferior? I wonder...

Anything above 60 fps is great. Anything above 120 fps is a marketing meme.

What clocks does your 1080Ti hit? 1934 seems way too low. Should be able to do ~1980MHz at 1v or less if you have an average bin, which also gives you the benefit of power throttling significantly less if at all.

They can't even compete with Ryzen, pootel is like 3-4 times slower than Ryzen. Seriously Intel is now core2duo tier, 4 intel cores = 1 zen core

Attached: untitled-3.png (672x794, 44K)

1934MHz is the rated XMP speed of the RAM kit I'm using (3866MHz effective), though obviously you can't hit those speeds with Ryzen. My 1080 Ti hits 2063MHz max with my current overclock on stock voltage. Haven't bothered doing anything else to it, as it's more power than I need even now.