AMD BTFO

Here's your performance graph bro

Attached: lies damned lies and silly graphs.jpg (1411x786, 216K)

HOLY SHIT GREEN BAR BIG
RED BAR NOT AS BIG

SEETHING incel cope

Attached: 1563484213297.png (640x480, 76K)

dat y scale do

2 fps to 7 fps difference with a 100$ less price point and a shitty blower... whats your point?

What's the range of error on these fps tests? Makes these minute differences seem pretty close.

>what is scaling
Bigger bars are better

poorcucks lol....unbelievable.

You can't save everyone from the kikes. Some people just want to live the lie because they are subconsciously aware of how shitty the reality is.

I'm disappointed they are not measuring this in 0.0001 fps increments.

based 3600

Attached: PNG.png (883x496, 10K)

BUT THE PRICE/PERF RATIO

Monster Hunter World is a fucked port (as all Capcom ports are).
The animations/physics/etc run internally at 60fps or even 30fps.

All these cards can do 90fps or more, so who cares?

WOW WOW AMDRONES ON SUICIDE WATCH ! BIG GREEN BAR IS CLEARLY SO MUCH HIGHER

OH NO NO NO NO NO HOW CAN AMD EVER COMPETE BROS

DUUUDE THAT'S A BIG FUCKING BAR
BIG GREEN BAR
MY PC IS FOR #BIGGREENBAR'S ONLY
#BGBSUPERIORITY
I THINK I'LL GO BUY AN RTX 2070 SUPER RIGHT NOW

Are you retard user?

nvidia btfo

Attached: monster-hunter-world-1920-1080.png (500x650, 44K)

What BTFOing is happening here? It's just barely beating a 3 year old card, being slightly beat by its main competitor, and utterly blown out by Nvidoa's higher end cards.

>main competitor
>costs $100 more

That's a top of the line card with a billion more transistors and a thousand more shader units you fucking brainlet

>dat vega vii

What a messy card

CUSTOM COOLERS FOR 5700 XT WHEN?/?

MSIs shitty one and Asrocks interesting one have leaked, look it up
If you cant wait, morpheus or washer method

Yeah I'm not interested in asscocks stuff at least for another year or so until the verdict is out if it's garbage like their low end motherboards or decent like the odd good motherboard they make
>msi
>amd
lol, just as bad as gigabyte. I'll wait till Sapphire announces something.

I don't really have anything relative to compare this setup to.

But looking at it, with such a "shitty" CPU, is it really necessary to have a better CPU unless the specific game your running is CPU intensive?

Attached: Screenshot-2019-7-22 Phoronix Test Suite Results.png (1268x1470, 117K)

>3600x $249
>5700xt $399

So I can play Monster Hunter World @4K 94 fps? That's a pretty fucking good deal.

Attached: 1424900150290.png (1107x803, 66K)

I knew I had one of these somewhere

Attached: 1392523435491.png (1279x710, 144K)

Why is no one mentioning how it appears the 9900k is gimping the 5700XT?
With a 3700X it gains 4 FPS over the 9900k while the 2070S only loses 1FPS.

DUDE 3 FPS HOLY SHIT

wierd how the 5700 is slower on Intel than on AMD

Attached: this thread.png (879x549, 7K)

>4k 94fps
OP pic says 1440p without DLSS for Nvidia card or upscaled and CAS sharpened for AMD's
tbqh MH:W already has many low res textures so DLSS blurryness won't be too bad, waiting for th update

>75% of the graph is margin of error stretched out
holy fucking shit, can you tell which handicap did this piece of work (this was literal paid for) so i can add it to my list of scum people? thanks

That's worse than op imo.

According to TPU, the 2070 super is "the quietest card they have EVER reviewed". Compare that to the housefire jet engine that is the 5700xt.... Lord knows how bad the 5800 is going to be.

lmao these ones are missing the Tom's Hardware logo

so what you are saying is that 5700XT+3600x is ultimate gaming combo for the price? good to know I made the right choice.

you know DLSS and CAS are only for 4k displays right? they upscale to 4k from 1440p
at least you can use AMD sharpening on 1440p which looks pretty decent imo

ABSOLUTELY BTFO

Why would anyone buy the 3700x for gaming?

Can anyone speculate how well a 3900x would rub Arma 3 at 1440p?

Someone post "5700 is higher number than 2060" chart

Attached: 1560272812074.jpg (480x853, 141K)

OP here. When people say I am retarded about misleading bar graphs. Oh the irony. It was made by Wendell btw and he acknowledged his mistake.

nice y-axis, bro

>9900k + 2070 super = $950
>3600x + 5700 xt = $650
>32% cheaper, 4% slower
It's a fantastic deal, there's no reason to buy anything else.

You are ignoring price because you don't understand the subject. Some poor people buy 2070 super and similar priced cards because they are poor. Paying $100 more when your minimum wage job can't pay you more than $7.50/hr is big.

In a lot of cases, yes. Look at the 6700k vs 8700k benchmarks from a few years ago. Ones with the Titan Xp or 1080ti.

The games that are actually gpu limited still performed better because any and all overhead was removed by the far superior 8700k IPC. It allowed the gpu to go into a panic mode at 95-100% utilization and dump contents faster to get higher fps. Some games were as much as 25 fps.

Why 3600x over 3600? Only real difference is what cooler it comes with. Might as well shave off another 50 bucks there.

Intbeciles will pretend like the wanton wastage of money means youre not "poor".

kek came for this

cooler is cooler

Yeah.. what happened there?
t. Radeon vii owner

The cooler isnt worth 50 dollars. You can get a mugen b instead

>Radeon vii owner
why?

It's a shit card, sorry you bought one

HBM 1Tbps speeds and 16GB sounded cool
I imagine it can be utilized to some heavy GPU calculations
Haven't done anything meaningful with it tho
Still procrastinating

>kfc
Because it starts to smell like KFC after 5 minutes under load?

better binnend chip, you probably won't get lucky and make 3600x out of 3600 at reasonable voltage
realistically at 1440p it will be 10% difference on min fps at stock, so if you choice is better video card or better CPU - get better videocard

Fellow VII owner, the card overclocks pretty well on liquid. 2100MHz is the highest I've taken mine on an EKWB block.

>buying reference
alright user

>9900k
>rx 5700xt fps: 91
>2070s fps: 98

>3700x oc
>rx 5700xt fps: 95
>2070s fps: 97

the fuck? why did the 5700 jump 4 fps with a cpu switch?

because it's margin of error with averages without knowing how long the test been running
for true averages you need to play the game for at least an hour, but nobody would do that, it would require actually working on reviews

>Card that is 3/4 the price performs 2-7% worse
oh no

Extra zoomed in graphs should be banned

Attached: 357635afe25f38a98ab15ecc5a87e9e610a1984d156dc2f4205c007fc1d760d0.png (550x393, 310K)

Wendel can you please hire somebody to explain your content holy fucking shit.