Upgrade every year

>upgrade every year
>undervolt
>flash GPU bios
>overclock
>get fast (read magical) RAM
>expensive cooler
>extensive tweaking of their system
>play mental gymnastics that they dont need features in games/boycott games that implement them

>Just to prove to strangers on the internet their hardwares aren't inferior to intel and nvidia

How do AMDrones live with themselves?

Attached: 1534624564050.png (1228x1502, 1.07M)

Other urls found in this thread:

techspot.com/review/1829-intel-core-i5-9400f-vs-amd-ryzen-5-2600x/
techspot.com/review/1655-core-i7-8700k-vs-ryzen-7-2700x/
newegg.com/Product/Product.aspx?Item=N82E16820232530
twitter.com/NSFWRedditGif

You forgot:

>wait for zen2
>wait for Navi

Irony being relive is vastly superior to shadow play in terms of features.

But feel free to think corporations are your friends.

>Still worse single thread performance than Incel CPUs from 2014

We get it, you guys have crippling buyers remorse. It's not OUR fault you're too stupid to properly research things before buying them.

techspot.com/review/1829-intel-core-i5-9400f-vs-amd-ryzen-5-2600x/

techspot.com/review/1655-core-i7-8700k-vs-ryzen-7-2700x/

Attached: ZomboDroid 26042019075355.jpg (1324x2868, 415K)

How many days of tweaking did they get the ryzen system to work like that?
>still inferior to last gen intel

Attached: 1539633692038.png (623x808, 418K)

>Getting 3400MHz on Ryzen 5
talking about golden samples, get that CPU retail and you'll be lucky to get 4GHz 2800Mhz

I forgot
>Cherrypick benchmarks all days then autistically making screenshot compilations to show his hardwares arent inferior to intel and nvidia.

Doesn't it get tedious and boring to spin the same dribble EVERY day? I bought a pair of flare X RAM and got it to 3400MHz CL 15 (stilts timings) with a slight OC on a $75 B450 asrock motherboard. On that SAME motherboard I got my 2600 to OC to 4.1GHz on a $30 hyper evo 212.

newegg.com/Product/Product.aspx?Item=N82E16820232530

I actually used to own an i5-7600 intel system myself btw and noticed a really hefty performance improvement moving to my AMD system. And sure I could have spent a couple hundred more for a cannonlake system but why would I do that when zen+ systems are cheaper and 90-95% just as fast?

Attached: Screenshot_20190504-072819(1).jpg (1183x664, 326K)

Post your CPUZ/ HWinfor screencaps then, I legit want this to be true

I would desu and you and this thread sticks around long enough I certainly will but I have a morning shift to get to. See you guys around 3-5 pm.

>$170 for 16gb Ram
>pays same price for his CPU
You could have gone 8700k and 2400mhz Ram for same price and more performance.

>buy Ryzen 1600x
>buy a literal $10 CPU cooler
>buy whatever general recommended RAM and Mobo
Muh gaymes play just fine and my videos encode, and I didn't fork over massive money to the blue jew.

You don't have to subject us to your buyer's remorse or whatever weird tribalism thing you have going on.

$170 RAM + $160 2600 + $30 212 evo = $360 + $75 MB
vs
$100 RAM + $370 i7-8700K + $100 AIO = $570 + ~$150 high end Z270 motherboard to handle the """""95W""""" TDP + 5GHz OC.

ALL that just for 5-10% better performance? Why would anyone do something THAT fucking stupid?

Attached: 1552525930413.jpg (1024x943, 89K)

Reminder

Attached: 1535238777119.png (2000x1543, 58K)

>3-5pm

Damn amd marketing team can deliver golden sample thi fast?

Same IPC as skyline-x, actually. To be surpassed in 2 months btw.

Are you sure? See Or was there 0 IPC jump from sky-to-kaby-to-cannon?

>game
Are you retarded?

>Wait for Vega
>Wait for Polaris
>Wait for Fury

Games are notoriously inefficient at using more than 2 CPU cores, seems like a pretty good single threaded metric to go by. Isn't that why haswell crushed FX?

It is but not equal to IPC

Kek they used up all of their golden samples for marketing and left the shit dies as 50th anniversary 2700x's

How so? Generally you're only ever going to max out 2 CPU cores 80% of the time and throwing more in there won't improve performance by much if at all (see 16-core threadripper vs 4 core intel cpu at gaymes).

You realize that the gpu is a bottlenecks in those resolution where they got equal fps right? You can put an 2500k there and the results would be the same.

>"The main event though is an 18 game benchmark using the GeForce RTX 2080 Ti at 1080p, 1440p and 4K."
So what you're saying is a $1,200+ RTX 2080ti is a "bottleneck" even at 1080p, right?

You mean a 3200 cl 14 kit and a well known timing preset?

Sky kaby and coffee are all the exact same architecture.

amirite jewcucks unite

Why not? As long as it prevents the CPU to 100% utilized. Also the C in IPC stands for clockcycle not core.

Not him, but if the software can't properly leverage the entirety of the cpu (all corea/threads) it'll never reach 100% load. You'll technically be limited by 1 or 2 cpu cores. However the way windows does its task scheduling, you'll likely never see a single core being pegged 100% unless the developer pins their threads. Because left to its own devices windows will constantly shuffle shit from one core to another. But if the gpu isn't running 100% (pro tip: it's not) then it's not a bottleneck, and something else in the system is.

So what GPU would NOT bottleneck a CPU? Does such an alien artifact even exist?