100+$ more w no cooler, is worse

INCEL
B
T
F
O

Attached: file.png (1082x766, 191K)

Other urls found in this thread:

userbenchmark.com/UserRun/17880340
userbenchmark.com/UserRun/18045555
userbenchmark.com/UserRun/17740648
twitter.com/SFWRedditGifs

>Q418 v Q319
>save $115 but have to buy expensive high speed RAM and 15w TDP active cooled 570 motherboard that roars like a jet engine
>will shit the bed anytime an intel optimized program is ran
>shit at av512 and emulation

blunder of the century

Reminder

Attached: 1562006469183.jpg (700x5000, 1.83M)

Yeah I give u that Incel is superior at:
Power consumption
Temperatures
Vulnerabilities
Price

Why do you retards think a 3600X needs a 570 motherboard?
I use a 2600X with a B350, running at 4.3GHz. Have over 10 months now, works fine.

>30 bucks more for ram = bad
>1000 bucks more for industrial chiller = good

ram speed does not matter u retard

>>shit at av512 and emulation
Cope harder.

Attached: 1561072080694.png (2560x1440, 1.99M)

This, I'll be getting a 550 too.

RYZUN NEEDS FAST RAM TO BEAT INCEL
mfw works on bottom barrel ddr4

Attached: file.png (854x917, 90K)

btw all these benches and the passmark ones didn't run at full core boost, also what mem did they use, does someone have a link

Attached: image-2.png (2836x1764, 429K)

>AMD
is that way

Just click on it and see fucking mongol

But we all know that:
1) That's wrong
2) If you want fast RAM, just buy Hynix shit, works fine with Ryzen at high frequencies even on Zen 1 (since last May)

im asking for a link to the bench bc i can't find it brainlet..

>+1%
DESTROYED

Now try with Demon's Souls

Literally just open up a CPU list

Attached: file.png (1079x952, 191K)

>still hasn't posted the link but went through the trouble of posting a screencap

why should he post a link for something that you find by going to the website and typing the name of the component? do it yourself you tard

Go to the two threads that image was posted in the archive. Discussion about Demon's Souls is there too, apparently working fine.
If a game works on Intel, it works on AMD too, any performance issues are game specific.

>still hasn't posted a link but typed out a useless comment

>Single core within margin of error between 3600 and 9900
NOOOOOOOOOOOOOOOOOOO

Attached: 00015781.jpg (337x383, 45K)

fuck away pajeet

so this pajeeto ran it underclocked with completely fucked bios and ram settings it seems, and it scored that high?

Attached: file.png (503x349, 65K)

Google image and yandex has failed me, care to give a sauce?

Attached: 1455179394089.gif (259x282, 1.08M)

subtle

based

price doesnt matter

Where is the guy spamming the latency graph? i thought he was working 24/7?

>latency graph?
elaborate

oh yeah, I guess he's asleep still

>3 core with far memory controller vs. 8 core
It's 3 cores to minimize input lag, FYI. Imagine paying for 6 cores but only using 3 while gaming.

>input lag
well meme'd

or i can just get a Bx50 mobo and do it cheap just fine
is there no end to intcel cope?

Try it yourself.

where do you retards congregate?

Intcel shills are just trying to meet their quota by these 5 days. Hasbara is already hard at work to create those shill scripts for after the 7/7 holocaust.

Go into BIOS and set it to 4+0.

and what does ccx-ccx latency have to do with input lag, you mongoloid

>OS balancing load across two distant sets of cores
>FPS decreases
>interrupt latency increases
>input lag increases

>It's 3 cores to minimize input lag, FYI. Imagine paying for 6 cores but only using 3 while gaming.
Input lag is a meme. Core latency and RAM latency can drop FPS massively (And that's the main reason why Ryzen loses in gaming and why they've doubled L3 cache), but it's not something that's noticeable in therms or input latency. At worst you'll fell the FPS drop.

AHHAHHAHAHAHAHAHAHAHAHAHAHAHA
*WHEEZE*
AHAHAHAHAHAHAHAHAHAHAHHAHSHAHAHAHAA
THE "TECH ILLITERATE STREET SHITTER" MEMES ARE FUCKING REAL!
AHAHAHAHAHAHAHAHAGAHAGAGAHAHAHAHAHAHAHAHAHAHAHAHAHGA

input lag always increases when fps decreases
so please tell me where you retards congregate, because some people on this board seem to parrot this retardation about certain measured memory latencies automatically leading to significantly more input lag

You absolutely will notice it, go try it yourself instead of mentally masturbating on reddit.

>1. You realize you don't have to buy a 570 mobo right? You can use a $40 B350...
>2. You don't have to buy more expensive ram

> Sure, anyone who optimizes just for one specific architecture will see better performance on that architecture. That's terrible software development, and major softwares will not be optimized just for Intel.

are you even getting paid for this?

You *DO* recognize the character? Search parody of that anime on any hentai site, specially sad panda, you'll find it in 2 minutes.

No, just sharing my findings. I can't lead a horse to water though.

But I heard that applications only use one core and that 4 cores is all I need? Did you lie to me before? Or are you lying to me right now? Or is it both?

kek it's currently literally the first result when you search the anime on exhe

>Where is the guy spamming the latency graph? i thought he was working 24/7?
He's overdoing it, but quite frankly memory performance looks like shit even in better scenarios. Looks like 70+ns latency is normal now (can't say I've expected anything else with I/O die)
Here's few 3200, 4000 and 4266 results
userbenchmark.com/UserRun/17880340
userbenchmark.com/UserRun/18045555
userbenchmark.com/UserRun/17740648

>and 15w TDP active cooled 570 motherboard that roars like a jet
cope

Attached: jns90zhx3o731.png (1142x4398, 610K)

Memory latency doesn't matter if the results are fine.
You people are bitching about a car thats engine runs at 0.7X the RPM of another car, while still being faster than the car running at higher RPM. While costing half the price and taking half the gas unit per distance unit.

oh no it's retarded

I will most certainly not, since I've only got 75Hz monitor. But maybe some pro's do notice it on 100+ fps. I won't argue here because I obviously can't try it myself.

Northbridges on some Intel boards used to be way over 15W, nobody complained, we just swapped them out for huge ass passive coolers and it was fine. Suddenly this is a huge problem and nobody understands that you don't have to use X570 to get good performance with 3000 or if you do, you can probably afford a passively cooled one anyways, if not, you're probably smart enough to put a passive heatsink on it yourself.

>findings
funny how the retards mentioning input lag are always either trolling or unable to substantiate anything
(probably because they don't actually understand how things work and it literally can not be substantiated on account of it being literal nonsense)

Testing with a 2600, disabling half the physical cores gives much worse game performance in modern games. Same with SMT.

Attached: 1530669511670.png (1500x1392, 1.63M)

>my findings
So you already have zen2 chips? And you're shitposting in this african basket weaving board with cropped and pozzes el chupacabra screenshots instead of posting your "findings" or ppsting them on jewtube? LMAO.

Modern games use 4 cores so of course you'll get worse performance. Latency will definitely decrease with a CCX disabled. The latency difference of SMT is negligible, however the FPS increase is substantial when CPU-bound.

Oh okay, so it's just a meme. Thanks.

There's something fucky going on with this RAM, it seems to be underclocked or otherwise messed up.
If the results are like this with bad RAM or beta BIOS or whatever the fuck is going on here, this chip is going to be an absolute monster.
I can't wait to see the proper benchmarks, especially with the higher end parts. I wouldn't be surprised at all if AMD actually became the single core king with this lineup.

Attached: ede9582bb2dd595c392a4988662585eb.png (1424x272, 51K)

>expensive high speed RAM
Samsung c-die costs 50$ at most and easily reaches 3200.

see;
the boost is also only 3.9 instead of 4.4
the 3600 tests have it at 4.02 instead of 4.2

there was good 3600 test with 3200mhz + normal clocks on passmark but got deleted
score was 21333 or similar

Attached: 1497949036509.png (295x310, 19K)

>buys 3+3 core
>complains about being cornered in

>No, just sharing my findings

Attached: 1491873481051.png (860x650, 48K)

Best score so far

Attached: mem.png (1423x195, 41K)

it'll be embarrassing if the 3600 can clock as high as a 9700k. if it can't, which is likely, intel will still come out on top, albeit at a much higher price.

it's not about clockspeed

just give up
post measurements or fuck off

>only muh jigahurtz matter!
By that logic, the highest end intelaviv would be btfo'd by an fx chip.

This. I have a 2700X myself but WAIT FOR USER REVIEWS YOU RETARDS.
>Imagine believing marketing companies
yikes

>kiketel shills are THIS desperate
Yikes!

>says the shill
Oh yes, no company has ever lied before, or been misleading.

the talk being zen 2 has similar or better ipc than whatever lake the 9700k is. that leaves clock speed as the main point of differentiation for performance. zen 2 most likely won't clock as high as intel, at least not these first wave of chiplets. fuckin nasty chiplets, m8. wait for 5ghz+ next year.

>8 core
>8 threads
OHNONONO what year is it?

What do you need more than 8 threads for?

tested on a [email protected], 3200 CL11

Attached: ryzen in a nutshell.png (564x100, 12K)

>bought a 9700k two weeks before AMD announcement
a-atleast it does 5.2GHz

Attached: 1558224706007.jpg (1439x1439, 97K)

>the boost is also only 3.9 instead of 4.4

I'm actually a bit skeptical about that part.
Because these CPUs would utterly crush Intel if that was the case.
from 3.9 to 4.4 is 12.8% increase, so we'd be looking at a single core of 161 and that's with fucked up RAM. Quickly looking at the recent benchmarks, the 9900k at 5.2GHz gets 161 points. That to me that sounds too good to be true, a cheap stock 4.4Ghz chip trading blows with a 5.2Ghz monster.
Also if this score scales linearly then at 4.7GHz we'd be looking at a SC of 172. If the chips hit the magical 5Ghz then it's 183 points.

I'm going to keep my expectations somewhat in check and in the best case scenario expect the higher end AMD to inch slightly above Intel's best in SC.
Anything more and it's one hell of a plus. Either way in few days we're finally going to know the real performance. Can't wait to see the proper benchmarks.

I bought one at launch, have some regret but not really. If it's on-par with Intel, I couldn't care less.

these are microseconds
microseconds
does this correspond to actual increased input lag?

how did you get cl11 at 3200. i'm stuck at 14.

Yeah, hundreds of interrupts happen a second. Plus the interrupt to process latency is much higher than DPC latency, so it's much higher than a microsecond with 8c16l. Everything is more vague when the latencies get that high. My laptop gets .32us for reference.

Jesus fuck Intel is absolutely dead.

1.6 vddr

Can anyone explain why retards still buy Intel while Ryzen is out? Some have said they were unsure about AMD and the 2 "le gamers" I know that have Ryzen hate it.
>single channel ram
>other has 2133Mhz RAM
Clearly Intel is better for normies

ha, everything he said is wrong.

>2133mhz RAM
Tell them to at least use XMP profiles

Intel fanboys are legitimately mentally ill

He doesn't want to go into the BIOS
Other bought 16gb single stick because it was cheaper, no intention of getting another
Both regret getting Ryzen because their performance is shit, but by their own doing... Ryzen isn't very user friendly to get top-tier performance.

This. Latency means shit when it's already performing better than the competition.

streaming games. h264 scales up to 18 threads( 16 threads is more stable though). So 6-8 threads for the game and 16-18 threads for perfect stream quality we got a ways to go :). Fuck 2 pc setup streaming it's objectively a worse experience for both viewer and streamer.

can confirm

okay but how would u explain the boost being reported wrong (lower) across different runs, tests, and programs?

Why did the good pissmark bench get deleted also, can users do that?

>does this correspond to actual increased input lag?
None you can measure without a 10000 FPS camera to get a non-margin of error result.

>ram speed doesn't matter
What a lie. That was a big deal during the principled technology debacle.

The fuck are you talking about? I still use an FX-8350. These are some "retard le' gamer" friends who bought Ryzen and "built it themselves".

>streaming
Okay, but anything else? You're referring to 0.001% of the PC userbase... Besides, we need less retards streaming anyways.