Untitled

≧ω≦

Attached: 0af712b7-ad31-4e55-b605-fd326c3aac1a.png (1000x460, 813K)

>ballsack

Attached: SoyFace.png (53x38, 2K)

:O

>mfw buying 3900x and 3950x after selling the 3900x

Attached: are you retarded.png (420x358, 245K)

>Getting this beast the moment it comes out.
>Mfw thinking about the performance gains over my 5820k.

Attached: 1548994500965.jpg (760x718, 72K)

i don't have any proper use for this kind of performance so I won't purchase it.

why would you buy the same cpu (plus an afditional one) after selling it :-)

>Available September
Literally another 9/11 for Intel.

Oooooof. I hope you enjoy coarz.

Pic related.

Attached: nextholocaust.png (1021x503, 544K)

Cores don’t matter

>5820k
My nigga. Most underatted CPU from Intel's lineup the last 10 years easily. At the time it was a great deal, but man are you going to cream yourself with 16 cores. Going from 4 to 8 was already mindblowing for me, I can't imagine the amount of resources I'll have with 16.

Someone have their cope post on how all cores are not equal?

I definitely will.
This new CPU should be good for the next 5 years easy. Especially since I'll be doing 3D and other creative work.
Realistically it's probably going to stay relevant until it dies a hardware failure.

I see people with these processors every now and then, but it should really be right up there as the most popular price and performance king when it comes to Intel CPUs.
It has offered the most longevity out of all Intel chips during the last decade and it was very reasonably priced, but somehow flew under the radar for a ton of people. I guess it came out and there wasn't exactly a need for most folks to upgrade with Intel having previously published good CPUs so it was kinda ignored.
I don't think the Ryzen lineup would have done nearly as well had the 5820k been more popular. I guess it's good that it wasn't. AMD got more resources from selling their chips and produced this monster lineup.
I'll be looking at something like +30% single core and 300% multicore uplift when I upgrade. Can't wait to make the switch.

Problem with Haswell E CPUs was the HEDT platform they were stuck on because of Intlel. The upfront motherboard cost was just too much for most people so they opted for Haswell Refresh instead. Plus DDR4 costs were way steeper compared to DDR3.

Attached: Absolutely savage.jpg (2220x1080, 508K)

>6c/12t
>TDP 140W
What the fuck

> mfw already own 16-core ripper w/ 128GB ram
> tfw ramlets
> tfw pcie-slotlets
Paid less for my ram and processor than brainlets will pay for 3950x

Yeah but it's lower clocked, inferior IPC and less cache.

user, there is no world where you bought both RAM and CPU new and paid less than $750 (unless you live in a microcenter and even then you're a stupid cunt comparing prices). A 1950x can be had for $500 mostly everywhere and 128GB of DDR4 is expensive. I'd love to have more lanes to maybe add more SATA ports, a capture card, and another GPU for passthrough and whatnot, but I don't think it's time to move to the workstation platform when so many things need to be checked on Zen iterations: mainly DDR5, PCIe 5.0, more than 2 threads per core, vertical stacking to have memory closer than ever, and exotic chiplets unique to workstation.

GaymersNexus: AYYMD IS JUST AS VULERNALBRELELRELL AS INTEL

Attached: 1539296460877.jpg (474x415, 37K)

tfw poorfag so im getting a 3700X

Attached: 1560218217657.png (2560x1292, 3.61M)

this kills the incel

Attached: 1558920002589.png (1260x709, 781K)

That's fine. Honestly, the 3700x is probably the best buy to check gaymers and users who utilize moar cores. I'm very happy with a 2700x, I'm pretty sure no one will be disappointed with a CPU that can match the 9900k at $150+ less.

>6c/6t bulldozer
>666TDP
>same era

You have absolutely no argument here.

Having doubts a 3800x is going to match this with un-bias reviews.

Attached: haha.jpg (1421x447, 119K)

my boi. I intend to do the exact same thing
>3900X and Asus WS-Ace X570 motherboard
>When 3950X releases, get that and the Asus ROG Crosshair Hero X570 (maybe X590?)
>Pick up 2x16GB of DDR4 ECC for the 3900X and WS board to allocate as my new server.

FUCKING HYPE

The FX-8350 had a lower TDP than that. 140W TDP for a 6-core on 22nm is crazy.

...

What about Ryzenfall?

to be fair, to make the 8350 actually competetive, you had to blow past their rated TDP.

t. guy who owned an FX-9370 @ 5.5GHz.
are you trying to say the 3800X won't beat the 9900k? While comparing the 9900k to the R5-3600? The fact that the single core is only a little behind the 9900k is astonishing coming from their older 14nm/12nm stuff. The non X 3600 has lower clocks, lower boost. The 3800X has more than that, better binning, and more cores for $400.

>AMDFlaws (3/12/2018) - Publication Summary Coming Soon...

What are stock office images again?

>1645 multi core score for 9900k

>987 multi core score for 3600

I don't think adding 2 more cores and adding mhz is going to make the cut. It's single core speed is maxed out because of XFR. It's not going to get any better, Then after 2 cores are used you can kiss that single thread perf goodbye.

quit being reasonable retard

>sample size of 4
I'm waiting for a statistically significant result, as should anyone who isn't an early adopter (sucker).

Why would you want two dies on a "gaming" processor? You can't even force NUMA on Windows.

Even the 64c EPYC is UMA thanks to the I/O die.

The original TR was shipped as UMA too. But it gave you an option to use NUMA in BIOS because it wasn't actually UMA.

OK? The original Threadripper integrated the memory controllers into the CPU dies, with Zen 2 the IMCs are on the IO die and therefore the whole thing is one NUMA node.

And you still have to use the IO die to communicate from die 0 to die 1 so it would benefit many applications to force NUMA.

That's not what NUMA is. Anyway, that's what the purpose of the Windows scheduler update was, Windows is now aware of topology and will group relevant threads on the same CCX or the same die, if possible.

Can someone please fucking explain how
>8c 4.5ghz
>12c 4.6ghz
>16c 4.7ghz

All have the same TDP of 105w? Is this Jewish magic?

...

No, but NUMA was the best way to do it. Even with the scheduler update the interrupt latency will not be as good as single die (or full monolithic as consumer Intel), so gaming performance will suffer.

Binning better dies for higher end SKU's.
Makes sense as it gives the higher core count parts the best single threaded performance, making it a more enticing purchase.
At the base clocks they should all boost up to 105w, after that you need to use overclocking features or manual oc.

Binning and almost perfect segmentability. Leaky, bottom of the barrel silicons go into cheaper products. Also, the boost clock isn't for all cores, the TDP stays similar because the number of boosted cores is similar.

Unless the game is actually able to load more than 8 cores, no it won't. The new Windows scheduler will group related threads as closely as possible so unless a game is actually able to max out enough cores to tap out a die all the threads related to the game will be on one die.

You completely ignored the interrupt latency part which is extremely important for gaming when there are hundreds or thousands of interrupts per second. Intel still has the upper hand in this.

I've never even heard of this being an issue with Ryzen, but everything I am finding points to software issues that were fixed in fucking 2017. Sounds like you're out of the loop m8.

3900X is the one who competes with 9900k

>amdcucks think theyve won
how cute

I have a Strix x470-F would my power phases melt if I put in an 3900x or 3950x?

Attached: 1558697655804.png (198x234, 35K)

Got myself a 2600x last year and usually I utilize my CPU for at least 7 years. So I need to wait till 2026 for an update.

>deprecates entire Intel lineup from mainstream all the way to HEDT
I think AMD won this no contest.
We have to wait until Intel unfucks their fabs, finds a way to compete with chiplet yields, and stop making Broadwell for the 7th time.

Attached: ryzen-9-3950x-1-1-e1560213505487-1024x455.png (1024x455, 490K)

>~20% less single core performance
>1ghz less clockspeed.
>real 3600 clocks at 4.2ghz

so a $200 processor is only 15% slower than a 500% one, nice.

Oh yeah not to mention the 3800x actually matches the 9900k stock vs stock, at 500mhz less, for $100 less.

It was mostly bullshit and only locally exploitable via direct flashing.

Intel zombie load may be far worse then specter & meltdown.

>35% more threads

doing the math, a 3700 at 4ghz should hit a score of 1332, which is nuts considering the 9900k clocks 1ghz higher. a 3700 at its stock 4.4ghz would match the 9900k at $170 less

Great b8 brainlet. How about you do yourself a favor and google how long it takes the OS to handle an interrupt because it seems you forget that 90% of the latency is in higher level software not in hardware you absolute dumb fuck. Pay attention more in school because multi-chip architecture isn't covered until grad school. You also might want to take OS so you don't sound like a fucking idiot.

He's pulling shit out of his ass. OS/software handles Interrupts. 90% of the latency is found there not in the 10% hw signal traversal. I'll actually go ahead and update that to 99% of the latency is software/OS and 1% is the actual physical signal. Dumbass just learned about interrupts but hasn't taken OS so isn't aware of interrupt handlers (SOFTWARE)

The cost of X99 was well worth it if you had any foresight at all. 16GB of DDR4 was reasonably priced back then at the 2133 and 2400 speeds.

>yfw it still gets crushed by 2700k in every game

Going to go for the 3900X but keeping my shit GTX970, dreading the performance

>he doesn't look at 1% lows

Attached: 3fe.png (600x536, 253K)

what about the most important purpose, waving your massive epeen in the speccy threads?

For what purpose. Just get a Vega 56 for less than $300 if you're just gayming.

Are glued together chips really an advancement?
I mean it’s mainframe tech from decades ago and even mainframes moved away from mcm.
I feel like we’re going backward.

The all core boost of the 16 core is going to be like 3.3ghz

>8 vs 6 cores
>4.95GHz vs 4.05GHz turbo

Attached: 1560801676288.gif (607x609, 821K)

>needs to be 155/136 14% faster
>4.05*1.14=4617MHz
nooo AMD bros, we didn't make it

Go for 1070 GTX then. It's going to drop huge time when Super comes out. Over here is 230$ just for a 1060 6gb which is cheaper.

oh no no no no no

it was supposed to be our time AMDbros :(

Attached: Untitled.png (876x3712, 363K)

should i be upgrading from a 4670k to ryzen 3000?

Attached: 1510444467140.png (483x470, 184K)

8 times the threads

>that histogram equalization visualization score
Sign of a weak cache or memory.

i thought all this shit was coming out in july? wtf

12 core and under are coming in July.
16 core is in September

Will this make compiling gentoo quicker?

Unironically yes

look at the bandwidth.
AMD: DDR4 2133 vs intel: DDR4 3200, so yeah, they need to put at least the same memory in both systems.

>they sell out
>Intel has nothing to compete
>price jumps to 1049USD
>AMD shills:
"Lol it's still a great value AMD isn't going to turn exactly into Intel and fuck everyone via market dominance lol take this big branded plug and shove it up your ass and smile!"

This is what happens when there is no competition

Yes, that is what happens, and you also don't have to agree to overpay for something, because giving in justifies the price and turns that new price level into the new normal, ruining it for everyone in the long term.

That exact effect is why Nvidia has been getting away with $100 price increases every generation since the 900 series. People won't stop and go "Naw, fuck you, I wont buy unless it's cheaper".
They buy it anyways, damn the results, and now a midrange card is >$350 when it used to be $250 or less.

Here's one I found with more comparable RAM.

Attached: browser.geekbench.com_v4_cpu_compare_13567135_baseline=13567040.png (2080x8336, 1.32M)

cry more u intel shill

Attached: 1488221172975.png (2000x2000, 142K)

Tfw only a gayman so 3600x

Cool blog

the slightly higher clock speed on the 3900X will turn that slightly higher SC score on the 9900k into noise I'm guessing?

Ryzenfall you needed to flash a custom bios on the board. That way, basically anything that uses a BIOS is affected by Ryzenfall.

>AMD basically invalidates Intel's entire desktop lineup (and mosto f the server lineup too)
>Jow Forums starts creating more and more absurd scenarios to justify Intel

>9900kelvin
>500$
yeah nah try 600$ at least. or did the price actually drop for once?

>/JIDF/ starts creating more and more absurd scenarios to justify Intel
ftfy

also: Intel os now bowing down to samsung to help them out lol

Attached: samsung.png (927x468, 121K)

The fuck, how many lakes are there holy shit.

14nm+++++++++++++++++++++++++++++++++++++++++++++

And you call yourself a poorfag? Pussy.
I can't afford ANYTHING new because of 2300X that I bought two months ago.

ROCKET LAKE LMAO

Sapphire Rapids Q4 2020 will destroy AMD

Imagine having yields so shit that you have to go to Samsung and beg for their help.

instead of yields perhaps they have to (partially) shut down their fabs to upgrade them for smaller nodes and are outsourcing some things to avoid going bankrupt from the drop in production output