32 cores

>32 cores
>64 threads
>250W TDP
>4.2 GHz
>80MB Cache
>64 PCIe lanes

A decade ago, this would be called a supercomputer.

Attached: athlon 3200+.jpg (1000x582, 262K)

Other urls found in this thread:

youtu.be/KmtzQCSh6xk
twitter.com/AnonBabble

Today it's just called a toaster.

And no one uses even 5% in daily day to day use.

>Gimped memory bandwith
no thanks

Wtf is that file name

>ebyn
>ryzen
>threadrypper
Why is naming targeted at 15 year olds?

t. Intlel

make -j65

Attached: 1454453155670.png (508x497, 12K)

Go drown yourself in whatever the current name of the lake is

I like it, Threadripper especially. They can be named outrageously because they are outrageous. I imagine the Intel HEDT would be at most 8 core by now if not for AMD.

imagine compiling qt on it, few minutes

Attached: 1495632908929.jpg (636x466, 18K)

>8 glued together chips
lmao ok, i guess my 9900k is alien technology if that's a
>supercomputer

8 > 2

>he fell for the 9900k meme

enjoy 10!!!!! more fps than an i5 9400F for 5x the price !!!!!!!

8/16 > 8/16
550 > 290
oh wait...

>8
You suck at counting, it's 4. Experiencing a rounding error or something, bro? Also, you know very well gluelogic and chiplet is the way of the future. Even Intel knows and is just copying AMD like nVidia with their current chiplet research.

Also, the fabrication and diffusing process for silicon chips universally use shittons of tape and industrial adhesive, so even you TOO have LOL GLUED CPU LOLOLOL.

If you think your mesh-based core-to-core communication is superior or, in fact, any different than each single Ryzen die, you're functionally retarded.

This. Intelbabbys really fucking suck at building computers that make sense. An i7 is overkill in 2019, an i9 is a weird $550 flex.

trufax.

I went from a Phenom II X4 975 @4.2GHz to my R7 2700X @ 4.25GHz system at the Ryzen+ launch.

I even considered building an 8700K system instead but decided I would get a board I could put Ryzen 2 and Ryzen 3 CPUs on if I wanted to upgrade sometime in 2019 or 2020.

Watched as Intlel went on to release the 8086K and laughed and then laughed even harder when Coffee Lake 2: Intlelic Nigaroo came out later in the year.

I had a budget of $3500 USD and _still_ decided to spend money on a platform that will still exist for another few years.

Why?

Because I'm a fucking enthusiast, not a moron.

I don't upgrade(as in new platform) my primary gaming desktop very often, only once every 7-10 years and intermittent other hardware upgrades keeps me in a very happy position of gaming.

This time, I built a gaming rig that can dual function as a secondary server to compliment the rest of my home network, which includes a Xeon 5-2690 v2 that I was using as my render and media serving box which I now use far less since I built this beast...

I really don't think I'll be purchasing even Intel enterprise and server grade equipment anymore, simply because of how shit they have consistently been for the past 5 years.

If they get their shit together and release stuff at sensible prices and stop separating their enterprise and mainstream products by 2 or more generations, I may reconsider my position...but until then, literally every Intel fanboy is retarded.

Go jump off whatever the current name of the ridge is.

nice

No desktop OS can even efficiently use all cores at once, a hypervisor with 2-4 VMs is a must in this case. Or a server OS like linux.

Each core on a desktop is like a Raspberry Pi, I might have to rewire my place.

Imagine if every desktop on the planet contributed just 5% of it's idle compute cycles to solving the worlds biggest problems.

>not the 9900KFC
shiggy diggy

>69xxx420XEON420xxx69
>777CANNON777LAKE777
literally xbox call of duty names. fuck off

>A decade ago, this would be called a supercomputer.
A decade ago? Add another decade and we can talk about supercomputers.

You're so very enlightened.

Attached: 4L_nBxXNRVn.jpg (480x600, 22K)

10 more fps in 8k, which is 80+ in 1080.

>supercomputer
I don't think so.

It is as powerful as a mid level cluster from a decade ago tho

midrange parts are always best price/performance

2600/rx580 is still the best poorfag combo

RX 570 is the patrician poorfag gpu.

imagine compiling qt

>not -j48 so your video/music and shitpost experience stays smooth

I'm sad that I still have to wait another year for higher core/CU Zen 2 APUs

A decade ago, THIS would be called a supershitpost! Now it is just the absolute state of Jow Forums

I want to compile a qt gf for myself ;_;

Attached: ISHAL.webm (1150x480, 710K)

>enjoy 10!!!!! more fps than an i5 9400F for 5x the price

So you're admitting AMD is inferior

I genuinely laughed out loud
thank you user
haven't had a laugh like that in months

>32 cores
>only 250w tdp
That's kind of amazing, a 2700x has a 105w TDP and that has 4x the cores, based AMD.

Shame that you HAVE to use Linux distribution of some sort to actually have the CPU be properly used since Windows can't into NUMA access.
It is pretty sick isn't it? To be fair, they're using binned chips to get some of the best ones.

>32 cores
>64 threads
still slower than intel. yeah you can get some decent multi-thread by grouping a whole bunch of slow singled threaded cores together like intel did but unless you do the 2% of super parallelized workloads out there than thing will lose to a 9600k in BF:V.

>binned chips
That's totally fine, and I supported it when they made those super high core count X99 Xeons and the Q6600, made for awesome stuff.

9900k flexes on autocad fucktard

>ONLY GAMES MATTER!
No one is buying a Threadripper 2990WX to game on you fucking mouth breather. And say they do game occasionally, I highly doubt they'll give a shit about their absolute max fps.
Yea I honestly am considering getting Threadripper when 7nm launches. I'm getting by fine with my 8c16t chip I have now, but I encode a lot. I'd love to have 5.0GHz clocked 16c32t chip. Be able to fully max out most video encoders (16 threads), while gaming, with 20+ tabs of Firefox open. So excited for 7nm Ryzen.

idk user 32 cores is cool and all but even the 16 core threadrippers arevery niche. All the retarded tech reviewers magically forget the enormous workstation market for autocad in construction etc. Where Ryzen still isnt a good alternative.

I don't what intels gonna do if AMD does some outrageous shit like make they're mid range processor an 8 core 65w or even 95w at the ryzen 2600 price point, shit is gonna be nuts.

and neither is mainstream Intel. A true workstation is going to have ECC memory (which neither Ryzen nor i9 stuff fully supports). So their options are marked up Xeon's or Threadripper stuff. For the price, most will go Threadripper. CAD software is still very single core based and that's the ONLY reason Intel still has a foot hold there. I'm hoping with some IPC increases in 7nm, maybe AMD will corner this market too.

games do matter and matter most as games is what 85% of computer uses do on their computers. outside that its social media, masturbating to japanese cartoons drawn by fat japanese men who have a pedo complex, and shitposting on twitter and 4chans.orgs. only a small, top 1% of retards use something that needs all those threads at home. unless you work for a fortune 100 company doing professional work at a corporation you don't need a 32 core processor that's almost as slow as arm in single threaded performance. you're not a professional because at home you stream on twitch. you're not a processional at home because you unironcailly run gentoo, because again, you're retarded, and need it to compile chrome from portage. you're not a kernel dev, you're not some indie developer. stop fooling yourself. stop being retarded.

Dude transcoding a simple bvideo on your pc or browsing the web will leverage those core, this isn't 2003 where you have a P4 and they keep cranking the clocks up hopeing to get more performance, RYZEN has very good ipc, and moar cores, multi threading matters now, otherwise intel wouldn't be cramming dual ringbus 14 core nonsense into one CPU, they'd still be using 4c4t if thats all that was needed.

IT would be a potentata workstation CPU, but a decade ago superkomputera were already pushing thousands od cores.

Actually there are a fair amount of productivity work loads that can benefit from Threadripper's high core count, such as:
- Source Code Compilation (Make can take advantage of multiple cores)
- Building Docker Images
- Virtualization for development purposes
- Media encoding
- Rendering
- Finite element analysis / scientific simulation
- CAD

I do backend / server / system development and I use my 4 GHz for compiling source code and building Docker images all the time. It is totally worth it.

Whoops. I meant to say my 4 GHz 1950X not my 4 GHz.

Is this a Ryzen issue? I can max out my shitdozer CPU and it doesn't stutter at all

Based FX, getting a consistent framerate while bottlenecking the shit out of modern cards.

>AMD actually advances CPU design
inteltards:
>G..GLUED TOGETHER!
when Intel literally is starting to do the same thing. lmao

Attached: 1535425069376.jpg (480x472, 24K)

As opposed to the Xtreme Edition?

oh boy ive always wanted to use premiere at 16k

Remember Skull Canyon? 2edgy4me

Kek they did that gluing shit since Pentium D and the Q6600, but they get praise for it because intel can do no wrong.

Attached: 20190317_174326.jpg (2576x1932, 1.99M)

I'll be replacing my freenas server with a EBYN at some point, mostly for efficiency and all the i/o it has built in

Ryzen supports ECC
t. Using ecc on ryzen because autism

ECC is not even need except in some rare fluid dynamic sims for a military or aero space.

>ecc isn't needed
tell that to me when Mr. Solar Flare flexes on your browser cache

i would never buy a 9900k but only a retard would buy it for gaming.

>buying a 32 core cpu for gaming.
People don't buy a threadripper for gaming you fucking retard.

zfs scrub.

then why do all benchmarks show threadripper running games? check and mate ayytheist

What did he mean by this?

Thats all that reviewers know how to do

This, I don't know why they do it when new multi core APIs exist that make the benchmarks closer to being GPU bound.

???????????///////

>Not living in a lead lined bunker

Good. Windows needs to die.

literally laughed out loud with this one
underrated

Attached: 1402383961240.jpg (269x319, 8K)

fuck off with your reddit spacing

Attached: 1549041940998.png (922x715, 282K)

Inshill Tears Lake

Construction autocad in general as in running multiple single core dependent programs at once is why they dominate. You dont need all this extra trash added in so mainstream i7/i9 are the best price/perf. Im sorry user but ryzen isnt competitive here yet.

hell yeah brother

290% less performance

>high core count
>single threaded program

oh gee!!!

youtu.be/KmtzQCSh6xk

The fuck program did I mention that is single threaded you fucking retard?

He's claiming the threadripper get's betfo when running single threaded apps that arent what you specified, although threadripper would still do alright, even AMD FX does alright with one core running at 4GHz.