Threadripper 3000 BENCHMARKS LEAKED; INTEL DEAD

>It scored 68,576 points in multi-core testing. That’s near double the 36,000 that its predecessor, the 32-core 2990WX, achieves, and far more than Intel’s 28-core monster, the W-3175X, manages at around 38,000 (as per WCCFTech). In single-threaded workloads, it managed 5,523, which is again far higher than the 2990WX and W-3175X, which score around 4,800 and 5,150, respectively.


>What’s amazing is this might not even be the most capable third-gen Threadripper CPU. AMD’s Epyc server range has a new, 64-core CPU that uses the same Zen 2 chiplets. There’s always the possibility that a 64-core Threadripper is waiting in the wings, too.

>digitaltrends.com/computing/amd-threadripper-3000-cpu-demolishes-competition/

INTEL FAGS COPE!!!

Attached: AMD-Threadripper-3000.jpg (750x422, 127K)

Other urls found in this thread:

techpowerup.com/258739/amd-readies-three-hedt-chipsets-trx40-trx80-and-wrx80
cpu.userbenchmark.com/Compare/Intel-Core-i7-2600K-vs-AMD-Ryzen-7-3700X/621vs4043
cpu.userbenchmark.com/Compare/Intel-Core2-Quad-Q6600-vs-Intel-Core-i7-2600K/1980vs621
cpubenchmark.net/compare/Intel-i7-2600K-vs-AMD-Ryzen-7-3700X/868vs3485
cpubenchmark.net/compare/Intel-i5-2500K-vs-AMD-Ryzen-7-3800X/804vs3499
browser.geekbench.com/v4/cpu/14448604
userbenchmark.com/UserRun/19698768
youtube.com/watch?v=we2oePtTGMM
cpubenchmark.net/singleThread.html
twitter.com/SFWRedditImages

Can't wait for official information.

Sounds reasonable but I wouldn't trust wccftech

it's over for intel

Attached: threadchad.png (1242x1394, 1.08M)

>multi-core testing
LEL

>In single-threaded workloads, it managed 5,523, which is again far higher than the 2990WX and W-3175X, which score around 4,800 and 5,150, respectively.

>single-core testing
>higher than Intel
lel

baited hard :^)

cope

Attached: Bread.png (1024x719, 526K)

AMD is for chumps.

>why yes i do use amd, how could you tell?

Attached: 1566344722050.jpg (1242x1528, 157K)

He looks like a deformed piece of beef jerky.

If you're a 3D modeler who renders photo-realistic scenes, is it worth it to have something like a Threadripper AND a top of the line graphics card like a RTX2080Ti or is that IDORT mode?

I'm under the impression that with rendering you have to choose CPU or GPU, it can't be both at the same time. Which begs the question, which is better for rendering? a cpu or gpu?

Attached: maxresdefault[1].jpg (1280x720, 130K)

The W-3175X gets to 5GHz all core with a chiller. Can AyyMD do that?

Depends entirely on the rendering algorithm.

multi core performance doesn't matter

Depends on the software you use. Check its GPU support - if it supports CUDA or OpenGL or something, get a beef GPU. You should probably get a beef CPU anyway though.

>I am butthurt

Then don't buy a 32 CPU

>if not for some memory bandwidth limitations, might have snatched the performance crown from Intel in every benchmark there is

Jesus, this just goes to show how long a bit of misinformation and false assumption can endure. It's already been disproven to be a memory bandwidth bottleneck, but a Windows 10 Thread Scheduler issue.

>It's already been disproven to be a memory bandwidth bottleneck
where?

Attached: Intel-AMD-Naples-Reply-8-1080.1670557580.png (1500x844, 159K)

>with a chiller
(You)

>One important caveat with these results is that the Threadripper 3000 chip was tested under Linux, which tends to lead to better GeekBench scores than Windows

I've been curious about something. Thinking about putting a 9900k in a dev workstation. I do game dev in unity and would like to run a bunch of scripts in edit mode (essentially augmenting the existing editor) and this is not easily parallelized, hence the 9900k for its single core performance. But would I be better off with with a 2900wx/3000/3900wx and doing an extreme overclock to 5Gz+ (sub-ambient w/chiller, phase change or ln2)?

Attached: 2018-10-19-product-2.jpg (1600x1600, 124K)

AMD can't into overclock my man

>literally posting intel marketinc material

Nobody gets high-end consumer GPUs in vfx... There's currently no GPU renderer that can render a production scene and you almost always use your studio farm to render anyway.
It's mostly mid consumer or quadros, depending on how much discount you managed to (or not) get on the quadros.
If you're thinking about building a workstation for cg, your money is better spent on couple old xeons like E5-2670 and a 1070 or equivalent quadro. You want as much memory as you can fit on your board.

How the fuck is waiting for official release and benchmark by 3rd party is cope you dumb shit.

are you on crack?

>still believing in Intel/AMD fake benchmark pissing wars when CPU performance increases flattened out YEARS AND YEARS ago and all high end CPUs now are basically the exact same but Intel/AMD still gotta make their fucking money from idiots so they pull this goddamn garbage to convince people to buy their brand new garbage that isn't any better than their old garbage
You are all idiots.

care to explain why clocks and IPCs have increased? core count? i/o?

SHITEL POZZED STUTTERING HOUSEFIRES BTFO
>SHITEL POZZED STUTTERING HOUSEFIRES BTFO
SHITEL POZZED STUTTERING HOUSEFIRES BTFO
>SHITEL POZZED STUTTERING HOUSEFIRES BTFO
SHITEL POZZED STUTTERING HOUSEFIRES BTFO
>SHITEL POZZED STUTTERING HOUSEFIRES BTFO

>poisoning the well

Threadripper sounds chad as fuck

> b-but muh gaming

Eh. multicore doesnt really matter. i have a 2600k and gta5 runs fine

As much as I hate Intel, how is this shit feasible? They don't call AMD CPUs "House fire" for nothing

Wrong.gif

Hi Intel marketing :)

Hi Userbenchmarks!

>If you're a 3D modeler who renders photo-realistic scenes, is it worth it to have something like a Threadripper AND a top of the line graphics card like a RTX2080Ti or is that IDORT mode?
>I'm under the impression that with rendering you have to choose CPU or GPU, it can't be both at the same time. Which begs the question, which is better for rendering? a cpu or gpu?
Depends on what you're rendering, for small scenes GPUs can be quite good, VFX often use hundreds of millions in polygons and tens of gigabytes of textures, GPUs can't fit it in VRAM, the latest GPU renderers are supposed to support streaming data out of system RAM, I haven't tried them though. Nvidia pushing up GPU prices while AMD drives down prices on cores per $ is eroding much of the GPU advantage. CPUs are better per watt, GPUs per $ mainly because you can put 3 or 4 in one PC while AMD/Intel charge thousands of dollars for a bit of microcode that allows 2 CPU support.

fuck you goy

Attached: 1527629778452.jpg (679x758, 54K)

intel is quite literally for retards only, by this point.

Do you expect retards to realize how retarded they are?

You're just used to the stutters.

not really, but it's fun to point and laugh at them.

Do you want me to post the pasta?

Pasta's doesn't matter

t. fatty

Attached: 1413913323099.png (1000x1000, 162K)

>It scored 68,576 points in multi-core testing. That’s near double the 36,000 that its predecessor
This looks a bit too good to be true.

At that score it has to be a 64 core TR
With 64 cores that score isn't outside the realm of impossibility

but can it play df?

*isn't outside the realm of possibility

more than 32 cores are basically confirmed
techpowerup.com/258739/amd-readies-three-hedt-chipsets-trx40-trx80-and-wrx80

Based and redpilled.
cpu.userbenchmark.com/Compare/Intel-Core-i7-2600K-vs-AMD-Ryzen-7-3700X/621vs4043
>56% faster
Only a cuck would buy for such low gains, unless you're a faggot gamer who wants an APU for smash bros this is absolutely stupid. Tell me when they make it at least twice as fast, people were still holding onto their Q6600 back then and the performance gains were way better. cpu.userbenchmark.com/Compare/Intel-Core2-Quad-Q6600-vs-Intel-Core-i7-2600K/1980vs621

That just means we have a consumer platform with 4 or 8 memory channels, and a workstation platform with 8 channels.

They could easily leave it at 48 cores, which is enough.

>userbenchmark

can i use this on a motherboard that has a ryzen 1600?

>pants end above ankle
moslem detected

At this point of diminishing returns, you have no real reason to upgrade at a certain point. They've long reached points in which it doesn't matter. APUs are weaker and are way more balanced if you want a computer that does everything all in one chip, but if you had an intel process from the 920 or an FX there's very little reason to upgrade. Sure I can't convert my porn as fast but in real world performance with an SSD I can't tell the difference, my laptop feels just as fast as my desktop and it's way slower in benchmarking.

>intelbenchmark
Even pissmark is less garbage these days, but it will probably be """""updated""""" soon

cpubenchmark.net/compare/Intel-i7-2600K-vs-AMD-Ryzen-7-3700X/868vs3485

no

NOOO MUH SANDY
cpubenchmark.net/compare/Intel-i5-2500K-vs-AMD-Ryzen-7-3800X/804vs3499

It's 32 cores.
browser.geekbench.com/v4/cpu/14448604

Attached: 1541137318172.jpg (693x870, 95K)

Fake

IRL when does that score matter? I usually use my phone, I downgraded my main computer to a dual core, and you're only downloading loli and posting on Jow Forums anyway. I'm not against AMD but if you got a decent processor in the last 7 years it's not worth upgrading. Most people here are shit posting from their x220/x230 because memes like 4k, VR and multicore gaming are a joke, and anyone who has a brain is already using a rendering farm or just overclocking because it's good enough.

>64 core cpu
>worry about single core performance.
But yeah, Intlel is fucked.
What's wonderful with zen2 is that AMD's yields must be fabulous, and what they can't sell, they'll put in next-gen consoles.

Quad core? What do you need dual core for? Here's your 1c/2t CPU bro.

>32 cores boosting to 4.2GHz
MUH DICK
For reference, 2990WX scores 4138 in the 64-thread benchmark.
userbenchmark.com/UserRun/19698768

Attached: 1561921578141.png (1449x219, 31K)

You get T W O threads on your single core CPU bro

JUST BUY IT

Attached: 1488933859408.png (800x523, 604K)

>two threads? you already have one!

>still giving a shit about intelbenchmark
It's pozzed beyond repair.

Well, they're not hiding the multi-core score yet, so it's still valid for womparison, somewhat.

>they're not hiding the multi-core score yet
Soon

Doesn't matter as long as you ignore the % scoring.

ONE CORE AND ONE THREAD ARE ENOUGH FOR EVERYBODY

They already hid the 64-thread score from the list in the main page, you have to add the column manually. They only show single thread and 8 core tests, just so the 3900X doesn't score higher than the 9900K due to the extra cores, lmao.

Oh, I see.
But dude, think about it.
What if you ran an online database only surviving on ad revenue, and suddenly you're bathing in Jewish moneys?

>32 threads
>1flop/s each
>How come my processor no fast? >:(
Single thread performance is important or there's nothing to multiply genius. Also single thread programs are still extremely prevalent.

>In single-threaded workloads, it managed 5,523, which is again far higher than the 2990WX and W-3175X, which score around 4,800 and 5,150, respectively.
just kys

which company has less pajeets though?

But it downcloacks with more coars loads, Einstein.

What's your point retard? This guy is saying single thread performance is meaningless, he's a complete fucking bafoon. I didn't say Intel was better you drooling fucking mongaloid, I said that single thread performance in meaningful for multi thread performance.

Look, TR is there if you need more than 16 core 3950X.
If you're there, your workflow most likely doesn't give a shit about single core performance.

Single threading IS meaningless for multi-threading performance, when a single core is stressed its allowed to boost to whatever the fuck its capable of in clockspeed but that doesnt mean fuck all when all the cores are stressed and it has to stay within its 250w TDP.

Post the other chad with Navi

Who cares
Intel 10nm will even the playing field and Intel 7nm will be the end for AMD

INTEL 7NM SUPERPOWER WILL DESTROY AMD IN 2077

So amd will be selling all their trash CPUs for consoles. Actually selling the whole wafer.
I wonder what's left of an Intel wafer when they're trying to make an 8 cores.

By the time Intel finally releases 10nm desktop products, TSMC will already be shitting out 5nm EUV parts for Apple, AMD and Xilinx.

>I wonder what's left of an Intel wafer when they're trying to make an 8 cores
Lies and bribes

kek
youtube.com/watch?v=we2oePtTGMM

cpubenchmark.net/singleThread.html

Attached: 1558853140201.jpg (800x618, 185K)

by the time intel gets 7nm, amd will aleady be on 5...intel was sitting on their lazy asses for too long and they've fucked themselfs

I've seen your posts over the past few days and you need to go fucking drown yourself in a tub of acid this very fucking second.

That's the good shit that 7600k, 8600k and 9700k owners wish they could buy

>n-next year g-goyim I promise
>p-please start saving y-your schekels
>you'll also need a new m-motherboard of course

>implying I don't post here with my 2009 1.6Ghz single core atom netbook with 160GB HDD and 1gb ram

Unironically 5% single thread performance is more important than 300% multi thread performance.