>It scored 68,576 points in multi-core testing. That’s near double the 36,000 that its predecessor, the 32-core 2990WX, achieves, and far more than Intel’s 28-core monster, the W-3175X, manages at around 38,000 (as per WCCFTech). In single-threaded workloads, it managed 5,523, which is again far higher than the 2990WX and W-3175X, which score around 4,800 and 5,150, respectively.
>What’s amazing is this might not even be the most capable third-gen Threadripper CPU. AMD’s Epyc server range has a new, 64-core CPU that uses the same Zen 2 chiplets. There’s always the possibility that a 64-core Threadripper is waiting in the wings, too.
>In single-threaded workloads, it managed 5,523, which is again far higher than the 2990WX and W-3175X, which score around 4,800 and 5,150, respectively.
If you're a 3D modeler who renders photo-realistic scenes, is it worth it to have something like a Threadripper AND a top of the line graphics card like a RTX2080Ti or is that IDORT mode?
I'm under the impression that with rendering you have to choose CPU or GPU, it can't be both at the same time. Which begs the question, which is better for rendering? a cpu or gpu?
The W-3175X gets to 5GHz all core with a chiller. Can AyyMD do that?
Liam Russell
Depends entirely on the rendering algorithm.
Grayson Jones
multi core performance doesn't matter
Lucas Johnson
Depends on the software you use. Check its GPU support - if it supports CUDA or OpenGL or something, get a beef GPU. You should probably get a beef CPU anyway though.
Isaiah Moore
>I am butthurt
Andrew Sanchez
Then don't buy a 32 CPU
Liam Hill
>if not for some memory bandwidth limitations, might have snatched the performance crown from Intel in every benchmark there is
Jesus, this just goes to show how long a bit of misinformation and false assumption can endure. It's already been disproven to be a memory bandwidth bottleneck, but a Windows 10 Thread Scheduler issue.
Aaron Evans
>It's already been disproven to be a memory bandwidth bottleneck where?
>One important caveat with these results is that the Threadripper 3000 chip was tested under Linux, which tends to lead to better GeekBench scores than Windows
Landon Watson
I've been curious about something. Thinking about putting a 9900k in a dev workstation. I do game dev in unity and would like to run a bunch of scripts in edit mode (essentially augmenting the existing editor) and this is not easily parallelized, hence the 9900k for its single core performance. But would I be better off with with a 2900wx/3000/3900wx and doing an extreme overclock to 5Gz+ (sub-ambient w/chiller, phase change or ln2)?
Nobody gets high-end consumer GPUs in vfx... There's currently no GPU renderer that can render a production scene and you almost always use your studio farm to render anyway. It's mostly mid consumer or quadros, depending on how much discount you managed to (or not) get on the quadros. If you're thinking about building a workstation for cg, your money is better spent on couple old xeons like E5-2670 and a 1070 or equivalent quadro. You want as much memory as you can fit on your board.
Julian Jones
How the fuck is waiting for official release and benchmark by 3rd party is cope you dumb shit.
Luis Miller
are you on crack?
Benjamin Myers
>still believing in Intel/AMD fake benchmark pissing wars when CPU performance increases flattened out YEARS AND YEARS ago and all high end CPUs now are basically the exact same but Intel/AMD still gotta make their fucking money from idiots so they pull this goddamn garbage to convince people to buy their brand new garbage that isn't any better than their old garbage You are all idiots.
Nicholas Ortiz
care to explain why clocks and IPCs have increased? core count? i/o?
Eh. multicore doesnt really matter. i have a 2600k and gta5 runs fine
Samuel Hill
As much as I hate Intel, how is this shit feasible? They don't call AMD CPUs "House fire" for nothing
Nathan Howard
Wrong.gif
Elijah Miller
Hi Intel marketing :)
Aiden Peterson
Hi Userbenchmarks!
Benjamin Peterson
>If you're a 3D modeler who renders photo-realistic scenes, is it worth it to have something like a Threadripper AND a top of the line graphics card like a RTX2080Ti or is that IDORT mode? >I'm under the impression that with rendering you have to choose CPU or GPU, it can't be both at the same time. Which begs the question, which is better for rendering? a cpu or gpu? Depends on what you're rendering, for small scenes GPUs can be quite good, VFX often use hundreds of millions in polygons and tens of gigabytes of textures, GPUs can't fit it in VRAM, the latest GPU renderers are supposed to support streaming data out of system RAM, I haven't tried them though. Nvidia pushing up GPU prices while AMD drives down prices on cores per $ is eroding much of the GPU advantage. CPUs are better per watt, GPUs per $ mainly because you can put 3 or 4 in one PC while AMD/Intel charge thousands of dollars for a bit of microcode that allows 2 CPU support.
That just means we have a consumer platform with 4 or 8 memory channels, and a workstation platform with 8 channels.
They could easily leave it at 48 cores, which is enough.
Juan Clark
>userbenchmark
Luis Bell
can i use this on a motherboard that has a ryzen 1600?
Grayson Turner
>pants end above ankle moslem detected
Andrew Sanders
At this point of diminishing returns, you have no real reason to upgrade at a certain point. They've long reached points in which it doesn't matter. APUs are weaker and are way more balanced if you want a computer that does everything all in one chip, but if you had an intel process from the 920 or an FX there's very little reason to upgrade. Sure I can't convert my porn as fast but in real world performance with an SSD I can't tell the difference, my laptop feels just as fast as my desktop and it's way slower in benchmarking.
Gabriel Williams
>intelbenchmark Even pissmark is less garbage these days, but it will probably be """""updated""""" soon
IRL when does that score matter? I usually use my phone, I downgraded my main computer to a dual core, and you're only downloading loli and posting on Jow Forums anyway. I'm not against AMD but if you got a decent processor in the last 7 years it's not worth upgrading. Most people here are shit posting from their x220/x230 because memes like 4k, VR and multicore gaming are a joke, and anyone who has a brain is already using a rendering farm or just overclocking because it's good enough.
Lucas Morris
>64 core cpu >worry about single core performance. But yeah, Intlel is fucked. What's wonderful with zen2 is that AMD's yields must be fabulous, and what they can't sell, they'll put in next-gen consoles.
Eli Barnes
Quad core? What do you need dual core for? Here's your 1c/2t CPU bro.
>still giving a shit about intelbenchmark It's pozzed beyond repair.
Easton Campbell
Well, they're not hiding the multi-core score yet, so it's still valid for womparison, somewhat.
Robert Morris
>they're not hiding the multi-core score yet Soon
Christopher Ramirez
Doesn't matter as long as you ignore the % scoring.
Sebastian Gomez
ONE CORE AND ONE THREAD ARE ENOUGH FOR EVERYBODY
Elijah Murphy
They already hid the 64-thread score from the list in the main page, you have to add the column manually. They only show single thread and 8 core tests, just so the 3900X doesn't score higher than the 9900K due to the extra cores, lmao.
Parker Nguyen
Oh, I see. But dude, think about it. What if you ran an online database only surviving on ad revenue, and suddenly you're bathing in Jewish moneys?
Nicholas Sanders
>32 threads >1flop/s each >How come my processor no fast? >:( Single thread performance is important or there's nothing to multiply genius. Also single thread programs are still extremely prevalent.
Jose Lee
>In single-threaded workloads, it managed 5,523, which is again far higher than the 2990WX and W-3175X, which score around 4,800 and 5,150, respectively. just kys
Hudson Wright
which company has less pajeets though?
Michael Williams
But it downcloacks with more coars loads, Einstein.
Brayden Green
What's your point retard? This guy is saying single thread performance is meaningless, he's a complete fucking bafoon. I didn't say Intel was better you drooling fucking mongaloid, I said that single thread performance in meaningful for multi thread performance.
Mason Howard
Look, TR is there if you need more than 16 core 3950X. If you're there, your workflow most likely doesn't give a shit about single core performance.
Jose Watson
Single threading IS meaningless for multi-threading performance, when a single core is stressed its allowed to boost to whatever the fuck its capable of in clockspeed but that doesnt mean fuck all when all the cores are stressed and it has to stay within its 250w TDP.
Jaxon Perez
Post the other chad with Navi
Luis Hall
Who cares Intel 10nm will even the playing field and Intel 7nm will be the end for AMD
Grayson Peterson
INTEL 7NM SUPERPOWER WILL DESTROY AMD IN 2077
Eli Anderson
So amd will be selling all their trash CPUs for consoles. Actually selling the whole wafer. I wonder what's left of an Intel wafer when they're trying to make an 8 cores.
Jaxon Carter
By the time Intel finally releases 10nm desktop products, TSMC will already be shitting out 5nm EUV parts for Apple, AMD and Xilinx.
Nathan Lewis
>I wonder what's left of an Intel wafer when they're trying to make an 8 cores Lies and bribes