THANK YOU BASED AMD

THANK YOU BASED AMD

Attached: Kekspot.png (1920x1080, 1.72M)

Other urls found in this thread:

youtube.com/watch?v=K4xctJOa6bQ
grammar-monster.com/lessons/vocative_case_commas.htm
twitter.com/NSFWRedditVideo

Why no mainstream dual socket motherboards?

Attached: Dual-AMD-EPYC-in-Supermicro-AS-2023US.jpg (800x472, 203K)

For what reason?
EPYC is locked, so no OC.
What other features could you possibly want that couldn't be better implemented on an add-in board?

more cores

youtube.com/watch?v=K4xctJOa6bQ

ebyn

They exist though, you can buy dual socket boards right now...

not dual socket core i7/i9 or ryzen though

BIG RED BARS

Attached: openssl.png (631x511, 41K)

Why though?
Dual docket EPYC and Xeon exist.

Epyc yes, but not the Threadripper and same for cosumer Intel CPUs like 7980xe.
You cannot rock two threadrippers or 7980xe's
And lots of server CPUs in fact have lower clock speeds.
But even at the moment there are dual epyc workstation solutions IIRC it is only matter of money.

>What other features could you possibly want that couldn't be better implemented on an add-in board?
What do you mean here?

>Steven "AMD Unboxed" Walton
(lol

He's a fucking hypocrite calling out Linus and some reviewers as Intel shills when his reviews are clearly skewed towards AMD

Nothing wrong with being an AMD shill, just be transparent about it as opposed to pretending that he's """unbiased"""

So basically you want 'muh overclocking dual socket CPUs!' which is a market both Intel and AMD tried and found that it never makes money.
Want you want will never happen.

That is strange what doesn't AMD just fucking unlock the Epyc?
If it gets sold to some enthusiasts who want muh 128 threads it is okay but if not no problem.

Because it would cannibalize the server market.

>THANK YOU BASED AMD
grammar-monster.com/lessons/vocative_case_commas.htm

>Why no mainstream dual socket motherboards?
What do you mean with "mainstream"?

>dual socket 64c/128t Rome
Cinebench can't even handle that many cores and threads.

delid dis

Time to pull sc2 out of the drawer

PCIe v4 is going to make AMD shine so bright Intel will literally have a meltdown.

Starcraft 2 has terrible multicore support.

>Goyim know
Oy Shut it down

Quite frankly don't care that much about the 2970WX and 2920X, what to me makes this launch cool is that the 1920X and the 1950X are on sale locally

Attached: performance-per-dollar-fs8.png (1045x1030, 39K)

>Nothing wrong with being an AMD shill, just be transparent about it as opposed to pretending that he's """unbiased"""
You mean how all reviewers do? They all claim to be unbiased and try to make it as such, yet consistently pull out stunts to make one or the other look better. Just look at how Linus recently showed a temperature delta of -20ºC compared to everyone else reviewing the 9900K. Oh, what a coincidence, Linus is sponsored directly by Intel!
Humans are never void of bias. Some are more than others. You're only seeing one case of bias because of your own internal bias. Take your head out of your own ass before accusing anyone of bias.

>Effectively eliminating Skylake-X
How much did AMD pay him to put that in a headline?

I'm looking at other tech publications and none of them have a cringy title like that.

Nice try Steve

Next time add a "sponsored by AMD" on your videos so you won't get called out

>2018
>caring about youtube reviewers

the guy got expose when he doesn't know 4 and 8 threads per core power pc exist.

shouldn't this test compete against xeons though?

>2018
>Caring about performance at all
Intel is the better lifestyle brand, goy. Just buy it.

better not
one epyc is worth 2x xeons so i imagine the amount of salt from inlets

The real issue here is that Intel has stretched the "official spec" for their CPUs wider than goatse so that they can lie about official TDPs and boost clock/all core clock speeds to try and hide the fact that their CPUs are an overheating mess.

Oh no

OH NO NO NO

Attached: AMDBTFO.jpg (558x836, 119K)

Kill yourself back to /v/.

Go shill for diversity jewish people and feminism some more.

>something twice as expensive, twice as hot and twice as power consuming performs slightly better
oh wow, really?

Still gonna buy Intel, AMD sucks

>save $20 on his CPU and $1 a month on power
>has to spend $500 more on 4200mhz RAM so his poozen doesn't bottleneck at stock clocks

pajeet cope

Nobody cares how you waste your money.

all amdrones can do is cry about muh 10 jiggahurtz 7nm and how the 2700x is being sold next to nothing

no wonder amd stocks tanked, imagine if you can actually buy the 9900k at normal msrp

2700x+bdie ram is cheaper than 9900k+cheapest 2133 micron.

Can't wait for the 3700x on 7nm with 5ghz boost

>Threadripper thread
>Incels show up whining "MUH GAYMES" like useless faggots

Every fucking time.

The vast majority of the planet will still buy Ryzen even if Intel can reach 10GHz with passive cooling.

That's right! Intel's brand identity is the most compelling, so I'm going to keep being cool and buying blue, instead of being lame and poor and undesirable by associating myself with gross red!

Attached: npcmeme.jpg (1400x1400, 211K)

yeah but intards can't into math, you see

BASED INTEL PUSH FOR MORE GGGGGGGGHHHHZZZZZZZZZZ DESTROY THE PLANET FOR GGGGGGHHHHHHHZZZZZZZZZZZZZZ

he's doing it for free
>check'em

Attached: 1518659378509.jpg (2880x1608, 541K)

intel 10ghz superpower by 2020

6 gorillion jewgahurtz

>GAMES DON'T MATTER
>except when AyyMD has a lead
Kek

intel gpoo superpower by 2020

INTLEL 10NM SUPERPOWER BY 2020

>only games matter!*
*at 1080p

YIKES

Attached: 1540835698533.png (1327x1222, 69K)

MOAR HOUSEFIRES

>incels
tits or gtfo

so If I'm reading this correctly it shows that you can buy a 9900k, slap on a d15m overclock it to 5ghz and it will outperform any pajeetware poozen rig?

Attached: choice.jpg (992x1024, 132K)

>ONLY 4K GAMING MATTERS
>T-TTHANKS NVIDIA FOR HAULING OUR SHIT CPU

Attached: Screenshot.png (930x394, 45K)

>pajeetware
meanwhile intel dominates amazon sales in india and china while ryzen does the same in the west
>meanwhile intel is about to get a pajeet ceo

Sure. If you want to pay twice the price for 10% better performance.

okay, now post refresh rate of those monitors

Not twice the price after factoring in needing super expensive high speed RAM

Yeah, it's faster we got it.

BUT

Only in synthetic tests! They trick them with toy cores.

Intel's real world superiority is reflected in GAMES. Games tests are much harder to rig.

Good luck with your propaganda tho, haha ha.

It is twice the price considering you need a mobo capable of handling that much power and a watercooled setup.

It's in the same product range as the i9s
Ryzen 3/5/7 = Core i3/5/7
Threadripper = Core i9
EPYC = Xeon

How's that a fair comparison, though? You're comparing a CPU that's two months old to one that's over an year old. You should wait until November for 9980XE for an honest benchmark.

March 2019 can't come soon enough. I wonder what Intel NPC's will be repeating then.

>when IntelAviv's last bastion is literally a manchild activity

Nobody's fault when companies release products months apart from one another.

The refresh Skylake-X isn't different from the original other than they have solder.

the cheapest 2133 16 gb ram is $100
3200 vengeance lpx is $135
no wonder you're getting jewed so hard. you can't even do basic maths

>muh manchild
>RYZEN
>THREADRIPPER
>EPYC

Lol 10nm was meant to come out what 6 years ago?

There is literally no point in arguing with Intel NPC's.

>XEON
>intel core EXTREME
>Skylake X

>Ryzen 3/5/7 = Core i3/5/7
Except none of those Intel lines support SMT.

Does nobody care about power consumption over here? The upcoming extreme editions from Intel will allegedly be 15% more power efficient. So why would I buy a 2990WX when I can't even use all the cores, and have to feed it more energy because it consumes 250W for 16 extra cores that will most likely sit at 0%?

Someone post it.

they're within same power envelope, MSRP, platform, and overall performance

>15% more power efficient
15% better than a nuclear furnace? It does not matter anyhow as Zen 2 will be raping Intel for the next few years at least.

then don't get 2990wx, 2950x costs half the price while having 2 less coars and there's no way you'll end up ever saving at least $900 worth of electricity more(not including the price hike from 7980xe to 9980xe) no matter how intensive the workload is over its lifespan

Well, on paper it's 165W i9 vs 250W TR2. We'll see when they ship.

7980XE is $1999, while 9980XE is $1979, so there's no price hike. The reason I mentioned the 2990WX is because that's the corresponding alternative to 9980XE both in performance and in price; AMD's being ~$150–200 cheaper, depending on region.

For those few seconds before thermal shutdown, at twice the price and power draw, it will be 10-12% faster in Intel's sole winning use case. Congratulations.

There's no reason for that to exist, if you can buy a 32 core CPU.

This, games are the single most important thing in the universe

>The only use case that exists is the use case where the multibillion dollar corporation I have identified my ego with wins.
(You)

That's TDP not power consumption, and if you use a single zen core you're going to use much less power than using a single intel core.

It's no use replying to NPC's trying to spread lies and falsehoods that other NPC's told them were true.

While the difference is not very large, it's there, it's undeniable, and it adds up if your workstation is on almost 24/7.

Attached: Annotation.png (707x532, 174K)

>32 core part vs Intel 16 core part (i9-7980XE)
>uses only a few percent more power yet hands Intel its ass on a plate.

As I've said, you should wait for 9980XE, the real contender for 2990WX.

>excuses, excuses, excuses

>18 cores vs 32 cores
>can only boost to 4.5Ghz housefire
>Industrial chiller not included
Intel TDP is meaningless to anyone but Linus and OC3D paid shills.

So the truth is an excuse now? You are really reaching.

Yet, with all those 32 cores, it can only do two things well: Blender and C4D. Even 2950X outperforms it in many cases which should have never happened.

There's literally no point for a CPU to even exist other than video games

4K gaming won't matter until Nvidia puts out a midrange card (price-wise, not the current clusterfuck that is the RTX 2000 series) that is going to reliably provide 60 FPS at 4k.

>People on HEDT usually buy a CPU based on the workloads they will be using them for.
No kidding!
>But muh games!