SINGLE THREAD PERFORMANCE DOESN'T MATTER!

SINGLE THREAD PERFORMANCE DOESN'T MATTER!

>5ghz wasn't even needed for AMD to have better performance than Intel
Its like Intel vs Bulldozer (which clocked higher than intel back then but had worse performance due bad IPC).

Attached: 1558929062636.jpg (2053x1025, 137K)

Other urls found in this thread:

techspot.com/news/80241-amd-announces-ryzen-9-3900x-flagship-desktop-cpu.html
hexus.net/tech/news/industry/113090-intel-use-cobalt-interconnect-layers-10nm/
twitter.com/NSFWRedditImage

I wonder how Intel's 10nm is going to clock. For some reason I'm expecting less than 5GHz

0ghz because it'll never come out

there are already the 8121u so at least 3.2ghz

We'll have to wait for next year :^)

but it has higher single thread performance as shown in the pic

>been out for a year

where?

Imagine, 2 years ago the 9900K would have been $1500
thanks AMD.

Why would you buy slower, more expensive, full of security holes, power hungry CPU
I don't know why would anyone buy Intel at this point

wtf are you talking about? I literally only said intel massively dropped their prices cuz of AMD..

why is the single thread performance so bad?

You mean 2023?

Is this before or after the latest mitigations?

Ayymd barely beats a cpu on 14nm while being on 7nm in their own Benchmarks
Ill see when the real Benchmarks come out
Also lower clocks, lower core count and higher price than what AMDrones have been shilling the past months

That no one should buy Intel even at lower than Intel (higher than AMD) prices? Including all "benefits" I mentioned previously

Why would you need 10nm in 2023? 2024 it is
Will be releasing in 2025
>2026. Intel releases 10nm desktop CPUs

10nm is doa and relegated to low power mobile applications. There will never be a high performance desktop unit on 10nm.

AND sucks
nothing new here

Based phoneposter.

>Its like Intel vs Bulldozer (which clocked higher than intel back then but had worse performance due bad IPC).
not really the only thing that killed bulldozer was the retarded decision to delay it for a year
cause we know for a fact now that it was shitting on lynnfield big time but...

>believing anything a big company has to say
>inb4 seething Intel retard
I own a 2700X. Wait for use benchmarks.

stay poor

Attached: Untitled.jpg (726x472, 80K)

Its never going to happen.
Intel is a diverse workplace now. Its stock price is going to tank to zero while asians take over.

>SINGLE THREAD PERFORMANCE DOESN'T MATTER!
SHUT UP GOY SHREEEEEEEEEEEEEEEEEEEEEEEEEEEE

shut up goy

at least try to hate with a goand think_again pasta.

Depends how hot you like your PC components.

they themselves said its going to be lower

Attached: intel-suicide-watch.png (2880x1620, 369K)

10NM BTFO

>AMDrones now care about single core performance
Keep moving the goalposts, shill

>Intel lost the single instance where they were still ahead
keep moving the goalposts, shill

Prepare to be disappointed when actual fair reviews and benchmarks are published... again

You Intel dudes are like the bitconnect people when it comes to denial, and is making faster chips that cost less and use less energy, that is all.

AMD*

Like Principled Technologies benchmarks?

Intel 7nm in 2077 will BTFO AMD

>AMDrones now care about single core performance

Okay so if it doesn't Intel still gets obliterated in Multi core, as it's been for years now.

Where do we go from here?

Attached: 10764815.gif (195x229, 1.21M)

Uhhhhhhhhhhhhhhhhhh

CPUs don't matter

checked

Based and redpilled

Attached: 1558194662434.png (166x166, 14K)

Unironically this. Intel will start pushing for GPGPU instead

Nothing about frequency there?

Inb4 they start making game consoles.

Based quads for truth

GLOBALFIRES

ryzen 2 doesn't have a 15% ipc boost.
their single core score is 15% better in SYNTHETIC benchmarks.
something ryzen always does well in due to their SMT being so efficient.
so real world IPC+ is maybe 5%

low because it doubles as a furnace
The sdp's require liquid cooling, or a double size heatsinks and all of the them are bga.
Icelake must have only been named out of irony.
t. works at intel
At the rate of development this could be true. If intel's stock keeps dropping I can only see icelake being rushed even more than it currently is. One of my colleagues had 2/3's of his team laid off and we've been short staffed as it is.
And icx isn't even that much faster than skylake.

>t. works at intel

Attached: 1528269341610.jpg (466x591, 23K)

Are you stupid? SMT has no effect on single threaded scores.

You don't' have to believe me. I obviously have no way of proving it without losing my job.

Attached: e1b.jpg (680x383, 46K)

>firecuck
>botnet edition

Attached: 8567478765867598679876.jpg (741x402, 41K)

Attached: 1457369900063.gif (600x313, 860K)

>3900X +4% single thread score over 9700k
NO INTELBROS IT WAS OUR LAST DEFENSE

Attached: 205af7691b98b54bac4c566fa9a13f146f820541c77ca7976c44988bc3f54d2f.png (800x618, 937K)

IPC is good but IPC plus moar cores is better yet

this

NAGATORO NO

techspot.com/news/80241-amd-announces-ryzen-9-3900x-flagship-desktop-cpu.html

Amd is light-years Superior to Intel in APU. Intel uses Amd for their nucs.

Ryzen 2200g, 2400g Is capable of nearly close to Xbone/PS4 console performance.

ok? who gives a fuck? nobody was talking abouy buying their shitty CPUs

>Intel CEO walks out on stage
>Hey hey heeeeeyyyyy, WASSUP Intlets!
>Nononononono Massive SCAM

>SMT has no effect on single threaded scores
Breathes in*
that's a YIKES from me.

why is 9920x single thread so garbage?

Low turbo?

cheeky cunt

It doesn't, but AMD still kicks asses in multithreaded workloads too.

Yeah but Intel 10nm > TSMC 3nm

How the fuck has Intel been getting spanked this hard by AMD over the past couple years? Its not even slowing down, but in fact accelerating.

Here's how I understand it:
>make a CPU that skips mandatory stuff while running which makes it super fast
>competition can't understand how you got so far ahead
>competition starts pushing for more cores instead
>it fails to get interest
>dominate the market for years and years while making bank
>competition hires the superhuman mastermind engineer to make a new CPU from the ground up
>it's an awesome design that's easy to make and cheap for end users
>you have nothing as a backup plan even when you had a decade to prepare something legit
>try to downplay the competition since it's a new process and the first generation is still rough around the edges
>suddenly your jewish tricks got exposed and people know how you got your speed advantage
>it can't be left open and patching it takes half off the performance
>you still have nothing to fall back on and now your CPUs are slow
or in a single word: greed

Intel put all their eggs into the 10nm basket and then got fucked by low yields/other failures while AMD came out of nowhere with a competitive CPU and caught them off guard since they had it easy for the previous 6 or so years vs Bulldozer. They then had to fall back onto just trying to squeeze out everything they could from Skylake to stall for time while they tried to salvage things, hence the 14nm+++ etc. There is only so much you can do without either node shrinking or doing a whole redesign and they are locked in waiting for 10nm which keeps getting delayed.

It's implied in the perf graph, even without the graphs it's a no brainer, all of the Skylake revisions can only hit 5Ghz because of node improvements, if it wasn't for increased clocks the 9900k would perform almost exactly the same as the 6700k, Intel's "IPC" gains over the past 4 years are effectively zero

You forgot the part of the revolving door between nsa and intel and how the intel engineers had nsa clearance

Without going into detail: intel got lazy to the point where it affected their product pipeline- their latest products are on nearly 8-year old architecture and they have not had an impetus to change it until this year.

It'll hit around 3GHz. Not enough to beat AMD.

Cope. :)

Well, in no uncertain terms they went with a special element in the periodic table to handle the vias in the PCB when they would layer the circuitry; only to find that said element though theoretically insanely well performant, actually in practice is not such a good idea. It'd basically break like something brittle after a certain number of layers deposited. At point which, the entire fucking die goes into the trash--because you can't fix such breakages.

It's why first gen Ice Lake CPUs were

Intel needs 180w to break even with an AMD chip and they have the balls to charge me twice the money?

Anyone can back this up with sources? If this is true, why haven't their stock tanked.

:)

It matches with other things I've heard about Intel's 10nm. Their stock hasn't tanked because until AWS, Microsoft, and Apple stop buying Intel they've got a license to print money.

10nm is a dumpster fire. It has worse thermals, worse clocks and worse yields. First batches had to disable igpu to have any yields at all.

What element?

>worse clocks
nope
JIM KELLER said himself their new 10nm will have no clock regression from 14nm+++

>new 10nm
I'm talking about their current 10nm.

Why would Microsoft et all buy an inferior product ? Investors aren't stupid

if clocks are the same then expect no ipc gains what so ever

Because brand value. Some people only buy stuff with intel in it.

Attached: 1558536766998.gif (498x280, 2.27M)

Hmm true but if what you say is true long term then Intel is doomed anyway and investors should see the writing on the wall.
I'm an Intel employee but I don't work on CPUs.

>their current 10nm.
yes, the current in production 10nm is not the same as the original.
it's less dense and doesn't use cobalt interconnects.

>expect no ipc gains
Jim Keller wouldn't talk about IPC, but he did say the new 10nm sunny cove could do a lot more at the same time.
whatever that means but, it's Jim Keller its gonna be good

10nm is 14nm now.

The amount of Intel cope in the comments is palpable.

>Jim Keller wouldn't talk about IPC, but he did say the new 10nm sunny cove could do a lot more at the same time.

it doesnt matter ipc gains can be made in a specific way only and since there wont be any clock reduction it means that the bus will remain the same

:-)

cobalt

(-:

hexus.net/tech/news/industry/113090-intel-use-cobalt-interconnect-layers-10nm/

Can confirm, they chose cobalt, another problem of cobalt is its low thermal conductivity (one fourth of copper's thermal conductivity) which lead to hotspots in the deepest layers and eventually cause interconnects to fracture.
Cobalt's thermal conductivity is so low that it's used in heat resistant alloys.

KELLER?!