14nm +++++++++++

Why can't intel do 10nm again? I really don't know why are they stuck at that 14nm process

Attached: somefrog.gif (336x468, 199K)

they're gay

because they cant do better

The way I see it , they became complacent as the top dog . They also own and operate their own semiconductor foundries , meaning their foundries were equally complacent. AMD on the other hand doesn't make their own silicon, they contract foundries like TSMC that regardless of AMD are continuing to do better and better since it's their sole objective in the first place. AMD doesn't have to worry about creating a new node, they are using TSMCs nodes while focusing on design. Either that or Intel is just fucking dumb.

One of the issues (there are others) is that when you get down to 10nm you start getting quantum effects in the state of your gates. So you end up spending more time recalculating and you loose some of your performance.
I'm sure there are other issues going on but when it starts affecting the bottom line with defect increases, I think that is when the management steps in.

simplified explanation:
When Intel designed the 10nm architechture they used cobalt instead of copper for some layers because at 10nm the clading required by copper starts being a problem. But cobalt brought with it some significant problems, namely it's more brittle and conducts heat poorly compared to copper. This lead to hotspots forming, causing the more brittle cobalt to fissure. This is the reason they're going with low power parts first. Also the limited earlier releases of CPUs reportedly had really high failure rates in use so expect Intel 10nm to work max 1-2 years before it shits the bed.

Attached: nn0h9os8n7f31.jpg (856x584, 63K)

Too much diversity.

oh and for good measure some analysis I lifted from a forum.

If in what must have looked to some people at Intel as a touch of a genius you build a stack
- using metals with significantly different thermal expansion coefficients (16.5 for Cu vs 12-13 for Co),
- and one of them being brittle and having 4x worse thermal conductivity on top making hot spots even hotter,
how in the world are you going to fix that?!

Even if your eventual indicator of chip health is its performance on a test rig under conditions closely matching those which are likely to be in the real world, the rig is one thing -- every single setpoint is your choice, and even if you manage to end up with a stepping which begins to look if not yet sellable, but at least reliable as long as you don't hit some nasty corner cases, real world is a different thing -- when thermal cycling is concerned, every app has unique "fingerprints" -- it exercises various blocks of a chip in a specific manner: activating them with certain probability, and even that probability is usually a variable, not a constant.

See what I'm getting at? Different people use different apps, there are millions of apps and billions of people; some never power down their PC, while those working on the run can flip the switch a dozen times a day. Wanna simulate that or build a math model for distribution of service life and probability of a failure? Good luck with that.

Besides, if you know for a fact you have corner cases when your chip crumbles, you simply can't guarantee they won't happen with real applications, no matter how unique and improbable the cases can look (let alone when you know there do exist killer apps, pardon the pun). To add insult to injury, in real world fans tend to slow down or stop, TIM -- to dry out, VRs - to fail in different manners, dirt-cheap or faulty electrolytic caps - to age quickly, and the list goes on. All of this starts to matter a lot more for a chip with razor-thin reliability margin.

Wtf is this thread

Intel literally announced a full line up of new 10nm chips with gen11 coming out in ultra book laptops and 2in1s in the next month. Clocks are up to 4.1ghz on 4 core 8 threads but withmuch better single core perf die to new arch.

Info from Toms hardware posted yesterday

>only four cores

Attached: 9th-Gen-Slide-1-640x339.jpg (640x339, 46K)

Stop embarrassing yourself user. I said the 10nm chips were up t o 4core8thread and I was right.

Attached: x.jpg (1510x849, 1.06M)

>miracle 10nm core still slower in real world applications than last gen 14nm+++++ even under 100% fan closed-door testing

Attached: 111761.png (650x250, 18K)

>NOT 6GHZ
>not the 10-20ghz we should have got with moore's law

I don't give a shit. I just wrote to stop the bullshit OP that Intel aren't shipping 10nm. They are and they're very possibly going to be in the new xps13 which I want.

Why is your slide for 9th gen retard?

That's not how Moore's law works.

They've been "shipping" 10nm Cannon Lake Core i3-8121U for a long time. Too bad it's terrible, and those new 10th gen will be bad as well.

Attached: 105628.png (650x350, 34K)

How did you get from me posting a press release about ninth gen CPUs to thinking I'm talking about the new shit? I'm bitching that they're halving the core counts on 10nm.

Attached: 8130-vs-8121-Power.png (678x444, 78K)

but intel arent really shipping 10nm. you can't buy a single chip directly from them yet, and the amount of chips given to oems is pitiful, its a paper launch and even then its a product slower than what its meant to replace. and from what I've read their yields are still god awful and barely getting out of single digits.

its really not that they cant, its just that they got way too complacent and thought they would need it any time soon
theyve been top dog for so long they decided to rest on their laurels because "lol amd will never reach us"
compare them to nvidia, Im pretty sure the super cards nvidia released were supposed to be the original normal RTX cards but nvidia sold us detuned cards simply because they can
then when navi came around they just gave us better cards outta nowhere at navi pricepoint
I honestly dread to see what nvidia can do with 7nm and how they might rape the market again
tldr; intel didnt work in anything while they were ahead

Yeah that was 2 I think years ago they shipped to a Chinese OEM with zero fanfare because the towels were awful. I reckon things will be different this year, especially since their new gpu perf is so much better. That will be a real boon for ultra books

You can't call a product announcement a paper launch. They never said it's on sale now and the whole thing was under embargo until yesterday anyway. We'll know soon enough what the actually performance is because they'll start showing up in actual laptops this year. Just because you can't buy one doesn't mean Dell hasn't

there was a thread ~1 year ago where an alleged intel insider listed some reasons like bad management and sticking to techniques for the shrink that led to more trouble than they solved.

There will never be 10nm desktop chips. Exactly because their 10nm is so fucked up.
Intel plans to keep producing 14nm until 7nm(2022 hopefully).

>company announces product
>product is not available anywhere for 1year+
>10nm has been delayed for over 4 years now
it is a paper launch, you dumb kike.

Meanwhile Intel is adding more 14nm fabs.

Their current rather small 10nm production is just so they can tell their investors "look we got 10nm to mass production phase just like we promised". At this point it looks like they're gonna ride that 14nm horse until they get their 7nm process ready for launch.

Yesterdays announcement is not a paper launch you muppet

>let's make it even smaller
>but it will be lava hot, don't we need alternative solutions and better engineering?
>don't worry, you will just need to drop a bucket of ice on it every 30 minutes and connect a room size AC directly to the side of your new PC
>genius

>Intel literally announced a full line up of new 10nm
Like they have the last 7 years? Okay.

1 - Engineering is a lot more complicated than your hobbyist programming junk.
2 - We already have way more processing power than we need, and a lot of applications use brute force when they could just be better made.

Attached: laps cock.png (299x181, 85K)

to make progress you need white people, asians are only good to copy and make small tweaks and even that only for a short period of time, then it's all memes and rgb keyboards in overheating laptops

Very interesting thanks for posting this

So you dont think that any of these chips are going to be in ultrabooks by the end of the year?