Can someone explain to a tech newbie how AMD is close to releasing 7nm already and Intel is still on 14nm...

Can someone explain to a tech newbie how AMD is close to releasing 7nm already and Intel is still on 14nm? Isn't Intel supposed to be ahead in research and development? Don't they have more resources? How did this happen? I might got with AMD for my next build for this reason.

Attached: 0a56b8099a5247d3fcca707eafb3d43e_XL.jpg (750x563, 122K)

Intel was cruising with their legs on the table with their 4core cpus+eventual achitecture and sometimes nodeshrink for 10 years while AyyyyMD was BTFO. Then they release ryzen and that got alot of people to switch and Intel didnt expect that.

i watched a video yesterday that said that intels 10nm is actually going to be just so slightly denser than AMD's 7nm, if that's true then the wording is kinda misleading

intel are struggling beecause they haven't had to put any serious work in for the last 7-8 years. sandybridge dominated over bulldozer chips and intel were in a very comfortable position where they had virtually no competition, this is why there have been barely any gains between each succeeding generation of their chips.

Also intel has their own fabs so they have to research on their own while AMD uses chink fabs that everyone uses so they all share the node shrink tech i think.

Because TSMC is ahead of the curb while Intel became lazy and decadent

That's false. Intel's original 10nm isn't quite as dense as TSMC's 7nm, and if the rumors are to be believed, Intel has been forced to alter their 10nm node in ways that reduce density to get it to work at all.

Because they're all lying, the nm size is just marketing.

Intel's fabs are going downhill because most of their engineers left and TSMC gets heavy cash injections from other tech giants so their fabs will easily get ahead.

So basically Intel is shit and AMD is the best, that's all you need to know.

Attached: 1528292848450.jpg (811x1024, 223K)

TSMC 5nm risk production in 2019

Attached: Fin-vs-GAA-FET.jpg (600x249, 34K)

Intel's 10nm is denser on logic, less dense on sram, evens out probably. 10+ will be 10% denser. The only rumors I've heard that Intel's faking 10nm is from like Charlie who's half full of shit

What? Its the other way around. TSMC has better density metrics across the board compared to intel's 10nm. Total logic cell area is lower with TSMC, SRAM denser.
Intel isn't struggling because they've been lazy either.

Because Intel went with balls to the walls insane targets for their 10nm while firing their experienced process engineers and replacing them with H1Bs.

Is GloFo even making money?

Intel has basically squandered their monopoly for 10 years and have been milking their product line for massive gains with no real improvements.

meanwhile even a fucking phone has a 7nm cpu inside

Attached: apple-a12-descriptor.jpg (2628x1508, 265K)

Yes, they'll have profitable operation without needing cash infusions from now on. As of late August they had over 50 customers on their 22FDX process. Its likely that the upcoming 12FDX will net them even more design wins.

Global Foundries didn't give up on high performance nodes, they just went with SOI instead of bulk FinFETs.

Girl power ;)
Step away silly boys let me show you how a real woman innovates ;)

Attached: fb1e4b34-7ff5-47e1-af0f-8a23e11e7eac.png (349x456, 190K)

I wish women were at least quarter as competent as Lisa is on average.

Kek. She'd be making coffee for execs at Intel, the only reason she is at AMD is because she couldn't get a job at Intel

t. BS

lots of free time brian, eh?

Attached: t.JUSTnich.png (1002x1480, 1.25M)

> meanwhile even a fucking phone has a 7nm cpu inside

So? They're just using TSMC's process. Just because Intel is a bunch of cucks doesn't mean the rest of the industry has to be.

she cute

Attached: 1525104475161.jpg (757x627, 87K)

the basic principle of 3nm AMD is this:
if (electron.is_tunneling()){
fix_with_magic();
}

Intel got taken over by poos

>another redditor meming about muh quantum tunneling without knowing anything about it
Current leakage isn't magic. You know how you keep current controlled in an increasingly shorter channel? You use a GAA. Retard.

Afaik other fabs are richer than intel, simply because everyone is buying from them.

Probably doesn't help that most of intel's profits goes to corrupt ceos receiving massive severance pays.

Because intel is the last cutting edge stronghold of the "fabbed" model. While AMD is fabless and has their products made by other companies like global foundaries and tsmc.

so 7nm+ means more nanometers than 7?

Intel sat on their laurels for too long. They also did not foresee gooks and Chinks eventually catching up and eventually surpassing them. Intel is always held back by its own hubris.

AMD is a fablet company who was sunk for a decade, they piggybacked off TSMC who got really big from Apple, Huawei, Qualcomm and Nvidia contracts.

It means 7nm with EUV integration. Decreases costs, die area, increases density, reduces power consumption slightly.

>TSMC wasn't big before the smartphone boom

Literal underage.

No one knew about TSMC 10+ years ago, it was only Intel and GlobalFoundries.

They only started surfacing a few years ago.

DESU he made some alright decisions like intel diversifying and entering other markets besides regular PC CPUs. Although I still don't understand why they bought McAfee

Jesus Fucking Christ

They've made Nvidia and ATI gpus for longer than that.

how the fuck do you have this little lack of awareness

everyone has known about TSMC for the past 10 years.

Attached: 1444849627475.jpg (272x285, 22K)

Lets see if they can actually make FDSOI work this time. STM shipped some 28nm FDSOI designs, but no one else was able to make it work. NXP's chips have all been in pre-production for fucking years.

What year are you from?
FD-SOI has always worked. What you're trying to refer to is a subset of FD-SOI called ET/UTBB that utilizes body biasing to create an tunable double gate effect. Even then, Global Foundries 28FDX was a cash cow for them, the process had no issues. Samsung is offering their own competing family of advanced SOI nodes they call FDS.

>The only rumors I've heard that Intel's faking 10nm is from like Charlie who's half full of shit
Right, but how do you know his info on this is in the bullshit half?

Charlie posted info straight from intel in his paywalled articles, he didn't post any made up rumormill WCCFtech shit. He didn't say they were faking anything either. You couldn't make it any more obvious that you saw headlines and never read the articles.
Intel isn't faking anything, but they are not going to put designs into volume production that had the pitches they publicly detailed when they first presented their 10nm process to investors.
Intel's original 10nm node was behind TSMC's 7nm in density.
Intel's current 10nm node is even further behind.
TSMC's 7nm EUV will improve on density even further over their current 7nm.

It's been 6 years since STM got 28nm FDSOI running and shipped some of their products, in the mean time Sony shipped a GPS chipset ... what else has shipped exactly?

The biggest design win was NXP and they've shipped nothing but engineering samples.

TFW AMDPOO raja is in charge of intel now

AMD doesn't do research and development on FABs because AMD doesn't fucking have any, they just check who has higher density fabs and go with them instead.

GF didn't even exist 10 years ago you fucking underage retard.

I'm thinking about switching to team red but I was chatting to this kid the other day and he kept calling his processor an APU.

I'm not going to have to get into the marketing bs to fit in at the AMD boy meet ups am I? Or was this kid just being a fag?

Maybe you should stop seeing boys

Attached: 1521353331668.jpg (796x1000, 60K)

>nobody has posted the intel pasta yet

Attached: 1531336678558.png (1400x1608, 399K)

>shit that never happened

legendary and housefirepilled

Attached: yeah it did.png (614x561, 64K)

I have huge suspicious Intel is sandbagging. It's only logical a company as huge they are have many on going secret research into cutting edge future gen technologies. I predict Intel will be shitting on AMD's thunder one again after zen2 is released. Just as Intel did with phenom and bulldozer.

XX00G SKUs are APUs, as in combined CPU and GPU.
AMD CPUs generally do not contain an iGPU so this is a necessary distinction.

AMD is now powered by China, that's how.

Do you even know what is being measured as 7nm or 14nm?

Do you?
I'm betting you don't.

You skipped the part where AMD cancelled literally everything in development, relying on revenue from consoles and a fuckton of long term debt to dump EVERYTHING at 7nm. Looks like it paid off.

AMD doesn't have to share the designs, and these contracts are locked in. Intel can't just knock on TSMCs door and tell them to figure out how to copy AMD. Right now TSMC might have excess 7nm wafer production available but that doesn't mean fucking shit.

>ARM soc is the same as x86
Braindead retard

Global foundries was literally funded by AMD and IBM with some backing from some Saudis you fucking dense zoomer

retardalert.jpg

I realize it will never happen, but I wish I could see a brief glimpse of the universe where Intel gives up and contracts TSMC for their next line of chips.

Because fitting in more transistors doesn't matter as much when you can't use all of them. Look up Dark Silicon and Denard Scaling.

Leakage Current exists before even considering Quantum Tunneling, in both static and dynamic. For all intents and purposes they're considered completely separate things.

I think firstly intel would abandon the monolithic designs because the yields would be too terrible otherwise.

Its really not. Quantum tunneling is a meme. Its literally just short channel effect.

If you've got a graphics card will you ever use the integrated GPU?

Attached: brainlet 4.jpg (638x1000, 92K)

Not likely. It's useful having there as a backup incase anything happens to your graphics card. That said, you could probably buy a used gpu for $10 to serve that same backup purpose.

Broadly speaking, no. There are use cases that would involve both but if you have to ask that sort of question they are almost certainly not relevant to you.
Beyond it serving as a fallback in case your dedicated GPU died it would just be wasted money presuming you have the option of going without.

There are some applications (not games) that make use of everything you throw at them, in which case a decent iGPU could serve as a performance boost.

Now is that significantly more beneficial compared to the iGPU that pretty much comes standard on Intel mother boards? I'd imagine what ever you'd gain in latency you'd lose in thermals.

Intel massively fucked up their 10nm node

Also the 7nm TMSC node AMD is using is equivalent or only marginally better than the 10nm intel would be using.

so basically 7nm ~=~ 10nm more or less

AMD got to this size first simply because intel are moronic.

Also the reason to get AMD isnt that is has a better node its because the Zen Core design is superior to the obsolete Skylaked architecture intel is currently using

That will change when sunny cove comes out on 10nm but until then AMD has the better design and with 3000 series they will have the better design and the better node.

AMD didn't develop 7nm process, TSMC did.