90nm graphene nanotubes x50 times better than current 7nm

spectrum.ieee.org/nanoclast/semiconductors/devices/first-3d-nanotube-and-rram-ics-come-out-of-foundry

>the future is 90nm and it's government-cucked already

Attached: 80P.jpg (450x632, 112K)

Attached: e8de1fa7761b239e954776ddcbc04d8289afed59fb035513b0da052cec0812d4.png (991x917, 864K)

And its going to cost a fortune for the next 50 years because the silicon industry is going fight it the the death. That and carbon nanotubes are notoriously hard to mass produce.

you must not have read the article.
they took an old fab and retrofitted it on the cheap.

>on the cheap
Eyes eel watt yolk deed tar.

>6 gorillion cores

better how

>posting IEEE
This board does not deserve such high quality posts.
Btw, I'm doing my PhD in a lab where some guys work on nanotubes. Quite hard work, I'm thinking about moving to this field.

x50 times faster, while being even more efficient.

how can you be faster when the distance between transistors is 10x larger?

Attached: 1013423_164144617097928_2086563000_n.jpg (960x945, 55K)

>because the silicon industry
There is a bunch of fabs, not a industry. Nevermind that the PBC and microchip industry is going nowhere for another few decades.

Fantastic prototype shit means nothing, what matters is mass production equipment and processes. We are inundated with this kind of news covering all areas of technology and yet +99.9% go nowhere.

>prototype
Are you retarded? Read the article, fuckwit. They're already producing it under the contract/funding of DARPA. These are literally going into the next gen of killer bots.

IQ level - below 21.

Imagine falling for the carbon nanotube meme when quantum chips are around the corner

>quantum chips are around the corner
How is it there, in 2048?

processor is still Si. it's just memory bandwidth increasing by 50x

Attached: .png (936x1239, 447K)

>memory bandwidth increasing by 50x
They've said PERFORMANCE.

and it'll still only be arm/riscshit

Attached: .png (523x294, 60K)

hello fren, you seem confused. the current year is 5779

t. brainlet

Attached: .png (932x1193, 295K)

>publicly funded research
>watch them hand it to israel

>it's just memory bandwidth increasing by 50x
>just
I wish I could increase the length and girth of my dick by just 50%...

>DARPA
>Jude
Weeeeeeeeeeeeeeeeeeeeeeew

Misleading shit thread.
This DARPA project did not create an SoC with 50X the performance of any 7nm part on the market. They are still speculating that a chip on this process, using an entirely different architecture and approach to design, "would" have 50X the performance of some random 7nm target.
They didn't create a chip that did that, but they believe if they created a chip with more layers of stacked logic then it would do so.

>multiple layers of stacked logic can produce higher throughput than a single layer of logic!
Nebulous non statements like this should be a redflag to anyone with a three digit IQ.

consumer devices haven't been memory limited for the past decade. this will mean nothing for consumer level products (desktop/laptop/smartphone CPUs) since the processor is still Si.

Wow, another military vaporware that doesn't scale and doesn't address the core problem of multilayer chips - heat dissipation.

>This DARPA project did not create an SoC with 50X the performance of any 7nm part on the market
No one said DARPA created any shit, you retard. It's MIT guys that did it, UNDER contract and funding from DARPA.

>i-its not DARPA!
>it was just a DARPA contract funded by DARPA
Ask me how I know you're a drop out retard.

OP, that's a grant farming article. It won't be practice for real world use outside a lab.

Lol if they could be even close to competitive on 90nm that's gonna be a lot of foundries retooling ASAP on smaller nodes then that, retooling is much cheaper then building new facilities, if it pans out we will see a explosion of new designs and performance improvements, and the big players maybe folding

so we either won't see consumer nanotubes CPUs, or they'll be pozzed pieces of shit

We will see pozzed POS that's 50 better than 7nm.

It's almost like the medium you move through affects the speed, genius.

Electricity travels at the speed of light you dumb gorilla nigger.

brainlet, that's only in a perfect vacuum you subhuman retard.

Yes, its basically clickbait tier bullshit.
They didn't meet the project goal, but they claim they *could* if they increased the stack size by using many more layers. They also state that cost is comparable to 7nm nodes, which is retardedly high.

Signals do not travel at the speed of light in a conductive medium, their speed is limited by the medium. The moar you know.

>REEEEEEEEEEEEEEEEEEE, STAHP INVENTING!!!11

|
|>
|
|
|

Wow, I thought it might still be a decade or two away, but suddendly it's here, outta nowhere. Gonna take a while still until it gets adopted into consumer products, but it's here.

what isn't cucked by governments from the word go these days? For instance, let's say I was a billionaire who wanted to make his own CPU and CPU line? Would I get a visit from some G-men in suits telling me to give them access/backdoors "or else"? Does anyone think the government is above killing off anyone in tech who may present a threat? I don't.

>Gonna take a while still until it gets adopted into consumer products
I'd say 2024 at the WORST case scenario. Because 5nm is already out by next year (in the mobile) and 3nm is the absolute wall (by late 2022/early 2023).

>doesn't address the core problem of multilayer chips - heat dissipation
Yes it does, the vast majority of heat isn't generated in the logic, if more of your architecture is logic then you generate less heat.

If you take the same amount of logic and make it as two layers instead of one it will dissipate the heat worse.

Unless you have a material that conducts heat much better than just silicon.

With the logic stacked vertically, signal pathways which connect it to other SOCs are shorter. The shorter these pathways, the less heat they generate. In fact, the SOCs could be stacked also.

Grasp at straws harder, you fucking kike shill.
DARPA wastes billions on dead end projects annually, literally robbing tax payers.

3D stacked logic isn't new. In that respect nothing new is presented here at all. The only mildly interesting aspect of this is the use of CNTs. They didn't even develop a new interconnect, electing to use standard copper vias.
Virtually all heat is generated by switching logic and SRAM cells, guy.

Silicon is a great thermal conductor.

Carbon nanotubes are the next asbestos and will be banned in first world countries within the next 10 years.

Attached: 1548219511236.jpg (1826x1795, 191K)

...what?

> imagine falling for the quantum meme