Will graphene ever make it into regular consumer products?

Will graphene ever make it into regular consumer products?

I’ve been seeing people hyping this shit for years and there’s still no shit to show

Attached: A5C9D881-3AA6-4480-9D8F-16E8D23002CA.jpg (750x744, 319K)

Other urls found in this thread:

youtube.com/watch?v=YpphKzmDiJM
reddit.com/r/science/search?q=graphene&restrict_sr=1
youtu.be/_oEFwyoWKXo
twitter.com/SFWRedditVideos

shitcoins die overnight thank god

>Will graphene ever make it into regular consumer products?
Yes. In 20-30 years or so.

Cost benefit analysis is clear, graphene isn’t worth the cost. Anything graphene can do great, something cheaper can do good enough.

>i have no idea what i'm talking about

>I’ve been seeing people hyping this shit for years and there’s still no shit to show
Probably because it's currently difficult to mass produce. When mass production is viable we will see more research that may eventually lead to consumer products.

"No."
Graphene is already seeing small scale adoption for some industrial and scientific purposes. It just hasn't hit the consumer market yet because it's still ludicrously expensive and researchers are trying to figure out the best way to utilize it.

>Cost benefit analysis is clear
source?

>New technology is prohitibely expensive for widespread adoption
Stop the fucking presses. Give it another 30 years senpai.

>something cheaper can almost do good as terahertz cpus

Attached: alu1bEp.jpg (1688x2000, 86K)

there goes openssl

Attached: 1519092054562.png (800x800, 84K)

No, you don't. Graphene is the best at everything, therefore there is nothing that actually needs it. It's the limit, not the standard. We use copper wiring, not gold.

in the end of the millenium

>integrated circuits arent worth it vacuum tubes are good enough and much cheaper

Attached: 1512307313602.jpg (645x729, 81K)

my x220 with a 9 battery pack running GNU/Linux gentoo can screenfetch taking less seconds. Hell, it only takes a week to compile cowsay. Graphene is ridiculous meme tech only "researchers" pretend to need. No one needs more than an underclocked core 2 duo with 4 gigs of RAM.

Attached: 1523304736606.jpg (418x410, 31K)

Imagine games running at 2terahertz 4 cores CPU!
OMG! It will be like reality!

>We use copper wiring, not gold.
don't know about you but most computer contacts i've seen are gold plated at the very least

Gold is worse than copper for wiring; assuming you're comparing their conductivity.

Looks like most of the cost is refining graphite to graphene. So they really just need to develop new processes. Thats probably a billiion dollar idea. I don't know if foundries have to have some special equipment that'll drive up costs.

>No one needs more than an underclocked core 2 duo with 4 gigs of RAM.
So you are just an Idiot, ok.

>4 gigs of RAM
that’s too much. 2 gb is more than enough for anyone’s computing needs.


also disable swap

coppers a cheaper conductor thats as good as gold, but gold doesnt oxidize so thats why you plate copper with it

>Will graphene ever make it into regular consumer products?
It will be in consumer products way before intel's 10nm process node

the wires inside the CPU are copper iirc?

It's not 1,000 gigaherz, it's one teraherz

There are already precessors which works at hundreds of GHz, but they're different kind of processors than the ones in consumer electronics. Shit doesn't mean anything.

people have different need than ricing i3 on arch

moar coars btfo.

Problem is with moving the lattice around without damaging it.

Stop bullying Intel

Attached: laughing Dalian.webm (240x135, 63K)

>Imagine games running at 2terahertz 4 cores CPU!OMG! It will be like reality!
>It will be like reality!
>like reality

you want boring games?

Captcha should really be reconfigured to stop retards like you. We don't use gold because it's rare. There's shitloads of carbon around.

It's one thing to say you have a 1 teraherz CPU, but if you don't have a similar GPU to match it, you can't process real life graphics in real time as if it were real life.

If i'm not mistaken, it is on that solid thermal conductor thing Linus was shilling other day.
That thing looks quite good for cooling shit we have, but sadly it looks great to make phones even thinner,as you can pretty much get rid of the heatpipes and solve it with glue.

You could still do lot of neat stuff with cpu power, better physics for instance.

If you have a 1 Thz CPU, you don't actually need a GPU.
Just do realtime raytracing by software.

gpu would offer a better resolution

Yep, and well you most likely can use the same process to make a 300 Ghz GPU with it, but was assuming the apocalyptic "CPU ONLY" scenario.

>graphene
>expensive

I think the most expensive part must be developing the tooling and robots and stuff to get the mass production of chips on graphene.
It's a shitload of research that is needed to do.
We're probably seeing someone pull off a retarded fast Z80 first.

Yes, it will. A cople of weeks ago a team at MIT figured out a manufacturing process that could get you 5 cm a min from CVD of a decent quality. While not much, this is a prototype of industrialisation.

>We use copper wiring
Intel is going to partially transition to cobalt interconnect for their 10nm cpus, and they used aluminum before copper.

youtube.com/watch?v=YpphKzmDiJM

>2 gb is more than enough for anyone’s computing needs.
I haven't had to put more than 1gb into any of my or my family's builds tho I use Linux so ymmv

Attached: phil.jpg (1320x415, 102K)

>It will be in consumer products way before intel's 10nm process node
Kind of funny you say that since Intel is going full cobalt for their 10nm. They're 10nm is already matching or exceeding existing 7nm lines in all but voltage

>you want boring games?
becauze errything needs its own coarz!!!

Attached: cheetos.gif (360x202, 3.52M)

If you have chips for a 1ghz cpu can't you develop similar chips for a gpu

You won't get your shekels posting the same bullshit thread after thread; you're not convincing anyone.

>implying that intel ever makes a 10nm CPU

Doesnt mean shit, it could be 1000 terahertz but it's just running clockcycles without running actual instructions. Don't get bamboozoled.

That's actually a decent question to be asked.
The primary clock wall is the speed of the transistors linked to transistors linked to transistors etc..
You need to wait all the transistors of the circuit to "stabilize" before sending the next clock pulse, and those "1000 ghz" may be referring to the speed of a single transistor.
Of course, you can reduce this lag by adding moar pipelines, but you lose performance with too much pipelines.

Silicon ain't expensive either. There's like 2 cents worth in a chip.

Who cares? Wasn't memory access the bottleneck anyway? I'm not sure speeding up chips will do much.

oh, the graphene hype hit CPU's
So in 5 years, someone will reccomend using machine learning and in 5 more blockchain

IBM did this like 5 years ago and we still have yet to see one.

>We use copper wiring, not gold
You know copper is a better conductor than gold, right? Silver is the only metal with a lower resistance

True. We aren't even close to silicon transistors' maximum switching frequency. CPUs are more limited by clock skew and tdp.

1000 ghz not 1 thz

why come nobody don't uses silver tracers than?

Things like OP - possibly.
Regular ass consumer shit - no, it's got the same problems as Asbestos.

Silver is hard AF, no ductility.
Come to think of it, Intel would love processors that die in ~3 years.

Hey, what if it's a retarded fast 6502? We can use c64 computers again and they'll be able to run Java and c++ and GEOS 2020 and port TempleOS over for dual boot? Someone email Terry A. Davis, he was right all along. Wow, the future is so exciting!

Attached: terryadavis.png (400x800, 1.24M)

I wonder how much they paid this fgt to say that. Why won't he stroke out already?

>porting a true 64bit OS to an 8bit microcontroller

How tf are everyone this dumb. We still have the speed of Light to overcome, in a 10ghz cpu electrons can only travel about 3cm between each clock cycle. Imagine fetching several millions of bytes from ram. That cpu will waste a fuck ton of clock cycles before it can do anything.

uhm just integrate the ram into the cpu have you never heard of computronium???

I bet with 1.21 gigawatts processing power under the hood that thing will run like butter.
Are you kidding me? Imagine segmenting everything, like threads, various strings in a sense, try to picture it user, not a street where a single car can go down in one direction at a time, but a 16 32 or 64 lane super highway going both ways, you see where I'm getting at sport?

If it runs at 1 Thz, and the 6502 takes around 20-50 cycles to emulate a x86 instruction, that would make it equivalent to an 8Ghz modern CPU.
Make it quad core and we have something that can go up against the 2600k.

Ram being slow as shit is already a given.
This is why we use caches everywhere.

Maybe a 16bit instruction from an 8086.
a 6502 isn't doing 128bit AVX in 50 cycles.
It's not doing it 5000 cycles.

It's nothing but theories. If they where to release 1Thz cpus in the near future they would need a huge cache around the entire die and not the unnecessarily long distance to ram that we have today

Indeed daniel jackson, indeed.

LEAVE INTEL ALONE

Attached: Crocker_shit.jpg (608x400, 27K)

Wouldn't need a huge cache, the cache would just need to scale in speed with the CPU
Also HBM, also HBM made from Memephene.

It's an acedemic argument anyway, noone is releasing a 1Thz CPU in the next 20 years.

I'm actually releasing one tomorrow

I have a fab in my shed

Cool Beans, link me your Etsy later.

t. doesnt understand how shitcoins work

Ibm already made 1terahurtz chip.
You can't make several billions transistor chips with those shit.

Buttmad poorfag

You probably don't need to.
If you can make a pentium pro with it, it's pretty much enough to trounce the modern machines, even if it runs at a lower clock like 100 Ghz.
But it's still some million of transistors.

A 1Thz 8-core cpu with 256bit FPUs would have almost 3x more 32bit FLOPs than a TitanV.

>2009 article

>Terahertz
>probably requires rocket engines for cooling fans
>switch on computer
>it flies away

Eat coins.
Shit coins.
What else is there to it?

1000 Gigahertz processors to scan your anal pulses and evening ejaculations. Thanks for the new tech. The NSA kikes will pay you well.

Graphene already has, hobby batteries are seeing a massive longevity improvement and discharge rate increase. You can buy then today. I use a couple on quadcopters, they're fantastic and don't drop power until there's no juice left in them

Yes.

Source: reddit.com/r/science/search?q=graphene&restrict_sr=1

1024 gigahertz is one terahertz you idiot

Oh man this is so exciting, just imagine how awful and inefficient we can make our code now and it'll still run fast enough to sell!

No I want my dragon waifu to be real so I could pound her pussy in vr

>no 10nm
>next major vulnerabilits, even normie media convers it
>amd objectively the better choice

Attached: 1470004910231.jpg (600x885, 36K)

youtu.be/_oEFwyoWKXo

Graphene is extremely hard to synthesize in large quantities.

frequency is not comparable to amount of memory, retard

This will be how India finally becomes a world superpower.

GiHz > GHz

Imagine being this retarded

Graphene isn't synthesized. It's made from pencils.

GaAs will be the next transistor technology for high frequency transistors, though I don't know much about CPU transistors. Silicon Carbide will replace silicon for power transistor applications. Also never trust what journalists say about potential "applications", especially if it comes from MIT. According to them MIT would have us all in the 4th dimension.
There's a huge difference between graphene sheets and using graphene as a transistor. One of the big problems is getting a good crystalline form of multiple sheets.

>graphite

lol. Try running an n64 emulator.

Attached: angrylion.png (1275x775, 798K)