Moore's law will end in 2020

Moore's law will end in 2020.

How can you justify buying a laptop or desktop now when you could just wait 15 months and buy a computer that will most likely never* become obsolete?

*at least until the distant future when things like optical/quantum/DNA/graphene computing becomes a reality.

Attached: image.jpg (518x388, 57K)

Other urls found in this thread:

thechipcollective.com/posts/juanrga/parallel-computing/
twitter.com/NSFWRedditGif

>How can you justify buying a laptop or desktop now when you could just wait 15 months and buy a computer that will most likely never* become obsolete?
For the same reason that people will buy the newest iShit every other year; because it's shiny.

>Graphene
No tunable band gap, so not suitable for semiconductors.

>DNA
Too costly and too complex and highly susceptible to degradation.

>Quantum computing
Literal meme.

>Optical computing
So, basically what HDDs and fiber optics cables are used for?

Truth

CARBON NANOTUBES

you don't know shit about computers you cancer newfag bitch.. moore's law won't end in 2020 go back to your xbox, poser fag

Attached: 2834511.jpg (220x213, 31K)

Does this apply to building my own PC as well?

its been that way since 2006 you idiot.

I used a core2duo oc to 4.5ghz from 2008-2017. ran basically every thing at 60fps upgraded because pubg ran like shit but now its optimized it would run at like 50+fps.

How long will my i7-6700k, gtx 1080 and 16gb of RAM last me?

>Moore's law will end in 2020.
that's okay, history is over anyways

It depends on how spoiled you are, and what you do with your life. In my case it would last me for at least 5+ years since i mostly make music, and play some esports titles, but some faggots will find it unbearable to play without the rtx. Maybe you're that spoiled faggot, maybo you're easily satisfied, it depends

I dont care much for RTX shit. I just play stuff at 1080p 60fps and I dont play many AAA titles

/thread

but i do think waiting for 2020 is the right choice

Are you retarded? If so, you might actually believe OP.

We're going megazord.
Dies over dies over dies over dies, until you have this cube with copper pipes crossing it.

>>Quantum computing
>Literal meme.
Can somebody explain to a brainlet why it's a meme?

>AMD infinity fabric causes issues in all applications relevant to me
So OP is right then

If the Moore's law ends, it will be the only way left.

MCM and massively parallel computing are the future.

Wishful thinking.

thechipcollective.com/posts/juanrga/parallel-computing/

>AMD infinity fabric causes issues in all applications relevant to me
No one cares about your games faggot

Moore's law will never end.

>Moore's law will end in 2020
time to short amd/intel?

so why exactly is thinner so much better? I get that you can then fit more powerful components in the same amount of space, or the same power components in less space, but CPUs are already so small, why not just make it bigger?

Multicore is the future. Intel and Microsoft have been holding back progress, but multithread will prevail.

Look at this fag he doesn't progame AAA

IT'S NOT A FUCKING REAL LAW, STOP SAYING IT WILL END LIKE PEOPLE WILL VOTE ON IT.
thank you

So if moores law ends in 2020, then my 2011 hp laptop won't be too obsolete?

>why buying something you need now if you can buy it later
This is an 18+ site.

I wish I had the cooling to get my old Core 2 box overclocked. Would be nice to enjoy some gaming with my friends with that box.

>tfw you're a NEET but you're still social

5nm will take AT LEAST a decade to hit consumer market at sub 10,000$ prices. 7nm is hard and slow to manufacture. 5nm will be harder. Anything smaller than 5nm will give you quantum tunneling that will make your CPU either malfunction, or be designed to correct noises at a level which will greatly slow it down. So 5nm is the practical limit.
In 2030 you will see a 5nm 16 core 32 thread 5.5 GHz AMD CPU going for 400$. It will have a 100 Watt TDP.

Attached: 1531028464948.png (800x894, 41K)

>most likely never* become obsolete
but user, they already "fixed" that problem by using unleaded solder.
You components degrade every time they're heat-cycled.

Attached: Nokia7610.jpg (1536x2048, 424K)

>games
Audio production software frend, don't do gaymen in years and years

because you cant do shit with it other than hacking md5

Tell me about that user, looks interesting.

this is what will probably happen - chips will just be bigger and bigger

Seriously? If you could just make a 2 "layer" processor that is twice the size, and twice the performance, with of course twice the material cost, why aren't they already on the market? why isn't linus tech shills showing off his 64 layer CPU

I have realized that this will happen, so I bought new hardware 6 months ago, so I can enjoy the last years where I can have an edge with my PC.

Patterson says it's over already

>t. brainlet

Thats's it bros, pack it up, computing is ended 2020. We've reached the pinnacle of technology and cant improve any further

Attached: 1488681628222.jpg (1048x960, 233K)

3 months. Upgrade your processor

I have an i7-3770

Still no need for a new cpu. At this pace I'll be looking for a new one by 2022

Why would it have ended?
It's the best for competent programmers. Now we will have to improve the efficiency of our programs to get better performance, and not every Pajeet is ready to do that.

AMD still has their 7nm zens under the sleeve. After that, only god knows what would happen

That entirely depends on what you use your PC for. When you don't play vidya or do A/V work, you've probably hit your own personal singularity and don't need to upgrade until something physically breaks. But because the material quality has significantly improved since the 80's, your tech might just as much outlive you.

>5nm will take AT LEAST a decade to hit consumer market at sub 10,000$ prices
Maybe for Intel

What a great time to be alive yaa

Attached: 1536343774787.png (900x506, 473K)

I've have been using my i5 2500k since 2011 and O think it's me longer than any other desktop which is crazy to think about in terms of it still being satisfactory to my needs.

I still use my 3570k, provided I game only non AAA strategy games code very light applications and the most intensive I get with my computer I get is editing RAW digital photographs. I really don't see any benefit of a faster CPU.

I prob could overclock it a bit for fun, if anything I prob could use moving from 8 to 16 gig, some things don't like 8 anymore.

The regression is real. my brother got a laptop in 2011 with 8gb ram. In 2018 even high end laptops still come with 8gb ram. Not to mention cheaper laptops have given up on hard drives and instead give you a shitty 32gb ssd instead. With intel reverting to 22nm we can expect hotter slower computers bogged down with Spydows 10.

MCM CPU's are already a thing. MCM GPU's are likely less than 3 years away. This isn't science fiction. Monolithic dies are the only thing that will go the way of the dinosaur in the performance segment within the next few years.

>I've have been using my i5 2500k since 2011 and O think it's me longer than any other desktop

Literally same. I have the exact same chip, 3.7Ghz, 8GB of RAM, a GTX 960 and everything works fucking amazing

glad I got myself an i7 2600. I installed a modular water cooler when I built the pc. About two or three years ago the CPU was temp.-throttling very quickly and I thought it was about to go belly up at any moment. I thought maybe the led or whatever is between the metal case of the CPU and the die had molten away, so it couldnt properly transfer the heat to the cooler anymore. When it started to hit 95°C in idle it finally gave up and halted right after or during boot.


I wanted to check if I could re-use my GPU and other PCI stuff. As I removed the CPU-cooler to get the tubes out of the way, one tube broke. Turns out there wasnt a single drop of liquid left in the water-cooler-system. After replacing the CPU-cooler with a 30$ noname cooler it's still working, and still rocking the OC profile I made years ago.

The SSD I bought with the CPU is also still working. However S.M.A.R.T. reports some funny values, like power-on-hours-count of almost 90k (which is more than 10 years, for an SSD first sold in 2011).

4th year phyaics phd working on optical computing. AMA

>Moore's law will end in 2020.
Fucking Trump. Let's make a petition to reinstate it!

Why would you want lead on your motherboard?

>Just wait™ for the last build you'll ever need
>Until the next new thing comes out

Moore's law ended over 10 years ago, nigger.

>Moore
who?

Why did you decide to waste your life?

Nah, not the same guy but he's right. There's no reason whatsoever for quantum computers to reach the general market.

I think it's time for all of us to start thinking deeply about the future. Let's drop the memes pushed by clickbait articles.

I want a fucking threadrippre workstation so bad!

Moore's law could end tomorrow but Intel and AMD would still be the sole providers of x86
stock values are not somehow pegged to meme laws

Not true. There are plans for 3nm using gaa transistors instead of finfet.

>muh Moore
CPUs aren't doubling in transistor count because no one knows where to put the transistors other than more cores. Its incredibly hard to aptly spend transistors to get more performance from where we are right now.
GPUs however are still scaling nicely, so long as foundries keep churning out nodes, then GPUs will keep piling on billions of transistors.

On the CPU front we could see companies abandon area scaling all together, and we'd still see performance increases, power decreases. Making smaller gates isn't the only way to lower costs and add performance.

why not just make the cores bigger?

I expect they will start doing this eventually for the sake of progress.

why didn't they already start doing it 20 years ago?
CPUs aren't fucking iphones, who gives a shit if it is physically thinner. this isn't a rhetorical question here I'm begging for an answer.

Think about what you're asking.
Where are you going to spend those transistors, user? Adding cache will bring negative returns past a point. Adding more execution units does nothing if you can't feed them. Designing a front end that can actually feed more instructions per clock is ungodly hard, the front end is the most critical area of core arch design.
There is no "just make it bigger"

I don't get it, what will happen in 15 months?

so then the halting of progress has nothing to do with reaching a thinness barrier, and everything to do with diminishing returns on transistors? If so why does Jow Forums constantly bring up how max thinness is going to be reached when "quantum tunneling" becomes a problem?

lmao if all your computing isn't done in the cloud

Because people have no idea what they're talking about and just regurgitate popsci headlines.
Foundries aren't having any issues scaling down to 5nm and 3nm class nodes, intel aside. Different gate topology and different BEOL metals are used to ensure devices function at any given feature size. Thats been true of every single process node, and its no different at 5nm or below.

>not knowing the difference between a law and a prediction

Nice try, Tzeentch

Well that's a bit depressing. Is RISC-V a highly parallel ISA or will something like the 'EPIC' have to be reworked or a new design take its place? Do x86 makers space out their transistors more for higher voltage and faster clocks? With branch prediction being gimped is there any hope salvaging serial ISA?

22nm is for their mobo chipsets. Better to offload that production burden on something that can take the hit rather than fewer actual cpus.

>For the same reason that people will buy the newest iShit every other year; because it's shiny.
Why doesnt Xiaomi slaps gold and diamonds onto their pocophones and sell them at premium price?

Im glad the singularity (probably) wont happen soon. I still believe that it could eventually happen.
If things start to slow down then developers will start trying to optimize what they've got.
Less chance of fucking up and creating a cyberpunk dystopia.