If you buy a PC in 2019 then you're an idiot

According to most experts, Moore's law will be ending in either 2020 at 7 nm or 2022 at 5 nm. If 7 nm proves to be the limit of Moore's law, then a computer purchased in 2020 will never become obsolete. Likewise, if 5 nm is the limit then a computer from 2022 will never become obsolete.

Buying a computer today is like buying a computer with a Pentium D in 2006 just before the Core 2 was released. Or buying an old 300 Celeron before the 300A was released.

Why not just wait another year and buy a computer in 2020? Or even better, why not hold off buying anything until 2022 in order to be 100% certain that your expensive computer will always remain modern and up to date?

Attached: 000192-00.png (550x348, 50K)

Other urls found in this thread:

nextbigfuture.com/2017/11/key-component-for-quantum-computers-miniaturized-by-1000-times.html
fastcompany.com/90242006/old-school-silicon-could-bring-quantum-computers-to-the-masses
staticwww.asml.com/doclib/investor/asml_3_Investor_Day-Many_ways_to_shrink_MvdBrink1.pdf
youtube.com/watch?v=VMbUXKsMKKA
twitter.com/NSFWRedditGif

because AM4 will be supported for a while still

I think the main reason that I haven't bought a computer since 2015 is that I just don't need more processing power. It's fast enough for what I do. For me the real progress I'd like to see is with storage.

>a PC from 2020 will never be obsolete
Yes because no one will try new architectures, and memory and cache will just freeze.

You’re wrong and im not even going to bother telling you why

I once had a Pentium 286 with a turbo button and a lock key in the 80s. Makes game speed like Nintendo Game and Watch's Game A to Game B. Also a Pentium II 266 and overclocked it to 400 on the first day I got it in the 90s. Those were the days.

>when moores law no longer holds true, technology will no longer progress
What the fuck is this thread

Yeah, and the Earth is still the center of the universe and gravity is the end all be all for physical interactions.

5nm is already on track
than theyll just vertically stack cores by 2030 people will laugh at 64core computers

Do you seriously think moore's law is some sort of natural phenomenon? Do you seriously think that slowed progress is no progress?
God damn it, user. Delete your post before more people see this embarrassing shit.

> Wait 3+ years to buy a computer with more processing power/storage/... that you need
> Computers in 3 years will never become obsolete because that's what happens once moore's conjecture ends
Are you on some srs drugs?

holy shit I have been waiting since 2009 though

Attached: 1554976607827.jpg (256x256, 12K)

finally programs will start to use multi-core if they want to do more

If you actually believe that after Moore's law there will be no more improvements then you need to go back immediately. And I don't mean plebbit, I mean the clinic for mentally retarded people you escaped from.

>a computer purchased in 2020 will never become obsolete
What is quantum computing

You waiting for ssds to come down too?

Something that will never be consumer ready due to the simple fact that people won't install a vacuum chamber and cooling system to keep their PC at near 0K and also not valid for all applications
Any other retarded questions I can help you with?

That's OK, they'll quickly make something that defies conventional understanding of physics and engineering, just so the 500MB - 1GB JavaScript calculator apps can run, they HAVE TO.

>they'll quickly make something that defies conventional understanding of physics and engineering, just so the 500MB - 1GB JavaScript calculator apps can run

Attached: you asked for this.jpg (567x378, 324K)

>a computer purchased in 2020 will never become obsolete.
holy fuck how retarded are you op? a limit in node size wont limit performance, however packing more power into the same node will cause it to have heating issues
and even if 5nm is the limit with silicone dies we will still be able to bypass it with graphene until we hit 1nm

Likewise

Yeah. Cheaper and bigger. I think it's just demand that is stopping that from happening, so much is "in the cloud" and everyone is streaming media, I'm a bit of an outlier storing movies and music still.

If you want to wait until SSDs are on par with HDDs you should just stop already. SSDs won't be able to compete with HDDs for a looooong time.
And you really shouldn't use a SSD for movie or music storage.

I have about 12TB of movies and 8TB of music, hard to even manage it with HDDs. At some point I need to buy a NAS but that's expensive.

People said the same thing back in the 1970s when computers took a whole room and ran off of the output of 2 power plants. You now have a computer a thousand times more powerful, running on a thin battery that fits in your pocket. Tech evolves, new discoveries are made, and the stupid assumption that the fundamental technology is bounded by its primitive form is why you're an idiot. Advancements will be made. A way will be found to miniaturize the tech and keep it under vacuum and 0 kelvin.

>hard to even manage it with HDDs
That's when you either buy an old server (like me) or buy a cheap office PC and put a SAS/SATA Controller in it and hook up a tons of 4TB/8TB drives.
For that much data HDDs are the only way right now. And trust me don't get a NAS they almost all suck ass.

5-year moratorium on hardware development so software devs are forced to actually put some effort into efficiency when?

>We will just invent™ a way to get around the very laws of quantum mechanics, fucking physics, thermodynamics and how superconductors work
The only retard I am seeing here is you. But surely if we managed the step from vacuum tubes to semi conductors we'll somehow bend the universe to the will of some retard on a Mongolian cave drawing forum because he said we'll just do it anyways.

Attached: 1554938255155.jpg (700x700, 209K)

Why not hold off to 2040 and buy a quantum computer?

Imbecile. People can use 3770K for 10 more years from now. I don't care what trash CPUs come out.

>expensive
the fuck are you buying
gaymer shit?

Quantum computers are only good at particular computing problems. They aren't necessarily faster/better.

Woah look at mr Ph.d in physics over there. Who the fuck do you think you are to say what we can and cannot accomplish?

>We will just invent
No, the key word is discover. Get your retarded ass out of here. You think we've discovered everything there is to know about the universe? People said the same thing, until we started using printed circuit boards to make circuits smaller. Then we came up with the first transistor. Then we came up with integrated circuits. Then we perfected that. We keep making breakthroughs year after year, that allows us to make things smaller, faster and more efficient. All it takes is the discovery of new materials, new processes and new science. There is so much that we don't know, and making claims like yours is arrogance at its finest. You think you have the answer to everything, but you don't. All you need is a breakthrough that eliminates the need for super cooling.

nextbigfuture.com/2017/11/key-component-for-quantum-computers-miniaturized-by-1000-times.html
2 years ago you would of said such a thing was impossible. Yet here they are proving you wrong every single day.

But they're *new* and *exciting*, unlike your old silicon crap. We should all wait for that and stick to our athlon APUs.

embarrassing

my sides
my fuckin sides
no

It's been dead. They just keep tip-toeing around it with massively parallel supercomputers that do basically nothing for 99.99999999% of the worlds computing.

Next gen computing will be functional programming on massively parallel systems

>Woah look at mr Ph.d in physics over there
>Hurr physics hard makes my feefees feel hurt durr
>All you need is a breakthrough that eliminates the need for super cooling.
>All you need is to find a way for thermodynamics to not apply anymore hurr durr it's easy
And where did I say or even just IMPLY that miniaturisation is impossible? Oh yeah that's right I didn't, you just straw man cause you have neither a fucking clue what you are talking about neither any arguments.
All that you wrote is so severely handicapped I want to know where you live so I can send you money cause I am sure I can get that off my tax as charitable donation.
Especially "All it takes is the discovery of new materials, new processes and new science" actually made me laugh out loud for the first time in probably years on this shit website simply because of how fucking retarded that sentence was. Just stop, everyone already knew you have no fucking clue what you are talking about but you just keep going.

Attached: 1550895921362.jpg (490x515, 44K)

>Never Obsolete

Attached: never.jpg (3024x4032, 942K)

>And where did I say or even just IMPLY that miniaturisation is impossible?


Let me break it down for you:
I claim miniaturization is possible
You say sarcastically and I quote:
>We will just invent™ a way to get around the very laws of quantum mechanics, fucking physics, thermodynamics and how superconductors work
As if we need to break the laws of physics to be able to miniaturize quantum computers, and therefore it is impossible. So yes, you did IMPLY that miniaturization is impossible, and called me a moron for claiming otherwise.

Guess what nigger, they are working on designs that don't even require superconductors
fastcompany.com/90242006/old-school-silicon-could-bring-quantum-computers-to-the-masses

>neither any arguments.
Oh the irony, this is coming from the guy who replies to my post by calling me a retard, handicapped, and provides no argument at all.

>All you need is to find a way for thermodynamics to not apply anymore hurr durr it's easy
WRONG dipshit. All you need is to find a way to design the computer without the need of super cooling. The link I provided shows a design that can operate at higher temperatures, and does away with superconductors all together.


I don't know what makes a faggot like you so sour at the idea of having a quantum computer that fits on a desk in the future. Are you afraid you'll get vanned for the kiddie porn on your hard drive? Lmao, stay mad.

Wasn't Moore's law supposed to die out a few years ago?
Also, who cares about some old cunts' observation on trends at that time?

>Don't spend your money, reeee
>When the economy teacher is a commie

Why do they suck? My dad has one and it seems to wig out sometimes, it was also very expensive...

Yes goy, don't buy a CPU now. Wait for genuine Intel ™ processors.

Tend to fail just after the warranty is up, usually use impossible to replace proprietary parts so once it breaks go buy another 300$ shitbox that can only hold 4 drives and nothing larger than 4TB, usually have shit thermal performance, and so on. Consumer NAS boxes are probably the biggest meme there are. Never bought one myself cause they seemed way too overpriced, family and friends did, all regretted it.

aapl_128 will exist

>according to (((experts)))
Fuck off with this nonsense. There is an engineering problem, and engineers will work nonstop to solve it. 5nm EUV FinFET is bringing a solid transistor density increase and die area reduction over 7nm EUV FinFET. After this the industry will transition away from FinFETs for high performance processes, and they will adopt GAA topology. GAAs will bring one of the largest generational power reductions we've ever seen. 3nm GAA processes will likely again double transistor density over 5nm EUV FinFET while having a power reduction of 50% or greater depending on VT.
You want to opine about muh Moore's Law ending then wait til the mid 2020s. Silicon has a lot of life left in it. No one is halting R&D for high performance processes. All these issues the industry faces will be solved with adequate time and money, and they're putting tens of billions per year into it.

It would still be usable if websites didn’t use megabytes of javascript and browsers that don’t require sse2 were still common.

>when a code monkey pretends to know about quantum physics and what will be or not be possible in the coming decades
Cringe

>le code monkey cringe btfo'd xDDDDDDDD
Fuck off you glow in the dark consumer.
Quantum computers have very limited applications and that won't change anytime soon. And again I know you niggers have problems with simple maths already but thermodynamics won't change no matter how much you niggers ignore it.

Attached: 9c8.jpg (499x499, 130K)

>Vertically stacked cores

Enjoy your bottom cores overheating like fuck.

>Quantum computers have very limited applications
Just like conventional computers when they started...

Thanks. Since I'm a techlet sometimes I wonder if my current system of USB drives really is the best for me.

>heat descends

>not understanding insulation
Its a real problem, silicon is not a perfect thermal conductor. HBM stacks have the same issue; clocks are limited by the temp of the bottom DRAM slice.

>LoL
GAAG is still planned to 2024 and 3D chips or more things.

No more compute power is a reddit meme
staticwww.asml.com/doclib/investor/asml_3_Investor_Day-Many_ways_to_shrink_MvdBrink1.pdf

Are you fucking dumb? There is still room for improvement with 3d stack.

>silicon
Get with the times gramps, graphene is (for once) unironically the future.

>Why not just wait another year and buy a computer in 2020?
1) I'm not poor.
2) My time isn't worthless.

If Moore's law taps out at 2020 how are we going to reach the level of simulation we're all in right now?

>regurgitating meme shit you have no clue about
The entire industry, all its infrastructure, all logistical supply chains, are all centered around silicon substrates. This is not changing any time soon. Silicon will not be abandoned as a substrate just because you read some popsci trash clickbait article, reddit.

>Getting this ass blasted over literally nothing
Did I say it will instantly replace silicon? No, I didn't even imply it. Simply truth is graphene is currently our best hope once silicon reaches it's final limit aka anything sub 5nm as it currently seems. But I am sure you have extensive knowledge about CPU architectures, litography and actual quantum mechanics, sure thing kiddo.

when am i getting a Joi?

user just stop you're embarrassing yourself.

>I'm talking out of my ass, you must be just as stupid as me!
What a great argument. Have fun with that outlook.

>Yes because no one will try new architectures
i remember when they tried these new architectures
>Itanium, Pentium 4, Bulldozer
good times

holy shit this guy is retarded

Attached: 1491096820035.jpg (498x500, 46K)

I agree that the difference in performance is not much, but sometime hardware fails
I've used one computer from 2008 and replaced it this year, also I've spend only equivalent of 400$ for it, I haven't bought new gpu though, but will buy when I save some money
Also I'm not stupid enough to buy intel, for compiling AMD is better and if I ever want to play a game ~5fps doesn't make a difference

We will start to go up.
Chip on chip on chip on chip on chip, until it turns into something that sounds exactly like a dick measuring contest.

Nice try, intel, but I'm going to buy Zen 2 when it comes out in a few months.

I'm using a 10 year old X58 motherboard, still not obsolete.

youtube.com/watch?v=VMbUXKsMKKA

Attached: X58_will_never_die.jpg (485x485, 45K)

there is a point at which avarage user programs can no longer make use of increased computational power you know?


for example, in what way can microsoft office or libre office improve? if anything it would be marginal

same with all software out there, there is a increasingly shrinking room for improvement

They will just compensate for it by using shittier programming languages and meme AI stuff.