According to most experts, Moore's law will be ending in either 2020 at 7 nm or 2022 at 5 nm. If 7 nm proves to be the limit of Moore's law, then a computer purchased in 2020 will never become obsolete. Likewise, if 5 nm is the limit then a computer from 2022 will never become obsolete.
Buying a computer today is like buying a computer with a Pentium D in 2006 just before the Core 2 was released. Or buying an old 300 Celeron before the 300A was released.
Why not just wait another year and buy a computer in 2020? Or even better, why not hold off buying anything until 2022 in order to be 100% certain that your expensive computer will always remain modern and up to date?
I think the main reason that I haven't bought a computer since 2015 is that I just don't need more processing power. It's fast enough for what I do. For me the real progress I'd like to see is with storage.
Landon Ward
>a PC from 2020 will never be obsolete Yes because no one will try new architectures, and memory and cache will just freeze.
Levi Miller
You’re wrong and im not even going to bother telling you why
Henry Miller
I once had a Pentium 286 with a turbo button and a lock key in the 80s. Makes game speed like Nintendo Game and Watch's Game A to Game B. Also a Pentium II 266 and overclocked it to 400 on the first day I got it in the 90s. Those were the days.
Juan Baker
>when moores law no longer holds true, technology will no longer progress What the fuck is this thread
Aaron Sanders
Yeah, and the Earth is still the center of the universe and gravity is the end all be all for physical interactions.
Xavier Parker
5nm is already on track than theyll just vertically stack cores by 2030 people will laugh at 64core computers
Dominic Rivera
Do you seriously think moore's law is some sort of natural phenomenon? Do you seriously think that slowed progress is no progress? God damn it, user. Delete your post before more people see this embarrassing shit.
Levi Collins
> Wait 3+ years to buy a computer with more processing power/storage/... that you need > Computers in 3 years will never become obsolete because that's what happens once moore's conjecture ends Are you on some srs drugs?
finally programs will start to use multi-core if they want to do more
Brayden Hughes
If you actually believe that after Moore's law there will be no more improvements then you need to go back immediately. And I don't mean plebbit, I mean the clinic for mentally retarded people you escaped from.
Charles Walker
>a computer purchased in 2020 will never become obsolete What is quantum computing
Jonathan Edwards
You waiting for ssds to come down too?
Daniel Butler
Something that will never be consumer ready due to the simple fact that people won't install a vacuum chamber and cooling system to keep their PC at near 0K and also not valid for all applications Any other retarded questions I can help you with?
Connor Lopez
That's OK, they'll quickly make something that defies conventional understanding of physics and engineering, just so the 500MB - 1GB JavaScript calculator apps can run, they HAVE TO.
Sebastian Phillips
>they'll quickly make something that defies conventional understanding of physics and engineering, just so the 500MB - 1GB JavaScript calculator apps can run
>a computer purchased in 2020 will never become obsolete. holy fuck how retarded are you op? a limit in node size wont limit performance, however packing more power into the same node will cause it to have heating issues and even if 5nm is the limit with silicone dies we will still be able to bypass it with graphene until we hit 1nm
Jason Bell
Likewise
Jordan Davis
Yeah. Cheaper and bigger. I think it's just demand that is stopping that from happening, so much is "in the cloud" and everyone is streaming media, I'm a bit of an outlier storing movies and music still.
Wyatt Gray
If you want to wait until SSDs are on par with HDDs you should just stop already. SSDs won't be able to compete with HDDs for a looooong time. And you really shouldn't use a SSD for movie or music storage.
Zachary Reyes
I have about 12TB of movies and 8TB of music, hard to even manage it with HDDs. At some point I need to buy a NAS but that's expensive.
Sebastian Nelson
People said the same thing back in the 1970s when computers took a whole room and ran off of the output of 2 power plants. You now have a computer a thousand times more powerful, running on a thin battery that fits in your pocket. Tech evolves, new discoveries are made, and the stupid assumption that the fundamental technology is bounded by its primitive form is why you're an idiot. Advancements will be made. A way will be found to miniaturize the tech and keep it under vacuum and 0 kelvin.
Benjamin Mitchell
>hard to even manage it with HDDs That's when you either buy an old server (like me) or buy a cheap office PC and put a SAS/SATA Controller in it and hook up a tons of 4TB/8TB drives. For that much data HDDs are the only way right now. And trust me don't get a NAS they almost all suck ass.
Landon Cook
5-year moratorium on hardware development so software devs are forced to actually put some effort into efficiency when?
Sebastian Martin
>We will just invent™ a way to get around the very laws of quantum mechanics, fucking physics, thermodynamics and how superconductors work The only retard I am seeing here is you. But surely if we managed the step from vacuum tubes to semi conductors we'll somehow bend the universe to the will of some retard on a Mongolian cave drawing forum because he said we'll just do it anyways.
Why not hold off to 2040 and buy a quantum computer?
Imbecile. People can use 3770K for 10 more years from now. I don't care what trash CPUs come out.
Jaxon Hall
>expensive the fuck are you buying gaymer shit?
Hudson Rodriguez
Quantum computers are only good at particular computing problems. They aren't necessarily faster/better.
Gabriel Sanders
Woah look at mr Ph.d in physics over there. Who the fuck do you think you are to say what we can and cannot accomplish?
>We will just invent No, the key word is discover. Get your retarded ass out of here. You think we've discovered everything there is to know about the universe? People said the same thing, until we started using printed circuit boards to make circuits smaller. Then we came up with the first transistor. Then we came up with integrated circuits. Then we perfected that. We keep making breakthroughs year after year, that allows us to make things smaller, faster and more efficient. All it takes is the discovery of new materials, new processes and new science. There is so much that we don't know, and making claims like yours is arrogance at its finest. You think you have the answer to everything, but you don't. All you need is a breakthrough that eliminates the need for super cooling.
But they're *new* and *exciting*, unlike your old silicon crap. We should all wait for that and stick to our athlon APUs.
James Diaz
embarrassing
Joshua Bailey
my sides my fuckin sides no
Justin Morris
It's been dead. They just keep tip-toeing around it with massively parallel supercomputers that do basically nothing for 99.99999999% of the worlds computing.
Jeremiah Sanders
Next gen computing will be functional programming on massively parallel systems
David Morris
>Woah look at mr Ph.d in physics over there >Hurr physics hard makes my feefees feel hurt durr >All you need is a breakthrough that eliminates the need for super cooling. >All you need is to find a way for thermodynamics to not apply anymore hurr durr it's easy And where did I say or even just IMPLY that miniaturisation is impossible? Oh yeah that's right I didn't, you just straw man cause you have neither a fucking clue what you are talking about neither any arguments. All that you wrote is so severely handicapped I want to know where you live so I can send you money cause I am sure I can get that off my tax as charitable donation. Especially "All it takes is the discovery of new materials, new processes and new science" actually made me laugh out loud for the first time in probably years on this shit website simply because of how fucking retarded that sentence was. Just stop, everyone already knew you have no fucking clue what you are talking about but you just keep going.
>And where did I say or even just IMPLY that miniaturisation is impossible?
Let me break it down for you: I claim miniaturization is possible You say sarcastically and I quote: >We will just invent™ a way to get around the very laws of quantum mechanics, fucking physics, thermodynamics and how superconductors work As if we need to break the laws of physics to be able to miniaturize quantum computers, and therefore it is impossible. So yes, you did IMPLY that miniaturization is impossible, and called me a moron for claiming otherwise.
>neither any arguments. Oh the irony, this is coming from the guy who replies to my post by calling me a retard, handicapped, and provides no argument at all.
>All you need is to find a way for thermodynamics to not apply anymore hurr durr it's easy WRONG dipshit. All you need is to find a way to design the computer without the need of super cooling. The link I provided shows a design that can operate at higher temperatures, and does away with superconductors all together.
I don't know what makes a faggot like you so sour at the idea of having a quantum computer that fits on a desk in the future. Are you afraid you'll get vanned for the kiddie porn on your hard drive? Lmao, stay mad.
Eli Scott
Wasn't Moore's law supposed to die out a few years ago? Also, who cares about some old cunts' observation on trends at that time?
Adrian Gomez
>Don't spend your money, reeee >When the economy teacher is a commie
Isaac Butler
Why do they suck? My dad has one and it seems to wig out sometimes, it was also very expensive...
Justin Evans
Yes goy, don't buy a CPU now. Wait for genuine Intel ™ processors.
Carson Green
Tend to fail just after the warranty is up, usually use impossible to replace proprietary parts so once it breaks go buy another 300$ shitbox that can only hold 4 drives and nothing larger than 4TB, usually have shit thermal performance, and so on. Consumer NAS boxes are probably the biggest meme there are. Never bought one myself cause they seemed way too overpriced, family and friends did, all regretted it.
Daniel Cox
aapl_128 will exist
Grayson Cooper
>according to (((experts))) Fuck off with this nonsense. There is an engineering problem, and engineers will work nonstop to solve it. 5nm EUV FinFET is bringing a solid transistor density increase and die area reduction over 7nm EUV FinFET. After this the industry will transition away from FinFETs for high performance processes, and they will adopt GAA topology. GAAs will bring one of the largest generational power reductions we've ever seen. 3nm GAA processes will likely again double transistor density over 5nm EUV FinFET while having a power reduction of 50% or greater depending on VT. You want to opine about muh Moore's Law ending then wait til the mid 2020s. Silicon has a lot of life left in it. No one is halting R&D for high performance processes. All these issues the industry faces will be solved with adequate time and money, and they're putting tens of billions per year into it.
Josiah Stewart
It would still be usable if websites didn’t use megabytes of javascript and browsers that don’t require sse2 were still common.
Angel Sanchez
>when a code monkey pretends to know about quantum physics and what will be or not be possible in the coming decades Cringe
Jayden Reyes
>le code monkey cringe btfo'd xDDDDDDDD Fuck off you glow in the dark consumer. Quantum computers have very limited applications and that won't change anytime soon. And again I know you niggers have problems with simple maths already but thermodynamics won't change no matter how much you niggers ignore it.
>Quantum computers have very limited applications Just like conventional computers when they started...
Jacob Wood
Thanks. Since I'm a techlet sometimes I wonder if my current system of USB drives really is the best for me.
Isaac King
>heat descends
Luke Hill
>not understanding insulation Its a real problem, silicon is not a perfect thermal conductor. HBM stacks have the same issue; clocks are limited by the temp of the bottom DRAM slice.
Samuel James
>LoL GAAG is still planned to 2024 and 3D chips or more things.
Are you fucking dumb? There is still room for improvement with 3d stack.
Zachary Murphy
>silicon Get with the times gramps, graphene is (for once) unironically the future.
Noah Bennett
>Why not just wait another year and buy a computer in 2020? 1) I'm not poor. 2) My time isn't worthless.
Nolan Sanchez
If Moore's law taps out at 2020 how are we going to reach the level of simulation we're all in right now?
Thomas Foster
>regurgitating meme shit you have no clue about The entire industry, all its infrastructure, all logistical supply chains, are all centered around silicon substrates. This is not changing any time soon. Silicon will not be abandoned as a substrate just because you read some popsci trash clickbait article, reddit.
William Foster
>Getting this ass blasted over literally nothing Did I say it will instantly replace silicon? No, I didn't even imply it. Simply truth is graphene is currently our best hope once silicon reaches it's final limit aka anything sub 5nm as it currently seems. But I am sure you have extensive knowledge about CPU architectures, litography and actual quantum mechanics, sure thing kiddo.
Evan Cooper
when am i getting a Joi?
Andrew Powell
user just stop you're embarrassing yourself.
Ethan Long
>I'm talking out of my ass, you must be just as stupid as me! What a great argument. Have fun with that outlook.
Chase Butler
>Yes because no one will try new architectures i remember when they tried these new architectures >Itanium, Pentium 4, Bulldozer good times
I agree that the difference in performance is not much, but sometime hardware fails I've used one computer from 2008 and replaced it this year, also I've spend only equivalent of 400$ for it, I haven't bought new gpu though, but will buy when I save some money Also I'm not stupid enough to buy intel, for compiling AMD is better and if I ever want to play a game ~5fps doesn't make a difference
Luke Perry
We will start to go up. Chip on chip on chip on chip on chip, until it turns into something that sounds exactly like a dick measuring contest.
Levi Carter
Nice try, intel, but I'm going to buy Zen 2 when it comes out in a few months.
Landon Stewart
I'm using a 10 year old X58 motherboard, still not obsolete.