Why does everyone keep telling me not to upgrade my CPU now because we're on the cusp of a huge advancement?

Why does everyone keep telling me not to upgrade my CPU now because we're on the cusp of a huge advancement?

Are they just talking out of their ass? What advancement?

Attached: 1532776109758.jpg (552x754, 59K)

CPU performance has plateaued significantly in the last 7 years. Don't even bother.

7nm Ryzen Gen 3. Just wait.

FPBP

If you bought a medium to high range CPU, you likely have all the CPU power you need. It isn't like the 90's where medium range between years gave you performance boosts by leaps and bounds.

Attached: 1293135032181.jpg (887x904, 38K)

If you've got a bottomless wallet, Intel i9-9900K might be interesting... or a house fire.... probably a house fire.

We're always on the "cusp" of advancement because of marketing.

But my 2500k is finally dying after all these years.

Attached: 1531429819550.png (1106x1562, 1.1M)

More like Windows is beinf designed to work faster on cheap dual core tablets

Depends what you're doing to be honest, there are certainly situations where the new gen CPUs just make sense. A lot of professional work has seen huge benefit from getting cheap 16 core + CPUs with relatively low power draw compared to what we had for offerings back in 2011.

I was wondering the same OP, my 4770k has been working like a charm for ages, covers all my daily use cases/gaming - question is, how long till I have to upgrade?

Yes, they're talking out their asses. Intlel shills be like "skylake++++ on 14nm++++ has more plusses than the last one! AMD BTFO!"

Ignore and/or mock at your leisure. If you bought an Intel CPU in in the last 5 years, or have a Ryzen of any sort, you can ignore all CPU shilling threads - Intel /and/ AMD.

What the fuck

>What advancement?
Quantum processors

...?

Not for general use.

Attached: 1530141218458.jpg (1280x720, 88K)

Because Q1 2019 AMD is releasing 7nm Zen with revised lithography. That includes an IPC increase and the possibility of more cores per CCX.

Just wait bitch.

I believe that one day there will be more uses for quantum computing, not only for scientific and medical research but for 3D modeling, faster renderings for cinematographic purposes, CGI effects and insane gaming graphics and physics made by a game engine that simulates atoms. Probably a millionaire business, selling quantum gpu's and cpu's, every architect, doctor, film producer, graphic designer, game developer would be using a quantum computer on their offices

I want to make her joycons click, if you know what I mean

>pic

Attached: 1516131470720.png (205x246, 14K)

how is it dying? it either works or it doesnt

It keeps overheating.

This.
When will people finally accept that the original Moore's law no longer applies?

chips are being designed to run more efficiently
yet we're still nowhere near reversible computing

Depends on what you do and what you have.
I went from Phenom II X4 940 -> i7 3770k -> Threadripper 1950X. Each of those were big upgrades.

Whether newer CPUs are actually significantly more powerful or not, don't upgrade unless a newer CPU will actually allow you to do stuff you couldn't do before.

Change the thermal paste and clean the heat sink.

Boom, another 10 years of service.

Tell that to my Richlands APU, which appears to have faulty graphics - it works, then it doesn't, then it works again... and then it doesn't again.

Pretty much this.
If you have a quad core you're probably still good.

time travel feature for windows 10 is coming soon
you will need a good cpu for that

If you're ever used one of those you know that's a lie

If you don't want advice don't ask for it. If people are telling you not too it's probably because you have a solid processor and it's a waste of time and money to move. Otherwise shut up and get one do what you want.

thermal paste is drying out dumb fuck
get a new cpu fan and put some noctua or arctic silver paste on it

This basically.

I don't really feel like I need to upgrade my 4670k.

I definitely need a GPU upgrade to keep up with VR but that's all i'll be upgrading
just paste should be fine. no need for a new cooler.

Why are you so worried about what other people say?

Do you need more power? Upgrade.
You don't? Don't upgrade.

Is not that hard.

this

Attached: 1532784212618.png (378x370, 120K)

>we're on the cusp of a huge advancement?

Intel's plan 1.5 years ago was to keep selling 4c8t CPUs for ~$300 for the next decade, with like 5% year-over-year perf improvements. Then last Spring, AMD starts selling $300 8c16t chips with about the same single-threaded perf as Broadwell, and Intel finally starts selling 6c chips at non-premium prices, and 8c chips (i.e. i7-9700k) in the near future.

HOWEVER, there is a very strong chance that AMD will go for the jugular in early 2019 with 12+c/4.5+GHz (plus maybe 10-15% IPC uplift) consumer chips for non-insane prices, which is something that Intel probably hoped to postpone till somewhere around the year 2030. Intel will have to counter (maybe in 2019, maybe in 2021, but who knows), which finally gets us out of the rut we've been in for a decade.

/thread

but that's wrong you retard

Zen 2 is looking like the CPU to get. Which worries me because I know AMD will find a way to fuck things up.

>which finally gets us out of the rut we've been in for a decade
Do you really mind the rut? Be honest with yourself. Because Intel has been happy to just merrily Jew along, your old hardware remains useful for a longer period of time. There are tons of people who use 10-year-old laptops and have to make comparatively few sacrifices in everyday usage. An i5-2500k is still perfectly fine for most games. Sure, if you operate in a field where the benefits from faster processors are disproportionately large relative to the increase in processing power, then Intel's laziness is probably hurting you. But if you are like 95% of people on Jow Forums and the vast majority of consumers, then it's good that AMD has been stuck in the doldrums and doing nothing for the better part of 7 years. It forces programmers to be more efficient with the chips they have available for use, and you get to keep your shit longer.

I do not have particularly fond memories of the exponential increases in processing power from the 1990s and 2000s. A state-of-the-art, $2,000+ system was behind the curve within 2 years and was completely outclassed, obsolete and useless within 4-5 years. Meanwhile I am on the same T500 I bought in 2009 and the same gaming PC I built in 2011, and aside from a new graphics card, I haven't had the urge to replace anything.

The funny thing is that Intel's severe Judaism is killing Microsoft.
>slower refresh cycles lead to cratering OS sales as most people don't change OSes once they get a PC
>even forcing upgrades on people for years wasn't enough for 10 to dethrone 7
>stagnant PC CPU performance means after a certain point the only way to increase FLOPs is to offload the serious computation to GPUs or "the cloud" (a cluster of remote Linux systems)
>anything that runs in "the cloud" is accessed over a network, usually via standard protocols like HTTP, S3, SSH, or RDP that all have clients for multiple platforms, including Linux and mobileshit
>GPGPU APIs are all available on Linux, and are the basis for Linux's sole occupancy of the entire TOP500 supercomputer list
All of these things weaken the stranglehold Win32 has on applications, which is the only reason people put up with Microsoft's bullshit on the Windows desktop. The Windows desktop in turn underpins the value proposition for all of Microsoft's portfolio. Windows Server in particular is a dire mess propped up only by the need for Active Directory to manage Windows PCs.

> being able to play slightly old games at framerates and resolutions undreamt of at release, on not bleeding edge hardware
> so awful

Attached: cuomo on no.jpg (653x477, 58K)