Why are processors GHz basically the same for the last 10 years? did we hit a wall in CPU development?

why are processors GHz basically the same for the last 10 years? did we hit a wall in CPU development?

Attached: bowie.gif (220x170, 448K)

Other urls found in this thread:

youtube.com/watch?v=9eyFDBPk4Yw
en.wikipedia.org/wiki/Instructions_per_second#Timeline_of_instructions_per_second
software.intel.com/en-us/blogs/2014/02/19/why-has-cpu-frequency-ceased-to-grow
twitter.com/NSFWRedditImage

bump

Because moar coars

Imagine I give you 3 billion incels. Every second the incels pull a collection of wheels to generate power/energy/processing. Then I give you a choice. You can either have a billion more incels or I can take the 3 billion incels back and give you 3 billion gym-bros. Which do you go with?

yes we did, a ghz is a billion times a second, the resonance of the crystal material and the speed of light prevents us from going to a trillion times per second

you can't vibrate something infinitely fast

The power consumption is roughly proportional to the CPU frequency squared. So the main issue is cooling the CPU at these high frequencies.
For this reason the main development in recent years have been in IPC and the number of cores. A high number of cores in order to support the high-end processing demands and performance requires both more memory and a good cooling strategy.
In general, a high number of CPU cores in a high-end processor will also have to be faster if you want a high-end gaming environment such as the GeForce GTX 780 Ti or AMD FX-8850. In the case of the GTX 780 Ti, the performance increase is significant and it's probably more significant for gamers trying to play at 1080p resolutions where pixel jitter is not an issue. Since many PC gamers can't run our system with the card at its max quality, they can play at low or average settings with a small improvement in performance.
AMD recently released their "Titan X" series of high-end graphics cards. Like the previous "Gigabyte" lineup, the "Gigabyte HD 7870 GHz Edition" is a card designed for high-end gaming.

Why is cooling a problem? Just make liquid nitrogen the standard

I wish. Maybe in the future when we all live in the cold of space.

You dont need more than 4GHz. You need more cores and more per-core-performance.

no
I need to maintain 30 FPS in dwarf fortress despite having 50 cats. Cores are for zoomer software

>You dont need more than 4GHz.
that is an arbitrary limit in terms of computing imposted by practical crystal vibration issues. If those issues had occurred at 400mhz you would say the same thing about that speed and if 100ghz was readily available cheaply you would think it was the min standard for a laptop

I'm kinda glad we reached the limit because now devs will actually have to focus on the optimalization

yeah bro thats totally whats happening!

Attached: 1565454110395.png (112x112, 24K)

Attached: wishful-thinking.jpg (2480x1748, 945K)

I think you'd want a whole bunch of L1/L2/L3 cache on the CPU for Dorf Fortress (and faster system RAM wouldn't hurt either). More Ghz can help, but that will only help to a certain degree when the main issue is that your CPU constantly has to wait for your system RAM (which is much slower than the CPU) because of the huge amount of objects this game requires your computer to manage.

> cant vibrate something infinitely fast
You've never seen me flicking your moms bean kiddo.

"cold of space" would be terrible for supercomputers because dissipating the heat becomes a huge problem.

Yes, we hit limits in physics. Collections of electronic transistors can only go so fast before they start breaking down.

Not true, the first problem encountered is mentioned . We are literally just talking about the speed of light through silicon.

This, DF is a memory issue mfirst and foremost.
We need to work on having actual sRAM to replace dRAM. If the Ryzen trend continues then DF will be even harder to play in 10 years.

At the beggining they actually downgraded the requirements, Now if You want something diferentes You need to think of a server

lol

The best way to understand what's going on in today's industry is to use an analogy.

Let's say we are a restaurant. We can have two floors of floors. Each floor has one dish; it's always an appetizer. We have a big kitchen. Each table can have 10 diners, a waitress, and we give them the same menu — the same thing every time. When the menu is over, there's an order coming in as we prepare the food for us. It's very clear that each table should have something different from the one that was before. That being said, we still don't have as many people moving from one floor to another of course. There is only so much space. That's where we are at now in the computing world.

There hasn't been a huge paradigm shift since 2000 or so, when Intel launched its Turbo architecture—which went on to become Intel's "Haswell" architecture. But there were also plenty of changes in the way the chip industry developed. Why isn't that change happening in the computing market?

Intel's strategy has been to build chips in two main areas: low-power and high-power design. At the end of the day, the goal is "performance over power."

Surely it's the opposite?
Heat would dissipate fastest in a vacuum.

Increased frequency leads to even lower wavelengths inside the die causing problems with clock propagation and other transmission line effects, not to mention increased capacitive effects. Transistors themselves also have a limited switching time.

There's little issue in generating ~10 GHz signals today
Stop posting

Isn't 5ghz about the practical limit for silicon? If they switch to GaN or something similar transistor capacitance is 10x less.

there 3 forms of thermal conduction.

1. thermal flow. Imagine you have a hot pocket of air and a wind. the wind moves the warm air (and therefore the heat enegy) from place A to B. Not applicable for CPU Cooling because we want the Energy out of our CPU and not the CPU somewhere else.

2. Thermal Radiation. Evetything with heat energy is radiating EM-Waves. This is slow as fuck. Em-Waves can only carry so much Energy from the system. It is slow as fuck. Then you can't just say let's output only X-rays. Have you ever seen a CPU glowing?

3. Thermal contact. When you touch a hot spoon you burn your hands. It is because a hot source and a cold drain are in contact. So the heat flows from warm to cold and burns you. This can take away the most heat.

When you enter space there is no matter that can "accept" your heat. So you won't freeze to death, but rather other stuff.

based analogy, fuck incels

idk but my old ass i3 I've bought back in 2014 still works just fine and runs latest games without any issues
explain that techtards

Then riddle me this:
If radiation is so weak, how come we lose most of our body heat via radiation?

your body isn't made out of silicon and metal

We don't lose heat as thermal radiation...
We lose heat through thermal contact with air or whatever else your skin is touching

speak for yourself

Attached: Screenshot_20190812-160855_Wolfram_Alpha.png (1080x1920, 110K)

we can always go in 3d adding another axis to CPU

>Fails Intro to Computer Architecture

That would certainly reduce clock sure but heat management would suck.

*clock skew

>hurp let's just drown our computers in oil or liquid nitrogen
Or we could stop writing inefficient and bloated code?

>if you want a high-end gaming environment such as the GeForce GTX 780 Ti or AMD FX-8850
Nice copy-pasta, why would you try so hard to look smart on an anonymous board? Are you retarded?

Attached: whatyear.jpg (480x360, 70K)

This opinion is so ridiculous yet so prevelent. is correct. A 500GHz core is far superior to 500x1GHz cores.

Because frequency really doesn't matter.

Try running a CPU demanding game, like Dead Rising 4, with high settings (assuming you have at least a 1060/580), and then come back.

This post assumes "increase CPU frequency" isnt matched by increased memory speeds. If we could clock a core to 500GHz we'd clock the ram to 100GHz and dwarf fortress would run crazy fast.

OP is a fag

youtube.com/watch?v=9eyFDBPk4Yw

Attached: Hopper-3 square.jpg (300x300, 17K)

>stagnation for 7 years
technology is dead

Attached: chrome_ZuVEBty7ii.png (1709x282, 64K)

based

>... or around their neck ...
damn she's brutal

>A 500GHz core is far superior to 500x1GHz cores.
Source: Your ass.
Even though depends a lot on how the software was written, the general rule is that parallelism allows higher performance than frequency in multitasking applications, exceptions are programs that naturally work sequentially, like compression or encoding/decoding.
Netburst vs K7/K8 generation proved this very clearly, even in singlecore, being able to perform more operations per cycle was more efficient than having higher frequencies for most applications.

Good luck getting over 50fps at 1080p on any game made since 2016 on high. Not to mention the stuttering even if you do get this framerates. Even quad cores better than your i3 stutter nowadays.

Depends on the workload.

As usual, OP is a fucking faggot.
en.wikipedia.org/wiki/Instructions_per_second#Timeline_of_instructions_per_second

>A 500GHz core is far superior to 500x1GHz cores.
depends on the application, if you're running 500 small tasks, you'll have save time wasted in the overhead of task switching

Yep, on a 3rd gen i5 with RX580 I get stuttering in DR4, and ocasionally on the new RE2, on 900p.

Because Moore's Law is on suicide watch.

software.intel.com/en-us/blogs/2014/02/19/why-has-cpu-frequency-ceased-to-grow

thats literally what is going on
7700k at 5ghz was a literal housefire
after that you will see any 6+ core going at 5ghz that can be used in regular basis

I don't think this analogy is remotely true. And 4 billion incels would produce more power than 3 billion alpha chads. 1 billion more dudes far outweighs the extra performance per person.

Kek. user btfo