First half of the 00s. Software developers predict CPUs with more than 10 GHz. Instead...

First half of the 00s. Software developers predict CPUs with more than 10 GHz. Instead, hardware developers introduced multicore processors.

What happened?

Attached: maxresdefault.jpg (1280x720, 22K)

>What happened?
reality caught up

Single thread performance is hard to scale because of latency. Adding threads is linear with transistor count.

>What happened?
physics

muh Crysis 1

Attached: maxresdefault.jpg (1280x720, 49K)

Scaling to multiple threads is also an issue because of RAM latency.

greed

Attached: 25A0046D-D058-4076-B0A8-2B06AA134044.jpg (1024x576, 82K)

Yup.
Still manages to give trouble in 4k to a 2080 Ti 10 years after release

AMD happened

2080 Ti shits itself on Fallout 76

Companies got lazy cheaped out.
This.

>sucking on the teat of the frequency jew

I'd shit myself too if I had to render frames for such a shit game


ive never played it

Blame the Fallout 4 engine.
Manages to fuck up with any GPU and with G-syn/freesync.
It's a marvel of autism

the fact that most powerful gaymen GPU on the market sturggles running a modded 2015 game (that looked like 2012 game when it came out) is fucking fascinating though
just how much do you have to fuck up to even achieve something like that?

Some of the rendering techniques weren't as efficient as modern ones too. They also used forward rendering with dynamic lightning and lots of postprocessing which tends to kill performance.

Woops. Forget that, the bench was with the 1080 Ti.
A 2080 Ti should handle it well.

Phones

Let's take a moment to laugh at AyyMD

Attached: 1.png (711x455, 58K)

it's called CRYTEK because it runs so bad it makes you cry

I mean its simple really

Just revert to ...and you have your answer

You can juxtapose a myriad of different problems that made things this way but every single negative outcome stems from these 7 deadly... problems

Lust, Gluttony, Wrath, Greed, Envy, Sloth, & Pride

In this case you can accurately pinpoint it to greed.

Attached: F84C442D-FDF3-4284-BFDE-39B3194A37CB.jpg (305x458, 23K)

There's no reason we can't do 10 or even 100GHz clock speeds. There are loads transistors with specially doped junctions capable of those speeds. They're typically used in RF oscillator design. I'd guess they aren't really used in PCs because it's hard to build nanometer scale waveguides.

speed of light becomes an issue with those speeds though, you can't keep the whole die in the same clock phase, also, at those frequencies even the smallest fucking trace becomes an antenna/receiver

this
Just add moar coars

>100GHz clock speeds CPU
So 100 000 000 000 clock signals per second.
Speed of light is 100 000 000 m/s.
Light can only travel 0.1 cm (= 1mm = 0.039 inch) during each clock tick.
And that's assuming you have a perfect electronic circuit in which electrons move at their max speed.

>Speed of light is 100 000 000 m/s.

Kek. How did I fuck that up.
it's about 300 000 000 m/s
So 0.3 cm (= 3mm = 0.118 inch)

>Instead, hardware developers introduced multicore processors.
>What happened?
multicore allow better multitasking, companies are what matter and they need multitasking
also you can balance the load of a single task on many cores but you can't do more multitasking on a single core
multicore processors are the way to go

And Intel followed suit because?

silicon limits

You do realize clock speed isn't the only factor when determining processing speed, right?

>moar memory sticks

Transistors simply can't go this fast.
And the funny part is that pentium 4 is designed for 15Ghz, and did all sorts of performance sacrifices to be able to be theorically clocked at this speed, but it ended not being a thing.

The Cell processor is another one that fell for that meme, being designed for 6 Ghz.

3e8 is a lot more than 1e8 user

We were supposed to be here 7 years ago

Attached: 10Ghzby2011.png (1080x1920, 157K)

Intel probably already did that but not releasing to juice out more money
(I bet NASA, illuminati and communist government are already using them)

>Transistors simply can't go this fast.
There are specialized transistors that works at dozens of ghz.

It wasn't just software developers, I have a textbook written in ~2004 on microprocessor design that also parrots the "10GHz by 2010" meme and that was written by a pair of microarchitecture engineers.

You forgot NSA and Air force.
Even though NASA and Military unironically use most primitive hardware under the sun.

When they start making graphene chips, we'll finally be able to run Crysis at 60 fps.

We couldn’t make oxides thinner due to physics, we couldn’t make supplies lower due to leakage so power exploded.

>First half of the 00s. Software developers predict CPUs with more than 10 GHz.
Becuase that's why the hardware vendors, namely Intel, told them.

Tejas was supposed to be 5ghz, Jayhawk 7ghz+

>Instead, hardware developers introduced multicore processors.
Physics ruined Intel's plans.

Yes, but they're fucking big.

>what is laws of physics
>what is Intlel promising shit they don't deliver like they always do
>what is OPs reckless faggotry as this shit takes 2 seconds to look up

I can accept getting c wrong, but using inches is inexcusable

>using inches is inexcusable
Mutts throw a tantrum if you use measurements they can't understand