This is still more than enough today, but the pentium 3 at 550 MHz was terribly obsolete by 2007

this is still more than enough today, but the pentium 3 at 550 MHz was terribly obsolete by 2007
why?

Attached: intel-core-i7-2700k-socket-1155-lga1155-quad-core-4cores-processor-cpu-electgen-1806-20-electgen@2.j (500x450, 32K)

Other urls found in this thread:

pcgamesn.com/amd/zen-3-cpu-tsmc-7nm
twitter.com/NSFWRedditVideo

Jews.
4 cores is enough goy

Moore's law was repealed.

They aren't really being pushed to innovate very fast anymore. When it comes to architecture it feels like we are in a slump. No one is really being creative. Intel is going to start working on hybrid chips with a bunch of stuff normies would want but I would like to see something for enthusiast like a double CPU board with a slot for a single massively over engineered core that can withstand higher voltages with a quality soldered IHS with heat dissipation at the enthusiast level in mind and a second slot for a multi-core standard chip that could work together since so much is still dependent on single core brute force. End result being a machine that could have an independent single core capable of hitting very high clocks and not melting.

diminishing returns

kinda no need, CPUs have been able to do everything a consumer needs for a while now

Because pc gaming is dead.

I still have a Pentium 4 computer and applications such as Firefox are dog slow on it. The reason why Pentium 3s and 4s were so quickly obsolete is twofold. Firstly, they are single core so all processes have to share the CPU. Secondly, modern software is so badly written that if even one process is a hog, all the other processes starve. Heaven help you if you have more than one process that's a CPU hog.

Which brings me to the browsers. They're all hogs in one way or another. Firefox is just beyond belief, sucking up tons of RAM and CPU slices even when the only page being viewed is about:blank. Almost all software is written using crappy frameworks and there is a very heavy Pajeet influence on the software. That's why even trivial programs take 30 MB or more of RAM nowadays to run.

Firefox and Chrome regularly take up over 4 GB of RAM each on my computer.

>I still have a Pentium 4 computer

Lol poorfag, cry more.

idiot

imagine being unable to read

I worded the thing badly. I happen to still have a Pentium 4 computer as well as a Celeron computer. However my main computers are Core i7 machines.

/thread

Intel had no incentive to dump lots of money in R&D after Nehalem; they already owned the market

It's no coincidence that we didn't see Core i9 until Intel had some real competition again.

You still see people rocking overclocked i7 920's literally a decade later, and they're still decent enough for modern games

>can play anything u throw at it

Attached: chad fx .jpg (225x225, 14K)

Prioritization of mobile chips with low power usage over desktop performance.
Amdahl's law.
The stagnation of advances in clock speed-enabling technologies.

Why would anyone spend the time to optimize their application for desktop usage when by far the biggest market is mobile. If your application can't run effectively on 25W CPUs then it's dead or a niche market in 2019.

Cause pajeets can't into multithreading.

and also stutter like a motherfucker

Based FX

Mine is smooth as hell because moar cores.

Attached: moar cores.png (439x439, 40K)

>Cause pajeets can't into multithreading.
Fucking this, if applications scaled perfectly, 2600K would be obsolete because it's multi core is insanely weak.

Processors couldn't do this...

Attached: PowerDensityWithClockScaling.png (560x486, 27K)

So they did this...

P.S. anyone who replies otherwise is a brainlet or trolling

Attached: ThermalPlateau.jpg (871x868, 86K)

Computer technology and specially CPUs have hit various limits, that's why. They aren't getting much smaller anymore, clock speeds aren't increasing and so on. You'll obviously be able to use your 5 year old CPU to run current software when current CPUs are barely faster.

Take the leap from the 80286 to the 80386 as an example: That was a above-2x increase in performance from one CPU generation to the next. Intel gave us 1% for years while AMD had nothing and still barely started trying when AMD finally delivered something with Ryzen.

Not saying it's all bad and boring these days, but it's not exactly worth getting excited when a new CPU comes out.

There's some other areas where things aren't exactly progressing for other reasons, btw. We had Gigabit Ethernet in 1999. Today desktops and laptops still come with... gigabit ethernet. Given the difference in things like RAM, CPU power, storage and so on between 1999 and today.. shouldn't we be using faster than gigabit ethernet?

Stagnation. AMD only recently caught up with Ryzen and the improvements will now continue as before.

I just hope the devs will write games for moar and moar cores because that's where ryzen is going, Intel doesn't know what to do and keeps increasing the clock speed like a P4, thinking it's gonna help.

because cpus can't clock 10 times higher than 2007

It's not just "muh we're too lazy", you physically cannot just double the clock speed like the old days without burning the CPU to a crisp. Any improvements now have to come from better architectures, which will give marginal but not transformational improvements. You're an idiot if you think CPU companies wouldn't double the performance every generation if they still could.

Oh right, intel magically stopped producing any improvements once they virtually killed off AMD, but now suddenly as AMD put forth Ryzen, they are producing pre-sandybridge level improvements.

Go be a kike elsewhere.

>are producing pre-sandybridge level improvements
but they're literally not

clock for clock there's like zero improvement over the latest and sandy bridge

You're right that it's also becoming harder to shrink transistors (and in fact, geometric scaling in the old sense is also prettymuch dead since new shrinks require exotic geometries like FINFETs), but that is irrelevant for PC compute performance since you can't clock them anywhere near their full potential anyways. Next big digital thing is probably memory but there's no "magic bullet" in that field. Moore's law was a one-off phenomenon that we'll probably never see another equivalent of in our lifetimes.

>You're an idiot if you think CPU companies wouldn't double the performance every generation if they still could.
You're totally right about clock speeds, going from 4.77 MHz to 16 MHz was a huge increase, just like going from 66 MHz to 133 Mhz was. We've hit a wall there, clocks aren't increasing.

Yet I disagree with you when it comes to Intel. Four cores, pay extra for the i7 to get eight threads - but still just four cores - for years and years. Then AMD shows up with Ryzen and Intel's suddenly magically able to rapidly push 6 cores to the consumer market. Don't tell me they could have done that years earlier and would have if there was competition (=reason to do that).

Exactly.

How many cores does the i7 have now? Is it four? Or more than four? If it's not four, but more than four.. when exactly did this change occur? Was there some competing product launched right before that change?

I'm not claiming Intel wouldn't hold back (or software lock) some *architecture* improvements due to lack of competition but again, that's a ~10% a generation deal, not a 2x deal. If it was possible to make something 2x good as a late model i7 with a mere lithography improvement (i.e. minimal need to change the die layout), someone would be building a fab right now to do it.

What's crazy is that intel could've had 6 core mainstream chips since Gulftown, they're just so Jewish they refused to deliver it.

Attached: CPU Store.png (720x720, 277K)

Intel had the capability to built 8core16 thread CPU in Sandybridge era. They had Xeons. Heck, they even had 6c12 thread CPUs marketed for HEDT and were selling for $1000+. But they had kept it that way for as long as they could and as long as AMD remained dead.

The excuse just doesn't work.

The ammout of power needed in a cpu for day to day use was finally satisfied with dual cores around 2007, after that, the only real push in day to day perfor mance was the SSD's

Any performance problem on day to day usage is because of continous bloat from all flanks

>>How many cores does the i7 have now?
They only lowered prices and changed names, the tech is fundamentally unimproved

bcdedit /set useplatformclock false
bcdedit /set tscsyncpolich enhanced
bcdedit /set disabledynamictick true
bcdedit /set useplatformtick false

> Core count is CPU performance
More cores only helps for some applications, that's why 4 is more than enough for most "normal" PCs. I'm talking about pure single-core performance, because any application no matter how poorly written will run 2x as fast on a CPU with 2x the clock speed and same architecture. Stuff like how many cores to put in is, at this point, a business/marketing and systems engineering decision, not a key "target spec" of the CPU.

>Core count is CPU performance
It kinda is desu senpai

Attached: Multicore.png (422x460, 43K)

because there is no use for more processing power for "casuals" even with terribly inefficient web coding and HD videos
if only coders bothered to optimize then even much older cpus would be enough for everything

also multicore was a really significant jump and intels OCed real good to 5hzg, but since then it's been pathetic 5-10% improvements, Intlel got lazy

You guys do know that the Core architecture is just a continuation of the P3 after Intel realized that NetBurst was dumpster juice and getting it up to 10GHz or whatever crazy speed it was they had in mind would require an insane amount of power and cooling to match, right?

>10GHz
Lmao those dumbasses wanted to get the shit to 20GHz, maybe if the 1771w cooler that doesn't matter would've been available at the time, we would've seen that.

>1771w cooler that doesn't matter
made me laugh, thank you user

>2008
>open facebook, youtube and 4chanz
>500MB ram usage at most

>2019
>open facebook, youtube and 4channel
>4GB ram usage

Attached: 1555207268266.gif (300x300, 1.2M)

also the architecture and its legacy holding everyone back, all the progress been in gpus
and probably arm progressed more too

What you're arguing is that Intel held back innovation because they had a monopoly - I'm not claiming this is false (in fact, it's probably true). I'm saying there *wouldn't be a monopoly* if there was still a single, easily understandable way to increase core performance. That was the promise of Moore's law, and it is a promise that has now run dry in every parameter except maybe cost and power consumption.

Is the max die size of the process limiting the number of cores? If not, then core count is just a measure of what the processor company feels is most profitable to produce (since bigger die = lower yield + throughput = higher cost). That has nothing to do with technological innovation, it is purely a buisiness decision. In the era when multicore was just getting started you'd have a valid point, but that is no longer the case.

It's not about how long your dick is, but how much it stretches them walls

>implying women don't want dicks in their cervix-pussy

Attached: 1543285382450.jpg (250x250, 24K)

> 20GHz @ 1771W
Kek, I wish they'd done this just to give a giant middle finger to physics, they'd probably need like liquid helium or something to keep it from melting

this

Windows Aero

There hasn't been a fundamental revolution in computing use cases since the Internet. Hardware improved, but in the end we're still throwing the same fundamental algorithms and data at it and past a certain point you just don't notice something that took 2 milliseconds to execute now taking only half a millisecond.
Yes, many of us had access to Wikipedia in 2005 as well.

> One CPU for moar cores, one for really fast clocks

Damn, I didn't know I wanted this but now I really want this

>run dry in every parameter except maybe cost and power consumption
don't forget security. We hit the performance ceiling, but not without cheating.

5ch

I lived it, thanks. I had several 486-P4 machines. Clock for clock, the P3 ate the P4's lunch, and everyone outside Intel's Marketing department knew it.

I'm not really sure where you got the impression that I was denying that, but okay. I was more poking fun at you for trying to be a know-it-all.

>Firefox and Chrome regularly take up over 4 GB of RAM
How the fuck do you people do this? You'd have to have like 30 tas open.

single blank tab does it for me. if i start up another program that is a memory hog, no issues. so what i think is happening is that firefox requests X amount of RAM, but doesn't actually use it. Just keeping in just in case I guess, until another program with a higher priority asks for it.

My point was that a lot of the 12 year-old Fortnite addicts on Jow Forums probably did not know Core's heritage, and certainly weren't browsing Wikipedia before they were even born.

moore's law wasn't a one-off phenomenon it was a consequence of a new field of knowledge being dictated by market forces. eventually there will be a common mass produced design like bicycles

it'll happen when we figure out fusion reactors too

how do you know this?

>quad-core CPU that barely clocks over 4.5 GHz and has the IPC of a potato is still more than enough today
Ah yes. I, too, greatly enjoy waiting for things.

>IPC of a potato
>still beats ryzen when slow memory is used

Attached: 1485463825623.jpg (125x123, 3K)

>still beats Ryzen under this specific condition if you chose parts like a retard when building your PC
So don't use slow memory with Ryzen then.

>let me cripple a processor's performance by not using decent ram, then say that it sucks because i'm not using decent ram
>let me also make such a blanket statement about every single ryzen processor
>cores don't matter
>price of system build doesn't matter
here's your (((you)))

>Yes my main CPU is a 2600k, how did you know?

Attached: gigachad.jpg (1080x1331, 97K)

> Still on P4
Mad respect, my man

Attached: Boomers95.png (380x349, 180K)

liar. you're using an i7 right now, 8gb of ram and a sub $300 gpu. all on a win10 system.

The main reason is that a few companies have full control over the national infrastructure. So we all have shit internet because its priced to get the most money out of you. Also kinda reflective of people that use the internet If we give you 10 gbps for 100-200 a month what are we going to charge the boomer that barely needs 5 mbps to send emails and post on fb? If im a company that wants to make money off infrastructure that I already have I’d make 10 gbps seem like a myth.

>You'll obviously be able to use your 5 year old CPU to run current software when current CPUs are barely faster.

I have two machines here, my i5-4570 and the one I'm typing on, a Xeon 3220 (aka C2Q 6600). I do 90% of my client work on the i5 and everything else under the sun with my Xeon. The Xeon would have been bleeding edge tech 12 years ago and I still use it today just fine.

Pic related, my Xeon/C2Q machine still up and running just fine.

Attached: Screenshot from 2019-04-15 21-12-58.png (1920x1080, 1.7M)

Exactly. The browsers are written by brainlets because I'm pretty sure they're requesting memory from the OS and never releasing it until you quit the browser. Perhaps this has changed but a few years ago, the operating system (OS X, yeah, yeah. Applefag BTFO.) actually popped up a message box saying that the system is out of RAM. I've got 16 GB of RAM and hundreds of GB of available space on the SSD for a swap file if the OS needs it. Nope. Due to a combination of OS X being written by faggots like Tim Cook and browsers just sucking up RAM even if they're not using it, I actually ran out of RAM.

I hadn't seen an out of RAM error since the MS-DOS days and have never, ever seen it on a Unix that can use swap files to create virtual memory.

Ivy Bridge and Haswell should have never even been made. We could have gone straight from Sandy Bridge to Sky Lake and started increasing cores for the i7's.

what use are those on the side for you? do they play an active role in your decision making? embrace minimalism, and let it go
download more, ezpz

>They aren't really being pushed to innovate very fast anymore
Because the cost of R&D has skyrocketed. There are new breakthroughs in manufacturing that have taken years to develop. The industry didn't think they would be stuck on 193nm lithography for so long before shifting to EUV - we're finally going to start seeing EUV chips enter production. It's incredibly expensive and difficult to scale. AMD's chiplet approach is the next step for scaling multicore chips. It has much better yields, though is still an interconnect and cache coherence nightmare. We'll likely see 3D stacking within the next decade. Until we move off silicon, expect progress from now on to be slower.

The only companies licensed to produce x86 processors are Intel, AMD, and VIA. Maybe not a monopoly but absolutely an oligopoly. If there was more pressure we would absolutely have better processors today.

>what use are those on the side for you

what are you referring to?

heat

Transistor density hadn't peaked

The stats on cpu freq, temps, usage etc

I like to keep an eye on the state of the machine. Gkrellm has been handling that for me for the last 19 years across every distro I've ever used since Mandrake 6.1.

Nice pajeetOS, wintoddler.

>AMD’s Zen 3 CPUs can increase transistor count by 20% over Ryzen 3000 thanks to TSMC’s 7nm+ node
pcgamesn.com/amd/zen-3-cpu-tsmc-7nm

that's a really nice neofetch

/thread

I like linux user, but you have to admit windows does real work (on the desktop) and basicallly Jow Forums is just thinkpad shitposting.

they really really really dont

hahahahahaha

Maybe try to play ARMA

>AYYMD IS GOOD TOO RIGHT GUISE?

Attached: 2f7.jpg (601x508, 31K)

Intel focused on power saving instead of performance
Turns out that most normalfags are happy with a celeron so marketing faster CPUs makes no sense

The fuck was in the itanium?

Intel knew back in during the Pentium III the fuckton if x86 instructions would hold everything back, that’s why startling with the P6 arch the CPUs all turned RISC and CISC instructions where turned into micro ops to run on the RISC cores