Why are we going back to terminals

and every thing is a mainframe in the cloud?

is it because personal computing as peaked? desu my 2008 4.5ghz core2duo that I did push to 5ghz which it ran at for like 2hours before it died isn't really that much slower than a 8350k at 5ghz.


wtf is going on... I know the IPC of the 8th gen is a fair bit better but in BF4 it would only translate to like 20% more fps.


WTF is going on is intels chip stacking going ot save us.. are programers just hording multicore programing knowlage? is it some conspiracy?


obviously big companies want to charge us 15$ a month for some cloud program than some thing running on our own.

Attached: Early_Macintosh_Prototype_Computer_History_Museum_Mountain_View_California_2013-04-11_23-45.jpg (2592x1936, 1.66M)

the core2 I pushed to 5ghz for afue hours default clock with 2.8 or some thing I forget the model. it ran at 4.5ghz for like literally 8years thou.

>desu my 2008 4.5ghz core2duo that I did push to 5ghz which it ran at for like 2hours before it died isn't really that much slower than a 8350k at 5ghz.
HAHAHAHAAH

desu I run my Q9505 at 2.004V

>muh meg...gigahurtz! xD
Yes, paralyzation and cache are the next big things to improve performance, it's software that's lagging behind, clock rate peaks at 5GHz on average but means nothing, there are chips that run at 3GHz and still beat a 5GHz ones in single core performance.
Better fabs, higher clock rates, more efficiency, bigger cache, more cores. Development for CPUs is going faster than ever again since competition on the market is at a peak again.
Games are the wrong thing to use for comparing since very few are well optimized and the ones that are use that optimization to allow for more detail/graphics.

Next big things will be Intel's first 3D chip and AMDs next 2.5D chip and after that, probably AMDs 3D chip if Intel still lags behind, probably in the early 2020's.

>gong faster than ever again
>5% real world improvement in a year if you're lucky
>vs real world performance literally doubling year over year in the 90s
Pull the other one.

be thankful your teacher is not making you go back to oscillascopes wanker

Concentration of power/data. The owners of the servers will be the gatekeepers to knowledge.

the 8core ryzen is 75% faster than my 5 year old 4770K, negro.. Meanwhile 2600K OC vs 7700K both 4c8t and 5 years and virtually no difference

Except power efficiency.
Plus you can't compare two different microarchitectures from different companies, AMD was behind in the game for a long time.

Bloat is giving the impression personal computers are "not enough" to handle software these days.

Attached: gates.png (612x688, 621K)

Because remotely accessing another machine doesn't need GUI overhead

Our reptilian overlords are addicted to spying on us and collecting every bit (haha) of data on us, to the point where they measure the thickness of our turds.

>We stopped making gains where it counts, so we will trivialize its importance and focus on the lesser gains we've made to try and hide that really this shit is the same as ten years ago with bells and whistles added.
Fuck Intel and fuck AMD you've been scamming us for a decade and a half.

>competition is at a peak
>three company oligopoly with heavy regulation preventing anyone else from joining in a profitable manner, on purpose, because of the oligopoly's lobbying power.
Fuck off.

>reptilian overlords
disinfo detected

Processors are so fucking good that they literally can't get data fast enough to be able to process it as quickly as they can and not be idle like 60% of the time. This has been the case for decades.

Wrong. It's because applications continue getting more complex while processing power has reached a plateau due to energy constraints on mobile devices. This calls for cloud processing.

Attached: 1429912232149.jpg (720x439, 47K)

To be truth I missed *depending on the application.

For the same reason PC became popular, you won't see any movement off those CPUs anytime soon in desktop use.
They will only decrease in popularity when desktop computers fall out of favor for normies.

Yes, competition in the last 10 years, since the last time it had any relevance. We are talking x86 of recent times here, no other architectures or companies from the 90's and 00's.

Uselessly complex*
It's also known as bloat.

>applications continue getting more complex

*inhales*

HAHAHAHAHAH!!!!

Attached: Lucas Hershlag Pointing Laughing.jpg (500x426, 84K)

>bought a, "AMD Phenom II X6 1090T Black" and built a PC nearly 10 years ago
>haven't updated it since, don't even remember what graphic card I'm using.......ah, "2048MB ATI AMD Radeon HD 6900 Series" 6950 I think?
>play latest games on full graphics (not modded) and everything works fine
>bloated web browsers slow it down now (wtf firefox you used to be fast and you do nothing new)

Everything is stagnated and bloated like a dead hog int he sun. There was a time I had to buy a new CPU and GPU every year to play the latest games. On one hand, I'm glad I don't need to do that now. On the other hand, that says a lot about how bad things really are.

Name these other companies replacing the obvious oligopoly. Not happening.