Remember when you could spend $3,000 to $5...

Remember when you could spend $3,000 to $5,000 on a high-end machine and two years later it would be a hopelessly obsolete piece of shit because the new hardware was an order of magnitude more powerful?

Whatever happened to that?

Attached: hqdefault.jpg (480x360, 10K)

Other urls found in this thread:

cnet.com/news/nvidia-cuts-prices-on-gtx-260-280-graphics-boards/)
forbes.com/sites/jasonevangelho/2013/10/28/nvidia-issues-aggressive-33-percent-price-drops-for-geforce-gtx-780-770-graphics-cards/#6ddf1d5a1a17
twitter.com/SFWRedditVideos

Overall cost, blame IBM, they fucked it all up.
Quick advancements, blame no competition, shitty consumers enabling it, predatory practices removing competition from the scene (Cyrix, VIA, NEC, etc.)
Defacto monopoloies with no challengers. This is how shintel managed to only get 5% gains every year the past 8 years.
Novideo at least makes better shit but holds back uintil they have competition to actually price gouge or when they're beat, slash their prices by $150 to maintain marketshare, because they have infinite money to burn, and retards keep buying their shit even when other stuff was better. (cnet.com/news/nvidia-cuts-prices-on-gtx-260-280-graphics-boards/)
Also, shitty OS monopoly (ok, ok, duopoly) forces software to be written for a shitty OS, new generations of retards who don't care about how underlying things work making bloated dogshit in java, etc.
I'd go on, but it gets too depressing.

CPU-wise, Intel not having any serious competition stifled advancement for a while, but now it's finally gotten interesting again.
Not an expert when it comes to GPUs, but competition in that market doesn't seem too great either.

> Whatever happened to that?
You are a single desktop user with little needs, computers just got fast enough for your needs & personal computing is strongly focusing on smartphone-type computers anyhow. Which are not really up to par with even 10 year old computers.
This isn't a jab at these amazing low-power and compact machines, but it partly explains why a 10 year newer personal computer desktop is a lot more than anyone wants to enable use of for single home users.

The recent high end still "obsoleted" the 2-5 year old high end if you dearly needed the processing power.

Competition is irrelevant for GPUs.
Even when AMD had superior products for lower prices damn near no one bought them because Nvidia's marketing is twice as strong as AMD's is terrible.

Things have still been improving, its just that everything other than the cpu has improved.

>amd
>not ATI
>not 3dfx
fuck off newfag
forbes.com/sites/jasonevangelho/2013/10/28/nvidia-issues-aggressive-33-percent-price-drops-for-geforce-gtx-780-770-graphics-cards/#6ddf1d5a1a17
dont forget this one. same thing happened AGAIN

Attached: pc-9800.jpg (1200x900, 123K)

so many words for such little understanding
other than more detailed games, higher resolution cartoons and whatever trendy fad technology is working you up into a lather this month you’re doing the same exact shit you were doing 20 years ago with algorithms that were designed 40 years ago
in those few applications that take advantage of faster hardware, they’re getting fucked by trendy bloat and also the laws of physics causing the hardware itself to hit a wall in advancement

it’s not all some fucking ebil corborate consbiracy bullshit no matter how easy it is to blame it on that when you’re too stupid to understand reality and you just want something to feel victimized by for the sake of it

AMD are considering their Ryzen line as future-proof in the literal sense. Technological advancement in processors is dead.

>Whatever happened to that?
sandy bridge happened

A part of it was Windows XP's long run, and for a while they couldn't think of new ways to bloat stuff up.

Mah mate most AMD GPU are overengineer, lotsa shaders and quite good specs, but devs always fucking optimize for Nvidia, so yeah, they are not many developers embracing DX12 and Vulkan where AMD shines and by the time they do Nvidia will have a workaround for the problem it has with Maxwell and Pascal where it i most cases it actually performs worse than DX11.All in all its all about dem moneyz.

Attached: 1524877306425.jpg (1024x1018, 78K)

>Whatever happened to that?
2 decades

Because evolutionary progress slowed down for technical reasons. And there is no point to expect anything revolutionary from anyone, since normies don't care that processors barely jump 5% performance gen-to-gen.

3000 dollars wasn't a lot a few years ago.

>Whatever happened to that?
amd fucking up for almost a decade and leaving nvidia and intel to do nothing.

At least Ryzen is goat and Vega scales down well (doesnt scale up well obviously)

>Overall cost, blame IBM, they fucked it all up.
Manufacturing process requires more precision now and became more complex
>Quick advancements, blame no competition, shitty consumers enabling it, predatory practices removing competition from the scene (Cyrix, VIA, NEC, etc.)
According to you, more market fragmentation is a good thing. These manufacturers couldn't stay competitive in the face of change. And even if they stayed, what would be their purpose? Yet another x86_64/ARM CPU manufacturer? Everyone would pick the best option on the market regardless and they'd go bankrupt anyway.
>Defacto monopolies with no challengers. This is how shintel managed to only get 5% gains every year the past 8 years.
We've reached a plateau and need a new manufacturing process for the advancement to be kickstarted again. Going beyond 16 cores/4GHz with current tech isn't worth it.
>Novideo at least makes better shit but holds back uintil they have competition to actually price gouge or when they're beat, slash their prices by $150 to maintain marketshare, because they have infinite money to burn, and retards keep buying their shit even when other stuff was better. (cnet.com/news/nvidia-cuts-prices-on-gtx-260-280-graphics-boards/)
What an argument. They're a company, their goal is to make money.
>Also, shitty OS monopoly (ok, ok, duopoly) forces software to be written for a shitty OS,
Linux is too fragmented to be accessible and I'm pretty sure you'd be whining too if everybody used Ubuntu instead of
>new generations of retards who don't care about how underlying things work making bloated dogshit in java, etc.
Hey grandpa, nobody programs in Java outside of enterprise anymore

If software efficiency improves alongside hardware, we should reach a point where it doesn't make sense to upgrade hardware for decades.

Are you implying money used to be worth *less* before, you fucking retard? Ever heard of inflation?

fuck off you child, do you know what inflation is?

>you must be over 18 to post here

I'm pretty sure he meant that you could buy more with 3000 therefore it wasn't a lot. Now you can't buy much with 3000, therefore it's a lot of money. Like when you look at something and see the price tag and think "that's a lot of money". Back then seeing the same price tag wasn't that much of a big deal since money was worth more and you could afford it more easily. It's a bit of a retard backwards logic, but it sorta makes sense if you think long enough.

>According to you, more market fragmentation is a good thing

Are you seriously implying that *less* competition is a good thing?

I'm not even trolling, I'm genuinely curious. Are you serious?

Attached: 1520708229878.jpg (460x350, 19K)

>2 main x86_64 CPU manufacturers
>30 different sockets
>can't upgrade CPU without getting a new motherboard unless you were one of that 0.1% who got a mobo update
>old CPU makers return
>8 main x86_64 CPU manufacturers
>30^n different sockets
etc. etc.
Competition in the ARM world works slightly better, but x86_64 is a clusterfuck and it's going to stay that way until all the CPU makers agree to use a single socket, which is never going to happen

nobody bought itanium so nobody bothered fixing the compilter for itanium

isnt the only push in hardware advancement for vidiya and ai? 15 yr old computers run most shit fine, the bloat has increased, so its a false slowdown? i honestly dont trust a reasonable cost system bought right now everything runs like shit out the box, phones, tablets, and computers

Mostly it's datacenters, the advancements just trickle down to consumers as well.

So are you saying that it would be better if there were just one company?

We aren't talking about fantastic hypotheticals of everyone using the same socket user. Are you saying things would be better if there were fewer CPU manufacturers?

Attached: 1519290754608.jpg (562x530, 62K)

what good is the advancement when its only purpose is to lock people into a planned obsolence and information gulag? they sell all those nifty buzzwords on cpu's ram and gpu's but firmware and backdoors kill the pure power

>Remember when you could spend $3,000 to $5,000 on a high-end machine and two years later it would be a hopelessly obsolete piece of shit because the new hardware was an order of magnitude more powerful?

What era are you talking about? Before the 90s it was features and software you paid for. After the 90s with Microsoft and Apple on the field we had established standardised features and software then performance became the huge factor. I'm not aware of a time when a 3k-5k machine was obsolete after two years so if you could please provide dates, specify hardware and price and give an example of hardware released two years later that made it hopelessly obsolete. Performance has always been gradual 5-15% at most annually and for the last 15 years it's only been 5%

Failing to do so will mean you're a newfag underage b&

No, I'm saying we don't need more than we already have. It's too much of a clusterfuck already and introducing more competition where each company would desperately try to lock out the other ones from its tech in an attempt to seize a part of the market would create a compatibility nightmare and be a pain in the ass for the consumer.

With GPUs it's different because they all use the same slot and it all just werks, all you need is hardware-specific drivers
If the CPU world was like this then I'd be more than happy to see more competition

>1998 PC, fucking unbearable by replacement time in 2005
>2005 PC, fucking unbearable by replacement time in 2009
>2009 PC, still going strong in 2018
On one hand, it would be nice if new PCs were faster. On the other, I love this computer, and plan to use it at least until its 10th birthday.

performance increases exponentially but so does development "difficulty"

But a 2006 machine was good until the mid 2010's.
Core2 and the 64 X2 were pretty big leaps over the single-core offerings.

>2005 PC, fucking unbearable by replacement time in 2009

Disagree. My dad gave me his old T40 (2003 machine) when I started Grade 9 in 2008, and I used it until I started university in 2012. With the 1400x1050 screen, 1 GB of RAM and a faster 7200 RPM hard drive it was perfectly usable that entire time. By the time I replaced it with a T430, it was already 9 years old, and only struggled a bit with the more faggy resource-intensive websites. Was totally fine for 95% of what I needed it to do.

If you're referring to gaming PC upgrade cycles though, I agree. My 2005 Pentium 4 rig was completely useless by 2009-2010. Which was probably good, because not being able to play the latest games incentivized me to study and get into a good school.

Little of both. It was pretty damn slow for general use, and at games it was just awful, even with an upgraded GPU. I remember playing the Batman Arkham Asylum demo in slow motion. Didn't realize it wasn't supposed to be slow like that until I got the new PC. It also had a tendency to overheat to the point of shut down in hotter weather because P4, prebuilt cooling system, and the aforementioned upgraded GPU. I'm not actually sure if the HDD was 7200rpm or 5400, so that could definitely be a factor.
>tfw my dad had a similar Thinkpad, but had a security clearance, so no way he could give it to me

competition isn't the magical cure-all Jow Forums and gamer faggots probably told you it was, fragmentation was shit and it ended because it was shit; rampant vendor lock-in, zero standards, unnecessary incompatibility, lack of stability, there are a lot of disadvantages that come from having 30 companies desperate to carve out a niche that makes having to pay $50 extra for an upgrade to your gaming shitbox you didn't actually need sound pretty tame

Oh man, nostalgia hit me hard. My 2006 athlon 64x2 6000 with upgraded geforce 9800 sounded like a jet engine when starting up. I got addicted to eternal upgrading after the 9800 got toasted playing Witcher 2. Fuck it was a happier time not thinking about hardware and just playing what I could.

amd / ati never recovered from the hd 2000 series. the 2900 really fucked amd over and its sad because the hd 2000 series was 100% ati before amd acquisition. when amd finally recovered with the mid-range with the 4000 series and high-end with the 5000 series everyone was already knee deep into nvidia's cock. see the 400 series vs 5000 series. nvidia only had a small ass ~10% lead with the 480 but used more power, heat, and higher price but nvidia still sold more 480's / 470's than amd sold of their entire 5000 series. can't even say drivers as drivers between the two at the time were nearly identical until nvidia did their multithreaded driver in 2011 for a free performance increase in single threaded bounded games that amd was to retarded to do and amd entered that toxic funk of only 4 drivers a year for two years straight that brought horrible driver stability during the r-200 series.

not only didn't care about the heat and power but wore it as a badge that they could cook an egg on their 480. they didn't start caring about power consumption until the 600 series when nvidia gutted out half the hardware in their gpu's and moved it over to software such as the scheduler. amd had a very competitive line with the 7000 series. first to release the new generation of cards, and had the faster single gpu with the 7970ghz edition by like 10% but used 30 more watts than the 680. but now people "cared" about power consumption because nvidia cared. didn't even matter amd had sweet spot cards like the 7950 or the 7870 that had great power-perf-price ratio. nvidia had better power consumption with the 680 vs 7970ghz edition and therefor amd garbage.

the cpu market on the other hand well intel and amd ran into limitations with silicone. you can only shrink the processes so much before you run into to many diminishing returns. its why they're now focusing on higher clocks and MOAR CORES! you can't just cram in more transistors for free ipc gains anymore.

Attached: 1486342337991.jpg (1816x1710, 861K)

The traditional way of doing processors is hitting the limit. x86's domination doesn't help.

600 vs. 7000 series was really weird to read about as I was getting into consumer hardware after a five year pause. Nvidia fans in every fucking hole praising the slightly lower TDP and 2GB VRAM in 680 and both "sides" just acting like children. First contact into GPU fans. Eugh.

>market concentration is better because I don't understand things
Great non-arguments, retard

Attached: ahwa.jpg (2420x2480, 160K)