Hardware moves so slow these days

So in late 2013 I went and decided to build a beast gaming PC.

Ended up with a 4770k, 780 ti, 16gb of ram, 1tb evo ssd, etc. I'm just surprised how almost 6 years from the date that I've built this thing, I'm not compelled at all to replace it with what's out there currently. It still plays modern games in 2019 just fine by tweaking some settings and I find that I really don't game much like I used to and when I do it's mostly older stuff anyways.

Overall though it's kind of sad looking at the progression in hardware in the last 6 years compared to the past. If you compared top of the line hardware in 1993 to 1999, there was a huge jump. 1993 hardware wouldn't even be able to run a game like Quake 3 in 1999. Same with 2003, you couldn't run a 2009 game like GTA4 on a top of the line 2003 machine.

I think I'm going to see how 2020 plays out in the hardware space and make the jump to 4k then. Is 4K worth the jump? Looking at the current landscape right now the only thing my machine can't handle is 4k and Minecraft RTX.

Attached: download.jpg (337x150, 9K)

Other urls found in this thread:

gpu.userbenchmark.com/Compare/Nvidia-GTX-780-Ti-vs-Nvidia-GTX-1070/2165vs3609
anandtech.com/show/14590/vesa-announces-displayport-20-standard-bandwidth-for-8k-monitors-beyond
anandtech.com/show/14447/pcisig-finalizes-pcie-50-specification
anandtech.com/show/12095/hdmi-21-specification-released
twitter.com/IntelGraphics/status/1167622125412392960
lists.freedesktop.org/archives/amd-gfx/2019-July/036727.html
twitter.com/SFWRedditImages

I went from a 3770K system to a 9900k system and you are right that the jumps are not drastic, at least not for the average user (like me). When I upgraded to 3770K from some Core 2 Duo it was a similar amount of time as going to 3770K to 9900k and it felt MUCH more drastic.

Since I'm a boomer I like to think of it like going from a regular car to a sports car, and now I've upgraded from a sports car to a better sports car but the speed limits are still the same.

I just replaced my 4790k 980ti to a 9900k 2080ti and it was worth it

i'm just going to vent here that in the 90's my childhood was ROBBED by nintendo, I saw a gameboy pocket and just somehow assumed that this thing was the source of all games, until I saw the next nintendo unit, and the next one
and I assumed for some reason that people only made (((educational games))) for pc like freddie fucking fish or whatever it was called, and I had no idea what doom, quake, starcraft or anything even was.
I didn't play my first pc game until the slow retards at microsoft finally ported halo CE to windows and by then it was only LAGGING

The prices of the top of the line is really outrageous compared to the past as well. Both Intel and Nvidia are really price gouging these days.

I'm sure it is, I'm just saying the jumps in performance and new stuff just isn't nearly as exciting as the jumps relative to the past were.

god fucking damnit

Attached: 1564025258476.png (300x298, 210K)

Finally, a fucking good thread on Jow Forums, I've been waiting weeks for this.
That aside, maybe this hard wall in consumer technological progress is due to a lack of competition? AMD really isn't doing that well in the GPU area (maybe the supposed intel GPU will do something, but I doubt it)

Same here. Still using my 4770k+R9290. Dark ages of computing.

I feel like AMD hasn't released a decent gpu since the 7970. Looking at their top end offerings today, its pathetic really, and has allowed Nvidia to price gauge their cards.

It's unironically a good thing brainlets. Less upgrading and devs forced to optimize their games

Last time I've felt a notable CPU performance difference is when I upgraded to a Core 2 Duo E8400 long ago

Attached: 1552578436059.png (419x421, 194K)

>Is 4K worth the jump?
For the stellar crisp and clear fonts?
Definitely.
For the impeccable looking visuals?
100%
For Gaymen?
Fuck no. Even 2K is a struggle.

Going back to q6600 my parents dekstop in living room, i replace the drive to ssd, yes not much different for daily task. Maybe some gaming or heavy editing for videos.

Agreed. Pic related is left to right: 1997, 2003, 2008

Attached: 2019-08-30 14.28.48.jpg (1331x998, 380K)

I finally upgraded from sli 760 to a 2070 super. Took me years to do it

>devs forced to optimize their games
We both know that's not happening in the slightest.

Attached: 1528774550634.jpg (520x519, 76K)

Their position in the market is more solidified now so it's natural if you are buying components from them that you are paying somewhat of a premium these days. Worse is the artifical reduction in features to have a segmented line up of products, like the 9700K being the same as 9900K only with the HT turned off.

I have a 4K 60hz monitor and a 155hz 1440p monitor and would vastly recommend the latter.
Only for AAA shit. While my 1070 was across the country I had a temporary 2400G machine and the fucking integrated GPU could run things like the original Payday and a lot of other shit I play at 4K medium settings. And if someone mostly plays indie shit, 4K is achievable with basically anything. The idea that you need something crazy for 4K is a meme. Most people play esports shit like CS:GO, Overwatch, etc. I'd say a 1060 could easily handle those at 4K.

I've almost same specs: i7 2600, 16gigs ram, 780Ti, 256GB Evo.

>right now the only thing my machine can't handle is 4k

I'm literally running dual 4k displays off of this card, but I only play shit games like CSGO, Minecraft, WoT and GTA V. The card is not as bad for 4k as u might believe. I'm certainly not gonna upgrade anytime soon, this setup is still very good Imo.

Attached: tonk.jpg (685x641, 79K)

The biggest thing I felt with the 780 ti is the 3gb ram at 4k, but for the games you listed it shouldn't be an issue. I just want to be blown away like I was from the jump to my core 2 quad with a 5770 to my current set up.

Personally waiting at least a year before upgrading, but possibly longer. Hoping for a compelling reason to upgrade in the next year or so.

>I'm just surprised how almost 6 years from the date that I've built this thing, I'm not compelled at all to replace it with what's out there currently.
This is what poor people do to console themselves on why they use shit hardware, drive the 30 year old beater, or live in the roach infested shack.

It has always moved slow. People were using Apple IIs from 1980 until 1990

Did you throw your computer away?

Are you 16 years old faggot? The point isn't that I can't afford to. The point is that the hardware jumps aren't nearly as big and not as exciting as it was in the past. The 4770k to 9900k and 780 ti to 2080 ti really isn't that impressive compared to the 90's and early 00's in a similar time frame.

Older people don't value novelties as much as they did when they were growing up. I still get excited for new hardware, but I can think of a thousand other ways to better spend that money that, say, a meme-tracing card would cost me.

Attached: religion.jpg (474x844, 107K)

Yeah, but even a 1070 shits on a 780 Ti. Especially noticeable in games.
I'd say CPUs have aged well, but GPUs age really quickly.

Depends what you do with your PC.
Going from a 3rd gen i7 to 9th gen is noticeable for example when you run emulators like Yuzu.

gpu.userbenchmark.com/Compare/Nvidia-GTX-780-Ti-vs-Nvidia-GTX-1070/2165vs3609

But it doesn't. I mean it's averaging 30% better frames in games and 44% better at all sorts of benchmarks. Great. But it's not exactly a mind blowing jump and it's not moving any game on the market from unplaylable to playable.

5700/xt are pretty decent for their price, at least you're not overpaying for some useless memecores that tank image quality and performance when used.

>userbenchmark
Use something legitimate next time

Wait for the PS5 specs to be officially confirmed some time next year.

As long as your PC is significantly faster then average console you'll be fine for the whole next game generation.

Attached: bork bork bork.gif (500x562, 1.06M)

But games for the first half are just going to be ps4 ports mostly

you really only need top of the line hardware at this point if you want higher refresh rates

if you're at 1080p/60hz you're still good for the most part. i'd probably upgrade that gpu to an rx 580 or something a little higher though.

keep telling yourself that. maybe one day it will be true [spoiler] it won't [/spoiler]

Not really. I paid $1000 for an i7 980. The fastest i7 or i9 now is a lot cheaper.

The Law of More performance is slowing or spmething.
Good, I only play PlayStation games.

Haha just kidding, More Law slowing down doesn't mean RISC-V and no proprietary software. It actually means all code and compiler in the cloud under lock and key since you can't run it locally without a $5 trillion dollar infrastructure.

Attached: tumblr_mnmk5vjW8E1r7amvbo1_400.gif (400x462, 96K)

>Gaymes

Except you can buy a card 40 to 50% faster for half of the 780 ti price.
While the 2080 is almost 3 times as powerful as the 780 ti
What exactly do you want? For top tier gear to be obsolete the very next year?

>1993 to 1999, there was a huge jump
It's almost like it's harder now. Hmmm big brain moments for OP. Fucking retard

In using a 2600k & 16gb of based Samsung ram & an ssd. Did purchase (steal) a gtx 1070 reference card tho.

Feels good not having spent money since 2011 on pc hardware

>i7 3930k
>dual HD 6970
>24GB RAM
>360GB SSD
>still runs everything I do just fine

I understand that there's a host of factors involved in the slowing down of the current hardware landscape numb nuts. I'm just bummed in a way that the speed and tech jumps seem to have slowed down compared to what I was used to in the past during the 90's and 00's.

anandtech.com/show/14590/vesa-announces-displayport-20-standard-bandwidth-for-8k-monitors-beyond

anandtech.com/show/14447/pcisig-finalizes-pcie-50-specification

anandtech.com/show/12095/hdmi-21-specification-released

You should wait for 2020 anyway, next gen GPUs will support new features as baseline

i bought 2nd hand dell t3500 2 years ago for 150 bucks. It is awesome how some 6c12t old-ass xeon cpu performs almost as well as a 2nd gen ryzen.

The i9-9980XE is like 1900 doll hairs

Yes he wants to be forced to (((buy))) a new gpu every year

Your 2013 pc can't run dx12 dxr rtx rt stuff though so there are games u can't run
T. 2080ti fag

2k looks nice on my dell u3011

>everyone ITT using gimped intel cpu's

twitter.com/IntelGraphics/status/1167622125412392960

>Intel supports integer scaling
>Gimped AYYMD HOUSEFIRES doesn't

KILL YOURSELF AYYMDPOORFAGS

>intel shill posting intel propoganda

Anyone else had the hots for Freddi?

>More Law
Please tell me this is bait.

Attached: bio-moore_gordon_2005.jpg (262x262, 45K)

Not in my case. I went from 970 to rtx2070 and the jump in terms of resolution was massive.

In the late 90s top tier gear WAS obsolete in a year, thats how fast improvements have been made. Clockspeeds topping out in ~2003 were the end of crazy performance gains.

I'm still on my shitty 1080p monitor, I have a 1440p monitor at work but even there I can see the pixels which is why I'm desperately waiting for 120hz uhd. Does anyone know when technology will be finally ready for it??

And that's a good thing. Literally the best time to be a pc gamer.

Going from a 780 to a 1080ti was a great upgrade. Goinf from a 4700k to an 1800X, not so much...

planning to go from 3770k to a 3700x or a 9900k + a 2070s. this is very likely gonna be my last major platform upgrade since as I'm getting older I really don't have as much of a desire to play games as much. I could maybe swap out some parts if they fail or such but aside from that it's a crapshoot for me

>30% - 44% better performance in 3 years going from a high end product to a mid range product
>not impressive
You're dumb

>my PC is still very usable despite being 5 years old
Some people will complain about the damnedest things.

Freddie Fish is a good game tho.

Stale pasta

I played Freddy Fish. It was a little childish to me when it came out though,
I grew up on Midnight Rescue.

Attached: super-solvers-midnight-rescue_16.png (320x200, 3K)

sure?

Why would you do this?
>Goinf from a 4700k to an 1800X
The only noticeable gain between the two is multi-core performance thats it.

This guy gets it. You could buy a computer for 2,000 dollars in the 90's and in 12 months buy a 1,000 dollar computer that was twice as powerful.

test

i feel like AMD released a new generation of cards right as higher refresh rate and higher resolution monitors started getting cheaper so there is a lot more of a reason for them to segment their cards heavily, mostly so they can keep price/performance the same while bar keeps rising

the last card i bought was a 6870 and when i bought it for $200 it was a solid midrange card that could play games at 1080p on medium-low at 50-ish frames per second. even high end cards couldn't play things maxed out smoothly. nowadays a $200 card will. not at higher resolutions, but that also means that you don't have that limitation slapping you in the face every time you use it. its almost better than the way things used to be, even though its blatantly anti-consumer

this was basically my adolescence so looking at "8TH GENERATION INTEL CORE" doesn't make me feel like we're living in the future it feels like some kind of extended purgatory where nothing has improved in an unambiguous fashion

this is only a likely outcome if people are stupid and go along with things that are blatantly against their best interest with no benefit to doing so

but i can see it happening to people like you that are just regurgitating what you heard somewhere else without any real regard for making it understandable by other people especially someone with a bad enough sense of proportion to pull "5 trillion" out of your ass

that's half the GDP of the united states you dipshit

>half

Attached: TRUMP.jpg (1000x548, 53K)

I built one in 2009 and still don't need to upgrade it. I recently stopped playing video gaymes so I will probably never upgrade now unless there's a failure. The main thing now is to make good hardware obsolete by introducing some gimmick shit for rendering 3D for video games. For everything it really doesn't fucking matter. lol

Attached: Do-you-think-this-is-a-game.jpg (500x377, 37K)

hardware gets cheaper, faster, cooler, quieter. games get patched and discounted. there's really no downside to waiting.

it's a great day to make some data backups, test the batteries in your UPS, and call your parents for a chat.

>780 ti
Kepler has aged like milk though. You're right though that 2 GB DDR5 GCN cards have held up very well and any 4.7 GHz 4c/8t since sandy bridge will do just fine still. I blame the duopolies for the stagnation. If VIA, IBM, Matrox, Silicon Graphics, 3DLabs, or 3dfx were still making desktop PC hardware we'd see more progress annually.

We should all hope Intel's Xi gpus are competitive, and RISC-V cpus become widely available in the near future.

There hasn't been a lot of advancements in processing power compared to the jumps seen in the 90s and 00s but there has been serious advancements in efficiency resulting in less heat and power consumption.

Does it? Also you should have bought the dual CPU variant

It was because everyone knew the path forward and there were a few companies actively pursuing it. Iterations were obvious engineering challenges, solved quickly for low cost. Once all the easy problems were solved, and shit became more theoretical, the improvements slowed. Exponential growth/improvement cannot happen forever. At this point, making a slightly better CPU or GPU takes tens of millions to attempt, and success isn't even close to guaranteed.
This is natural. The technology has matured.

>it's a great day to make some data backups, test the batteries in your UPS, and call your parents for a chat.
comfy

now lets compare a more "reasonable" build option. an r5 3600 + 5700xt build is what, half the cost? how much worse does it perform? from what I've seen you get like 80% of the performance for 50% of the cost. lack of competition from AMD had pushed upper mid-range builds into the $2000+ area but now it's back down to not much over $1000. adjusted for inflation that's not much more than a higher end build when I started getting into PC gaming which was 2011 ish.

new-ish GPUs are really all you need. processors from 8+ years ago still break 60 easily even at 1440p in most games.

Built my pc in 2012 and only bothered to upgrade my gpu/ssd/monitor.

My cpu has never bottlenecked me and everything else just works. My friend has a 4k new build and it's just not worth it. I run games at 1080p 144hz and he does it with 1440p with better settings. It's just not comparable to old upgrades.

...

They haven't officially yet but there was a commit dating to July earlier than Intel's announcement for support. I am guessing the display side is done but they have to add more stuff to the kernel driver to have it work everywhere.

lists.freedesktop.org/archives/amd-gfx/2019-July/036727.html