Has the Moore law failed us? Why I'm still able to play most games with an fx and a radeon 7970...

Has the Moore law failed us? Why I'm still able to play most games with an fx and a radeon 7970? It's a 7 years old rig ffs. Imagine trying to play anything in 2006 with a 500 mhz slot a and a rage 128 pro.

Attached: s-l300[1].jpg (300x216, 13K)

Other urls found in this thread:

asus.com/Motherboards/SABERTOOTH_990FX/specifications/
techspot.com/review/1418-geforce-gtx-780-four-years-later/
guru3d.com/files-details/download-inspectre.html
twitter.com/SFWRedditVideos

Fine wine.

Nah, a 2500K and a 780ti will still run pretty much anything on medium/high. Development has just slowed down.

Mostly because of console gaming. Since most games are multiplat a PC that mostly matches consoles will be in the clear.
Hell, even PC game development hasn't made that many strides this console generation. I built my PC for about 600 bucks in October of 2013 and while I have to switch to low settings in newer Total War games or Pathologic 2, I still usually get 60fps.

Fine wine

This, it's consoles that hold game development back. Most of the code for these games are just middleware, and that's all made for consoles.

Exquisite vintage, my good Xir.

console gaming and lootbox bullshittery force all recent games to be able to play with the most potato pc you can get your hands on 7 years ago.

A fine wine with a complex bouquet.

>still running fx 4100 because i'm poor and frugal as fuck
>about to upgrade to R 3600 next month

My body is ready.

Attached: 1400833321343.gif (170x207, 80K)

Thats a good thing

And a velvety mouth-feel.

I would describe it's typicity as: Ahead of it's time.

console gaming i can deal with because eventually consoles with up their graphics to match customer demands for more graphics but lootbox crap will downgrade future games hardcore because devs need to expand their player base as much as possible even to those who are still using optiplexes for that paypig/whale bux.

>rig

Attached: 1559677506219.png (772x579, 576K)

-console gaming
-making more graphically complex games cost devs much more money than they were before

it's actually not because it means we're not using all the technology in games that we could be using

I'm getting earthy flavors with a subtle hint of leather.

>fx and a radeon 7970
Based Vishera.

This.

Attached: 4500MHz.png (1920x1200, 1.13M)

I'm still using a i5 2500k and R9 nano.

The 2500k is still great, but I despise the R9, it's a pain in the ass to deal with.

Once I have the money I'm either gonna get a 2700x, or i7 8700k, plus a 1660ti or 1070.

>Have a 5.5GHz @ 1.53v capable prime95 stable FX-9370 rotting in my closet
>Had pushed for 5.6GHz because greedy
>Blew 2 VRMs on my Asus 990FX Sabertooth
>Now all the half way decent boards are $150+ on eBay and ddr3 2400MHz kits are the same price as ddr4 3000MHz.

Nooooo.

because you play with a gpu not cpu

Well you will bottleneck after a GTX 1060, but FX-8350 still gets it done.

>prime95 stable
Wish i had that chip, would love to see what it would do at 1.45 or 1.4v.

asus.com/Motherboards/SABERTOOTH_990FX/specifications/

Supports AM3+ 32 nm CPU
Supports CPU up to 8 cores
Supports CPU up to 140 W

How did it even boot. Isn't the 9xxx serie at over 200 watts?

Moore's law slowed down. Not that that's a bad thing, of course. Right now we're at a fairly good point, where development is slow to the point where you don't have to get a new system every 3 years, but also fast enough to the point where there's a reason to buy the top end.

Attached: 2990WX.jpg (1000x582, 262K)

>8350 + 7970
Had the exact same build before getting a 580

Attached: HunkofJunkMKVI.jpg (2403x960, 379K)

Can you play games on wine tho? The 7970 has crap support on Linux.

The Original sabertooth 990FX was made before the 220w 9000 sereies, but the VRM was really strong, stronger than my 990FXA-UD3 that claims to support 9590


Goddamn that is a cool build compared to my hobbled together machine that only looks decent because motherboard.

Attached: IMG_0084.jpg (4032x3024, 2.28M)

Nope, you need a GCN card or better for real Wine performance.

7970 is GCN, the OG GCN 1.0 core.

> Imagine trying to play anything in 2006 with a 500 mhz slot a and a rage 128 pro.
I don't have to. San Andreas couldn't run on Cel700MHZ/Intel815/128MB RAM. However, Vice City could run at a whopping 2 SFP (seconds per frame). Managed to collect 100 hidden packages in a year and a half.

What case m8?

It's the Corsair 300R, basically a baby version of a 750D, i actually sugegst this if you can find one new over a tempered glass bullshit case.

Attached: CC-9011014-WW-300R_03.png (800x1000, 334K)

reminder that Sandy bridge is EVEN faster in gaming than FX.
the gap between 2012 and 2019 is even worse.
FX gets i's dick stomped on
sandy aged like Fine whine. FX didn't.

>Sandy aged like fine wine
>Have to disable HT to mitigate Zombieload MDS crap

definitely an oaky aftter-birth

and it was based as fuck. the 1080p high settings 60fps card from 2013-2018

The 1080p resolution died when the 7970 came out, even an FX-8350 with it's somewhat crappy single core got crazy frame rates in gaymen.

The 9000 series was literally just binned samples of the 83xx stuff. It booted fine, it was just the under load part that had AMD slapping 220w tdp on it.
It did 5.1GHz with 1.49v but strangely would only do 4.8GHz with 1.45v. was a fantastic chip.

>would only do 4.8GHz with 1.45
>tfw 8350 only does 4.6 with 1.45v and it's a bit warm on air
Damn I thought I did okay on the lottery but my 8350 is shit, it was made well after 8370/9370/9590 got binned.

A lot of games are shat now, just rehashes of PS1/PS2 games. So why upgrade anything? You still got the original game just go to e-bay and buy a used ps2 for like $80 and be done with it. PS2 has Component output,TV's still come with the jacks so your in good shape.

So just don't install the fix, only an absolutely retarded nigger or spic actually gives a shit about vulnerabilities.

>it doesn't matter
Dude you know if FX-8350 or the 2700X was pulling such shit with performing mitigations, we'd be laughed off the board.

That's going to be one hell of an upgrade.

Be grateful that a PC now can last more than three years.

Jow Forums isn't one person, engaging in brand rivalry bullshit just reeks of /v/eddit, the only thing that matters is performance.

Funny thing, I'm actually getting a Ryzen 2700 soon to upgrade my i3 8100, because 4 cores ain't enough anymore, pic related if you don't believe me.

Attached: Speccy64_1Od72mrT9K.png (473x389, 18K)

A lot of people misrepresent Moore's law.
It was just an observtion about the number of transistors in a processor doubling every 18 months. Not performance, not clock speed.
They got around limitations by adding more cores, but not everything utilizes multiple cores well.

Hopefully now that there seems to be plateau of what games can do technically they get back to making games that are actually fun and well written. I don't see the importance of pushing hardware beyond what it is now. PBR shading as the new standard makes games look great. Raytracing might make it look better but after that how good does it have to get? Realism isn't a selling point for most games and a bigger army of artists and voice actors for every triple-a game just means SSDs better become more affordable as game data is 90% texture and audio files.

>fx8350
>R9 fury
I have no reasons to upgrade. Nothing bottleneck this badboy

That's a generation newer and was priced a couple segments higher than the HD 7970.

Kepler aged like milk, it barely supports dx12 and Vulkan.

techspot.com/review/1418-geforce-gtx-780-four-years-later/

Fortnite tranny spotted

>Imagine trying to play anything in 2006 with a 500 mhz slot a and a rage 128 pro.
Well I mean it's not like you've upgraded you're resolution either. We're still stuck on 1080p because it's good enough.

Can you disable the Spectre & Meltdown patches?

I'm pretty sure they're built into modern Intel CPU's.

This, back in 99 you were a fucking chad if you had a 1024x768, then by 06 it was possible to get a 2560x1600 screen.

99 chads had 1600x1200

I did with inspectre, but I didn't really notice any change in performance with my 2500k, I tested with Final Fantasy XV, a pretty CPU heavy game and it seemed the same with and without, I guess these patches affect workstations more.

Don't know if the patches would affect the user with the i3 8100 more, but I think an i3 8100 is probably better than my 2500k.

guru3d.com/files-details/download-inspectre.html

Attached: InSpectre_HhWRetmD42.png (309x80, 5K)

>I guess these patches affect workstations more
Just fuck my enterprise market share up family

both aged like shit.
2500k because intel didn't release anything else other than bumping clocks
780ti got fucked by tahiti multiple times and again with the ghz edition, then in 2014 Mantle came and put the last nain in the 700 series coffin. Finally novidia decided to switch goyimworks to dx12 and the grave got 2 foot deeper.
that's what happens when you design shit gpus, either vulkan/mantle/dx12 comes out or ms releases a new version of DC and your overpriced overheating crap, get crapper.

What I want to know is why are CPUs still in the 3.0-3.7 range? Why haven't they gotten higher as a base?

>"I don't know either user"

Attached: 1523433903969.jpg (1038x1000, 140K)

Isn't that higher with boost? I mean straight out of the box.

Is upgrading from an FX 6300 to a FX 8350 worth it?

Thinking to do that while saving money to get an ryzen 4xxx in the future.

An FX-8350 is 4GHz base clock straight out of the box, but it doesn't really mean anything, the ipc of new chips is awesome so I wouldn't worry about the clocks speed.

The extra module the 8350 has with two cores is absolutely worth it if you want to stay on FX, and you can get it with a nice wraith cooler new in box for $75, set the CPU vcore to at least 1.38125v, crank the bus speed to 210 MHz, and you've got a little beast.

Right but is there any reason why that's not the norm at this point? Many years ago I had a 3.2 ghz. Why aren't we at like 5 ghz base now? Is there something holding it back?

I don't know, ryzen (at least ryzen 1000 and 2000) get really hot when you get past 4GHz, hopefully ryzen 3000 will change that.

Heat is holding it back.

good ye olde gif
enjoy upgrade

>DDR3
>900MHz
retard

It''s DDR3-1800MHz, user.

yet it clearly says in speccy its running at 900MHz

user i...

It's the double data rate ram chips on the sticks running at 900, speccy is dumb.

Games simply can't make good use of more than 2 cores. GPU compute took over as the actually important part.

This is my exact rig, with 16gb ram. Built in 2011 and upgraded gpu once to the 780ti, threw in a couple of SSDs.
See no reason to upgrade again if I can still get 40+ fps on my 1080p display.

>5.5GHz @ 1.53v
They don't make them like they used to.

Double Data Rate.....

Nope. Just add a computer to a chip.

>tfw 1.53v isn't even high enough to degrade an FX core and it goes to 1.55

Attached: ayymdeee.png (777x818, 54K)

fx series and ryzen share the same front end user you just got a free boost thats all

>When your 2011 vintage fine wine gets better and better with each sip

well the first bulldozers were bullcrap anyways im talking about the good last ones 8xxx and 9xxx

Yeag original bulldozer is ass, It's a shame Vishera wasn't released first, because the 8350 is 15% faster than an 8150 while still being 125w and came out only a year later, would've competed with sandy a whole lot better, the 3.6GHz of an 8150 with the terrible ipc just didn't cut it.

> >tfw 1.53v isn't even high enough to degrade an FX core and it goes to 1.55
To become degraded

Attached: 1560155690940.jpg (437x404, 49K)

Reminder that the 8350 holds the world record for the highest overclock at 8.5GHz

Attached: 1547851835584.jpg (1616x2048, 277K)

Please try any modern game like the latest Total War and cry whenever you hit end turn.

Its mostly about what you are used to. I had an fx-8120 system with a rx480 card until last year and I was used to 30 fps. Then I upgraded to a ryzen and a nvidia gpu and now I am used to 60 fps but now I also want a stable 60 fps with no dips and that got me looking for the high end systems and will probably buy a 9900k unless the new ryzen is a much cheaper deal.

how tf? I have an FX 8350 and 750 ti. I can't handle shit. I installed Linux because it's lightweight OS because my specs couldn't even fucking handle Windows 10.

SAME HERE user

Attached: excited.jpg (196x250, 8K)

I have been doing just fine 2 years ago with phenom II and r7 265.
I bet you don't have an ssd.

I have an 8350 and a 1050ti
runs wow at 60fps medium settings, and that's the only bideo game I play so idk
probably won't upgrade until the next couple gens of ryzen so they get cheaper

You know, I remember hearing this same song, almost word for word, more than 10 years ago in this very same place (except it was on /v/ not Jow Forums, you know, back before MOBAs and twitch when /v/ was slightly less shit) and I can't tell whether that's an indicator of it actually being true or just people using the same excuse over and over again. Time really puts things into perspective.

Why not get an i7 8700 and not replace the motherboard?

I've ran 3 video card upgrades through an i5 2500k @ 4.7ghz, 8gb 2000mhz DDR3

>HD 5870 1gb
>R9 290 4gb
>GTX 1070 8gb

And in most games I'm still bottlenecked by the GPU playing at 1440p, always sustaining 60fps. I doubt I'll run a 4th GPU through this thing but it's funny how long this shit has lasted