I upgraded my R9 290X to RX 580. And I barely have any improvements. My display is 1080p...

I upgraded my R9 290X to RX 580. And I barely have any improvements. My display is 1080p. I basically have 5-6 fps more in some games. Is it something wrong with GPU, or is GPU not really that better than R9 290X and I wasted my money? :/

Attached: 64WMtTYIxGngVmkR.jpg (3000x1500, 600K)

Depends on which 580 you got, the Sapphire Nitro + can clock up pretty well, also the power savings, smaller form factor, and lower thermals are improvements over the 290X

t. Sapphire 580 8GB Nitro+ owner who used to run two 7970XT Tahitis in Xfire

That's actually fairly impressive. It should be a tad slower across the board.

If you where expecting major improvements, sorry, take a look at the numbers in pic related. Even thou it's older hardware, 290X pacs more power.

Attached: 290x_580.png (742x221, 12K)

Strix RX 580 on 1080p you will see no improvements whatsoever. However If you are planning to upgrade your display to 1440p or 4K in the future. You didn't waste your money.

the reference 290x sounds like a jet engine, you should be happy with just the difference in noise levels.

AMD is utter shit for GPUs since 2011.

Tbh you bought a gpu as good as 290x. Atleast it's a lot more efficient. Rx 580 is no flagship but it's the best ayymd can even offer atm.

The only upgrade you should have gone for is either a Vega 56 or 1070 Ti. I did the same. Went from a 290 to a 480. Saw no difference apart from less noise and lower power usage/thermals. I eventually went for Vega 56 (the 1070 Ti was not out bthen) because I got a good price on it and it does compete well with a 1080 and 1070 Ti despite what 'some' benchmarks will tell you.

You didn't waste your money because you're saving half on your power bills.

With caveat. It requires you to flash to 64 BIOS to get near 1080 performance. But it is well worth it.

R9 290X can literally run everything on 1080P. RX 580 is better for higher resolutions because It has 8GB of memory compared to 4GB that 290X has.

Why don't you niggers research jesus fucking christ

lel its genuinely sad that you actually think and believe this

Just to add that when I compare benchmarks I always compare the 1080 against my Vega 56 because with the 64 BIOS mine is so close to the Vega 64 performance which is what was made to compete with the 1080 (Ignore all the shills who said it was meant to compete against the 1080 Ti. It never was and was pure AMD shilling and AMD marketing idiots).

You fucked up. The equivalent of the 290x is Vega64. A 580 is a sidegrade.

At least you don't have plane in your house anymore.

R9 290X actually is not that loud. It's loud over 60% fan. But It's Hawaii chip It can easily go to 95C and that' something AMD confirmed It's safe because chip can won't throttle. And usually when GPU is maxed in Uber mode fan is around 55%. So he wasted money for RX 580 and did not get nothing because of 1080p monitor.

1. idles at 60-75 celsius, depending on the season
2. 3x more than what it's supposed to cost
meant for crypto mining
3. AMD GPUs are so shit that AMD has enabled Nvidia to run a monopoly for 5 years now
4. freesync is garbage compared to gsync

it's genuinely sad to see autistic pieces of shit such as yourself defend AMD and attack anyone who exposes AMD products. try to refute points, instead of sperging at people, you dumb pajeet.

>trying to look smart and failing
A side grade is an equivalent you moron

R9 290X is a f'n beast...

You should've save up more money and get GTX 1080 Ti but only If you also wanted to get a new monitor that's over 1080p, because R9 290X is destroying games in 1080p.

Look at the AMDrone. Look at him, and laugh!


>R9 290X actually is not that loud.
nice try shill
>It's loud over 60% fan.
Nvidia GPUs barely use 30% during activity and it's barely loud. Meanwhile AMD GPUs can be heard from 10 meters away when AMD GPUs use 10% fan activity.
>It can easily go to 95C and that' something AMD confirmed It's safe
HOUSE FIRES ARE SAFE BECAUSE AMD SAID SO , LMAO

I hope you get paid for shilling, otherwise, this is just pathetic and sad to see.

>

Why didn't you check benchmarks before this?

R9 290X is released ahed of It's time, that GPU will be fine for at least 2-3 years more for high settings on 1080p display. You really screwed up by upgrading to RX 580.

NVIDIA fanboy

Wow, you totally refuted the well-known facts about AMD GPUs being shit by calling him a "NVIDIA fanboy"!

AMDrones are so pathetic and weak, it's genuinely sad and terrifying to see so many of you autistic no-lifers shill AMD products every day on Jow Forums.

Do you get paid for shilling AMD? Just curious.

Dude If you're talking about R9 290X It's sad for you to say that because that GPU was released in 2013 and still kicks ass. If you're talking about RX 580 then I totally agree bcuz GPU is shitty and overpriced.

RX 580 has a way better power usage than an r9 290x and you can overclock it pretty hard

That's what user meant.
290X of its day was the highest end card you could get.

Now the highest end AMD card is a Vega 64. RX580 is mainstream range. You get better power efficiency, but that's it.

Certain 290X models weren't loud. Reference and shit gigabyte models and stuff were though, sure.
Nvidia reference blowers are also trash. At least AMD PCBs are good.

That's funny because as I was typing out the first few characters of my post it struck me that the reward of a friendly & helpful post on Jow Forums is some asshole is going to try to milk me for (you)s with aggressive shitposting

What are you a fucking woman? Quit your bitching, faggot

they were all loud, hot and consumed more energy than two titans

(you)

Loud If you crank the fan over 60%. It was never "that" much loud under 60%. But 100% was insane.

Hawaii is a fucking beast and polaris is only an upgrade if you really care about power draw. If you want to brute force compute something NOTHING from AMD is as well balanced as hawaii.

Attached: 290x overclock 08.04.18.png (1275x691, 497K)

I just tested my system over at Userbench. Once I overclocked my Vega 56 on 64 BIOS it comes in at 87th percentile with 127% score. The average score of a GTX 1080 is 122% with a 99th percentile. Mind you I have to push my GPU pretty hard and it gets noisy to get there. I was considering a water cooler on it but the price of one is a bit silly and anyhow the power usage gets beyond a joke. I prefer to keep mine a little over 1070 levels of performance and save some energy, heat and be less noisy. Vega really does need that 7nm refresh.

It was fairly obvious that what he means is: one cannot compare a top end flagship card that cost $549 at launch vs a midrange card which cost $199 at launch.

its like going from a gtx 780ti to a gtx 1050ti and complaining its not as much of an upgrade you hoped and your frames are the same.

OP is a retard for not doing research, not for buying an amd card. He just bought the wrong one.

Yes a 1080ti is still king and the 1180 is coming soon so prices will go down even more in the near future. But I've seen aftermarket vega 64s for $530 this past week, while 1080s are still $625+

so depending on where you live, amd could be the better choice. no brand loyalty here... In recent years I've had a 8800gt, 5850hd, gtx 760, and an r9 390x that I'm currently using... Used to have 3dfx cards too, so fuck off before calling me a shill. This is my first post in this thread btw.

>upgraded
That's a WORSE card you fucking retard. Jesus christ.

Wasn't bashing AMD, I was bashing his verbage because he sounded like a high schooler

>1180 is coming soon

Attached: 1515108735319.gif (200x234, 2.85M)

Wat did u expect?
>a titan killer cant be replaced with a mid range card, even if its decent itviant enough

>Max 326

You think thats hot, how about this?

Attached: ahaha nvidia housf....oh shit.gif (443x665, 23K)

Why tho, bad sensor?

I'm guessing that, like the informed person you seem to be, you tested them all one by one to know that literally each and everyone of them is loud and hot, right?

Bug caused by Unigine Valley. Valley's own inbuilt thermal sensor has once reported 17000c as core temp on my gpu. Running basically any other stress test none of the monitoring software report unrealistic results.

People like things to come quickly rather than quality

My am i proud that the 290x still causes as much butthurt as it did when it was new 5 years ago

Attached: aKCAW5x.jpg (600x315, 26K)

>spend $350 on a sidegrade during a really shitty time for buying a GPU
>WHY AM I NOT SEEING TEH IMPROOVMENTS

The absolute state of AMDrones.
Less than a minute of research would have spared you all this.

(you)

Attached: 0a4.png (200x294, 68K)

You won't see much improvement in performance, except for newer APIs, but the specific game will have to support that. 5-6 is a reasonable increase when you came from a R9 290X.
The RX 580 is pretty much just a cooler running card that takes less power and overclocks better but has only a slight performance increase.

RX590 could have been amazing if it had the same 44CUs.
Oh fucking well I guess.

RX580 is a solid mainstream range card, but putting 8GB VRAM on something with that performance is so meh.
The VRAM probably costs more than the GPU itself at this point.

>overclocks better
What? I have a 580 Nitro+ and all I here is about the wall you hit when you try to OC
If I remember correctly, the Hawaii cards could overclock pretty well

How likely are we to see a vega refresh that actually reduces power consumption to acceptable levels? I dont mind if we dont gain much performance from the die shrink but a vega 64 with say half the power draw would be vert desirable

amd announced a polaris refresh on 7nm in 2019

Why polaris though

cuz they probably just used some automation to shrink the design to 7nm with ease

and they don't have a new architecture in the pipeline till 2021. the whole graphics division raja built up has been kicked out with their plans to do bollywood gfx

HAHAHAHAHAHAHAHAHAHAHAHAHAH
AHAHAHAHAHAHAHAHAHAHA
HAHAHAHAHAHAHAH
AHAHAHAHA
HAHA

>own 290 (not X though) still slaughters everything thrown at it, and wont replace it until the inevitable house fire

The low end ones overclock better. Pretty much guaranteed 1400Mhz. Lmao.
But yeah the high end models are mostly clocked to the limit. 1425-1475 is the usual limit.
A rare few break 1500.

But if you can get 1500 core and 2100 memory, that's fury performance right there for ~2/3rds the power consumption.
Or you can get 90% Fury performance for half the power consumption.

Huh, I'll have to see what mine can do then

I also have a 290x and it still runs. It doesnt run fine, because it will emergency underclock itself as soon as it reaches 100 degrees (which happens fairly often, unless you tell games to use a maximum fps)
the result is frame drops, because the card somehow aborts drawing the frames its supposed to when the underclocking happens.
You may still average 90 fps or higher in most games, but it feels like 20, because you're missing frames, essentially being stuck on the same frame for the length of what 20 fps would feel like.
I have to use MSI afterburner to forcefully underclock the core to about 800 mhz so this "doesnt" happen.

also, my room is retarded hot all the fucking time, its actually no joke
but I will still run this card until it craps out on me because the price performance is insane

Attached: idle.png (467x631, 32K)

replace your thermal paste. Christ.

Maybe get an Artic Accelero or Morpheus II.

are you retarded? There's nothing that can touch nvidia flagship GPUs and hasn't been for awhile.

2013**

Southern Islands had a great lineup. They were space heaters, but great nonetheless.

>this thread
Man, I wish the gpu scene was more active, particularly on the AMD side. I really do miss the shitflinging of old.

I would kill for an AMD gpu that performed as well as the second or even third best NVIDIA GPU for 80-90% of the price but those days are gone.
Nvidia has 1080, 1080ti, Titan, and Titan V that outperform the Vega 64 and the 1070ti outperforms Vega 56 too. And the Vega cards aren't even fucking cheap.

You no longer run a space heater so yes it was worth it

7970 vs 780ti threads haha

Been switching back and forth between nvidia and amd since geforce 2. ATI cards been shit during the X series until the HD3XXX series. HD4XXX is pretty good until the HD5XXX and shit from there until the release HD7XXX and it ended there. AMD been shitting since the rebranding of HD7XXX to R7/9. Currently there is no point to get AMD thanks to miners, else they can be a good contender at the lower price point like how AthlonXP kicked P4 before then, lucky crypto didn't happen to use Athlon back then.

>Idles at 60-75, depending on the season
Nice try, it's summer out here

Attached: muhhousefire.png (799x545, 176K)

>It should be a tad slower across the board.
it should not. It's clocked WAY higher than Hawaii, so the smaller core doesn't matter. Also there are architecture improvements that increase performance per clock slightly.

RX580 should be 10-15% faster than 290X, so 5-6fps increase sounds about right. RX580 uses about half the power of 290X so if you game a lot you'll save on bills.
If you wanted a significantly faster GPU you should have gotten Vega 56. See and kys

>comparing fan percentages

>hot and consumed more energy than two titans
They used 20-30W more while being half the price.

390 here.

Attached: 1528330012050.png (500x374, 114K)

2 years newer top of the line card still being shit on by a $400 2 year old card from AMD was great.
Meanwhile the 290X was shitting on the Titan.

I can't believe Nvidia even had the guts to market those cards as "high end" when they couldn't compete against older generation cards. But fanboys still bought them.

>I can't believe Nvidia even had the guts to market those cards as "high end" when they couldn't compete against older generation cards. But fanboys still bought them.

But it's okay for AMD to market Vega as "the Volta killer" when it struggles to compete with the upper half of the Pascal lineup at twice the power consumption, kek.

Attached: 1527343332149.jpg (640x586, 41K)

try overclocking that 580 and slightly overvolting

fucked up by not keeping that 290x and getting a 7nm vega/navi/mcm whatever next year or so

Yeah the
>poor volta
thing was retarded, I agree.

But Nvidia gets away with it and AMD doesn't, is the difference.
Nvidia outsold the superior 7970, 280, 290, etc cards despite their cards at the time being vastly inferior until they finally got the breakthrough with Maxwell.

Replace the fucking paste/fans man jesus

you dun goofed lad
you bought a card of roughly equal power to the one you own during the giant spike in GPU prices despite the fact you already own a card that can run every game there is in 1080p
and it's because you were too fucking lazy to google it before impulsively buying a new toy, you fucking inbred

this he could have got a decent cooler for it and kept it till something better came out

>superior 7970, 280, 290

Attached: 1497989417041.jpg (9990x10000, 2.33M)

That's exactly what I thought, he's a fucking tard

Gigabyte R9 290x here, 1080p panel

It will play every new title at ultra settings on 1080p most over 60fps. I can do NuDOOM over 180fps on the stock cooler, I like it so much I'm getting another one in Xfire and watercooling them.

For a 5 year old card it's pretty fucking decent

The latter. 480 barely had any improvements over 290X, and 580 is oced 480, so the only reason to upgrade is lower power consumption and more vram.

BS marketing for sure. I think they was banking on culling at the driver level to happen which it never did.

at least you get linux drivers

Lmao this. Why is nvidias linux drivers so god awful? Makes me want to buy the next amd gpu 7nm thing

Wtf that's the same specs of my 7990 which I bought during 2013 summer or FIVE YEARS AGO.

It's clocked like 200mhz higher. A 290x should be running at 1200mhz.

7990 - rx 580
Are all same cards with tiny changes.

This is why Nvidia fucks AMD in the ass.

Personally I will never buy AMD GPU because I want the best and fastest card not some renewed Pajeet shit from 5 years ago

Who Sapphire Tri-X 290x masterrace here?

Attached: 11226-00_R9_290X_TRI-X_4GBGDDR5_DP_HDMI_2DVI_PCIE_C02_635225387958378267.jpg (1000x759, 76K)

7970 was faster than the 680 and later the 780
the 290x was faster than the 780/titan/780ti
the 280 was faster than the 770

I honestly don't understand AMDs value proposition atm. I upgraded from a 750ti to a 770GTX a while back, which is a bit of a housefire card (warms up the room quickly.) I recently thought of upgrading, but im povo so thought i'd look at the AMD cards for better price/performance ratio. What i found was that the rx570/580 have the same price/perf as nvidias 1060/1070 offerings yet they use way more energy, surprisingly even more than my 770GTX housefire. So, why would someone buy them, and fuck the current GPU market, it seems so stagnant that my 3-4 year old card is still ranked ~50th for gaming on user benchmarks.

>I like it so much I'm getting another one in Xfire and watercooling them.
Why when you could get Vega? It's more performance than 2 in crossfire.

It took Nvidia 3 years to catch up to GCN, though.

>I honestly don't understand AMDs value proposition atm.
They're still the best in the $100-$550 range due to Freesync.

How can you understand Nvidia's value proposition of spending 300 more on a monitor so you can save a single lightbulb's worth of electricity under peak load?

Artic accelero on 7970 master race here.
Though I have a Sapphire Nitro+ Special Edition being shipped to me to hold me over until 7nm MCM GPUs come.

I actually hit 1320 stable on my 7970. Though 1200 for more daily driving until I eventually realized the heat isn't worth it and I can just run everything at 60fps for way less power consumption.

280 is a 7950.

Attached: 2017 FT03 build rev2.jpg (935x1080, 192K)

WHY

The 580 is a sidegrade, it has more VRAM but not more performance. The 290X might be even slightly faster depending on the usage.
290 is still a good card. Its just hot, draws a lot of power and has no HDMI 2.0.
My friend bought the 290 Tri-X after I recommend it. He is still happy with it. But Tri-X isn't masterrace the Vapor-X is the one and only true AMD masterrace.

Attached: DSC_0061.jpg (2160x3840, 1.75M)

290x doesn't clock as well as 7970. That's why 1200mhz is a good target

>They're still the best in the $100-$550 range due to Freesync.
>How can you understand Nvidia's value proposition of spending 300 more on a monitor so you can save a single lightbulb's worth of electricity under peak load?

I can't, but i'm not interested in gsync/freesync anyway so it's irrelevant to me. As for the power difference, nvidias 1060/70 cards are 1/2 to 1/3 of the rx5xx equivalents from what i just read an hour ago, ~120W vs ~300, thats about 17 bright CFL bulbs.
I'm not even an nvidia fan, they're all too pricey considering how old their offerings are. As i said it all looks stagnant and crap to me. I thought graphics cards were the one area of computing still seeing major performance gains. Am i missing something here?

> 300Watts
More like 300 wats
Citation needed. 300W is full system under load.

Attached: 86529.png (650x500, 38K)

i had a 290, 390 and 390x fuckers oc'd killed at least 2 psu's over the course of 5 years im talking 1kw and 850w beefy $250usd+ psu's

Not the cards fault but those old buggers REALLY put strain on your psu, op did a decent sidegrade i hope he sold that 290x to a minertard