$280

videocardz.com/newz/amd-radeon-rx-590-official-launch-slides-have-been-leaked

>$280
>for a $200 REBRANDEON HOUSEFIRES card with no real microarchitecture improvements

AYY LMAO

Attached: AMD-Radeon-RX-590-NDA-Slides-1.jpg (1364x738, 220K)

Other urls found in this thread:

tomshardware.com/reviews/amd-radeon-rx-590,5907.html
techpowerup.com/reviews/XFX/Radeon_RX_590_Fatboy/
twitter.com/NSFWRedditGif

14% upgrade over 580, its a meh for normal GPU but from just a small node shrink from 14nm to 12nm its neat. Hope AMD has something new in the works, the current architecture looks to be dated.

That's going to be a yikes from me.

Imagine re-releasing a 3 year old architecture for the same fucking MSRP with only a 14% performance boost.

Absolutely disgusting. The GPU market is absolutely fucked. Your options are overpriced out the ass Nvidia shit or outdated and still expensive AMD shit.

At this point I might as well just buy a console.

Has AMD ever thought about having Vega displacing Polaris in the midrange as well? or is that just not feasible?

So is vega still worth it when this hits?

Vega is probably too expensive given the HBM. Some of the Vega56 cards sell for $400, but these are old stock, and likely have razor thin margins.

Vega56 is considerably faster than Polaris, no matter how high clocked it is. A Vega56 with the little undervolt gives you way more performance and the power consumption of the two cards would be pretty close.
It just comes down to price.

>reddit spacing

>not even GDDR5x

>inb4 that impossible

AMD Radeon pulling an Intel...but its okay if AMD does it.

Blue and Green bad, Red gud.

So AMD is releasing the equivalent of a GTX 980 in late 2018/early 2019.
I'm impressed

Why does it always switch with AMD? It's either the GPU is good, but then the CPU is shit, or the other way around (the ladder being now), Nothing AMD has is comparable to the 1080ti, and now that the 20 series is out, they have nothing to put out against it.

>AMD Radeon pulling an Intel
they are charging twice the amount as their competitor because of no competition?

ill prob still end up buying it. im in desperate need of a new GPU, currently running a dying 760. want to go AMD because new monitor is 144hz freesync, and depending on price + game bundle (which i can hopefully sell), i might as well get this over a 580. Vega is out of my price range.

Yeah, the card is fine but the price is crazy. You can't have an incremental upgrade 2.5 years after the original and INCREASE the price ffs.

This really is the dark age of GPUs.

Vega beats 1080 Ti in Battlefield V. Never underestimate Fine Wine Technology.

>Lying on the Internet

KILL YOURSELF FAGGOT

Attached: 2160.png (500x690, 46K)

>he was one of the retards that opted in to the beta ray tracing program.
enjoy your shitty overpriced GPU of fail.

So go do it, faggot. One less gaymer retard shitting up this board.

>240Hz monitor
>Implying the RX590 will reach even half of 240 FPS in most games

That fucking graph you just posted has the 2070 higher than vega despite Vega being a higher value in the chart. So it's obviously not a very well made graph lmfao. Nice try retard

I wouldn't know much about modern gaming, but when the specs for video editing and mutlimedia needs, the 1080ti was beating out the vega64 in nearly everything.

>53.5 FPS is higher than 55.3 FPS

KILL YOURSELF BLIND FAGGOT

ayy lmao enoguh for NoVideo to rush out a 1060 with GDDR5X to not look too bad

>alienware monitot
>1080p
>240hz
Cringe and blue-pilled. How are you even going to get a game to go that high in fps except crapshit-go and quake 3

It's just that amd cards are "weaker". For me the fact that amd cards don't have a cuda, or nvenc alternative that is worth using kind of kills them off for me.

>Imagine re-releasing a 3 year old architecture for the same fucking MSRP with only a 14% performance boost.
Same price, better performance, don't see the problem. And it's 14% over the 580, which is not by any means 3 years old.

>tfw sold my rx 480 during the mining craze and bought a 1070Ti for muh games
>tfw cucked myself with a gsync monitor
>tfw eternally cucked by nvidia

I want to go back to AMD, the nvidia cards are too expensive

Attached: 1541281105388.png (633x874, 323K)

Polaris is 3 years old.

Why bother? just make more Vega 56/64 cards. Or hell try and make a mid range card based on vega with GDDR5X instead of HBM. Something ffs.

AYYMD can't make GDDR5 or GDDR5X cards because they bet the whole company on HBM2 which has been nothing but a massive fail and their high power consumption boxed them into using HBM2 just to not look bad when compared to the superior competition when it comes to performance and power efficiency

It's faster than a GTX 1060 so why would they price it lower?

Attached: 89984984984989849874.jpg (625x341, 39K)

All I want is a card with 8GB of VRAM and good Linux support for relatively cheap, I don't even care if it's the third iteration of the same cores, chances are it'll beat my GTX980 anyway

>rebrand of a rebrand of a rebrand of a rebrand
jesus christ i bet its just a changed name fuji card with bump up overclock settings

Are you retarded? Poolaris is over 200W and Pooga is 300W

The reason they went with HBM2 was hoping the lower power consumption would help over GDDR5 higher power consumption, but it hasn't and HBM2 has other issues with yields and implementation complexities like connecting HBM2 die to the interposer die that causes yields to plummet

>why would they price it lower?
Because it has the performance of a GTX 980 from 4 years ago

Then why doesn't Nvidia price the GTX 1060 lower?

because Jensen is saving up for a new leather jacket.

So they can sell it. Pretty common from AMD having 10%+ faster cards and selling it for less.

Because they're fucking Jews.
AyyMD is supposed to not be

>GDDR5
>0.9v
>high power consumption

AYYMD themselves said so, faggot

Attached: HBM_10_Energy.png (3999x2250, 761K)

How many pin connectors? How much power draw?

Fuck AMD cards are trash.

compared to HBM yea, GDDR5/X is much more power hungry. Especially when you start having to make up for lack of bandwidth with clock speed,

>HBM
Yeah, this is most likely the biggest thing preventing HBM2 on the mainstream; it's about $175 for interposer + 8GB HBM2. HBM3 is aimed at reducing this cost while still doubling performance, so I'm looking forwards to that.

Wait, so because you don't like buying 3 year old hardware, your answer is to buy an even console?

Attached: 1469664762970.png (498x724, 758K)

>compares performance per watt instead of full powerdraw

Are you retarded?

>tfw the rx 480 was $200 on release

what in the fuck were they thinking

4gb was 200. 8gb was 250.

No, you're retarded if you don't understand shit

Anyway, AYYMD made the wrong bet with HBM2 and Nvidia made the right smart bets with GDDR5X and GDDR6 and Nvidia won the GPU wars by not betting on HBM2 for consumer cards

>that price

who do they think they are? Nvidia?

>Might as well buy a console
>A console that potentially has even older AMD hardware or a Tegra, if you're a nintodler.
You'd sure show us, buddy.

>$500 Vega vs $700 2070
>2 frames difference
Money well spent.

>Anyway, AYYMD made the wrong bet with HBM2 and Nvidia made the right smart bets with GDDR5X and GDDR6 and Nvidia won the GPU wars by not betting on HBM2 for consumer cards

None of those has anything to do with why AMD GPUs have struggled at the high end since Fury. The problem is that GCN's front end is limited to 4 triangles per clock, and this bottlenecks the shit out of high CU count GCN cards in gaming applications. Until RTG fixes GCN, any CU found above about 36 is a waste.

Nvidia uses HBM as well. You don't understand what you're trying to argue. HBM is not why Vega has high power consumption, it would be even higher if they tried to use any variant of GDDR.

No, you are a faggot that don't understand how to read or understand context

No one said HBM2 was the cause of power consumption issues on Pooga, it's the complexity of HBM2 and connecting HBM2 to interposer that is problematic

FUCK OFF FAGGOT IF YOU CAN'T READ

You're flat out wrong. The interposer isn't causing high power consumption or any issue, neither is the memory. You're just a reddit spacing retarded shill who has no idea what hes even trying to argue. You don't have the IQ for it, kid.

>At this point I might as well just buy a console.
do it then, AMD makes all console hardware so they get rich either way

Again, not about power, complexities of HBM2 causes yield issues

FUCK OFF FAGGOT

Nigger there is no yield issue, its just an MCM with micro bumps. Its no more complex than building any other MCM.
Cost is the only limitation with HBM.

Why are you pretending to know what you're talking about, little kid?

>HOUSEFIRES
that's rtx series

No, RTX is very power efficient while AYYMD HOUSEFIRES is 300W

You're the little kid that doesn't know shit, it's too bad your cunt whore bitch mother didn't abort you when she could

You’re not a real gamer if you don’t suck DiCK.

BLOODY BASTARD

That's what you got out of that cool.
Should put that GPU power into researching a cure for autism instead of mining,

I FUCKING YOU BLOODY, BASTARD. BLOODY BASTARD FUCKING. I BLOODY.

Yet four generations of Rebrand Lake 14nm++++++ is perfectly A-OK, I'm sure.
>us poor joos, why has everyone been trying to genocide us for centuries

turn down settings at have an i7-8700K at 5GHz

That's actually pretty nice.

Cuz Nvidia has a stranglehold on the market with games supporting Nvidia more than AMD and feature sets like CUDA accerlation for content creators and SHADOWPLAY which gamers use and advertise.

$280 is way too much, it should be the price that the rx580 is now. nvidia will just release the 2060 and shit all over it.

i have no sympathy for rtg any more, they don't even bother to try.

>2060
for $379, sure

Would have bought this instead of the 1060 6gb if I knew it was coming out. Bastards baited me by saying they won't release a new gpu till next year.

>REBRANDEON HOUSEFIRES
Meanwhile, in reality:

>$280 is way too much
That's about the price of RX580 Nitro+ now.

The rx 480 was supposed to be a 200-250 dollar card at launch. Replaced the similarly performing r9 290 and r9 390 that were also sold for about 250-300 dollars. The rx 590 looks like an upgrade, but not enough after all these years. Too many years of similar performance at more or less the same price.

Yikes

it comes with RE2, DMC5 and The Division 2. you can sell them for $50 each on ebay if you want a $150 price cut

Why aren't there GPUs that use GDDR and HBM?

>same performance as an overclocked $250 card from over two years ago
>marginally lower power consumption due to new lithography
>"upgrade"
Technically it is, but you'd need to be a special kind of red to buy into that.

AMD had an opportunity. AMD squandered that opportunity. Again. As is tradition.

Fucking idiots.

It's good for games like LoL.
I played for a month at 144hz monitor, and when i had to go back to my home i felt uncomfortable for a week at my 60hz monitor.
Well, i think about 144hz/240hz monitor, but i need then also change a gpu+psu. Still thinking to wait good 7nm things.

>Poolaris is over 200W and Pooga is 300W
Why would you do something like this? Go on the internet and tell lies?

Did your bitch cunt whore mother never told you about shutting up when the adults are talking?

tomshardware.com/reviews/amd-radeon-rx-590,5907.html

>What we can see, however, are the benchmark results, power consumption measurements, value comparison, and efficiency calculation. Radeon RX 590 may benefit from a tuned process, but it’s still being flogged for a few percentage points of additional performance and sold at a higher price. It’s sucking down GeForce RTX 2080 power to generate frame rates between GeForce GTX 1060 and 1070. As a result, the Radeon RX 580 and 590 both look bad when we look at performance per watt.

techpowerup.com/reviews/XFX/Radeon_RX_590_Fatboy/

> Gaming power consumption is high, looks like AMD used all the potential from the 12 nm process to run higher clocks, and not reduce power draw. With 232 W power draw in gaming, more than twice the power is needed compared to GTX 1060, with similar performance. Look for RTX 2080 Ti in these charts—the RX 590 is not far away from those, but RTX 2080 Ti is almost 3 times as fast!

Running the 1070ti here as well brother. Fucking hate the novideo software. That GeForce experience is literal botnet Iiked relive but fuck rtg and their incompitance.

Attached: 1542096170576.png (623x808, 418K)

kek

Attached: rx590_performance-per-watt_1920-1080.png (500x970, 52K)

144hz makes sense, but 240 is overkill. Especially when you consider the performance requirements for running any other game at such high frame rates. I guess you could just buy it for league but that's such a waste for only one game. Though It could also apply for the crapshit go audience, since they usually play at an excessive 300 fps+.