Radeon what the fuck are you doing?

Radeon what the fuck are you doing?
>consumes more power than a 1070 with the performance of a 1060

Attached: RX 590 POWER CONSUMPTION.png (744x838, 63K)

Other urls found in this thread:

techpowerup.com/reviews/Sapphire/Radeon_RX_590_Nitro_Plus/36.html
twitter.com/SFWRedditImages

Nobody cares about power consumption except Nvidia marketers.

And the cost of a 1050. Kill yourself faggot

>And the cost of a 1050. Kill yourself faggot
Are you fucking high? Here in Australia they cost more than a 1060

not the whole world revolves around your shitty prison island

>Implying anyone cares about strada cunts.
KEK
E
K

>non-qwerty mongoloid
>talking shit about anyone else

Where do you live? In germany the cheapest one is 260 euros while the cheapest 1050ti is 140 euros.

If you live at your mom's basement and she pays the power bill, of course you don't care. Any remotely intelligent person takes into account the long term costs of a graphics card (power consumption) on top of the immediate costs (actually buying the card). People who think immediate costs are vastly more important tend to be poorfags, and therefore either children or simply stupid. Unsurprisingly, most amdrones only look at the immediate costs. Retards are utterly terrified of things that cost more upfront but save you money in the long term.

The AMD shilling on Jow Forums is absolutely unreal. Are they paid?
>AMDtards on cpus
>lmao intel housefires using 1000W
>AMDtards on gpus
>n-no one cares about power consumption

And then you have the blatant lies that a 1050ti is as expensive as a 590 lol. You can get a 6gb 1060 for cheaper than a 590 in most of the western world.

Ah yes the massive savings because I am running my card at 100% load 24/7 all months... Not.

Cheaper amd gpu plus much cheaper freesync monitor vastly offsets the increased energy cost. Its barely more power usage considering gaming is not something you do at all times of the day.
Single digit savings at best.

The net present value for your initial saving of few dollars will be higher than that of the savings for your electricity bill across the next three years, even if the costs were nominally exactly equal.

Not a chance! I bought a new rx580 8gb for $30 less than the cheapest 1060 6gb

Incorrect. When I had to decide between a 670 and an equivalent AMD card, I would've spent 50-60 bucks extra due to the AMD card over the next 3 years, simply because the cards had garbage efficiency when running multiple monitors. 3 times the power consumption when simply idling. But math is too hard for amdrones anyway. And so is taking into account various usage scenarios.

you do know amd has a hardware sc right?

There is nothing incorrect in my message. Turns out math is too hard for you.

Oh you AMD shills.
>intel has high power consumption
>rant about it for weeks

>AMD high power consumption
>nah doesnt matter

Why are AMD GPUs so power hungry anyway?

I'll call your bluff, tell me exactly how much more would you pay in electricity bill in a year by choosing that GPU instead. Also tell how much each of the same level GPUs cost for you. You sound so smug so you DO have the numbers, right? You aren't just shitposting?

It is indeed a bit double standard, but people were mostly memeing about that gigantic industrial fridge that Intel dared to show while presenting their newest product, like it requires that abomination to run at all. I don't think people care about their desktop CPU temps.

Fuck all you niggers cause I bought a 1080 for $380.

undervolt.
my rx 570 goes 1.02v 1400Mhz.
massive overhead.

Its actually like 12% faster than a 1060
But yeah polaris is clearly being pushed to its limit. It scales poorly with power delivery. That's why most people undervolt polaris (and vega).
they aren't always / weren't always, but recently it's because their architecture doesn't scale any higher and even with maximum numbers of shader processors and such, they need to push clocks to insane levels to match nvidia equivalents; the RX 480 is around what polaris "should" be at, and it uses way less power and runs way cooler than a 580 and "590" (680? radeon RX580ti*?) - the benefits of the 12nm process are wasted because they're pushing it harder.
it would be more efficient if it simply matched a 480 but they push it so it is the "higher throughput" in its bracket. imho its not worth a buy. save the extra ~50-100 and get a 1070ti or vega56, OR just wait for navi.

lol
are all correct
see pic related

Attached: Math Is Hard.jpg (1200x1600, 140K)

This is nothing
head over to /pcbg/, the resident namefag shill is like a special attraction in some zoo. He has to be seen to be believed

there is still the shitty temps on Radeon cards
>OR just wait for navi.
they confirmed Navi will be midrange

yes? and? it would be a much better midrange purchase than a 580/590/1060, probably a better purchase than a 1070/vega56.

based Intel retard
The memeing about the 9900k is about the fact that it's way too hot, the power consumption is just icing on the cake

>namefag
>retarded
i dont need to see it

>there is still the shitty temps on Radeon card
Be carefull not to strain your back with al that goalpost moving and all

Vega is such a shitty naming scheme, what's wrong with AMD?

>there is still the shitty temps on Radeon cards
techpowerup.com/reviews/Sapphire/Radeon_RX_590_Nitro_Plus/36.html
??? seems pretty reasonable for a card sucking over 300w of power
That Asus 590 with 3 fans probably runs at 60C but who the fuck spends that much on a mid-tier card.

How's your mom's basement m8?

>I don't think people care about their desktop CPU temps.
I don't know. Those recent 8 core CPUs that were blue screening on benches with a NH-D15 due to temps were pretty bad.

amd makes their cards to handled FAR more then they push though it, nvidia had to discontinue the 2080ti because they were catching fire.

AMD gpus are shit, amd cpus are good and zen 2 appears to be great.

amd since the gcn came out has done a piss poor job at binning.

SO MANY cards can be undervolted and overclocked to some stupid extents, but amd does none of this in house, instead they just blanked everything to the most retarded gpu they are willing to sell.

Imagine a world where amd did this

amd 400 and amd 500 lines happened at the same time

the 400 line was just as powerful as it was, but the 500 line was where all the binned gpus went, so you got a 480 that would get 100fps, and use 200 watts,
you have a 580lp that gets 100 fps and possibly 120 watts
and you have a 580 that gets 120-150 fps and uses 200 watts

possibly a switch on the gpu to go in between the two base settings.

If I was amd, I would invest some money into an auto oc application, something that if an oc failed it would restart and try at a lower oc, you have it run during the night, and by morning, you have the most stable oc the gpu can handle, and more time just makes sure the gpu is stable longer. it would go though clock rate with its normal voltage range, and then it would drop voltage and find other stable points.

Hell, it may be possible to have on hardware oc space, something that will monitor voltages and know if the gpu is going toward a crash and adjust accordingly, or if its not preventable, at least able to know why it failed and fix itself. this would be the best thing amd can do, because a properly undervolted overclocked 56 competes favorably with a 1080, when at stock its unfavorably competing with a 1070

navi confirmed mid range or navi speculated to be mid range?

because honestly, when the 480 came out, it was very desirable as it did everything it needed to and is still able to keep up, if the 2060 is 1070 performance range, and this is where the new mid range is, and amd puts their gpu here (a 30% uplift minimum over a 580, so potentially) so potentially, we could get a 1070 performing gpu for the 200$ range, and with vega 56, we would be getting 1080ti/titanxp for around 400$

And this is they do nothing but a die shrink.
Im perfectly ok with this performance, and seeing that 2070 the 2080 and 2080ti were massively overkill for anything shy of 4k, and even then, unable to do rt without shitting the bed (I have little hope, but there is some there that dice fucekd the implementation up bad as everything they did was in house and rtx is a translation wrapper for them) while costing... wew, 600$

amd could fairly easily shit on nvidia price to performance, fairly hard in fact, if they just put out a vega die shrink last time they did this, we got the 4850 and 4870, amd could potentially make a short term comeback, though I am highly suspect they can due to gamers being retarded.

Honestly, $ for dollar, it scales in line with the best price performance gpus, If I was in the market for a new gpu, I would also consider the game bundle,
Devil May Cry 5, Resident Evil 2, and The Division 2
2 of these games I would love to play, and if I cant sell the third, I would just look at it as a new gpu getting some new games to play on it that I usually do, so I can write 120$ of the 280$ gpu off completely, if division 2 is good, possibly 180$ off, bring the gpu to a perceived value of 180$ or 120$

honestly, I really do like it for its price, if it was just a 11-12% faster than what I currently have, I would likely buy it.

amd gpus, price/performance are usually at or better then nvidia, however since the 900 series, amd has held the best gpu title outside of some pro applications.

>Radeon 590 FatGoy

The 580 performs the same as a 1060, the 590 is about 12% better. The power consumption and price of the 590 are bullshit, though. Terrible value.

Wrong answer

so its ok when intel does it? thats how it works for you?

At least when intel does it, you get something in return (good performance)

When AMD does it, you get nothing but shit.

>intel
>good performance

Attached: 8400 btfo.png (1275x714, 348K)

wtf

Intel is shit and shills have a vested interest in making ryzen look bad (literally, most of them are intel investors) who would have thought?

Attached: 9900khousefire.png (1112x833, 74K)

That sucks, I've had an i3-7100 for a while now but was reluctant to upgrade since I don't want to buy a water cooler. Pretty impressed how much AMD has improved their stuff, might switch to them if zen 2 beats intel.

yeah I don't know how people make a big deal out of power consumption. I rent my own apartment and it took me two months to get my new computer together. so for those two months I only had a shitty apple MacBook air.

once I got my new computer built my electric bill went up a hole $4 more a month. the fucking washer, dry, and stove are the only ones that truly give me the biggest increase on my bill. not my computer that at most draws maybe 500 watts at most. average is far less than that when gaming.

vega 64 and 2700x.

I wish a new global player would step up in the GPU game, AMD GPUs are pure trash.

Samsung we need you

>complains about amd gpus being trash
>goes on crying for Samsung to make gpus
yes, because we all want gpu's that spontaneously catch on fire like Samsung microwaves, top loader washers that drums literately pop off and explode in washers, front loader washers that leak like crazy, more than your typical front loader, cell phones which batteries legit explode, and fridges plaque by to many stupid problems to list. oh and can't forget about their stupid $800 robot vacuum that has cameras that shits itself left and right. failing batteries, retarded robot ai, extra.

Samsung might be able to make ram, but that's as far as they can go.

yes, lets have Samsung make a gpu.

Yes it's pretty bad but you can always increase the efficiency (sometimes a lot) by undervolting.

I've got a Vega 64 BIOS flashed onto my Vega 56 and set the core clock to 1380 MHz with ~835 mV and the HBM2 to 1050 MHz. This gives me a 17% increase over the stock settings at 1440p while the GPUz reported power consumption is ~135 Watts (real power consumption is around 170 Watts).

pic related

Attached: witcher 3 novigrad benchmarks - ultra settings - no hairworks - 1440p.png (2560x1440, 700K)

this is with RAM that costs more than cpu itself (that there's no guarantee that it will work at high speeds on budget mobos) + an aggressive 4.1ghz overclock which requires an aftermarket cooler (wraith stealth will throttle).
in the end the ryzen build WILL perform better but only when you put way more $$ into it, killing the whole "muh price/perf" argument.
t. 2600 owner

>implying diminishing returns don't start at 2933mhs
>implying the AM4 socket isn't better to invest into
>implying a QVL wont guarantee that it WILL work
T. 1600 owner, fuck off retard.

>this is with RAM that costs more than cpu itself
unless you are buying a workstation with a $1000 cpu the ram is always more expensive than the cpu, welcome to 2018 dram price fixing

Holy shit, this convinced me AMD sucks right now.

I hope their new line up stops being so power hungry, because that extra wattage adds up!

No, it does not. See

>The AMD shilling on Jow Forums is absolutely unreal. Are they paid?

they are '''paid''' in store credit towards a GPU. all of the most vocal shills here and on plebbit are redteam plus marketers. AMD's marketing really is pathetic

Just install some LEDs.

Attached: Capture+_2018-11-06-20-58-07.png (887x468, 273K)

Not really kiddo
Memory is still cheaper then the CPU unless you are going for a budget model.
There isn't DRAM price fixing the problem is that memory cartel shifted their production towards flash not UDIMMs. There was an unexpected demand for UDIMM DDR4 when people were upgrading their Sandy Bridge-Ivy Bridge rigs en mass to Ryzen-Coffee Lake.
People got too spoiled by bargin basement prices when demand for UDIMM was practically non-existent and there was massive supply everywhere (DDR2-DDR3)

>Nobody cares about power consumption
Thanks for the heads up, Intel.

hey look the intelfag is projecting again

Attached: intel persuasion kit.jpg (2083x1405, 462K)

>power consumption doesn't matter
>latency doesn't matter
>high-end gpu doesn't matter
>1440p and 4k doesn't matter
>memory with high clock doesn't matter

>primitive shaders doesn't matter

Radeon graphics are dead

Complete failure of a division

Intel will come save us all when they release their discrete GPUs in 2020

AMD fans are no life losers who cling to AMD fanboyism in order to give their useless lives meaning

>overclocking is cheating! even though they're the same price!

they package hbm
hbm will be important soon
I was hyped for AMD GPUs since Fury X
they do stuff the same way I tend to, ignoring reality of present constraints in favor heading in an ontological direction.

Lisa Su sketches me out a little bit, but she's a sweet little Asian lady who does computers.
AMD is right not to rush their products into production before they're mature.

I tend to play one MMO at a time and don't enjoy single player games or FPS as much as I did in the past.
ESO runs smoothly enough on my 580.

the next evolution in 3d software is already well in the works and previous generations of graphics cards have been predominantly gambits in the market, not much at stake technologically.
the architectures from both NVIDIA and AMD are very mature in the current releases.
dropping to 7nm will do AMD a lot of good. I wouldn't worry even that they don't produce cards to drive the latest titles beyond their limits. it's not their cue of entry yet. NVIDIA is the clear choice at the 14nm node if you don't have budget restraints or aren't expecting an impending revolution.
if you want to invest in freesync and you're content letting the newest AAA titles run on 1080p high, maybe AMD is enticing enough to hold over until
>the next thing

Attached: 1489869107948.jpg (500x452, 13K)

>t. seething nvidiot butthurt about AyyMD wearing the midrange crown

>after 3 years

>Intel will come save us all when they release their discrete GPUs in 2020
Ah yes, because hiring Raja into an environment where no one else knows enough about discrete GPUs to realize when he starts blowing smoke up everyone's ass about magic features is *sure* to turn out well for Intel :^)

>implying diminishing returns don't start at 2933mhs
expect 2600 won't outperform 8400 with 2933mhz, it happens especially when it's paried with b-die ram.
>implying the AM4 socket isn't better to invest into
we're discussing performance, not what's a better buy you nigger
>implying a QVL wont guarantee that it WILL work
actually it doesn't lmao

>the ram is always more expensive than the cpu
are you blind? 2600 is $160 while 16gb of 3200mhz ram is $115. meanwhile b-die costs about $180

Yikes. Didnt know my 390x uses as much power as a Vega 64 lmao

This. This post right here. This is why i no longer take this board seriously. Fucking nailed it. this is all i see here anymore its nuts.

Attached: IMG_20181110_213637_016.jpg (556x556, 116K)

7nm fucking when? Vega 20 looks awesome
Polaris is shit
No big polaris is even shit tier so what do they do? Respin 480 3x
I like amd but Vega 10 and polaris small sucks ass

Are you kidding? The 7nm Vega looks awful unless its TDP is 150w or below. Its a paltry clock speed increase over current Vega64, likely only a few things were fixed or changed.

Polaris was made to be cheap, not to be high performance. AMD designed a cheap, simple layout, and pushed clocks high to make up for it. Thats why the 150w~ RX 480 turned into the 225w RX 590. Even the original RX 480 was a factory OC card. Its 32 ROPs meant it needed high clocks to have acceptable pixel throughput or it would cripple frame rates, and at the time the 480 was the highest clocked GCN GPU around. GCN still has an 8 stage pipeline, it isn't made for high clocks. Its a total miracle that AMD managed to push it this high, breaking 1500/1600mhz in some cases with the latest Polaris Refresh and good Vega dies.