Why do AMD graphics cards use ~2x the wattage of the equivalent nvidia cards? please explain to a retard

why do AMD graphics cards use ~2x the wattage of the equivalent nvidia cards? please explain to a retard

Attached: Radeon-RX-480-57793fe73df78cb62c3cf053.jpg (768x522, 46K)

Other urls found in this thread:

youtube.com/watch?v=kSqZ0IxMuCQ
overclockers.co.uk/msi-radeon-rx-vega-56-air-boost-oc-8gb-hbm2-pci-express-graphics-card-gx-343-ms.html
pcgamer.com/dooms-nightmare-graphics-image-quality-compared-and-benchmarked/
twitter.com/AnonBabble

They are poorly made

They are poorly designed

They didn't get made so good

GCN isn't optimized for power usage.

s/GCN isn't optimized for power usage./GCN isn't optimized./

Attached: 1483515669909.jpg (854x640, 318K)

I'm looking for a more specific explanation. I've read before that they typically have higher compute and even better/w but nvidia compensates very strongly with software solutions. is this true?

>7nm
>still consumes more power than equivalent nvidia card

the absolut vodka state of ayyyyymd design

polaris was a trash design and they had a pajeet govern their conception
Thankfully he's gone but his impact will last till at least navi

They don't.

A Vega 56 gets the same performance in games while taking 20W more power instead of 180W than a 1080 for example.

The RX 570 and 590 card are also regarded as highly efficient in terms of watt per FPS.
Just another dumb meme.

Attached: 1550929736341.jpg (490x480, 23K)

They don't, this is just cringey bait from a novideo shill.

why are all the rx cards so hot and noisy then? just cheap cooling?

You get what you pay for.

Agree, except the 590.

>AYYMD
>compute

Please stop that meme, it's long dead

Attached: AYYMD.png (630x173, 14K)

Depends on the manufacturer of the card and what UV / OC you run them at.

>gree, except the 590.
Well yeah, out of the box it isn't the most efficient card.

nvidia has generally had the better cards/arch the past few years, this means they can spend more time doing optimisations which also generally means less wattage
if amd had more competitive offerings nvidia wouldn't be able to get away with making minor powersaving tweaks with an extra $200 premium from last gen and would instead be putting out housefire tier cards also

Attached: NVIDIAnews.jpg (676x437, 35K)

kelelek ecksdee

Attached: power_peak.png (500x770, 46K)

OH NO NO NO NO NO NO NO NO NO NO NO

AHAHAHAHAHAHAHAHAHAHAHAHAHAHAHA

Attached: performance-per-watt_1920-1080.png (500x810, 44K)

Never mentioned the VII. It's a shitty paper launch. AMD does the same dumb shit that Nvidia does.
Stop being a fanboy of a company, instead of look at hardware in terms of power/bang/buck.

it all checks out

Attached: 1551087746481.png (500x810, 171K)

oh no , nvidia isn't putting 4k amounts of vram on 1080p/1440p gpus

As I said in a previoys thread, 29/30 titles are gaymworks, plus doom.
Take your paheetpowerup elsewhere

So what? AMD has the same options to gimp games on competitor cards by enabling developers their own team of graphics specific devs to work on their game. This is not a real excuse.

>BUTTMAD AYYMDPOORFAGS WITH NO GPUS AND NO DRIVERS DETECTED

>no one mentions the hardware scheduler
Oy vey, does no one remember the great Fermi housefires anymore?

according to the min and rec requirements of current cards they objectively do. feel free to post evidence to the contrary
>Depends on the manufacturer
not really. the only 580 that isn't a leafblower is the nitro+ and that's still just acceptable under load, not quiet

did they eventually fix the radeon 7 oc and other driver issues GN were reporting ?

Because they are allowed to.

The leather jacket master doesn't allow his nvidiots to overclock his GPUs, AMD users are permitted to use any amount of power they see fit, nvidioys have to settle for what the master allows them.

go buy amd gpus from back then if you're stuck in the past

>nitro+
Both Strix and Gaming X had better heatsinks than Nitro+ on RX480, did they change it that much with 580?

I don't get where this meme comes from. I've had 2 RX580's so far. One was Gigabyte's 8GB and the other Aorus' 8GB. None of them made much sound. Even when playing games or doing something that would require their usage. People must be buying used cards and attribuiting them being abused by the previous owner to the card or something.

>Both Strix and Gaming X had better heatsinks than Nitro+ on RX480, did they change it that much with 580?

Nitro+ 480 shipped with an adequate cooler which is uncharacteristic Sapphire who hold the reputation of being the very best.
With 580 they corrected it.

>tfw someone with elementary school level of reading comprehension replies to your post
OP asked why AMD cards suck up more juice, I mentioned one of the reasons, and illustrated it with the last mainstream offerring from Nvidia that used similar component, and had similar problems.

All AMD cards from the few years have had horrible power settings out from the box.
Vega cards for example could eat up 300W while you could get away with 200W while only losing a few FPS.

AMD cards are usually the cheap ones that aren't good competitors on the market, but good for people who know how to use them properly without having to pay for the extra performance.

>not really. the only 580 that isn't a leafblower is the nitro+ and that's still just acceptable under load, not quiet
RX 580 is an example of a bad card, it's just factory OC RX 480. Both 570 and 590 are better in those terms. As stated before, AMD is just another company on the market, they aren't some magical unicorn and neither is Nvidia. Fanboys are just dumb retards.

>Meanwhile, taking a look at power efficiency, it’s interesting to note that for the GTX 1660 Ti NVIDIA has been able to hold the line on power consumption: performance has gone up versus the GTX 1060 6GB, but card power consumption hasn’t. Thanks to this, the GTX 1660 Ti is not just 36% faster, it’s 36% percent more efficient as well. The other Turing cards have seen their own efficiency gains as well, but with their TDPs all drifting up, this is the largest (and purest) efficiency gain we’ve seen to date, and probably the best metric thus far for evaluating Turing’s power efficiency against Pascal’s.

THANK YOU BASED NVIDIA

Well if you reduce the cross sectional area of a conductor the resistance increases so

This too. It seems like AMD has more variance between chips, and din't bin them particularly aggresively. My 480 Nitro+ came in with stock voltage of 1.175v, while it was stable at stock with 1.053v. Shitload of wasted power.

>tfw bought a V56 half a year ago for the same price that a 1660 Ti is now, still in warranty
>tfw better performance than the 1660 Ti
>tfw only marginally higher power usage with a UV
>tfw had a FreeSync monitor anyways before Nvidia even allowed FreeSync that apparently doesn't even work with Nvidia even now
I'm glad I went the route I went at the time, the deal is still paying off

Attached: 1551079232075.png (446x422, 403K)

Also... *cough* 8GB *cough*

>Still posting this lie

Pooga56 is not the same price as GTX 1660 Ti, it is $400 and no one is buying that garbage at that price

That lie has been debunked by many reviewers

youtube.com/watch?v=kSqZ0IxMuCQ

Best part is r/AMD imploded again when they say the GTX 1660 Ti benchmarks

Salty everywhere

Reading comprehension?
>still in warranty
Meaning it was used.

Open up any sales sites, even higher end Vega 56's like Sapphire go in the low 300's down to even 280's. While they are pretty much all still in warranty since even the lowest end warranties are longer than the card is even been out.

I can buy a brand new shitty blower Vega 56 for €275 right now. If you can't find deals for new parts or good used parts, you shouldn't be on Jow Forums.

Attached: 1541076562434.jpg (1000x1230, 759K)

>Posts argument against Vega 56 why 1660 is better
>Posts a 1660 benchmark video that says you should go for a Vega 56 if you can find a good bargain for one
No no no no... Nvidiots can't be this dumb?!

Attached: 1529518833923.jpg (362x346, 38K)

they don't in terms of TFlops (think of it as raw strength of a GPU). it isn't AMD's fault that huge portions of the tech industry literally colludes with nvidia and develops software that runs better on nvidia cards.

Gcn is power hungry and has lazy power management plus amd have been factory oc + ov their chips since 2012-now in order to keep up with nvidias newer faster more efficient arch mainly pascal and turing
Graphics cuck next is fucking ancient they say it launched on 7 series back in 2011 but in reality it's much older dating back to the hd5xxx series and being tweaked alot since.
Weird as pre gcn amd/ati cards sipped power and mostly kept up or creamed nvidia so I dunno why the fuck they are still making such garbage.
Vega2 / radeon 7 was a fucking abortion uses 30%+ more power and is 30% slower even on 7nm with while being 30%+ more expensive with the real kicker being next to impossible to buy even though the mining boom is gone
And binned.
I've had my fair share of amd cards and the only one I liked was my 6970 2gb it was cool and ran great from 2010-2015 and most importantly it was fucking cheap like 200usd new.
Try finding a mid-high end card these days for that price now they are all 400+ for low end 6 series shit

>huge portions of the tech industry literally colludes with nvidia and develops software that runs better on nvidia cards

Probably has something to do with the fact that Nvidia develops drivers that improve game performance and AMD doesn't.

they need extra power to run on linux, unlike nvidia which has no linux drivers

t. linux user

not him but AMD have a very small team in comparison who likely focus on console optimisation

AMD drivers are slow, Vega took literally over a year, each driver made the performance better
Nvidia has GameWorks that benefits Nvidia cards, the only difference in benchmarks that AMD vs Nvidia have in games with cards that should perform similarly

AMD's optimization is basically fixing broken directx and vulkan calls.

Nvidia provides rewritten proprietary replacements for directx/opengl calls that are usually way faster, it isn't that AMD drivers are slow, they just suck period. The crap that AMD spends time "optimizing" already works perfectly on Nvidia.

when your "new" vega 56 blows up and you ask for warranty from the now ghostship youll regret it

The retailer deals with the warranty on these things, a retailer that has existed here for decades and has highly positive feedback, from me too. It's a MSI card too, never had problems with their motherboard warranty.

Looks like cope to me.

But AMDs Vulkan implementation is better than Nvidia's though.

>it isn't that AMD drivers are slow, they just suck period.
But I haven't had a single problem with them, not after AMD acquired ATi at least.
Their drivers are just slow to catch up, Vega cards have gotten better and better with each driver update, compared to launch. Yes AMDs biggest problem is their drivers, not the GPUs themselves, but compared to what they used to be, they are "ok" now, otherwise we wouldn't be able to say that they "used to suck".
This is until Navi comes out and all this shit repeats again thanks to AMDs driver team being shit like that.

No it isn't Nvidia implements the spec far more quickly and on a wider range of card models.

Vulkan is literally Mantle with a new name, AMD GPUs have always had better performance in Vulkan.
Just look at any title that scores the same between a AMD and Nvidia card in OGL or DX and then compare the same cards in the same games Vulkan benchmark.

Oh yeah, explains why it's so broken on older Nvidia cards while working fine on AMD ones from the same period.

They don't. AMD just gives you the option in the drivers to turn up the power target to 150%, giving the crazy results reviewers show.
In actuality 110% power target is all that you need to reach peak clocks on the majority of cards, which on Vega translates to about 220W under heavy load, not 300.
But nvidia shills and reviewers will say that a 20W difference is massive because it suits their narrative.

I used to know someone that used a 980ti and it had the loudest fucking coil whine ever. Loading any game or anything graphically intensive caused this high pitched sound to be generated and he even returned it for another one that also had coil whine. Fucking joke.

>Oh yeah, explains why it's so broken on older Nvidia cards while working fine on AMD ones from the same period.

I remember it a bit differently.

I.E. Games with Vulkan engines not working on AMD at all while Nvidia showed off DOOM running on Vulkan.

I'm pretty confident that Nvidia's support is better than AMD's they have an ultra long history of always being the first to implement basically everything.

Yes, fixed last week.

>I.E. Games with Vulkan engines not working on AMD at all while Nvidia showed off DOOM running on Vulkan.
What you remember is that AMD drivers broke Dooms bad implementation and it took a patch from id to fix it, it wasn't AMDs fault. Do you mean that one?

>I'm pretty confident that Nvidia's support is better than AMD's they have an ultra long history of always being the first to implement basically everything.
Yeah, except Vulkan as mentioned is just a new Mantle, AMD was the one who initiated the Vulkan program and gave Mantle to be worked into Vulkan. Nvidia cannot even have supported it that fast since AMD already supported it before it was Vulkan.

These days both companies GPUs since 2012 support it.

>AMD
>4K
pick one

>novideo
>4k
Pick one

>1660 Ti 6GB (120W)
>Vega 56 6+2GB (Undervolted) (200W)
They need moar power for that extra 2GB

>fury x 4gb
>now vram matter

This stupid VRAM meme should stop. GCN is outdated

Most Nvidia cards whine really bad, since most of the partner boards are horribly made.

>>Still posting this lie
DELUSIONAL

overclockers.co.uk/msi-radeon-rx-vega-56-air-boost-oc-8gb-hbm2-pci-express-graphics-card-gx-343-ms.html

HAHAHAHAHAHAHA

Attached: 1550897447512.jpg (443x550, 75K)

Moar VRAM
>pcgamer.com/dooms-nightmare-graphics-image-quality-compared-and-benchmarked/
>And then there's the Ultra + Nightmare quality. It may not look substantially different, but at least it doesn't tank performance too badly...except for the Fury X. The 980 Ti loses about 1-2 percent in performance is all, the 980 and 390 drop about 3 percent, but the Fury X takes a 30 percent hit. Maybe we'll see a 16.5.2.2 hotfix, as it doesn't seem to be a problem for the 4GB 980 card. Outside of the Fury X (and Fury and Nano, if you're wondering), Nightmare mode isn't a problem, at least on cards that have a chance of running such settings in the first place; that's good news of a sort, but there's bad news as well
>it doesn't seem to be a problem for the 4GB 980 card
>it doesn't seem to be a problem for the 4GB 980 card
>it doesn't seem to be a problem for the 4GB 980 card

Attached: Doom's 'Nightmare' graphics.png (650x433, 108K)

>Moar VRAM
Indeed, even my 8GB card at ultrawide 1080p easily eats over 7000MB playing Far Cry 5.

correction: at 1.2 resolution scale

Poorly designed pieces of shit. I work as IT and also support our customers when they have issues with the programs we sell (it's all high end engineering\design\architecture software) and 99% of the time when they have weird graphical glitches they're using one of these pieces of shit AMD cards. The other 1% is because they're using ancient Intel iGPUs on laptops from 10 years ago.

I also had issues the only time I fell for the AMD meme, I had the HD4870 and this is the only time I had to update the fucking bios on a GPU (and I've been using computers for nearly 20 years) to not have it crash my entire PC while I played Crysis. By the way, I remember I asked here and some Jow Forumsentleman pointed me to the bios because he had the same issue.

Tl;Dr fuck AMD and all of their shitty products. NVidia/Intel may be jews but at the very least their shit works.

Nice claims faggot

Buy the RX 570 8GB sir 8GB good

Attached: download.jpg (480x370, 37K)

HOW WILL AMDRONES EVER RECOVER

>www.overclockers.co.uk

Attached: yikes.jpg (480x270, 40K)

I agree about the V56 statement but
>590
kek, that trash out of the box uses more power than V56 because the retards keep increasing Vcore.
470 and 480 are the peak polaris cards, 580 and 590 are better only when downclocked and downvolted

I'm still mad

fuck amd fuck nvidia fuck intel

I member

Attached: lqk63so7d6ly.jpg (676x437, 24K)

Attached: amdrones.png (1118x214, 26K)

But AMD has the most power efficient x86 CPUs.

This isn't even true.

Attached: efficiency-singlethread.png (500x1410, 81K)

Based.

1) AMD allocates execution resources the way they think things should be (heavy on pixel/compute shaders, light on everything else)
2) Nvidia allocates things the way they want (very heavy on geometry, fairly heavy on ROPs)
3) Nvidia gives game devs middleware to make tons of tiny pixels with tessellated hair or whatever
4) AMD has to factory overclock every card they sell to compensate for fixed-function bottlenecking
5) HOUSEFIRES

special bonus rounds:
6) AMD never increases the raw geometry throughput of GCN beyond 4 triangles per clock, even though this pattern has not and will not stop
7) AMD tried to do a geometry pre-filter but couldn't get it to work in legacy DX/OGL/Vulkan APIs

>single threaded
COPE

99.9% of all applications are still single-threaded.

GCN is old, badly optimized for power usage and has some serious bottlenecks that tank the performance.

But they literally are, it's performance per watt we are talking about. Overall performance, not just single core.

Like what? Serious question btw. Browsers and games are all multi-threaded, no game runs on a single thread anymore.
For workstation use like rendering and en/decoding, it's all multi-core.

if your architecture is inferior to your competitor you need to ramp up the frequency to stay competitive, thus leading to an increased power consumption.

It isn't even architecture, it's just outright bad drivers.

>hardware sc
>better hardware in general
>needs brute force to battle gameworks

really? i thought they've been fixing the whole bad drivers thing for over a decade now.

nvidia needs 2x the amperage

Oh the drivers are better than what they used to be for sure, but they are still not good. Plus the drivers are always lagging behind the hardware, drivers of AMD GPUs at launch are much worse than Nvidia's and take a long time to catch up.

Quality post

now that's a FUD benchmark if i've ever seen one, it's not only some obscure metric no one ever uses but it also takes the infinity fabric's power consumption and calls it single thread as if it was going to go up multiplicatively

Attached: IF Power 2950X.png (1527x999, 133K)

I have no idea about the current situation, but couple of years ago, AMD GPUs used to be more efficient for computational usage, that's why miners loved them. It's just that they suck for gaming in a Nvidia dominated market. Look at a few games where AMD actually helped to develop them, cards like Vega 56 even beat a 1080 Ti in those titles, making the power usage difference something you'd expect.

Attached: 1510683968980.jpg (708x1000, 417K)