Why are AMD GPUs so fucking inefficient?

Literally uses 2x the power and noise compared to Nvidia

Attached: 1544796810728.png (500x970, 46K)

Why are Nvidia shills so obvious?

Well yeah, an nvidia shill on a board filled to the seams with amdshills would stand out

85/66=28%

Enough with the lies, bro.

Yeah there sure are five AMD shill threads up right after AMD announced a new product you fucking cunt.

where is the 2050 ti?

based nvidia

Attached: 1420100070807.png (425x251, 197K)

they got a point

>power efficiency doesnt matter

cringeworthy amd retard

It didn't matter back when NVidia was shitting their pants with Fermi.

It didn't because the cards delivered amazing performance and value per dollar. That fucking gtx 460 carried me for a long time.

Nice projecting.

GCN is a mess

>moving the goalposts

Its obviously a less efficient architecture, but if you're not gaming or mining constantly then who really cares? I had
a 1070 ti but I sold it for an RX580 cause it was all I needed and I pocketed the extra $200. It's been more than adequate even if it does use more power.

AMD overclock and overvolt GPU to compete agains Nvidia.

Goal posts are still the same, vega is an expensive piece of shit that arrived late to the party thanks to miners. You couldn't even buy them where I live until the memecoin market crashed; at that time the RX570 was selling for the same price as a gtx1070 because of it's higher performance in memecoins.

Polaris didn't have enough CUs for the targeted segment, and high CU count GCN (Vega 56/64) is front end bottlenecked due to being limited from 4 triangles per clock at the front end.

If AMD revised GCN's front end to handle 6+ triangles per clock, the perf/watt of 48+ CU GCN cards would improve significantly.

>If AMD revised GCN's front end to handle 6+ triangles per clock, the perf/watt of 48+ CU GCN cards would improve significantly.
why don't they do it?

Didn't you just say now that the only reason to buy GPU is videogames? Uh, what?

Time and money.

not released
they're doing this very inefficiently as you can see people undervolt and overclock their vegas for a much better result. still drawing considerably more power than turing/pascal but the arch isn't that incredibly bad, though vega is still bottlenecked like fury which also fucks with perf/watt.

Which is why I bought the gtx 1070, it made way more sense than buying a RX570 which was both inefficient and had much worse performance due to being a whole generation older and yet cost the same.

AMD may as well have not even bothered to make vega because no one other than miners bought them at the very end of the bubble.

Because amd expwct their users to undervolt it themselves

now that games are going to use more and more compute and fp 16 maths i really wanna see how nvidia will stack up as a gateway to hell

How many times have you made this thread by now?

Attached: 1524641255368.jpg (500x373, 51K)

i went awol for about 2 days and this shit is still making the same threads..

You make this same thread every day on both g and v.

Funny because that's exactly what (most) amd cards do...rx 480 for example.

>revised GCN's front end to handle 6+ triangles per clock
The amount of effort this would take is so ridiculous that they might as well just give up on gcn entirely which is what they've already done

Dunno where you live but rx 570 is like $129 on newegg now. Mining shit sucked prices on both sides but admittedly I remember rx580 costing far above msrp

just look at far cry 5 benches or forza if you want fp16 performance (vega56 beats a 1080 in both)

Just Waitâ„¢ for 7nm this year

why are Nvidia shills suddenly so spooked about AMD? their presentation didn't go so well i guess.
now check pic related and eat a giant dick, team green.

Attached: Radeon-RX-Vega-56-Efficiency.png (1016x622, 32K)

You probably live in a cold country and it doubled as a heater

still doesn't
so go amd you shill

imagine if nvidia didn't use die space for tensor cores and didn't limit TDP for PR - x3 more powerful GPUs than now.
Why can't nvidia give us good things, they got the tech.

>I don't know how GPU's work
the post

Yes, that is indeed the case. Everytime some nigger looking to get some traffic on their shitty website makes an article 'leaking' fake rumors, 10 threads go up instantly talking about it and how it's the end of nVidia/intel.

more rops ->more shaders->more clock-.bigger fps
GPU is a giant parallel calculator

So you really don't know

Because instead of making better cards they just bin them and overvolt resulting in 300 watt cards that still underperform. Fuck amd

whats with AMD and coning people into fixing their products
>just OC it bro
>just OC and manually tune your ram timings for a month
>just undervolt overclock your GPU

at the same time, with no irony they will laugh at intel fags that delid their cpus

god the complete lack of a sense of irony

Why didn't you post the performance per dollar chart too, Pajeet? You will never, ever come close to making up the difference of paying more for an Nvidia card, unless you plan to keep the same card for ten years and run it at 100% load 24/7/365.

t. zoomer

The 480 was 10% faster than the 5870 in a best case scenario, $100 more expensive and used literally twice the power.

Attached: performance-per-dollar_3840-2160.png (500x1010, 52K)

Undervolting doesn't potentially damage my card and void the warranty.

Not him, but I apparently don't know how GPUs work either. Why isn't his suggestion correct?

But with Thermi you saved at least 500$ on heating

>amd literally worse in perf per dollar too

test

Feels good to be RX 580.
Also
>4K

>Nvidia
THEY lowering GFX settings in drivers

keep telling yourself that buddy

Attached: 298347234234236.png (636x481, 44K)

Novidia on suicide watch.
Your funeral is happening now.

AMDrones will tell you the heat is negligible and the power consumption will not bankrupt you
why are they so stupid? Why? It confounds me.