THIS KILLS THE AYYMD

THIS KILLS THE AYYMD

Attached: performance-per-watt_1920-1080.png (500x810, 44K)

Other urls found in this thread:

linustechtips.com/main/topic/825782-vega-56-undervolting-can-beat-gtx-1080-in-benchmarks/
youtube.com/watch?v=CuwAAwfAZFo
techpowerup.com/252900/amd-partners-cut-pricing-of-radeon-rx-vega-56-to-preempt-geforce-gtx-1660-ti
twitter.com/SFWRedditVideos

Just imagine when Nvidia releases the 7nm version of turing!

There is no 7nm version of Turing

Nvidia is moving on to the next-gen GPU microarchitecture, CUDA Compute Capability 8.0 to replace 2 year old Volta CUDA Compute Capability 7.0, launching soon at GTC 2019 in March and CUDA Compute Capability 8.x for consumer GPUs to replace Turing CUDA Compute Capability 7.5 in 2020

Your post gave me a headache

>implying literally anyone was buying the 590 anyway

Everyone thinks it was shit, including fanboys. It was crap compared to the 580 in terms of value, and crap compared to the Vega 56 in terms of performance. Shit card.

>1080p

It's a great low range card now but you still have vega 56 which generally will overclock to vega 64 levels if you want to which brings 1440p and possibly 4k into the mix.

and to add, frankly if you ever have to factor power consumption into gaming then maybe you shouldn't be buying gaming desktop. Which is the only logical reason you would buy this card.

They deliberately gimp radeon by disabling DX12 on CIV VI for no reason
>1080p
yeah nvidia's shitty products are always better at lower resolutions

Attached: civilization-vi_2560-1440.png (500x570, 38K)

This is why AMDs gayming laptop are mess and Nvidia have 95% of this market

4K

Attached: performance-per-watt_3840-2160.png (500x810, 43K)

I just don't care about their paid reviews

You don't need to OC a Vega 56 to do 1440p though

>vega VII
Wort card ever made, literally unsolded Mi25

Attached: 1549849950733.png (1920x1080, 494K)

>be techpowerup reviewer
>never turn on dx12 to make nvidia's shitty products look better

Attached: a2ffbf77-cfa8-413e-bce1-29a91fe8c3a8.png (757x831, 47K)

Powerdraw doesn't effect gameplay or other tasks. It's an extra expense added on over the cost of the cards lifetime.
Not to mention the card has baked in oc performance. You can't change the power draw.

Are we going to have to wait 2 years for the next generation of GPUs?

* that is an added on cost over the cards lifetime

AMD doesn't know how to make an efficient GPU

>nvidiots grasping at straws
Every time.

Radeon is shit

>6GB vRAM
>barely beats the 590 in most DX12 titles
>still curb stomped by vega 56 in most DX12 titles
>most newer titles will support DX12
>vega 56 with mild UV gets fairly close to 1660ti efficiency
>both AMD cards now dropping in price
If anything this card just helped AMDfags like us get cheaper better cards. Thanks nvidia.

It's a stupid card at a stupid segment.

I would rather just have them keep making the gtx 10 series or push out a 1070 tish equivalent for around the same/slighter higher price point as the 2060 but without RTX

If anything Ryzen just helped Intelfags like us get cheaper better CPUs. Thanks AMD.

$150 RX 590 fucking when

Its over, Polaris is fucking finished.

based hexusposter

don't worry user, Navi will come in 2020

soon, brother.

See, everybody wins? Competition is good. The i5-8400 was constantly going for $200+ and now you can get it for $180. All because intel shit their pants over how an OC'd 2600 with expensive 3200 CL14 RAM was curb stomping it.

Turing was a stunt to sell ridiculous overstock of pascal cards

it worked

well be moving on to the next gen before the end of 2019

It also kills RTX.

Fucking retards at AMD keep pushing GPU clock and thus power draw, instead of pushing for better memory clock, which would dramatically improve performance without increasing power draw. I have an RX480 that matches or exceeds most 580s without ever using more than 110W, just by overclocking the memory.
RX480 needs about ~30% more mem bandwidth than it has, let alone 580 or 590.

Extra power cost is less important than the fact that the card doesn't dump extra 100W of heat into your system and produces a lot less noise, too.

>Housefires are okay when AMD does it.

>Housefires are not okay when Intel does it

To be fair vega is pretty overvolted in general.

>"The undervolting potential of the Radeon RX Vega 56 is somewhat higher.Here, we were able to reduce the GPU voltage from 1,200 mV to 1.070 mV (-12%) and maintain the clock at 1.613 MHz, an increase of almost 25% in extreme cases.In part, the Radeon RX Vega 56 reduces its clock rate in our tests to 1,300 MHz.In this respect the 1.613 MHz achieved is a very good result.This is also reflected in the reduction of the power consumption by 73 W."

linustechtips.com/main/topic/825782-vega-56-undervolting-can-beat-gtx-1080-in-benchmarks/

That means you go from 220W to ~150W power consumption AND you get higher performance from sustained turbo clocks as well. In fact just UV'ing a vega 56 can make it curb stomp a gtx 1080.

You have to hand it to Nvidia. They know how to segment well and they also know when to start dropping each of those segments for maximum sales. I imagine if they dropped the 1660 or 2060 a couple of months ago they would have just eaten in to their 2080/TI & 2070 sales but now they managed to shift units at a higher price to all earlier adopters and will clean up the rest with these cards.

That applies to polaris as well btw. Rx 590 UV can get it pretty close to 100W power consumption AND increase sustained turbo clocks which bump up the performance BEFORE OC'ing.

Silicon lottery too just like overclocking.

>That means you go from 220W to ~150W power consumption AND you get higher performance from sustained turbo clocks as well. In fact just UV'ing a vega 56 can make it curb stomp a gtx 1080.
why was amd so far off the mark? something doesn't add up.

Big if true

They're not. Undervolting is the same as overclocking to the brim of your voltage, except you're pushing the voltage down until stock clocks are just stable. In 6 months it's not going to be stable anymore and you're going to have to increase it. Nvidia does a much better job with the device power management, which is why they locked it down, but even then they always have an "overvolt" to keep it safe.

Honestly undervolted Vega56 is amazing and if you get lucky with Samsung HMB it comfortably beats the 1070Ti.
It does require some effort from the user though, if you're lazy and just want to install and play then go with the 1660Ti.

>you're pushing the voltage down until stock clocks are just stable. In 6 months it's not going to be stable anymore
Absolute unmitigated garbage.
AMD simply set far too aggressive voltage settings by default, pulling them back allows boost frequencies to reach much higher and vastly reduces power consumption with ZERO impact on short or long term stability. Find actual proof showing otherwise instead of posting nonsense.

Turing is actually a big overhaul from Pascal. Reworked cache and memory controller, 2 to 3x the L2 and L1 cache, reworked CUDA cores, dedicated FP16 units (tensor cores in big Turings, but the small Turing 116 has non-tensor dedicated units too apparently) that can work in tandem, but independently, from CUDA cores, and of course RT cores in the pipeline. It's a true next gen architecture and nothing to sneeze at. They probably produced it on 12nm because it's 1) a significant change and more complex than pascal 2) 7nm is still very expensive and everything about Turing is bigger, wider than pascal.

Async compute in volta crushes pascal and reslly fuckin crushes maxwell

>2 years

2020 is next year, you know, you just have to wait 1 year for Nvidia 7nm GPUs which will benefit from more mature 7nm process

>That means you go from 220W to ~150W power consumption
Source: my ass

Nope, 100-200mV UV is regularly reported on vega 56s. The silicon lottery here is getting those 10% of cards that can't UV at all.

From what I gather AMD simply wanted more GPUs to make it out alive so they applied that 100-200mV overvolt to the binning process which does help with turbos but was mainly done to ensure higher output volume. Unfortunately this stunted actual gaymen performance along with shotty drivers at launch but AMD was still able to make a killing due to the buttco
lin mining craze.

Now the part about undervolts failing after a few months seems very unlikely given how miners cranked down the voltage down to 900mV on most 56s and were still able to maintain above 1,100 - 1,200 MHz clocks under 100% torturous mining loads for more than a year.

Hard to believe isn't it?

youtube.com/watch?v=CuwAAwfAZFo

>Performance per watt
Literally who gives a fuck. Just post performance.

>~150W power consumption

Attached: undervolt.jpg (1246x788, 165K)

There's no point in a 2060 without RTX because then RTX 2060 is pointless.

It does matter unless you use water cooling which us expensive and cumbersome . Higher power consumption leads to auto lowering of clocks to prevent thermal damage to the GPU which causes frame stuttering AND lower average frame rates.

Luckily that both polaris and vega respond well to UV'ing.

Correct me if I'm wrong but that includes a OC of the p7 state as well.

>From what I gather AMD simply wanted more GPUs to make it out alive so they applied that 100-200mV overvolt
AMD only does preliminary binning and setting references on power at x frequency. OEMs would have done binning themselves and undervolted if they thought it was suitable. There's a reason they didn't. You can even undervolt stock Intel CPUs by 5-15% and Intel is leading in power management

Didn't they get killed from pc gaming market long time ago?

I'd rather spend countless hours and months undervolting and waiting for good drivers than pay the nvidia premium price.

They only went down to 1070mV, which is a very conservative UV.
>that includes a OC of the p7 state as well.
AND +50% PL

It's even easier now. 2019 Adrenaline drivers now include one-click auto undervolting. You might still be able to squeeze out an extra 50-100mV doing it manually tho.

>buying Radeon
j-just undervolt it sir

Attached: housefire.jpg (1024x576, 113K)

>only 6 gb of vram
lmfao even the 570 8gb is a better future proof card than this crap nvidiots are retardded

>6GB VRAM

>570
>better future proof
Amdrone at it finest

Where's the 480? It's more efficient than the 1060s.

Posters like this unironically have slower Nvidia cards than AMD.

chinese's mining rig

Nobody cares about paying an extra $1 on their power bill each month, what matters is heat generated. I shouldn't be warming my house for something that performs as poorly as an RX590.

even a 1060 is good enough for wqhd thanks to the abysmall gpus in the consoles right now

>AMD CPU
Moar cores!
>AMD GPU
Moar VRAM & heat!

>It's a stupid card at a stupid segment.
no its not since nvidia plans to release two cheaper cards below it in the next months they dont cannibalize their own cards at the launch

>actual RTX 2080 Ti owners don't have anything to prove, so they are never shitflinging
>mfw you realize the amdrone shouters own 5 year old nvidia cards

Attached: 1517734730923.jpg (400x386, 25K)

Enjoy your small victory, nvidiots, because soon you will get BTFO really hard.

Attached: justwaitfornavi.jpg (1080x1397, 283K)

Can we finally kick AMPoojeet shills off this board once this "leak" gets BTFO and Navi is a flop?

This. 6GB of vRAM is a hard pill to swallow.

>Believing the lies of an AYYMD asslicker Scottish cuck living on Swedish welfare

see

>6GB of vRAM
I don't know senpai, is that that really enough for another 4 more years? Because that's how long I plan to keep my GPU for.

It's not. You should buy a Raiden 7 with 16GB of ram. The perfect 1080p card.

That was my point. Who gives a fuck about a price/watt chart.

>is that that really enough for another 4 more years?
Next gen consoles come out in November 2020. You tell me.

Buy a 2070/2080 instead

Their hardware is already set in stone.

Don't buy Pooga56!

AYYMD SCAM EXPOSED!

Attached: Pooga.png (1280x290, 47K)

See
Though if it does stay at $400 that does make it a hard sell against the rtx 2060 which does have gaytracing.

Overall I'm convinced the 1660ti is a hard sell when it only has 6GB of vRAM and is only 25% faster than a $250 Rx 590 (aka OC'd $200 Rx 580) especially with the100-200mV undervolting polaris and vega usually give you.

Why do you keep babbling about the vram? Are you the same people who believe more GHz on cpus means it's better?

>mora vram
What do you expect from amdrones?

8 > 6, simple as

because the 2080 is choking on games at 4k

This is a 1440p/1080p card, not 4k.

LMAO

Just buy the RX 570 8gb if you want to play at 4k

Attached: AHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHHH.png (396x408, 165K)

Because we don't want a card that is gonna be useless two years from now.

>2018+1
>moar VRAM
A R7 240 2GB is better than the GTX750 1GB, right?

Cope harder

Attached: 1522362940763.jpg (400x285, 33K)

Do the needful sirs, buy Raiden 7 16gb ram

HWU is most delusional and retarded reviewer

vega 56 has been dropping price in EU since DECEMBER

holy shit he is retarded

more like pajeetpowerup

hue hue hue hue

techpowerup.com/252900/amd-partners-cut-pricing-of-radeon-rx-vega-56-to-preempt-geforce-gtx-1660-ti

>this
Power consumption is a tie breaker.